Science.gov

Sample records for selective a1a-blocker randomized

  1. Random Selection for Drug Screening

    SciTech Connect

    Center for Human Reliability Studies

    2007-05-01

    Simple random sampling is generally the starting point for a random sampling process. This sampling technique ensures that each individual within a group (population) has an equal chance of being selected. There are a variety of ways to implement random sampling in a practical situation.

  2. Random Selection for Drug Screening

    SciTech Connect

    Center for Human Reliability Studies

    2007-05-01

    Sampling is the process of choosing some members out of a group or population. Probablity sampling, or random sampling, is the process of selecting members by chance with a known probability of each individual being chosen.

  3. Randomized selection on the GPU

    SciTech Connect

    Monroe, Laura Marie; Wendelberger, Joanne R; Michalak, Sarah E

    2011-01-13

    We implement here a fast and memory-sparing probabilistic top N selection algorithm on the GPU. To our knowledge, this is the first direct selection in the literature for the GPU. The algorithm proceeds via a probabilistic-guess-and-chcck process searching for the Nth element. It always gives a correct result and always terminates. The use of randomization reduces the amount of data that needs heavy processing, and so reduces the average time required for the algorithm. Probabilistic Las Vegas algorithms of this kind are a form of stochastic optimization and can be well suited to more general parallel processors with limited amounts of fast memory.

  4. 47 CFR 1.1603 - Conduct of random selection.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The random selection probabilities will be...

  5. Randomness in post-selected events

    NASA Astrophysics Data System (ADS)

    Phuc Thinh, Le; de la Torre, Gonzalo; Bancal, Jean-Daniel; Pironio, Stefano; Scarani, Valerio

    2016-03-01

    Bell inequality violations can be used to certify private randomness for use in cryptographic applications. In photonic Bell experiments, a large amount of the data that is generated comes from no-detection events and presumably contains little randomness. This raises the question as to whether randomness can be extracted only from the smaller post-selected subset corresponding to proper detection events, instead of from the entire set of data. This could in principle be feasible without opening an analogue of the detection loophole as long as the min-entropy of the post-selected data is evaluated by taking all the information into account, including no-detection events. The possibility of extracting randomness from a short string has a practical advantage, because it reduces the computational time of the extraction. Here, we investigate the above idea in a simple scenario, where the devices and the adversary behave according to i.i.d. strategies. We show that indeed almost all the randomness is present in the pair of outcomes for which at least one detection happened. We further show that in some cases applying a pre-processing on the data can capture features that an analysis based on global frequencies only misses, thus resulting in the certification of more randomness. We then briefly consider non-i.i.d strategies and provide an explicit example of such a strategy that is more powerful than any i.i.d. one even in the asymptotic limit of infinitely many measurement rounds, something that was not reported before in the context of Bell inequalities.

  6. Selecting materialized views using random algorithm

    NASA Astrophysics Data System (ADS)

    Zhou, Lijuan; Hao, Zhongxiao; Liu, Chi

    2007-04-01

    The data warehouse is a repository of information collected from multiple possibly heterogeneous autonomous distributed databases. The information stored at the data warehouse is in form of views referred to as materialized views. The selection of the materialized views is one of the most important decisions in designing a data warehouse. Materialized views are stored in the data warehouse for the purpose of efficiently implementing on-line analytical processing queries. The first issue for the user to consider is query response time. So in this paper, we develop algorithms to select a set of views to materialize in data warehouse in order to minimize the total view maintenance cost under the constraint of a given query response time. We call it query_cost view_ selection problem. First, cost graph and cost model of query_cost view_ selection problem are presented. Second, the methods for selecting materialized views by using random algorithms are presented. The genetic algorithm is applied to the materialized views selection problem. But with the development of genetic process, the legal solution produced become more and more difficult, so a lot of solutions are eliminated and producing time of the solutions is lengthened in genetic algorithm. Therefore, improved algorithm has been presented in this paper, which is the combination of simulated annealing algorithm and genetic algorithm for the purpose of solving the query cost view selection problem. Finally, in order to test the function and efficiency of our algorithms experiment simulation is adopted. The experiments show that the given methods can provide near-optimal solutions in limited time and works better in practical cases. Randomized algorithms will become invaluable tools for data warehouse evolution.

  7. 32 CFR 1624.1 - Random selection procedures for induction.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Random selection procedures for induction. 1624... SYSTEM INDUCTIONS § 1624.1 Random selection procedures for induction. (a) The Director of Selective Service shall from time to time establish a random selection sequence for induction by a drawing to...

  8. 32 CFR 1624.1 - Random selection procedures for induction.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Random selection procedures for induction. 1624... SYSTEM INDUCTIONS § 1624.1 Random selection procedures for induction. (a) The Director of Selective Service shall from time to time establish a random selection sequence for induction by a drawing to...

  9. 47 CFR 1.1602 - Designation for random selection.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures §...

  10. 47 CFR 1.1603 - Conduct of random selection.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection....

  11. 47 CFR 1.1603 - Conduct of random selection.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1603...

  12. 47 CFR 1.1602 - Designation for random selection.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection. Applications in the services specified in...

  13. 47 CFR 1.1602 - Designation for random selection.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random...

  14. Protein minimization by random fragmentation and selection.

    PubMed

    Rudgers, G W; Palzkill, T

    2001-07-01

    Protein-protein interactions are involved in most biological processes and are important targets for drug design. Over the past decade, there has been increased interest in the design of small molecules that mimic functional epitopes of protein inhibitors. BLIP is a 165 amino acid protein that is a potent inhibitor of TEM-1 beta-lactamase (K(i) = 0.1 nM). To aid in the development of new inhibitors of beta-lactamase, the gene encoding BLIP was randomly fragmented and DNA segments encoding peptides that retain the ability to bind TEM-1 beta-lactamase were isolated using phage display. The selected peptides revealed a common, overlapping region that includes BLIP residues C30-D49. Synthesis and binding analysis of the C30-D49 peptide indicate that this peptide inhibits TEM-1 beta-lactamase. Therefore, a peptide derivative of BLIP that has been reduced in size by 88% compared with wild-type BLIP retains the ability to bind and inhibit beta-lactamase.

  15. Bayesian nonparametric centered random effects models with variable selection.

    PubMed

    Yang, Mingan

    2013-03-01

    In a linear mixed effects model, it is common practice to assume that the random effects follow a parametric distribution such as a normal distribution with mean zero. However, in the case of variable selection, substantial violation of the normality assumption can potentially impact the subset selection and result in poor interpretation and even incorrect results. In nonparametric random effects models, the random effects generally have a nonzero mean, which causes an identifiability problem for the fixed effects that are paired with the random effects. In this article, we focus on a Bayesian method for variable selection. We characterize the subject-specific random effects nonparametrically with a Dirichlet process and resolve the bias simultaneously. In particular, we propose flexible modeling of the conditional distribution of the random effects with changes across the predictor space. The approach is implemented using a stochastic search Gibbs sampler to identify subsets of fixed effects and random effects to be included in the model. Simulations are provided to evaluate and compare the performance of our approach to the existing ones. We then apply the new approach to a real data example, cross-country and interlaboratory rodent uterotrophic bioassay.

  16. Alternative Modal Basis Selection Procedures for Nonlinear Random Response Simulation

    NASA Technical Reports Server (NTRS)

    Przekop, Adam; Guo, Xinyun; Rizzi, Stephen A.

    2010-01-01

    Three procedures to guide selection of an efficient modal basis in a nonlinear random response analysis are examined. One method is based only on proper orthogonal decomposition, while the other two additionally involve smooth orthogonal decomposition. Acoustic random response problems are employed to assess the performance of the three modal basis selection approaches. A thermally post-buckled beam exhibiting snap-through behavior, a shallowly curved arch in the auto-parametric response regime and a plate structure are used as numerical test articles. The results of the three reduced-order analyses are compared with the results of the computationally taxing simulation in the physical degrees of freedom. For the cases considered, all three methods are shown to produce modal bases resulting in accurate and computationally efficient reduced-order nonlinear simulations.

  17. Selective advantage for sexual reproduction with random haploid fusion.

    PubMed

    Tannenbaum, Emmanuel

    2009-05-01

    This article develops a simplified set of models describing asexual and sexual replication in unicellular diploid organisms. The models assume organisms whose genomes consist of two chromosomes, where each chromosome is assumed to be functional if it is equal to some master sequence sigma(0), and non-functional otherwise. We review the previously studied case of selective mating, where it is assumed that only haploids with functional chromosomes can fuse, and also consider the case of random haploid fusion. When the cost for sex is small, as measured by the ratio of the characteristic haploid fusion time to the characteristic growth time, we find that sexual replication with random haploid fusion leads to a greater mean fitness for the population than a purely asexual strategy. However, independently of the cost for sex, we find that sexual replication with a selective mating strategy leads to a higher mean fitness than the random mating strategy. The results of this article are consistent with previous studies suggesting that sex is favored at intermediate mutation rates, for slowly replicating organisms, and at high population densities. Furthermore, the results of this article provide a basis for understanding sex as a stress response in unicellular organisms such as Saccharomyces cerevisiae (Baker's yeast).

  18. Selective randomized load balancing and mesh networks with changing demands

    NASA Astrophysics Data System (ADS)

    Shepherd, F. B.; Winzer, P. J.

    2006-05-01

    We consider the problem of building cost-effective networks that are robust to dynamic changes in demand patterns. We compare several architectures using demand-oblivious routing strategies. Traditional approaches include single-hop architectures based on a (static or dynamic) circuit-switched core infrastructure and multihop (packet-switched) architectures based on point-to-point circuits in the core. To address demand uncertainty, we seek minimum cost networks that can carry the class of hose demand matrices. Apart from shortest-path routing, Valiant's randomized load balancing (RLB), and virtual private network (VPN) tree routing, we propose a third, highly attractive approach: selective randomized load balancing (SRLB). This is a blend of dual-hop hub routing and randomized load balancing that combines the advantages of both architectures in terms of network cost, delay, and delay jitter. In particular, we give empirical analyses for the cost (in terms of transport and switching equipment) for the discussed architectures, based on three representative carrier networks. Of these three networks, SRLB maintains the resilience properties of RLB while achieving significant cost reduction over all other architectures, including RLB and multihop Internet protocol/multiprotocol label switching (IP/MPLS) networks using VPN-tree routing.

  19. Selecting Random Distributed Elements for HIFU using Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Zhou, Yufeng

    2011-09-01

    As an effective and noninvasive therapeutic modality for tumor treatment, high-intensity focused ultrasound (HIFU) has attracted attention from both physicians and patients. New generations of HIFU systems with the ability to electrically steer the HIFU focus using phased array transducers have been under development. The presence of side and grating lobes may cause undesired thermal accumulation at the interface of the coupling medium (i.e. water) and skin, or in the intervening tissue. Although sparse randomly distributed piston elements could reduce the amplitude of grating lobes, there are theoretically no grating lobes with the use of concave elements in the new phased array HIFU. A new HIFU transmission strategy is proposed in this study, firing a number of but not all elements for a certain period and then changing to another group for the next firing sequence. The advantages are: 1) the asymmetric position of active elements may reduce the side lobes, and 2) each element has some resting time during the entire HIFU ablation (up to several hours for some clinical applications) so that the decreasing efficiency of the transducer due to thermal accumulation is minimized. Genetic algorithm was used for selecting randomly distributed elements in a HIFU array. Amplitudes of the first side lobes at the focal plane were used as the fitness value in the optimization. Overall, it is suggested that the proposed new strategy could reduce the side lobe and the consequent side-effects, and the genetic algorithm is effective in selecting those randomly distributed elements in a HIFU array.

  20. Hierarchy and extremes in selections from pools of randomized proteins

    PubMed Central

    Boyer, Sébastien; Biswas, Dipanwita; Kumar Soshee, Ananda; Scaramozzino, Natale; Nizak, Clément; Rivoire, Olivier

    2016-01-01

    Variation and selection are the core principles of Darwinian evolution, but quantitatively relating the diversity of a population to its capacity to respond to selection is challenging. Here, we examine this problem at a molecular level in the context of populations of partially randomized proteins selected for binding to well-defined targets. We built several minimal protein libraries, screened them in vitro by phage display, and analyzed their response to selection by high-throughput sequencing. A statistical analysis of the results reveals two main findings. First, libraries with the same sequence diversity but built around different “frameworks” typically have vastly different responses; second, the distribution of responses of the best binders in a library follows a simple scaling law. We show how an elementary probabilistic model based on extreme value theory rationalizes the latter finding. Our results have implications for designing synthetic protein libraries, estimating the density of functional biomolecules in sequence space, characterizing diversity in natural populations, and experimentally investigating evolvability (i.e., the potential for future evolution). PMID:26969726

  1. Unbiased split variable selection for random survival forests using maximally selected rank statistics.

    PubMed

    Wright, Marvin N; Dankowski, Theresa; Ziegler, Andreas

    2017-04-15

    The most popular approach for analyzing survival data is the Cox regression model. The Cox model may, however, be misspecified, and its proportionality assumption may not always be fulfilled. An alternative approach for survival prediction is random forests for survival outcomes. The standard split criterion for random survival forests is the log-rank test statistic, which favors splitting variables with many possible split points. Conditional inference forests avoid this split variable selection bias. However, linear rank statistics are utilized by default in conditional inference forests to select the optimal splitting variable, which cannot detect non-linear effects in the independent variables. An alternative is to use maximally selected rank statistics for the split point selection. As in conditional inference forests, splitting variables are compared on the p-value scale. However, instead of the conditional Monte-Carlo approach used in conditional inference forests, p-value approximations are employed. We describe several p-value approximations and the implementation of the proposed random forest approach. A simulation study demonstrates that unbiased split variable selection is possible. However, there is a trade-off between unbiased split variable selection and runtime. In benchmark studies of prediction performance on simulated and real datasets, the new method performs better than random survival forests if informative dichotomous variables are combined with uninformative variables with more categories and better than conditional inference forests if non-linear covariate effects are included. In a runtime comparison, the method proves to be computationally faster than both alternatives, if a simple p-value approximation is used. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Thymidine kinase mutants obtained by random sequence selection.

    PubMed

    Munir, K M; French, D C; Loeb, L A

    1993-05-01

    Knowledge of the catalytic properties and structural information regarding the amino acid residues that comprise the active site of an enzyme allows one, in principle, to use site-specific mutagenesis to construct genes that encode enzymes with altered functions. However, such information about most enzymes is not known and the effects of specific amino acid substitutions are not generally predictable. An alternative approach is to substitute random nucleotides for key codons in a gene and to use genetic selection to identify new and interesting enzyme variants. We describe here the construction, selection, and characterization of herpes simplex virus type 1 thymidine kinase mutants either with different catalytic properties or with enhanced thermostability. From a library containing 2 x 10(6) plasmid-encoded herpes thymidine kinase genes, each with a different nucleotide sequence at the putative nucleoside binding site, we obtained 1540 active mutants. Using this library and one previously constructed, we identified by secondary selection Escherichia coli harboring thymidine kinase mutant clones that were unable to grow in the presence of concentrations of 3'-azido-3'-deoxythymidine (AZT) that permits colony formation by E. coli harboring the wild-type plasmid. Two of the mutant enzymes exhibited a reduced Km for AZT, one of which displayed a higher catalytic efficiency for AZT over thymidine relative to that of the wild type. We also identified one mutant with enhanced thermostability. These mutants may have clinical potential as the promise of gene therapy is increasingly becoming a reality.

  3. Materials selection for oxide-based resistive random access memories

    SciTech Connect

    Guo, Yuzheng; Robertson, John

    2014-12-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO{sub 2}, TiO{sub 2}, Ta{sub 2}O{sub 5}, and Al{sub 2}O{sub 3}, to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta{sub 2}O{sub 5} RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy.

  4. Ammons quick test validity among randomly selected referrals.

    PubMed

    Zagar, Robert John; Kovach, Joseph W; Busch, Kenneth G; Zablocki, Michael D; Osnowitz, William; Neuhengen, Jonas; Liu, Yutong; Zagar, Agata Karolina

    2013-12-01

    After selection using a random number table, from volunteer referrals, 89 Youth (61 boys, 28 girls; 48 African Americans, 2 Asian Americans, 27 Euro-Americans, 12 Hispanic Americans), and 147 Adults (107 men, 40 women; 11 African Americans, 6 Asian Americans, 124 Euro-Americans, 6 Hispanic Americans) were administered the Ammons Quick Test (QT). Means, confidence intervals, standard deviations, and Pearson product-moment correlations among tests were computed. The Ammons QT was moderately to strongly and significantly correlated statistically with: the Peabody Picture Vocabulary Test-3b (PPVT-3b); the Vineland Adaptive Behavior Scales-2 Parent/Teacher Form; the Wechsler Intelligence Scale for Children (WISC-4) or the Wechsler Adult Intelligence Scale (WAIS-4); and the Wide Range Achievement Test-Fourth Edition (WRAT-4) Blue and Green Forms. After 51 years, the original norms for the Ammons QT remain valid measures of receptive vocabulary, verbal intelligence, and auditory information processing useful to clinicians.

  5. Random copolymers at a selective interface: saturation effects.

    PubMed

    Kłos, J; Sommer, J-U

    2007-11-07

    Combining scaling arguments and Monte Carlo simulations using the bond fluctuation method we have studied concentration effects for the adsorption of symmetric AB-random copolymers at selective, symmetric interfaces. For the scaling analysis we consider a hierarchy of two length scales given by the excess (adsorption) blobs and by two dimensional thermal blobs in the semidilute surface regime. When both length scales match, a densely packed array of adsorption blobs is formed (saturation). We show that for random copolymer adsorption the interface concentration can be further increased (oversaturation) due to reorganization of excess blobs. Crossing over this threshold results in a qualitative change in the behavior of the adsorption layer which involves a change in the average shape of the adsorbed chains towards a hairpinlike form. We have analyzed the distribution of loops and tails of adsorbed chains in the various concentration regimes as well as the chain order parameter, concentration profiles, and the exchange rate of individual chains. We emphasized the role of saturation scaling which dominates the behavior of static and dynamic quantities at higher surface concentration.

  6. A Bayesian random effects discrete-choice model for resource selection: Population-level selection inference

    USGS Publications Warehouse

    Thomas, D.L.; Johnson, D.; Griffith, B.

    2006-01-01

    Modeling the probability of use of land units characterized by discrete and continuous measures, we present a Bayesian random-effects model to assess resource selection. This model provides simultaneous estimation of both individual- and population-level selection. Deviance information criterion (DIC), a Bayesian alternative to AIC that is sample-size specific, is used for model selection. Aerial radiolocation data from 76 adult female caribou (Rangifer tarandus) and calf pairs during 1 year on an Arctic coastal plain calving ground were used to illustrate models and assess population-level selection of landscape attributes, as well as individual heterogeneity of selection. Landscape attributes included elevation, NDVI (a measure of forage greenness), and land cover-type classification. Results from the first of a 2-stage model-selection procedure indicated that there is substantial heterogeneity among cow-calf pairs with respect to selection of the landscape attributes. In the second stage, selection of models with heterogeneity included indicated that at the population-level, NDVI and land cover class were significant attributes for selection of different landscapes by pairs on the calving ground. Population-level selection coefficients indicate that the pairs generally select landscapes with higher levels of NDVI, but the relationship is quadratic. The highest rate of selection occurs at values of NDVI less than the maximum observed. Results for land cover-class selections coefficients indicate that wet sedge, moist sedge, herbaceous tussock tundra, and shrub tussock tundra are selected at approximately the same rate, while alpine and sparsely vegetated landscapes are selected at a lower rate. Furthermore, the variability in selection by individual caribou for moist sedge and sparsely vegetated landscapes is large relative to the variability in selection of other land cover types. The example analysis illustrates that, while sometimes computationally intense, a

  7. Study on MAX-MIN Ant System with Random Selection in Quadratic Assignment Problem

    NASA Astrophysics Data System (ADS)

    Iimura, Ichiro; Yoshida, Kenji; Ishibashi, Ken; Nakayama, Shigeru

    Ant Colony Optimization (ACO), which is a type of swarm intelligence inspired by ants' foraging behavior, has been studied extensively and its effectiveness has been shown by many researchers. The previous studies have reported that MAX-MIN Ant System (MMAS) is one of effective ACO algorithms. The MMAS maintains the balance of intensification and diversification concerning pheromone by limiting the quantity of pheromone to the range of minimum and maximum values. In this paper, we propose MAX-MIN Ant System with Random Selection (MMASRS) for improving the search performance even further. The MMASRS is a new ACO algorithm that is MMAS into which random selection was newly introduced. The random selection is one of the edgechoosing methods by agents (ants). In our experimental evaluation using ten quadratic assignment problems, we have proved that the proposed MMASRS with the random selection is superior to the conventional MMAS without the random selection in the viewpoint of the search performance.

  8. Testing, Selection, and Implementation of Random Number Generators

    DTIC Science & Technology

    2008-07-01

    U.S. Army Research Laboratory ATTN: AMSRD-ARL-SL-BD Aberdeen Proving Ground, MD 21005-5068 8 . PERFORMING ORGANIZATION REPORT NUMBER ARL...NUMBER (Include area code) 410-278-6832 Standard Form 298 (Rev. 8 /98) Prescribed by ANSI Std. Z39.18 ii Contents 1. Random Number Generators 1...Linear RNGs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 3.4.2 The Characteristic Polynomial

  9. The Application of Imperialist Competitive Algorithm for Fuzzy Random Portfolio Selection Problem

    NASA Astrophysics Data System (ADS)

    EhsanHesamSadati, Mir; Bagherzadeh Mohasefi, Jamshid

    2013-10-01

    This paper presents an implementation of the Imperialist Competitive Algorithm (ICA) for solving the fuzzy random portfolio selection problem where the asset returns are represented by fuzzy random variables. Portfolio Optimization is an important research field in modern finance. By using the necessity-based model, fuzzy random variables reformulate to the linear programming and ICA will be designed to find the optimum solution. To show the efficiency of the proposed method, a numerical example illustrates the whole idea on implementation of ICA for fuzzy random portfolio selection problem.

  10. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  11. Acceptance sampling using judgmental and randomly selected samples

    SciTech Connect

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  12. Selective advantage for sexual replication with random haploid fusion

    NASA Astrophysics Data System (ADS)

    Tannenbaum, Emmanuel

    2008-03-01

    This talk develops a simplified set of models describing asexual and sexual replication in unicellular diploid organisms. The models assume organisms whose genomes consist of two chromosomes, where each chromosome is assumed to be functional if and only if it is equal to some master sequence. The fitness of an organism is determined by the number of functional chromosomes in its genome. For a population replicating asexually, a cell replicates both of its chromosomes, and then divides and splits its genetic material evenly between the two cells. For a population replicating sexually, a given cell first divides into two haploids, which enter a haploid pool. Within the haploid pool, haploids fuse into diploids, which then divide via the normal mitotic process. When the cost for sex is small, as measured by the ratio of the characteristic haploid fusion time to the characteristic growth time, we find that sexual replication with random haploid fusion leads to a greater mean fitness for the population than a purely asexual strategy. The results of this talk are consistent with previous studies suggesting that sex is favored at intermediate mutation rates, for slowly replicating organisms, and at high population densities.

  13. Random Forests-Based Feature Selection for Land-Use Classification Using LIDAR Data and Orthoimagery

    NASA Astrophysics Data System (ADS)

    Guan, H.; Yu, J.; Li, J.; Luo, L.

    2012-07-01

    The development of lidar system, especially incorporated with high-resolution camera components, has shown great potential for urban classification. However, how to automatically select the best features for land-use classification is challenging. Random Forests, a newly developed machine learning algorithm, is receiving considerable attention in the field of image classification and pattern recognition. Especially, it can provide the measure of variable importance. Thus, in this study the performance of the Random Forests-based feature selection for urban areas was explored. First, we extract features from lidar data, including height-based, intensity-based GLCM measures; other spectral features can be obtained from imagery, such as Red, Blue and Green three bands, and GLCM-based measures. Finally, Random Forests is used to automatically select the optimal and uncorrelated features for landuse classification. 0.5-meter resolution lidar data and aerial imagery are used to assess the feature selection performance of Random Forests in the study area located in Mannheim, Germany. The results clearly demonstrate that the use of Random Forests-based feature selection can improve the classification performance by the selected features.

  14. Selection on plasticity of seasonal life-history traits using random regression mixed model analysis

    PubMed Central

    Brommer, Jon E; Kontiainen, Pekka; Pietiäinen, Hannu

    2012-01-01

    Theory considers the covariation of seasonal life-history traits as an optimal reaction norm, implying that deviating from this reaction norm reduces fitness. However, the estimation of reaction-norm properties (i.e., elevation, linear slope, and higher order slope terms) and the selection on these is statistically challenging. We here advocate the use of random regression mixed models to estimate reaction-norm properties and the use of bivariate random regression to estimate selection on these properties within a single model. We illustrate the approach by random regression mixed models on 1115 observations of clutch sizes and laying dates of 361 female Ural owl Strix uralensis collected over 31 years to show that (1) there is variation across individuals in the slope of their clutch size–laying date relationship, and that (2) there is selection on the slope of the reaction norm between these two traits. Hence, natural selection potentially drives the negative covariance in clutch size and laying date in this species. The random-regression approach is hampered by inability to estimate nonlinear selection, but avoids a number of disadvantages (stats-on-stats, connecting reaction-norm properties to fitness). The approach is of value in describing and studying selection on behavioral reaction norms (behavioral syndromes) or life-history reaction norms. The approach can also be extended to consider the genetic underpinning of reaction-norm properties. PMID:22837818

  15. An Evaluation of Information Criteria Use for Correct Cross-Classified Random Effects Model Selection

    ERIC Educational Resources Information Center

    Beretvas, S. Natasha; Murphy, Daniel L.

    2013-01-01

    The authors assessed correct model identification rates of Akaike's information criterion (AIC), corrected criterion (AICC), consistent AIC (CAIC), Hannon and Quinn's information criterion (HQIC), and Bayesian information criterion (BIC) for selecting among cross-classified random effects models. Performance of default values for the 5…

  16. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or...

  17. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or...

  18. Variable selection with random forest: Balancing stability, performance, and interpretation in ecological and environmental modeling

    EPA Science Inventory

    Random forest (RF) is popular in ecological and environmental modeling, in part, because of its insensitivity to correlated predictors and resistance to overfitting. Although variable selection has been proposed to improve both performance and interpretation of RF models, it is u...

  19. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1... nearly equal as possible) halves. For example, divide the area into top and bottom halves or left and right halves. Choose the top/bottom or left/right division that produces halves having as close to...

  20. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Sampling Non-Porous Surfaces for Measurement-Based... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each...

  1. Optimization Of Mean-Semivariance-Skewness Portfolio Selection Model In Fuzzy Random Environment

    NASA Astrophysics Data System (ADS)

    Chatterjee, Amitava; Bhattacharyya, Rupak; Mukherjee, Supratim; Kar, Samarjit

    2010-10-01

    The purpose of the paper is to construct a mean-semivariance-skewness portfolio selection model in fuzzy random environment. The objective is to maximize the skewness with predefined maximum risk tolerance and minimum expected return. Here the security returns in the objectives and constraints are assumed to be fuzzy random variables in nature and then the vagueness of the fuzzy random variables in the objectives and constraints are transformed into fuzzy variables which are similar to trapezoidal numbers. The newly formed fuzzy model is then converted into a deterministic optimization model. The feasibility and effectiveness of the proposed method is verified by numerical example extracted from Bombay Stock Exchange (BSE). The exact parameters of fuzzy membership function and probability density function are obtained through fuzzy random simulating the past dates.

  2. COMPARISON OF RANDOM AND SYSTEMATIC SITE SELECTION FOR ASSESSING ATTAINMENT OF AQUATIC LIFE USES IN SEGMENTS OF THE OHIO RIVER

    EPA Science Inventory

    This report is a description of field work and data analysis results comparing a design comparable to systematic site selection with one based on random selection of sites. The report is expected to validate the use of random site selection in the bioassessment program for the O...

  3. Adaptive consensus of scale-free multi-agent system by randomly selecting links

    NASA Astrophysics Data System (ADS)

    Mou, Jinping; Ge, Huafeng

    2016-06-01

    This paper investigates an adaptive consensus problem for distributed scale-free multi-agent systems (SFMASs) by randomly selecting links, where the degree of each node follows a power-law distribution. The randomly selecting links are based on the assumption that every agent decides to select links among its neighbours according to the received data with a certain probability. Accordingly, a novel consensus protocol with the range of the received data is developed, and each node updates its state according to the protocol. By the iterative method and Cauchy inequality, the theoretical analysis shows that all errors among agents converge to zero, and in the meanwhile, several criteria of consensus are obtained. One numerical example shows the reliability of the proposed methods.

  4. Development of Solution Algorithm and Sensitivity Analysis for Random Fuzzy Portfolio Selection Model

    NASA Astrophysics Data System (ADS)

    Hasuike, Takashi; Katagiri, Hideki

    2010-10-01

    This paper focuses on the proposition of a portfolio selection problem considering an investor's subjectivity and the sensitivity analysis for the change of subjectivity. Since this proposed problem is formulated as a random fuzzy programming problem due to both randomness and subjectivity presented by fuzzy numbers, it is not well-defined. Therefore, introducing Sharpe ratio which is one of important performance measures of portfolio models, the main problem is transformed into the standard fuzzy programming problem. Furthermore, using the sensitivity analysis for fuzziness, the analytical optimal portfolio with the sensitivity factor is obtained.

  5. Alternative Modal Basis Selection Procedures For Reduced-Order Nonlinear Random Response Simulation

    NASA Technical Reports Server (NTRS)

    Przekop, Adam; Guo, Xinyun; Rizi, Stephen A.

    2012-01-01

    Three procedures to guide selection of an efficient modal basis in a nonlinear random response analysis are examined. One method is based only on proper orthogonal decomposition, while the other two additionally involve smooth orthogonal decomposition. Acoustic random response problems are employed to assess the performance of the three modal basis selection approaches. A thermally post-buckled beam exhibiting snap-through behavior, a shallowly curved arch in the auto-parametric response regime and a plate structure are used as numerical test articles. The results of a computationally taxing full-order analysis in physical degrees of freedom are taken as the benchmark for comparison with the results from the three reduced-order analyses. For the cases considered, all three methods are shown to produce modal bases resulting in accurate and computationally efficient reduced-order nonlinear simulations.

  6. Random Drift versus Selection in Academic Vocabulary: An Evolutionary Analysis of Published Keywords

    PubMed Central

    Bentley, R. Alexander

    2008-01-01

    The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example. PMID:18728786

  7. Topology-selective jamming of fully-connected, code-division random-access networks

    NASA Technical Reports Server (NTRS)

    Polydoros, Andreas; Cheng, Unjeng

    1990-01-01

    The purpose is to introduce certain models of topology selective stochastic jamming and examine its impact on a class of fully-connected, spread-spectrum, slotted ALOHA-type random access networks. The theory covers dedicated as well as half-duplex units. The dominant role of the spatial duty factor is established, and connections with the dual concept of time selective jamming are discussed. The optimal choices of coding rate and link access parameters (from the users' side) and the jamming spatial fraction are numerically established for DS and FH spreading.

  8. Gene selection using iterative feature elimination random forests for survival outcomes

    PubMed Central

    Pang, Herbert; George, Stephen L.; Hui, Ken; Tong, Tiejun

    2012-01-01

    Although many feature selection methods for classification have been developed, there is a need to identify genes in high-dimensional data with censored survival outcomes. Traditional methods for gene selection in classification problems have several drawbacks. First, the majority of the gene selection approaches for classification are single-gene based. Second, many of the gene selection procedures are not embedded within the algorithm itself. The technique of random forests has been found to perform well in high dimensional data settings with survival outcomes. It also has an embedded feature to identify variables of importance. Therefore, it is an ideal candidate for gene selection in high dimensional data with survival outcomes. In this paper, we develop a novel method based on the random forests to identify a set of prognostic genes. We compare our method with several machine learning methods and various node split criteria using several real data sets. Our method performed well in both simulations and real data analysis. Additionally, we have shown the advantages of our approach over single-gene based approaches. Our method incorporates multivariate correlations in microarray data for survival outcomes. The described method allows us to best utilize the information available from microarray data with survival outcomes. PMID:22547432

  9. Malaria life cycle intensifies both natural selection and random genetic drift.

    PubMed

    Chang, Hsiao-Han; Moss, Eli L; Park, Daniel J; Ndiaye, Daouda; Mboup, Souleymane; Volkman, Sarah K; Sabeti, Pardis C; Wirth, Dyann F; Neafsey, Daniel E; Hartl, Daniel L

    2013-12-10

    Analysis of genome sequences of 159 isolates of Plasmodium falciparum from Senegal yields an extraordinarily high proportion (26.85%) of protein-coding genes with the ratio of nonsynonymous to synonymous polymorphism greater than one. This proportion is much greater than observed in other organisms. Also unusual is that the site-frequency spectra of synonymous and nonsynonymous polymorphisms are virtually indistinguishable. We hypothesized that the complicated life cycle of malaria parasites might lead to qualitatively different population genetics from that predicted from the classical Wright-Fisher (WF) model, which assumes a single random-mating population with a finite and constant population size in an organism with nonoverlapping generations. This paper summarizes simulation studies of random genetic drift and selection in malaria parasites that take into account their unusual life history. Our results show that random genetic drift in the malaria life cycle is more pronounced than under the WF model. Paradoxically, the efficiency of purifying selection in the malaria life cycle is also greater than under WF, and the relative efficiency of positive selection varies according to conditions. Additionally, the site-frequency spectrum under neutrality is also more skewed toward low-frequency alleles than expected with WF. These results highlight the importance of considering the malaria life cycle when applying existing population genetic tools based on the WF model. The same caveat applies to other species with similarly complex life cycles.

  10. Fuzzy Random λ-Mean SAD Portfolio Selection Problem: An Ant Colony Optimization Approach

    NASA Astrophysics Data System (ADS)

    Thakur, Gour Sundar Mitra; Bhattacharyya, Rupak; Mitra, Swapan Kumar

    2010-10-01

    To reach the investment goal, one has to select a combination of securities among different portfolios containing large number of securities. Only the past records of each security do not guarantee the future return. As there are many uncertain factors which directly or indirectly influence the stock market and there are also some newer stock markets which do not have enough historical data, experts' expectation and experience must be combined with the past records to generate an effective portfolio selection model. In this paper the return of security is assumed to be Fuzzy Random Variable Set (FRVS), where returns are set of random numbers which are in turn fuzzy numbers. A new λ-Mean Semi Absolute Deviation (λ-MSAD) portfolio selection model is developed. The subjective opinions of the investors to the rate of returns of each security are taken into consideration by introducing a pessimistic-optimistic parameter vector λ. λ-Mean Semi Absolute Deviation (λ-MSAD) model is preferred as it follows absolute deviation of the rate of returns of a portfolio instead of the variance as the measure of the risk. As this model can be reduced to Linear Programming Problem (LPP) it can be solved much faster than quadratic programming problems. Ant Colony Optimization (ACO) is used for solving the portfolio selection problem. ACO is a paradigm for designing meta-heuristic algorithms for combinatorial optimization problem. Data from BSE is used for illustration.

  11. Developmental contributions to macronutrient selection: a randomized controlled trial in adult survivors of malnutrition

    PubMed Central

    Campbell, Claudia P.; Raubenheimer, David; Badaloo, Asha V.; Gluckman, Peter D.; Martinez, Claudia; Gosby, Alison; Simpson, Stephen J.; Osmond, Clive; Boyne, Michael S.; Forrester, Terrence E.

    2016-01-01

    Background and objectives: Birthweight differences between kwashiorkor and marasmus suggest that intrauterine factors influence the development of these syndromes of malnutrition and may modulate risk of obesity through dietary intake. We tested the hypotheses that the target protein intake in adulthood is associated with birthweight, and that protein leveraging to maintain this target protein intake would influence energy intake (EI) and body weight in adult survivors of malnutrition. Methodology: Sixty-three adult survivors of marasmus and kwashiorkor could freely compose a diet from foods containing 10, 15 and 25 percentage energy from protein (percentage of energy derived from protein (PEP); Phase 1) for 3 days. Participants were then randomized in Phase 2 (5 days) to diets with PEP fixed at 10%, 15% or 25%. Results: Self-selected PEP was similar in both groups. In the groups combined, selected PEP was 14.7, which differed significantly (P < 0.0001) from the null expectation (16.7%) of no selection. Self-selected PEP was inversely related to birthweight, the effect disappearing after adjusting for sex and current body weight. In Phase 2, PEP correlated inversely with EI (P = 0.002) and weight change from Phase 1 to 2 (P = 0.002). Protein intake increased with increasing PEP, but to a lesser extent than energy increased with decreasing PEP. Conclusions and implications: Macronutrient intakes were not independently related to birthweight or diagnosis. In a free-choice situation (Phase 1), subjects selected a dietary PEP significantly lower than random. Lower PEP diets induce increased energy and decreased protein intake, and are associated with weight gain. PMID:26817484

  12. Selective of informative metabolites using random forests based on model population analysis.

    PubMed

    Huang, Jian-Hua; Yan, Jun; Wu, Qing-Hua; Duarte Ferro, Miguel; Yi, Lun-Zhao; Lu, Hong-Mei; Xu, Qing-Song; Liang, Yi-Zeng

    2013-12-15

    One of the main goals of metabolomics studies is to discover informative metabolites or biomarkers, which may be used to diagnose diseases and to find out pathology. Sophisticated feature selection approaches are required to extract the information hidden in such complex 'omics' data. In this study, it is proposed a new and robust selective method by combining random forests (RF) with model population analysis (MPA), for selecting informative metabolites from three metabolomic datasets. According to the contribution to the classification accuracy, the metabolites were classified into three kinds: informative, no-informative, and interfering metabolites. Based on the proposed method, some informative metabolites were selected for three datasets; further analyses of these metabolites between healthy and diseased groups were then performed, showing by T-test that the P values for all these selected metabolites were lower than 0.05. Moreover, the informative metabolites identified by the current method were demonstrated to be correlated with the clinical outcome under investigation. The source codes of MPA-RF in Matlab can be freely downloaded from http://code.google.com/p/my-research-list/downloads/list.

  13. Estimating genetic architectures from artificial-selection responses: a random-effect framework.

    PubMed

    Le Rouzic, Arnaud; Skaug, Hans J; Hansen, Thomas F

    2010-03-01

    Artificial-selection experiments on plants and animals generate large datasets reporting phenotypic changes in the course of time. The dynamics of the changes reflect the underlying genetic architecture, but only simple statistical tools have so far been available to analyze such time series. This manuscript describes a general statistical framework based on random-effect models aiming at estimating key parameters of genetic architectures from artificial-selection responses. We derive explicit Mendelian models (in which the genetic architecture relies on one or two large-effect loci), and compare them with classical polygenic models. With simulations, we show that the models are accurate and powerful enough to provide useful estimates from realistic experimental designs, and we demonstrate that model selection is effective in picking few-locus vs. polygenic genetic architectures even from medium-quality artificial-selection data. The method is illustrated by the analysis of a historical selection experiment, carried on color pattern in rats by Castle et al.

  14. Identification of residues critical for metallo-β-lactamase function by codon randomization and selection

    PubMed Central

    Materon, Isabel C.; Palzkill, Timothy

    2001-01-01

    IMP-1 β-lactamase is a zinc metallo-enzyme encoded by the transferable blaIMP-1 gene, which confers resistance to virtually all β-lactam antibiotics including carbapenems. To understand how IMP-1 recognizes and hydrolyzes β-lactam antibiotics it is important to determine which amino acid residues are critical for catalysis and which residues control substrate specificity. We randomized 27 individual codons in the blaIMP-1 gene to create libraries that contain all possible amino acid substitutions at residue positions in and near the active site of IMP-1. Mutants from the random libraries were selected for the ability to confer ampicillin resistance to Escherichia coli. Of the positions randomized, >50% do not tolerate amino acid substitutions, suggesting they are essential for IMP-1 function. The remaining positions tolerate amino acid substitutions and may influence the substrate specificity of the enzyme. Interestingly, kinetic studies for one of the functional mutants, Asn233Ala, indicate that an alanine substitution at this position significantly increases catalytic efficiency as compared with the wild-type enzyme. PMID:11714924

  15. Feature selection for outcome prediction in oesophageal cancer using genetic algorithm and random forest classifier.

    PubMed

    Paul, Desbordes; Su, Ruan; Romain, Modzelewski; Sébastien, Vauclin; Pierre, Vera; Isabelle, Gardin

    2016-12-28

    The outcome prediction of patients can greatly help to personalize cancer treatment. A large amount of quantitative features (clinical exams, imaging, …) are potentially useful to assess the patient outcome. The challenge is to choose the most predictive subset of features. In this paper, we propose a new feature selection strategy called GARF (genetic algorithm based on random forest) extracted from positron emission tomography (PET) images and clinical data. The most relevant features, predictive of the therapeutic response or which are prognoses of the patient survival 3 years after the end of treatment, were selected using GARF on a cohort of 65 patients with a local advanced oesophageal cancer eligible for chemo-radiation therapy. The most relevant predictive results were obtained with a subset of 9 features leading to a random forest misclassification rate of 18±4% and an areas under the of receiver operating characteristic (ROC) curves (AUC) of 0.823±0.032. The most relevant prognostic results were obtained with 8 features leading to an error rate of 20±7% and an AUC of 0.750±0.108. Both predictive and prognostic results show better performances using GARF than using 4 other studied methods.

  16. Distribution of Orientation Selectivity in Recurrent Networks of Spiking Neurons with Different Random Topologies

    PubMed Central

    Sadeh, Sadra; Rotter, Stefan

    2014-01-01

    Neurons in the primary visual cortex are more or less selective for the orientation of a light bar used for stimulation. A broad distribution of individual grades of orientation selectivity has in fact been reported in all species. A possible reason for emergence of broad distributions is the recurrent network within which the stimulus is being processed. Here we compute the distribution of orientation selectivity in randomly connected model networks that are equipped with different spatial patterns of connectivity. We show that, for a wide variety of connectivity patterns, a linear theory based on firing rates accurately approximates the outcome of direct numerical simulations of networks of spiking neurons. Distance dependent connectivity in networks with a more biologically realistic structure does not compromise our linear analysis, as long as the linearized dynamics, and hence the uniform asynchronous irregular activity state, remain stable. We conclude that linear mechanisms of stimulus processing are indeed responsible for the emergence of orientation selectivity and its distribution in recurrent networks with functionally heterogeneous synaptic connectivity. PMID:25469704

  17. The selection and design of control conditions for randomized controlled trials of psychological interventions.

    PubMed

    Mohr, David C; Spring, Bonnie; Freedland, Kenneth E; Beckner, Victoria; Arean, Patricia; Hollon, Steven D; Ockene, Judith; Kaplan, Robert

    2009-01-01

    The randomized controlled trial (RCT) provides critical support for evidence-based practice using psychological interventions. The control condition is the principal method of removing the influence of unwanted variables in RCTs. There is little agreement or consistency in the design and construction of control conditions. Because control conditions have variable effects, the results of RCTs can depend as much on control condition selection as on the experimental intervention. The aim of this paper is to present a framework for the selection and design of control conditions for these trials. Threats to internal validity arising from modern RCT methodology are reviewed and reconsidered. The strengths and weaknesses of several categories of control conditions are examined, including the ones that are under experimental control, the ones that are under the control of clinical service providers, and no-treatment controls. Considerations in the selection of control conditions are discussed and several recommendations are proposed. The aim of this paper is to begin to define principles by which control conditions can be selected or developed in a manner that can assist both investigators and grant reviewers.

  18. Direct selection of targeted adenovirus vectors by random peptide display on the fiber knob.

    PubMed

    Miura, Y; Yoshida, K; Nishimoto, T; Hatanaka, K; Ohnami, S; Asaka, M; Douglas, J T; Curiel, D T; Yoshida, T; Aoki, K

    2007-10-01

    Targeting of gene transfer at the level of cell entry is one of the most attractive challenges in vector development. However, attempts to redirect adenovirus vectors to alternative receptors by engineering the capsid-coding region have shown limited success because proper targeting ligand-receptor systems on the cells of interest are generally unknown. Systematic approaches to generate adenovirus vectors targeting any given cell type need to be developed to achieve this goal. Here, we constructed an adenovirus library that was generated by a Cre-lox-mediated in vitro recombination between an adenoviral fiber-modified plasmid library and genomic DNA to display random peptides on a fiber knob. As proof of concept, we screened the adenovirus display library on a glioma cell line and observed selection of several particular peptide sequences. The targeted vector carrying the most frequently isolated peptide significantly enhanced gene transduction in the glioma cell line but not in many other cell lines. Because the insertion of a pre-selected peptide into a fiber knob often fails to generate an adenovirus vector, the selection of targeting peptides is highly useful in the context of the adenoviral capsid. This vector-screening system can facilitate the development of a targeted adenovirus vector for a variety of applications in medicine.

  19. Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.

    PubMed

    Ma, Yunbei; Zhou, Xiao-Hua

    2017-02-01

    For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.

  20. Selection of Unique Escherichia coli Clones by Random Amplified Polymorphic DNA (RAPD): Evaluation by Whole Genome Sequencing

    PubMed Central

    Nielsen, Karen L.; Godfrey, Paul A.; Stegger, Marc; Andersen, Paal S.; Feldgarden, Michael; Frimodt-Møller, Niels

    2014-01-01

    Identifying and characterizing clonal diversity is important when analysing fecal flora. We evaluated random amplified polymorphic DNA (RAPD) PCR, applied for selection of Escherichia coli isolates, by whole genome sequencing. RAPD was fast, and reproducible as screening method for selection of distinct E. coli clones in fecal swabs. PMID:24912108

  1. Implications of structural genomics target selection strategies: Pfam5000, whole genome, and random approaches

    SciTech Connect

    Chandonia, John-Marc; Brenner, Steven E.

    2004-07-14

    The structural genomics project is an international effort to determine the three-dimensional shapes of all important biological macromolecules, with a primary focus on proteins. Target proteins should be selected according to a strategy which is medically and biologically relevant, of good value, and tractable. As an option to consider, we present the Pfam5000 strategy, which involves selecting the 5000 most important families from the Pfam database as sources for targets. We compare the Pfam5000 strategy to several other proposed strategies that would require similar numbers of targets. These include including complete solution of several small to moderately sized bacterial proteomes, partial coverage of the human proteome, and random selection of approximately 5000 targets from sequenced genomes. We measure the impact that successful implementation of these strategies would have upon structural interpretation of the proteins in Swiss-Prot, TrEMBL, and 131 complete proteomes (including 10 of eukaryotes) from the Proteome Analysis database at EBI. Solving the structures of proteins from the 5000 largest Pfam families would allow accurate fold assignment for approximately 68 percent of all prokaryotic proteins (covering 59 percent of residues) and 61 percent of eukaryotic proteins (40 percent of residues). More fine-grained coverage which would allow accurate modeling of these proteins would require an order of magnitude more targets. The Pfam5000 strategy may be modified in several ways, for example to focus on larger families, bacterial sequences, or eukaryotic sequences; as long as secondary consideration is given to large families within Pfam, coverage results vary only slightly. In contrast, focusing structural genomics on a single tractable genome would have only a limited impact in structural knowledge of other proteomes: a significant fraction (about 30-40 percent of the proteins, and 40-60 percent of the residues) of each proteome is classified in small

  2. Selecting instruments for Mendelian randomization in the wake of genome-wide association studies

    PubMed Central

    Swerdlow, Daniel I; Kuchenbaecker, Karoline B; Shah, Sonia; Sofat, Reecha; Holmes, Michael V; White, Jon; Mindell, Jennifer S; Kivimaki, Mika; Brunner, Eric J; Whittaker, John C; Casas, Juan P; Hingorani, Aroon D

    2016-01-01

    Mendelian randomization (MR) studies typically assess the pathogenic relevance of environmental exposures or disease biomarkers, using genetic variants that instrument these exposures. The approach is gaining popularity—our systematic review reveals a greater than 10-fold increase in MR studies published between 2004 and 2015. When the MR paradigm was first proposed, few biomarker- or exposure-related genetic variants were known, most having been identified by candidate gene studies. However, genome-wide association studies (GWAS) are now providing a rich source of potential instruments for MR analysis. Many early reviews covering the concept, applications and analytical aspects of the MR technique preceded the surge in GWAS, and thus the question of how best to select instruments for MR studies from the now extensive pool of available variants has received insufficient attention. Here we focus on the most common category of MR studies—those concerning disease biomarkers. We consider how the selection of instruments for MR analysis from GWAS requires consideration of: the assumptions underlying the MR approach; the biology of the biomarker; the genome-wide distribution, frequency and effect size of biomarker-associated variants (the genetic architecture); and the specificity of the genetic associations. Based on this, we develop guidance that may help investigators to plan and readers interpret MR studies. PMID:27342221

  3. Construction of random tumor transcriptome expression library for creating and selecting novel tumor antigens.

    PubMed

    Zhao, Huizhun; Zhao, Xiuyun; Du, Peng; Qi, Gaofu

    2016-09-01

    Novel tumor antigens are necessary for the development of efficient tumor vaccines for overcoming the immunotolerance and immunosuppression induced by tumors. Here, we developed a novel strategy to create tumor antigens by construction of random tumor transcriptome expression library (RTTEL). The complementary DNA (cDNA) from S180 sarcoma was used as template for arbitrarily amplifying gene fragments with random primers by PCR, then ligated to the C-terminal of HSP65 in a plasmid pET28a-HSP for constructing RTTEL in Escherichia coli. A novel antigen of A5 was selected from RTTEL with the strongest immunotherapeutic effects on S180 sarcoma. Adoptive immunotherapy with anti-A5 sera also inhibited tumor growth, further confirming the key antitumor roles of A5-specific antibodies in mice. A5 contains a sequence similar to protein-L-isoaspartate (D-aspartate) O-methyltransferase (PCMT1). The antisera of A5 were verified to cross-react with PCMT1 by Western blotting assay and vice versa. Both anti-A5 sera and anti-PCMT1 sera could induce antibody-dependent cell-mediated cytotoxicity and complement-dependent cytotoxicity toward S180 cells by in vitro assay. Further assay with fluorescent staining showed that PCMT1 is detectable on the surface of S180 cells. Summary, the strategy to construct RTTEL is potential for creating and screening novel tumor antigens to develop efficient tumor vaccines. By RTTEL, we successfully created a protein antigen of A5 with significant immunotherapeutic effects on S180 sarcoma by induction of antibodies targeting for PCMT1.

  4. Application of random coherence order selection in gradient-enhanced multidimensional NMR

    NASA Astrophysics Data System (ADS)

    Bostock, Mark J.; Nietlispach, Daniel

    2016-03-01

    Development of multidimensional NMR is essential to many applications, for example in high resolution structural studies of biomolecules. Multidimensional techniques enable separation of NMR signals over several dimensions, improving signal resolution, whilst also allowing identification of new connectivities. However, these advantages come at a significant cost. The Fourier transform theorem requires acquisition of a grid of regularly spaced points to satisfy the Nyquist criterion, while frequency discrimination and acquisition of a pure phase spectrum require acquisition of both quadrature components for each time point in every indirect (non-acquisition) dimension, adding a factor of 2 N -1 to the number of free- induction decays which must be acquired, where N is the number of dimensions. Compressed sensing (CS) ℓ 1-norm minimisation in combination with non-uniform sampling (NUS) has been shown to be extremely successful in overcoming the Nyquist criterion. Previously, maximum entropy reconstruction has also been used to overcome the limitation of frequency discrimination, processing data acquired with only one quadrature component at a given time interval, known as random phase detection (RPD), allowing a factor of two reduction in the number of points for each indirect dimension (Maciejewski et al. 2011 PNAS 108 16640). However, whilst this approach can be easily applied in situations where the quadrature components are acquired as amplitude modulated data, the same principle is not easily extended to phase modulated (P-/N-type) experiments where data is acquired in the form exp (iωt) or exp (-iωt), and which make up many of the multidimensional experiments used in modern NMR. Here we demonstrate a modification of the CS ℓ 1-norm approach to allow random coherence order selection (RCS) for phase modulated experiments; we generalise the nomenclature for RCS and RPD as random quadrature detection (RQD). With this method, the power of RQD can be extended

  5. Equivalence between Step Selection Functions and Biased Correlated Random Walks for Statistical Inference on Animal Movement

    PubMed Central

    Duchesne, Thierry; Fortin, Daniel; Rivest, Louis-Paul

    2015-01-01

    Animal movement has a fundamental impact on population and community structure and dynamics. Biased correlated random walks (BCRW) and step selection functions (SSF) are commonly used to study movements. Because no studies have contrasted the parameters and the statistical properties of their estimators for models constructed under these two Lagrangian approaches, it remains unclear whether or not they allow for similar inference. First, we used the Weak Law of Large Numbers to demonstrate that the log-likelihood function for estimating the parameters of BCRW models can be approximated by the log-likelihood of SSFs. Second, we illustrated the link between the two approaches by fitting BCRW with maximum likelihood and with SSF to simulated movement data in virtual environments and to the trajectory of bison (Bison bison L.) trails in natural landscapes. Using simulated and empirical data, we found that the parameters of a BCRW estimated directly from maximum likelihood and by fitting an SSF were remarkably similar. Movement analysis is increasingly used as a tool for understanding the influence of landscape properties on animal distribution. In the rapidly developing field of movement ecology, management and conservation biologists must decide which method they should implement to accurately assess the determinants of animal movement. We showed that BCRW and SSF can provide similar insights into the environmental features influencing animal movements. Both techniques have advantages. BCRW has already been extended to allow for multi-state modeling. Unlike BCRW, however, SSF can be estimated using most statistical packages, it can simultaneously evaluate habitat selection and movement biases, and can easily integrate a large number of movement taxes at multiple scales. SSF thus offers a simple, yet effective, statistical technique to identify movement taxis. PMID:25898019

  6. Equivalence between Step Selection Functions and Biased Correlated Random Walks for Statistical Inference on Animal Movement.

    PubMed

    Duchesne, Thierry; Fortin, Daniel; Rivest, Louis-Paul

    2015-01-01

    Animal movement has a fundamental impact on population and community structure and dynamics. Biased correlated random walks (BCRW) and step selection functions (SSF) are commonly used to study movements. Because no studies have contrasted the parameters and the statistical properties of their estimators for models constructed under these two Lagrangian approaches, it remains unclear whether or not they allow for similar inference. First, we used the Weak Law of Large Numbers to demonstrate that the log-likelihood function for estimating the parameters of BCRW models can be approximated by the log-likelihood of SSFs. Second, we illustrated the link between the two approaches by fitting BCRW with maximum likelihood and with SSF to simulated movement data in virtual environments and to the trajectory of bison (Bison bison L.) trails in natural landscapes. Using simulated and empirical data, we found that the parameters of a BCRW estimated directly from maximum likelihood and by fitting an SSF were remarkably similar. Movement analysis is increasingly used as a tool for understanding the influence of landscape properties on animal distribution. In the rapidly developing field of movement ecology, management and conservation biologists must decide which method they should implement to accurately assess the determinants of animal movement. We showed that BCRW and SSF can provide similar insights into the environmental features influencing animal movements. Both techniques have advantages. BCRW has already been extended to allow for multi-state modeling. Unlike BCRW, however, SSF can be estimated using most statistical packages, it can simultaneously evaluate habitat selection and movement biases, and can easily integrate a large number of movement taxes at multiple scales. SSF thus offers a simple, yet effective, statistical technique to identify movement taxis.

  7. Computerized stratified random site-selection approaches for design of a ground-water-quality sampling network

    USGS Publications Warehouse

    Scott, J.C.

    1990-01-01

    Computer software was written to randomly select sites for a ground-water-quality sampling network. The software uses digital cartographic techniques and subroutines from a proprietary geographic information system. The report presents the approaches, computer software, and sample applications. It is often desirable to collect ground-water-quality samples from various areas in a study region that have different values of a spatial characteristic, such as land-use or hydrogeologic setting. A stratified network can be used for testing hypotheses about relations between spatial characteristics and water quality, or for calculating statistical descriptions of water-quality data that account for variations that correspond to the spatial characteristic. In the software described, a study region is subdivided into areal subsets that have a common spatial characteristic to stratify the population into several categories from which sampling sites are selected. Different numbers of sites may be selected from each category of areal subsets. A population of potential sampling sites may be defined by either specifying a fixed population of existing sites, or by preparing an equally spaced population of potential sites. In either case, each site is identified with a single category, depending on the value of the spatial characteristic of the areal subset in which the site is located. Sites are selected from one category at a time. One of two approaches may be used to select sites. Sites may be selected randomly, or the areal subsets in the category can be grouped into cells and sites selected randomly from each cell.

  8. The Kilkenny Health Project: food and nutrient intakes in randomly selected healthy adults.

    PubMed

    Gibney, M J; Moloney, M; Shelley, E

    1989-03-01

    1. Sixty healthy subjects aged 35-44 years (thirty men and thirty women) were randomly selected from electoral registers to participate in a dietary survey using the 7 d weighed-intake method during June-August 1985. 2. Energy intake (MJ/d) was 12.5 for men and 8.4 for women. Fat contributed 36.0 and 39.1% of the total energy intake of men and women respectively. When this was adjusted to exclude energy derived from alcoholic beverages, the corresponding values were 38.8 and 39.7% respectively. The major sources of dietary fat (%) were spreadable fats (28), meat (23), milk (12) and biscuits and cakes (11). 3. The subjects were divided into low- and high-fat groups both on the relative intake of fat (less than 35% or greater than 40% dietary energy from fat) and on the absolute intake of fat (greater than or less than 120 g fat/d). By either criterion, high-fat consumers had lower than average intakes of low-fat, high-carbohydrate foods such as potatoes, bread, fruit and table sugar, and higher intakes of milk, butter and confectionery products. Meat intake was higher among high-fat eaters only when a high-fat diet was defined as a percentage of energy.

  9. Issues Relating to Selective Reporting When Including Non-Randomized Studies in Systematic Reviews on the Effects of Healthcare Interventions

    ERIC Educational Resources Information Center

    Norris, Susan L.; Moher, David; Reeves, Barnaby C.; Shea, Beverley; Loke, Yoon; Garner, Sarah; Anderson, Laurie; Tugwell, Peter; Wells, George

    2013-01-01

    Background: Selective outcome and analysis reporting (SOR and SAR) occur when only a subset of outcomes measured and analyzed in a study is fully reported, and are an important source of potential bias. Key methodological issues: We describe what is known about the prevalence and effects of SOR and SAR in both randomized controlled trials (RCTs)…

  10. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti

    PubMed Central

    2013-01-01

    Background A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. Methods The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. Results A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. Conclusions The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge

  11. Generation of Aptamers from A Primer-Free Randomized ssDNA Library Using Magnetic-Assisted Rapid Aptamer Selection.

    PubMed

    Tsao, Shih-Ming; Lai, Ji-Ching; Horng, Horng-Er; Liu, Tu-Chen; Hong, Chin-Yih

    2017-04-03

    Aptamers are oligonucleotides that can bind to specific target molecules. Most aptamers are generated using random libraries in the standard systematic evolution of ligands by exponential enrichment (SELEX). Each random library contains oligonucleotides with a randomized central region and two fixed primer regions at both ends. The fixed primer regions are necessary for amplifying target-bound sequences by PCR. However, these extra-sequences may cause non-specific bindings, which potentially interfere with good binding for random sequences. The Magnetic-Assisted Rapid Aptamer Selection (MARAS) is a newly developed protocol for generating single-strand DNA aptamers. No repeat selection cycle is required in the protocol. This study proposes and demonstrates a method to isolate aptamers for C-reactive proteins (CRP) from a randomized ssDNA library containing no fixed sequences at 5' and 3' termini using the MARAS platform. Furthermore, the isolated primer-free aptamer was sequenced and binding affinity for CRP was analyzed. The specificity of the obtained aptamer was validated using blind serum samples. The result was consistent with monoclonal antibody-based nephelometry analysis, which indicated that a primer-free aptamer has high specificity toward targets. MARAS is a feasible platform for efficiently generating primer-free aptamers for clinical diagnoses.

  12. Generation of Aptamers from A Primer-Free Randomized ssDNA Library Using Magnetic-Assisted Rapid Aptamer Selection

    PubMed Central

    Tsao, Shih-Ming; Lai, Ji-Ching; Horng, Horng-Er; Liu, Tu-Chen; Hong, Chin-Yih

    2017-01-01

    Aptamers are oligonucleotides that can bind to specific target molecules. Most aptamers are generated using random libraries in the standard systematic evolution of ligands by exponential enrichment (SELEX). Each random library contains oligonucleotides with a randomized central region and two fixed primer regions at both ends. The fixed primer regions are necessary for amplifying target-bound sequences by PCR. However, these extra-sequences may cause non-specific bindings, which potentially interfere with good binding for random sequences. The Magnetic-Assisted Rapid Aptamer Selection (MARAS) is a newly developed protocol for generating single-strand DNA aptamers. No repeat selection cycle is required in the protocol. This study proposes and demonstrates a method to isolate aptamers for C-reactive proteins (CRP) from a randomized ssDNA library containing no fixed sequences at 5′ and 3′ termini using the MARAS platform. Furthermore, the isolated primer-free aptamer was sequenced and binding affinity for CRP was analyzed. The specificity of the obtained aptamer was validated using blind serum samples. The result was consistent with monoclonal antibody-based nephelometry analysis, which indicated that a primer-free aptamer has high specificity toward targets. MARAS is a feasible platform for efficiently generating primer-free aptamers for clinical diagnoses. PMID:28367958

  13. Automatised selection of load paths to construct reduced-order models in computational damage micromechanics: from dissipation-driven random selection to Bayesian optimization

    NASA Astrophysics Data System (ADS)

    Goury, Olivier; Amsallem, David; Bordas, Stéphane Pierre Alain; Liu, Wing Kam; Kerfriden, Pierre

    2016-08-01

    In this paper, we present new reliable model order reduction strategies for computational micromechanics. The difficulties rely mainly upon the high dimensionality of the parameter space represented by any load path applied onto the representative volume element. We take special care of the challenge of selecting an exhaustive snapshot set. This is treated by first using a random sampling of energy dissipating load paths and then in a more advanced way using Bayesian optimization associated with an interlocked division of the parameter space. Results show that we can insure the selection of an exhaustive snapshot set from which a reliable reduced-order model can be built.

  14. The basic science and mathematics of random mutation and natural selection.

    PubMed

    Kleinman, Alan

    2014-12-20

    The mutation and natural selection phenomenon can and often does cause the failure of antimicrobial, herbicidal, pesticide and cancer treatments selection pressures. This phenomenon operates in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures. The mathematical behavior of mutation and selection is derived using the principles given by probability theory. The derivation of the equations describing the mutation and selection phenomenon is carried out in the context of an empirical example.

  15. Introduction of Mismatches in a Random shRNA-Encoding Library Improves Potency for Phenotypic Selection

    PubMed Central

    Wang, Yongping; Speier, Jacqueline S.; Engram-Pearl, Jessica; Wilson, Robert B.

    2014-01-01

    RNA interference (RNAi) is a mechanism for interfering with gene expression through the action of small, non-coding RNAs. We previously constructed a short-hairpin-loop RNA (shRNA) encoding library that is random at the nucleotide level [1]. In this library, the stems of the hairpin are completely complementary. To improve the potency of initial hits, and therefore signal-to-noise ratios in library screening, as well as to simplify hit-sequence retrieval by PCR, we constructed a second-generation library in which we introduced random mismatches between the two halves of the stem of each hairpin, on a random template background. In a screen for shRNAs that protect an interleukin-3 (IL3) dependent cell line from IL3 withdrawal, our second-generation library yielded hit sequences with significantly higher potencies than those from the first-generation library in the same screen. Our method of random mutagenesis was effective for a random template and is likely suitable, therefore, for any DNA template of interest. The improved potency of our second-generation library expands the range of possible unbiased screens for small-RNA therapeutics and biologic tools. PMID:24498319

  16. The Implications of Teacher Selection and Teacher Effects in Individually Randomized Group Treatment Trials

    ERIC Educational Resources Information Center

    Weiss, Michael J.

    2010-01-01

    Randomized experiments have become an increasingly popular design to evaluate the effectiveness of interventions in education (Spybrook, 2008). Many of the interventions evaluated in education are delivered to groups of students, rather than to individuals. Experiments designed to evaluate programs delivered at the group level often…

  17. The Jackprot Simulation Couples Mutation Rate with Natural Selection to Illustrate How Protein Evolution Is Not Random

    PubMed Central

    Espinosa, Avelina; Bai, Chunyan Y.

    2016-01-01

    Protein evolution is not a random process. Views which attribute randomness to molecular change, deleterious nature to single-gene mutations, insufficient geological time, or population size for molecular improvements to occur, or invoke “design creationism” to account for complexity in molecular structures and biological processes, are unfounded. Scientific evidence suggests that natural selection tinkers with molecular improvements by retaining adaptive peptide sequence. We used slot-machine probabilities and ion channels to show biological directionality on molecular change. Because ion channels reside in the lipid bilayer of cell membranes, their residue location must be in balance with the membrane's hydrophobic/philic nature; a selective “pore” for ion passage is located within the hydrophobic region. We contrasted the random generation of DNA sequence for KcsA, a bacterial two-transmembrane-domain (2TM) potassium channel, from Streptomyces lividans, with an under-selection scenario, the “jackprot,” which predicted much faster evolution than by chance. We wrote a computer program in JAVA APPLET version 1.0 and designed an online interface, The Jackprot Simulation http://faculty.rwu.edu/cbai/JackprotSimulation.htm, to model a numerical interaction between mutation rate and natural selection during a scenario of polypeptide evolution. Winning the “jackprot,” or highest-fitness complete-peptide sequence, required cumulative smaller “wins” (rewarded by selection) at the first, second, and third positions in each of the 161 KcsA codons (“jackdons” that led to “jackacids” that led to the “jackprot”). The “jackprot” is a didactic tool to demonstrate how mutation rate coupled with natural selection suffices to explain the evolution of specialized proteins, such as the complex six-transmembrane (6TM) domain potassium, sodium, or calcium channels. Ancestral DNA sequences coding for 2TM-like proteins underwent nucleotide

  18. The Jackprot Simulation Couples Mutation Rate with Natural Selection to Illustrate How Protein Evolution Is Not Random.

    PubMed

    Paz-Y-Miño C, Guillermo; Espinosa, Avelina; Bai, Chunyan Y

    2011-09-01

    Protein evolution is not a random process. Views which attribute randomness to molecular change, deleterious nature to single-gene mutations, insufficient geological time, or population size for molecular improvements to occur, or invoke "design creationism" to account for complexity in molecular structures and biological processes, are unfounded. Scientific evidence suggests that natural selection tinkers with molecular improvements by retaining adaptive peptide sequence. We used slot-machine probabilities and ion channels to show biological directionality on molecular change. Because ion channels reside in the lipid bilayer of cell membranes, their residue location must be in balance with the membrane's hydrophobic/philic nature; a selective "pore" for ion passage is located within the hydrophobic region. We contrasted the random generation of DNA sequence for KcsA, a bacterial two-transmembrane-domain (2TM) potassium channel, from Streptomyces lividans, with an under-selection scenario, the "jackprot," which predicted much faster evolution than by chance. We wrote a computer program in JAVA APPLET version 1.0 and designed an online interface, The Jackprot Simulation http://faculty.rwu.edu/cbai/JackprotSimulation.htm, to model a numerical interaction between mutation rate and natural selection during a scenario of polypeptide evolution. Winning the "jackprot," or highest-fitness complete-peptide sequence, required cumulative smaller "wins" (rewarded by selection) at the first, second, and third positions in each of the 161 KcsA codons ("jackdons" that led to "jackacids" that led to the "jackprot"). The "jackprot" is a didactic tool to demonstrate how mutation rate coupled with natural selection suffices to explain the evolution of specialized proteins, such as the complex six-transmembrane (6TM) domain potassium, sodium, or calcium channels. Ancestral DNA sequences coding for 2TM-like proteins underwent nucleotide "edition" and gene duplications to generate the 6

  19. The Effect of Basis Selection on Static and Random Acoustic Response Prediction Using a Nonlinear Modal Simulation

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Przekop, Adam

    2005-01-01

    An investigation of the effect of basis selection on geometric nonlinear response prediction using a reduced-order nonlinear modal simulation is presented. The accuracy is dictated by the selection of the basis used to determine the nonlinear modal stiffness. This study considers a suite of available bases including bending modes only, bending and membrane modes, coupled bending and companion modes, and uncoupled bending and companion modes. The nonlinear modal simulation presented is broadly applicable and is demonstrated for nonlinear quasi-static and random acoustic response of flat beam and plate structures with isotropic material properties. Reduced-order analysis predictions are compared with those made using a numerical simulation in physical degrees-of-freedom to quantify the error associated with the selected modal bases. Bending and membrane responses are separately presented to help differentiate the bases.

  20. Selecting Random Latin Hypercube Dimensions and Designs through Estimation of Maximum Absolute Pairwise Correlation

    DTIC Science & Technology

    2012-12-01

    values are assigned at random to the n design points, with all n! possible permutations being equally likely. This generates the X j column in the...design matrix. The permutation process is performed independently for each of the k input variables. Therefore, for each column jX , all of the n...lattice RLH corresponds to independently generating k permutations of the first n natural numbers and appropriately scaling the columns to cover the

  1. Single-primer-limited amplification: a method to generate random single-stranded DNA sub-library for aptamer selection.

    PubMed

    He, Chao-Zhu; Zhang, Kun-He; Wang, Ting; Wan, Qin-Si; Hu, Piao-Ping; Hu, Mei-Di; Huang, De-Qiang; Lv, Nong-Hua

    2013-09-01

    The amplification of a random single-stranded DNA (ssDNA) library by polymerase chain reaction (PCR) is a key step in each round of aptamer selection by systematic evolution of ligands by exponential enrichment (SELEX), but it can be impeded by the amplification of by-products due to the severely nonspecific hybridizations among various sequences in the PCR system. To amplify a random ssDNA library free from by-products, we developed a novel method termed single-primer-limited amplification (SPLA), which was initiated from the amplification of minus-stranded DNA (msDNA) of an ssDNA library with reverse primer limited to 5-fold molar quantity of the template, followed by the amplification of plus-stranded DNA (psDNA) of the msDNA with forward primer limited to 10-fold molar quantity of the template and recovery of psDNA by gel excision. We found that the amount of by-products increased with the increase of template amount and thermal cycle number. With the optimized template amount and thermal cycle, SPLA could amplify target ssDNA without detectable by-products and nonspecific products and could produce psDNA 16.1 times as much as that by asymmetric PCR. In conclusion, SPLA is a simple and feasible method to efficiently generate a random ssDNA sub-library for aptamer selection.

  2. Robust estimates of divergence times and selection with a poisson random field model: a case study of comparative phylogeographic data.

    PubMed

    Amei, Amei; Smith, Brian Tilston

    2014-01-01

    Mutation frequencies can be modeled as a Poisson random field (PRF) to estimate speciation times and the degree of selection on newly arisen mutations. This approach provides a quantitative theory for comparing intraspecific polymorphism with interspecific divergence in the presence of selection and can be used to estimate population genetic parameters. Although the original PRF model has been extended to more general biological settings to make statistical inference about selection and divergence among model organisms, it has not been incorporated into phylogeographic studies that focus on estimating population genetic parameters for nonmodel organisms. Here, we modified a recently developed time-dependent PRF model to independently estimate genetic parameters from a nuclear and mitochondrial DNA data set of 22 sister pairs of birds that have diverged across a biogeographic barrier. We found that species that inhabit humid habitats had more recent divergence times and larger effective population sizes than those that inhabit drier habitats, and divergence time estimated from the PRF model were similar to estimates from a coalescent species-tree approach. Selection coefficients were higher in sister pairs that inhabited drier habitats than in those in humid habitats, but overall the mitochondrial DNA was under weak selection. Our study indicates that PRF models are useful for estimating various population genetic parameters and serve as a framework for incorporating estimates of selection into comparative phylogeographic studies.

  3. Identification of polypeptides with selective affinity to intact mouse cerebellar granule neurons from a random peptide-presenting phage library.

    PubMed

    Hou, Sheng T; Dove, Mike; Anderson, Erica; Zhang, Jiangbing; MacKenzie, C Roger

    2004-09-30

    Targeting of postmitotic neurons selectively for gene delivery poses a challenge. One way to achieve such a selective targeting is to link the gene delivery vector with small ligand-binding polypeptides which have selective affinity to intact neurons. In order to identify such novel neuron selective polypeptides, we screened a phage-display library displaying random 12-mer polypeptides and subtractively bio-panned for clones having selectivity towards cultured mouse cerebellar granule neurons. The selected phage clones were amplified and sequenced. Affinities of these clones to neurons were determined by the visible presence or absence of fluorescence of phage particles as detected by immunocytochemistry using an antibody to M-13 phage. This affinity was further qualified by how much phage was bound, and where in or on the cell it tended to accumulate. The selectivity of binding to neurons was determined by the negative binding of these clones to several cultured non-neuronal cells, including, primary glial cells, NT2 cells, human embryonic kidney 293 cells, neuroblastoma cells, and mouse 3T3 cells. Among the 46 clones that we have sequenced and characterized, four clones appeared to have excellent selectivity in binding to neurons. Homology comparison of these polypeptides revealed that three of them contained a consensus D(E)-W(F)-I(N)-D-W motif. This motif was also present in the Bdm1 gene product which was predominantly expressed in postnatal brains. Further characterizations of these polypeptides are required to reveal the utilities of these peptides to function as an effective linker to facilitate gene transfer selectively to neurons.

  4. Natural Selection VS. Random Drift: Evidence from Temporal Variation in Allele Frequencies in Nature

    PubMed Central

    Mueller, Laurence D.; Barr, Lorraine G.; Ayala, Francisco J.

    1985-01-01

    We have obtained monthly samples of two species, Drosophila pseudoobscura and Drosophila persimilis, in a natural population from Napa County, California. In each species, about 300 genes have been assayed by electrophoresis for each of seven enzyme loci in each monthly sample from March 1972 to June 1975. Using statistical methods developed for the purpose, we have examined whether the allele frequencies at different loci vary in a correlated fashion. The methods used do not detect natural selection when it is deterministic (e.g., overdominance or directional selection), but only when alleles at different loci vary simultaneously in response to the same environmental variations. Moreover, only relatively large fitness differences (of the order of 15%) are detectable. We have found strong evidence of correlated allele frequency variation in 13–20% of the cases examined. We interpret this as evidence that natural selection plays a major role in the evolution of protein polymorphisms in nature. PMID:4054608

  5. Yearling trait comparisons among inbred lines and selected noninbred and randomly bred control groups of Rambouillet, Targhee and Columbia ewes.

    PubMed

    Ercanbrack, S K; Knight, A D

    1983-02-01

    Inbreeding with concurrent selection was used to develop 26 Rambouillet, 20 Targhee and 10 Columbia inbred lines of sheep. Inbreeding coefficients averaged 30, 29 and 30% for the three breeds, respectively, at the conclusion of the study. A selected noninbred control group and a randomly bred unselected control group were maintained for each breed. Yearling traits were evaluated for 545 Rambouillet, 572 Targhee and 411 Columbia yearling ewes, each belonging to one of the inbred lines or control groups. In each breed, the selected controls were generally of greatest overall merit, the unselected controls intermediate and the inbred lines of least merit. Only a few yearling traits of only a few inbred lines were superior (P less than .05) to those of their appropriate selected control groups. Selection within inbred lines was generally ineffective in offsetting inbreeding depression. However, single trait selection for traits of high heritability, notably yearling weight, clean fleece weight and staple length, appeared to compensate for inbreeding effects on those traits. Deleterious consequences of inbreeding were particularly apparent in yearling weight, average daily gain, type and condition scores, grease and clean fleece weights and index of overall merit. Inbreeding also resulted in fewer neck folds among inbreds of all three breeds. Correlations between the rankings of inbred lines at weaning and yearling ages were high for traits of higher heritability. Superiority of the selected controls in most traits was of about the same magnitude at weaning and yearling ages. In no case did the final overall merit (index value) of an inbred line of any of the three breeds significantly exceed the overall merit of its respective selected control group.

  6. Pattern selection and self-organization induced by random boundary initial values in a neuronal network

    NASA Astrophysics Data System (ADS)

    Ma, Jun; Xu, Ying; Wang, Chunni; Jin, Wuyin

    2016-11-01

    Regular spatial patterns could be observed in spatiotemporal systems far from equilibrium states. Artificial networks with different topologies are often designed to reproduce the collective behaviors of nodes (or neurons) which the local kinetics of node is described by kinds of oscillator models. It is believed that the self-organization of network much depends on the bifurcation parameters and topology connection type. Indeed, the boundary effect is every important on the pattern formation of network. In this paper, a regular network of Hindmarsh-Rose neurons is designed in a two-dimensional square array with nearest-neighbor connection type. The neurons on the boundary are excited with random stimulus. It is found that spiral waves, even a pair of spiral waves could be developed in the network under appropriate coupling intensity. Otherwise, the spatial distribution of network shows irregular states. A statistical variable is defined to detect the collective behavior by using mean field theory. It is confirmed that regular pattern could be developed when the synchronization degree is low. The potential mechanism could be that random perturbation on the boundary could induce coherence resonance-like behavior thus spiral wave could be developed in the network.

  7. Random frog: an efficient reversible jump Markov Chain Monte Carlo-like approach for variable selection with applications to gene selection and disease classification.

    PubMed

    Li, Hong-Dong; Xu, Qing-Song; Liang, Yi-Zeng

    2012-08-31

    The identification of disease-relevant genes represents a challenge in microarray-based disease diagnosis where the sample size is often limited. Among established methods, reversible jump Markov Chain Monte Carlo (RJMCMC) methods have proven to be quite promising for variable selection. However, the design and application of an RJMCMC algorithm requires, for example, special criteria for prior distributions. Also, the simulation from joint posterior distributions of models is computationally extensive, and may even be mathematically intractable. These disadvantages may limit the applications of RJMCMC algorithms. Therefore, the development of algorithms that possess the advantages of RJMCMC methods and are also efficient and easy to follow for selecting disease-associated genes is required. Here we report a RJMCMC-like method, called random frog that possesses the advantages of RJMCMC methods and is much easier to implement. Using the colon and the estrogen gene expression datasets, we show that random frog is effective in identifying discriminating genes. The top 2 ranked genes for colon and estrogen are Z50753, U00968, and Y10871_at, Z22536_at, respectively. (The source codes with GNU General Public License Version 2.0 are freely available to non-commercial users at: http://code.google.com/p/randomfrog/.).

  8. Random fluctuation of selection coefficients and the extent of nucleotide variation in human populations

    PubMed Central

    Miura, Sayaka; Zhang, Zhenguo; Nei, Masatoshi

    2013-01-01

    It is well known that the selection coefficient of a mutant allele varies from generation to generation, and the effect of this factor on genetic variation has been studied by many theoreticians. However, no consensus has been reached. One group of investigators believes that fluctuating selection has an effect of enhancing genetic variation, whereas the other group contends that it has a diversity-reducing effect. In recent years, it has become possible to study this problem by using single nucleotide polymorphisms (SNPs) as well as exome sequence data. In this article we present the theoretical distributions of mutant nucleotide frequencies for the two models of fluctuating selection and then compare the distributions with the empirical distributions obtained from SNP and exome sequence data in human populations. Interestingly, both SNP and exome sequence data showed that the neutral mutation model fits the empirical distribution quite well. Furthermore, the mathematical models with diversity-enhancing and diversity-reducing effects also fit the empirical distribution reasonably well. This result implies that there is no need of distinguishing between the diversity-enhancing and diversity-reducing models of fluctuating selection and the nucleotide polymorphism in human populations can be explained largely by neutral mutations when long-term evolution is considered. PMID:23754436

  9. Moral hazard and selection among the poor: evidence from a randomized experiment.

    PubMed

    Spenkuch, Jörg L

    2012-01-01

    Not only does economic theory predict high-risk individuals to be more likely to purchase insurance, but insurance coverage is also thought to crowd out precautionary activities. In spite of stark theoretical predictions, there is conflicting empirical evidence on adverse selection, and evidence on ex ante moral hazard is very scarce. Using data from the Seguro Popular Experiment in Mexico, this paper documents patterns of selection on observables into health insurance as well as the existence of non-negligible ex ante moral hazard. More specifically, the findings indicate that (i) agents in poor self-assessed health prior to the intervention have, all else equal, a higher propensity to take up insurance; and (ii) insurance coverage reduces the demand for self-protection in the form of preventive care. Curiously, however, individuals do not sort based on objective measures of their health.

  10. Code to generate random identifiers and select QA/QC samples

    USGS Publications Warehouse

    Mehnert, Edward

    1992-01-01

    SAMPLID is a PC-based, FORTRAN-77 code which generates unique numbers for identification of samples, selection of QA/QC samples, and generation of labels. These procedures are tedious, but using a computer code such as SAMPLID can increase efficiency and reduce or eliminate errors and bias. The algorithm, used in SAMPLID, for generation of pseudorandom numbers is free of statistical flaws present in commonly available algorithms.

  11. The Ecological Effects of Universal and Selective Violence Prevention Programs for Middle School Students: A Randomized Trial

    PubMed Central

    2009-01-01

    This study reports the findings of a multisite randomized trial evaluating the separate and combined effects of 2 school-based approaches to reduce violence among early adolescents. A total of 37 schools at 4 sites were randomized to 4 conditions: (1) a universal intervention that involved implementing a student curriculum and teacher training with 6th-grade students and teachers, (2) a selective intervention in which a family intervention was implemented with a subset of 6th-grade students exhibiting high levels of aggression and social influence, (3) a combined intervention condition, and (4) a no-intervention control condition. Analyses of multiple waves of data from 2 cohorts of students at each school (N = 5,581) within the grade targeted by the interventions revealed a complex pattern. There was some evidence to suggest that the universal intervention was associated with increases in aggression and reductions in victimization; however, these effects were moderated by preintervention risk. In contrast, the selective intervention was associated with decreases in aggression but no changes in victimization. These findings have important implications for efforts to develop effective violence prevention programs. PMID:19485593

  12. The ecological effects of universal and selective violence prevention programs for middle school students: a randomized trial.

    PubMed

    2009-06-01

    This study reports the findings of a multisite randomized trial evaluating the separate and combined effects of 2 school-based approaches to reduce violence among early adolescents. A total of 37 schools at 4 sites were randomized to 4 conditions: (1) a universal intervention that involved implementing a student curriculum and teacher training with 6th-grade students and teachers, (2) a selective intervention in which a family intervention was implemented with a subset of 6th-grade students exhibiting high levels of aggression and social influence, (3) a combined intervention condition, and (4) a no-intervention control condition. Analyses of multiple waves of data from 2 cohorts of students at each school (N = 5,581) within the grade targeted by the interventions revealed a complex pattern. There was some evidence to suggest that the universal intervention was associated with increases in aggression and reductions in victimization; however, these effects were moderated by preintervention risk. In contrast, the selective intervention was associated with decreases in aggression but no changes in victimization. These findings have important implications for efforts to develop effective violence prevention programs.

  13. Randomized Comparison of Actual and Ideal Body Weight for Size Selection of the Laryngeal Mask Airway Classic in Overweight Patients.

    PubMed

    Kim, Min-Soo; Lee, Jong Seok; Nam, Sang Beom; Kang, Hyo Jong; Kim, Ji Eun

    2015-08-01

    Size selection of the laryngeal mask airway (LMA) Classic based on actual body weight remains a common practice. However, ideal body weight might allow for a better size selection in obese patients. The purpose of our study was to compare the utility of ideal body weight and actual body weight when choosing the appropriate size of the LMA Classic by a randomized clinical trial. One hundred patients with age 20 to 70 yr, body mass index ≥25 kg/m(2), and the difference between LMA sizes based on actual weight and ideal weight were allocated to insert the LMA Classic using either actual body weight or ideal body weight in a weight-based formula for size selection. After insertion of the device, several variables including insertion parameters, sealing function, fiberoptic imaging, and complications were investigated. The insertion success rate at the first attempt was lower in the actual weight group (82%) than in the ideal weight group (96%), even it did not show significant difference. The ideal weight group had significantly shorter insertion time and easier placement. However, fiberoptic views were significantly better in the actual weight group. Intraoperative complications, sore throat in the recovery room, and dysphonia at postoperative 24 hr occurred significantly less often in the ideal weight group than in the actual weight group. It is suggested that the ideal body weight may be beneficial to the size selection of the LMA Classic in overweight patients (Clinical Trial Registry, NCT 01843270).

  14. Benefits of Selected Physical Exercise Programs in Detention: A Randomized Controlled Study

    PubMed Central

    Battaglia, Claudia; di Cagno, Alessandra; Fiorilli, Giovanni; Giombini, Arrigo; Fagnani, Federica; Borrione, Paolo; Marchetti, Marco; Pigozzi, Fabio

    2013-01-01

    The aim of the study was to determine which kind of physical activity could be useful to inmate populations to improve their health status and fitness levels. A repeated measure design was used to evaluate the effects of two different training protocols on subjects in a state of detention, tested pre- and post-experimental protocol.Seventy-five male subjects were enrolled in the studyand randomly allocated to three groups: the cardiovascular plus resistance training protocol group (CRT) (n = 25; mean age 30.9 ± 8.9 years),the high-intensity strength training protocol group (HIST) (n = 25; mean age 33.9 ± 6.8 years), and a control group (C) (n = 25; mean age 32.9 ± 8.9 years) receiving no treatment. All subjects underwent a clinical assessmentandfitness tests. MANOVA revealed significant multivariate effects on group (p < 0.01) and group-training interaction (p < 0.05). CRT protocol resulted the most effective protocol to reach the best outcome in fitness tests. Both CRT and HIST protocols produced significant gains in the functional capacity (cardio-respiratory capacity and cardiovascular disease risk decrease) of incarcerated males. The significant gains obtained in functional capacity reflect the great potential of supervised exercise interventions for improving the health status of incarcerated people. PMID:24185842

  15. Simple random sampling-based probe station selection for fault detection in wireless sensor networks.

    PubMed

    Huang, Rimao; Qiu, Xuesong; Rui, Lanlan

    2011-01-01

    Fault detection for wireless sensor networks (WSNs) has been studied intensively in recent years. Most existing works statically choose the manager nodes as probe stations and probe the network at a fixed frequency. This straightforward solution leads however to several deficiencies. Firstly, by only assigning the fault detection task to the manager node the whole network is out of balance, and this quickly overloads the already heavily burdened manager node, which in turn ultimately shortens the lifetime of the whole network. Secondly, probing with a fixed frequency often generates too much useless network traffic, which results in a waste of the limited network energy. Thirdly, the traditional algorithm for choosing a probing node is too complicated to be used in energy-critical wireless sensor networks. In this paper, we study the distribution characters of the fault nodes in wireless sensor networks, validate the Pareto principle that a small number of clusters contain most of the faults. We then present a Simple Random Sampling-based algorithm to dynamic choose sensor nodes as probe stations. A dynamic adjusting rule for probing frequency is also proposed to reduce the number of useless probing packets. The simulation experiments demonstrate that the algorithm and adjusting rule we present can effectively prolong the lifetime of a wireless sensor network without decreasing the fault detected rate.

  16. Conflicts of Interest, Selective Inertia, and Research Malpractice in Randomized Clinical Trials: An Unholy Trinity

    PubMed Central

    Berger, Vance W.

    2014-01-01

    Recently a great deal of attention has been paid to conflicts of interest in medical research, and the Institute of Medicine has called for more research into this important area. One research question that has not received sufficient attention concerns the mechanisms of action by which conflicts of interest can result in biased and/or flawed research. What discretion do conflicted researchers have to sway the results one way or the other? We address this issue from the perspective of selective inertia, or an unnatural selection of research methods based on which are most likely to establish the preferred conclusions, rather than on which are most valid. In many cases it is abundantly clear that a method that is not being used in practice is superior to the one that is being used in practice, at least from the perspective of validity, and that it is only inertia, as opposed to any serious suggestion that the incumbent method is superior (or even comparable), that keeps the inferior procedure in use, to the exclusion of the superior one. By focusing on these flawed research methods we can go beyond statements of potential harm from real conflicts of interest, and can more directly assess actual (not potential) harm. PMID:25150846

  17. The Effect of Basis Selection on Thermal-Acoustic Random Response Prediction Using Nonlinear Modal Simulation

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Przekop, Adam

    2004-01-01

    The goal of this investigation is to further develop nonlinear modal numerical simulation methods for prediction of geometrically nonlinear response due to combined thermal-acoustic loadings. As with any such method, the accuracy of the solution is dictated by the selection of the modal basis, through which the nonlinear modal stiffness is determined. In this study, a suite of available bases are considered including (i) bending modes only; (ii) coupled bending and companion modes; (iii) uncoupled bending and companion modes; and (iv) bending and membrane modes. Comparison of these solutions with numerical simulation in physical degrees-of-freedom indicates that inclusion of any membrane mode variants (ii - iv) in the basis affects the bending displacement and stress response predictions. The most significant effect is on the membrane displacement, where it is shown that only the type (iv) basis accurately predicts its behavior. Results are presented for beam and plate structures in the thermally pre-buckled regime.

  18. Nitrates and bone turnover (NABT) - trial to select the best nitrate preparation: study protocol for a randomized controlled trial

    PubMed Central

    2013-01-01

    will use the ‘multiple comparisons with the best’ approach for data analyses, as this strategy allows practical considerations of ease of use and tolerability to guide selection of the preparation for future studies. Discussion Data from this protocol will be used to develop a randomized, controlled trial of nitrates to prevent osteoporotic fractures. Trial registration ClinicalTrials.gov Identifier: NCT01387672. Controlled-Trials.com: ISRCTN08860742. PMID:24010992

  19. Predicting the continuum between corridors and barriers to animal movements using Step Selection Functions and Randomized Shortest Paths.

    PubMed

    Panzacchi, Manuela; Van Moorter, Bram; Strand, Olav; Saerens, Marco; Kivimäki, Ilkka; St Clair, Colleen C; Herfindal, Ivar; Boitani, Luigi

    2016-01-01

    The loss, fragmentation and degradation of habitat everywhere on Earth prompts increasing attention to identifying landscape features that support animal movement (corridors) or impedes it (barriers). Most algorithms used to predict corridors assume that animals move through preferred habitat either optimally (e.g. least cost path) or as random walkers (e.g. current models), but neither extreme is realistic. We propose that corridors and barriers are two sides of the same coin and that animals experience landscapes as spatiotemporally dynamic corridor-barrier continua connecting (separating) functional areas where individuals fulfil specific ecological processes. Based on this conceptual framework, we propose a novel methodological approach that uses high-resolution individual-based movement data to predict corridor-barrier continua with increased realism. Our approach consists of two innovations. First, we use step selection functions (SSF) to predict friction maps quantifying corridor-barrier continua for tactical steps between consecutive locations. Secondly, we introduce to movement ecology the randomized shortest path algorithm (RSP) which operates on friction maps to predict the corridor-barrier continuum for strategic movements between functional areas. By modulating the parameter Ѳ, which controls the trade-off between exploration and optimal exploitation of the environment, RSP bridges the gap between algorithms assuming optimal movements (when Ѳ approaches infinity, RSP is equivalent to LCP) or random walk (when Ѳ → 0, RSP → current models). Using this approach, we identify migration corridors for GPS-monitored wild reindeer (Rangifer t. tarandus) in Norway. We demonstrate that reindeer movement is best predicted by an intermediate value of Ѳ, indicative of a movement trade-off between optimization and exploration. Model calibration allows identification of a corridor-barrier continuum that closely fits empirical data and demonstrates that RSP

  20. Pregnancy is not a risk factor for gallstone disease: Results of a randomly selected population sample

    PubMed Central

    Walcher, Thomas; Haenle, Mark Martin; Kron, Martina; Hay, Birgit; Mason, Richard Andrew; von Schmiesing, Alexa Friederike Alice; Imhof, Armin; Koenig, Wolfgang; Kern, Peter; Boehm, Bernhard Otto; Kratzer, Wolfgang

    2005-01-01

    AIM: To investigate the prevalence, risk factors, and selection of the study population for cholecystolithiasis in an urban population in Germany, in relation to our own findings and to the results in the international literature. METHODS: A total of 2 147 persons (1 111 females, age 42.8 ± 12.7 years; 1 036 males, age 42.3 ± 13.1 years) participating in an investigation on the prevalence of Echinococcus multilocularis were studied for risk factors and prevalence of gallbladder stone disease. Risk factors were assessed by means of a standardized interview and calculation of body mass index (BMI). A diagnostic ultrasound examination of the gallbladder was performed. Data were analyzed by multiple logistic regression, using the SAS statistical software package. RESULTS: Gallbladder stones were detected in 171 study participants (8.0%, n = 2 147). Risk factors for the development of gallbladder stone disease included age, sex, BMI, and positive family history. In a separate analysis of female study participants, pregnancy (yes/no) and number of pregnancies did not exert any influence. CONCLUSION: Findings of the present study confirm that age, female sex, BMI, and positive family history are risk factors for the development of gallbladder stone disease. Pregnancy and the number of pregnancies, however, could not be shown to be risk factors. There seem to be no differences in the respective prevalence for gallbladder stone disease in urban and rural populations. PMID:16425387

  1. Sequence-Based Prediction of RNA-Binding Proteins Using Random Forest with Minimum Redundancy Maximum Relevance Feature Selection

    PubMed Central

    Ma, Xin; Guo, Jing; Sun, Xiao

    2015-01-01

    The prediction of RNA-binding proteins is one of the most challenging problems in computation biology. Although some studies have investigated this problem, the accuracy of prediction is still not sufficient. In this study, a highly accurate method was developed to predict RNA-binding proteins from amino acid sequences using random forests with the minimum redundancy maximum relevance (mRMR) method, followed by incremental feature selection (IFS). We incorporated features of conjoint triad features and three novel features: binding propensity (BP), nonbinding propensity (NBP), and evolutionary information combined with physicochemical properties (EIPP). The results showed that these novel features have important roles in improving the performance of the predictor. Using the mRMR-IFS method, our predictor achieved the best performance (86.62% accuracy and 0.737 Matthews correlation coefficient). High prediction accuracy and successful prediction performance suggested that our method can be a useful approach to identify RNA-binding proteins from sequence information. PMID:26543860

  2. Characterization of indoor air contaminants in a randomly selected set of commercial nail salons in Salt Lake County, Utah, USA.

    PubMed

    Alaves, Victor M; Sleeth, Darrah K; Thiese, Matthew S; Larson, Rodney R

    2013-01-01

    Air samples were collected in 12 randomly selected commercial nail salons in Salt Lake County, Utah. Measurements of salon physical/chemical parameters (room volume, CO2 levels) were obtained. Volatile organic compound (VOC) concentrations were collected using summa air canisters and sorbent media tubes for an 8-h period. Multivariate analyses were used to identify relationships between salon physical/chemical characteristics and the VOCs found in the air samples. The ACGIH(®) additive mixing formula was also applied to determine if there were potential overexposures to the combined airborne concentrations of chemicals monitored. Methyl methacrylate was detected in 58% of the establishments despite having been banned for use in nail products by the state of Utah. Formaldehyde was found above the NIOSH REL(®) (0.016 ppm) in 58% of the establishments. Given the assortment of VOCs to which nail salon workers are potentially exposed, a combination of engineering as well as personal protective equipment is recommended.

  3. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness.

    PubMed

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia's marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to 'small p and large n' problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  4. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    PubMed Central

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  5. Polarimetric SAR decomposition parameter subset selection and their optimal dynamic range evaluation for urban area classification using Random Forest

    NASA Astrophysics Data System (ADS)

    Hariharan, Siddharth; Tirodkar, Siddhesh; Bhattacharya, Avik

    2016-02-01

    Urban area classification is important for monitoring the ever increasing urbanization and studying its environmental impact. Two NASA JPL's UAVSAR datasets of L-band (wavelength: 23 cm) were used in this study for urban area classification. The two datasets used in this study are different in terms of urban area structures, building patterns, their geometric shapes and sizes. In these datasets, some urban areas appear oriented about the radar line of sight (LOS) while some areas appear non-oriented. In this study, roll invariant polarimetric SAR decomposition parameters were used to classify these urban areas. Random Forest (RF), which is an ensemble decision tree learning technique, was used in this study. RF performs parameter subset selection as a part of its classification procedure. In this study, parameter subsets were obtained and analyzed to infer scattering mechanisms useful for urban area classification. The Cloude-Pottier α, the Touzi dominant scattering amplitude αs1 and the anisotropy A were among the top six important parameters selected for both the datasets. However, it was observed that these parameters were ranked differently for the two datasets. The urban area classification using RF was compared with the Support Vector Machine (SVM) and the Maximum Likelihood Classifier (MLC) for both the datasets. RF outperforms SVM by 4% and MLC by 12% in Dataset 1. It also outperforms SVM and MLC by 3.5% and 11% respectively in Dataset 2.

  6. DNABP: Identification of DNA-Binding Proteins Based on Feature Selection Using a Random Forest and Predicting Binding Residues

    PubMed Central

    Guo, Jing; Sun, Xiao

    2016-01-01

    DNA-binding proteins are fundamentally important in cellular processes. Several computational-based methods have been developed to improve the prediction of DNA-binding proteins in previous years. However, insufficient work has been done on the prediction of DNA-binding proteins from protein sequence information. In this paper, a novel predictor, DNABP (DNA-binding proteins), was designed to predict DNA-binding proteins using the random forest (RF) classifier with a hybrid feature. The hybrid feature contains two types of novel sequence features, which reflect information about the conservation of physicochemical properties of the amino acids, and the binding propensity of DNA-binding residues and non-binding propensities of non-binding residues. The comparisons with each feature demonstrated that these two novel features contributed most to the improvement in predictive ability. Furthermore, to improve the prediction performance of the DNABP model, feature selection using the minimum redundancy maximum relevance (mRMR) method combined with incremental feature selection (IFS) was carried out during the model construction. The results showed that the DNABP model could achieve 86.90% accuracy, 83.76% sensitivity, 90.03% specificity and a Matthews correlation coefficient of 0.727. High prediction accuracy and performance comparisons with previous research suggested that DNABP could be a useful approach to identify DNA-binding proteins from sequence information. The DNABP web server system is freely available at http://www.cbi.seu.edu.cn/DNABP/. PMID:27907159

  7. K-Ras(G12D)-selective inhibitory peptides generated by random peptide T7 phage display technology.

    PubMed

    Sakamoto, Kotaro; Kamada, Yusuke; Sameshima, Tomoya; Yaguchi, Masahiro; Niida, Ayumu; Sasaki, Shigekazu; Miwa, Masanori; Ohkubo, Shoichi; Sakamoto, Jun-Ichi; Kamaura, Masahiro; Cho, Nobuo; Tani, Akiyoshi

    2017-03-11

    Amino-acid mutations of Gly(12) (e.g. G12D, G12V, G12C) of V-Ki-ras2 Kirsten rat sarcoma viral oncogene homolog (K-Ras), the most promising drug target in cancer therapy, are major growth drivers in various cancers. Although over 30 years have passed since the discovery of these mutations in most cancer patients, effective mutated K-Ras inhibitors have not been marketed. Here, we report novel and selective inhibitory peptides to K-Ras(G12D). We screened random peptide libraries displayed on T7 phage against purified recombinant K-Ras(G12D), with thorough subtraction of phages bound to wild-type K-Ras, and obtained KRpep-2 (Ac-RRCPLYISYDPVCRR-NH2) as a consensus sequence. KRpep-2 showed more than 10-fold binding- and inhibition-selectivity to K-Ras(G12D), both in SPR analysis and GDP/GTP exchange enzyme assay. KD and IC50 values were 51 and 8.9 nM, respectively. After subsequent sequence optimization, we successfully generated KRpep-2d (Ac-RRRRCPLYISYDPVCRRRR-NH2) that inhibited enzyme activity of K-Ras(G12D) with IC50 = 1.6 nM and significantly suppressed ERK-phosphorylation, downstream of K-Ras(G12D), along with A427 cancer cell proliferation at 30 μM peptide concentration. To our knowledge, this is the first report of a K-Ras(G12D)-selective inhibitor, contributing to the development and study of K-Ras(G12D)-targeting drugs.

  8. Quantitative structure-property relationships of retention indices of some sulfur organic compounds using random forest technique as a variable selection and modeling method.

    PubMed

    Goudarzi, Nasser; Shahsavani, Davood; Emadi-Gandaghi, Fereshteh; Chamjangali, Mansour Arab

    2016-10-01

    In this work, a noble quantitative structure-property relationship technique is proposed on the basis of the random forest for prediction of the retention indices of some sulfur organic compounds. In order to calculate the retention indices of these compounds, the theoretical descriptors produced using their molecular structures are employed. The influence of the significant parameters affecting the capability of the developed random forest prediction power such as the number of randomly selected variables applied to split each node (m) and the number of trees (nt ) is studied to obtain the best model. After optimizing the nt and m parameters, the random forest model conducted for m = 70 and nt = 460 was found to yield the best results. The artificial neural network and multiple linear regression modeling techniques are also used to predict the retention index values for these compounds for comparison with the results of random forest model. The descriptors selected by the stepwise regression and random forest model are used to build the artificial neural network models. The results achieved showed the superiority of the random forest model over the other models for prediction of the retention indices of the studied compounds.

  9. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling

    PubMed Central

    Zhou, Fuqun; Zhang, Aining

    2016-01-01

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2–3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests’ features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data. PMID:27792152

  10. A Preliminary Investigation of the Jack-Bean Urease Inhibition by Randomly Selected Traditionally Used Herbal Medicine

    PubMed Central

    Biglar, Mahmood; Soltani, Khadijeh; Nabati, Farzaneh; Bazl, Roya; Mojab, Faraz; Amanlou, Massoud

    2012-01-01

    Helicobacter pylori (H. pylori) infection leads to different clinical and pathological outcomes in humans, including chronic gastritis, peptic ulcer disease and gastric neoplasia and even gastric cancer and its eradiation dependst upon multi-drug therapy. The most effective therapy is still unknown and prompts people to make great efforts to find better and more modern natural or synthetic anti-H. pylori agents. In this report 21 randomly selected herbal methanolic extracts were evaluated for their effect on inhibition of Jack-bean urease using the indophenol method as described by Weatherburn. The inhibition potency was measured by UV spectroscopy technique at 630 nm which attributes to released ammonium. Among these extracts, five showed potent inhibitory activities with IC50 ranges of 18-35 μg/mL. These plants are Matricaria disciforme (IC50:35 μg/mL), Nasturtium officinale (IC50:18 μg/mL), Punica granatum (IC50:30 μg/mL), Camelia sinensis (IC50:35 μg/mL), Citrus aurantifolia (IC50:28 μg/mL). PMID:24250509

  11. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling.

    PubMed

    Zhou, Fuqun; Zhang, Aining

    2016-10-25

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.

  12. Effects of a selective serotonin reuptake inhibitor escitalopram on the cutaneous silent period: a randomized controlled study in healthy volunteers.

    PubMed

    Pujia, Francesco; Serrao, Mariano; Brienza, Marianna; Vestrini, Elisa; Valente, Gabriele Oreste; Coppola, Gianluca; Pierelli, Francesco

    2014-04-30

    The cutaneous silent period (CSP) involves a transient inhibition of the electromyographic (EMG) activity in the hand muscles induced by a painful electrical stimulation of the digital nerves. The neurotransmitters potentially involved in mediating the CSP have not been completely elucidated thus far. However, few studies suggest that the monoaminergic system may play a role in the CSP. We elicited CSPs in the first dorsal interosseous muscle of the right hand before and 3h after administration of a single oral dose of the selective serotonin reuptake inhibitor escitalopram (20mg) or placebo. The two experimental sessions (drug and placebo) were performed in a random order at ≥1-week intervals. All recordings were numbered anonymously and analysed offline in a blind manner by one investigator. A significant increase in the CSP duration was observed 3h after escitalopram administration (p=0.01), and no changes were observed in the reflex latency and subjective pain sensation (p>0.05). No significant changes were observed in the CSP duration in subjects who received the placebo (all, p>0.05). Our results indicate that escitalopram increases the central disposition of serotonin and increases the activity of the spinal inhibitory interneurons on the α-motoneurons of the hand muscles. Thus, our results indicate the involvement of the monoaminergic system in controlling the spinal pain mechanisms by supraspinal descending pathways originating from the brainstem neural structures.

  13. The Long-Term Effectiveness of a Selective, Personality-Targeted Prevention Program in Reducing Alcohol Use and Related Harms: A Cluster Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Newton, Nicola C.; Conrod, Patricia J.; Slade, Tim; Carragher, Natacha; Champion, Katrina E.; Barrett, Emma L.; Kelly, Erin V.; Nair, Natasha K.; Stapinski, Lexine; Teesson, Maree

    2016-01-01

    Background: This study investigated the long-term effectiveness of Preventure, a selective personality-targeted prevention program, in reducing the uptake of alcohol, harmful use of alcohol, and alcohol-related harms over a 3-year period. Methods: A cluster randomized controlled trial was conducted to assess the effectiveness of Preventure.…

  14. Enumeration of Escherichia coli cells on chicken carcasses as a potential measure of microbial process control in a random selection of slaughter establishments in the United States

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The purpose of this study was to evaluate whether the measurement of Escherichia coli levels at two points during the chicken slaughter process has utility as a measure of quality control. A one year long survey was conducted during 2004 and 2005 in 20 randomly selected United States chicken slaught...

  15. Robust prediction of B-factor profile from sequence using two-stage SVR based on random forest feature selection.

    PubMed

    Pan, Xiao-Yong; Shen, Hong-Bin

    2009-01-01

    B-factor is highly correlated with protein internal motion, which is used to measure the uncertainty in the position of an atom within a crystal structure. Although the rapid progress of structural biology in recent years makes more accurate protein structures available than ever, with the avalanche of new protein sequences emerging during the post-genomic Era, the gap between the known protein sequences and the known protein structures becomes wider and wider. It is urgent to develop automated methods to predict B-factor profile from the amino acid sequences directly, so as to be able to timely utilize them for basic research. In this article, we propose a novel approach, called PredBF, to predict the real value of B-factor. We firstly extract both global and local features from the protein sequences as well as their evolution information, then the random forests feature selection is applied to rank their importance and the most important features are inputted to a two-stage support vector regression (SVR) for prediction, where the initial predicted outputs from the 1(st) SVR are further inputted to the 2nd layer SVR for final refinement. Our results have revealed that a systematic analysis of the importance of different features makes us have deep insights into the different contributions of features and is very necessary for developing effective B-factor prediction tools. The two-layer SVR prediction model designed in this study further enhanced the robustness of predicting the B-factor profile. As a web server, PredBF is freely available at: http://www.csbio.sjtu.edu.cn/bioinf/PredBF for academic use.

  16. Water chemistry in 179 randomly selected Swedish headwater streams related to forest production, clear-felling and climate.

    PubMed

    Löfgren, Stefan; Fröberg, Mats; Yu, Jun; Nisell, Jakob; Ranneby, Bo

    2014-12-01

    From a policy perspective, it is important to understand forestry effects on surface waters from a landscape perspective. The EU Water Framework Directive demands remedial actions if not achieving good ecological status. In Sweden, 44 % of the surface water bodies have moderate ecological status or worse. Many of these drain catchments with a mosaic of managed forests. It is important for the forestry sector and water authorities to be able to identify where, in the forested landscape, special precautions are necessary. The aim of this study was to quantify the relations between forestry parameters and headwater stream concentrations of nutrients, organic matter and acid-base chemistry. The results are put into the context of regional climate, sulphur and nitrogen deposition, as well as marine influences. Water chemistry was measured in 179 randomly selected headwater streams from two regions in southwest and central Sweden, corresponding to 10 % of the Swedish land area. Forest status was determined from satellite images and Swedish National Forest Inventory data using the probabilistic classifier method, which was used to model stream water chemistry with Bayesian model averaging. The results indicate that concentrations of e.g. nitrogen, phosphorus and organic matter are related to factors associated with forest production but that it is not forestry per se that causes the excess losses. Instead, factors simultaneously affecting forest production and stream water chemistry, such as climate, extensive soil pools and nitrogen deposition, are the most likely candidates The relationships with clear-felled and wetland areas are likely to be direct effects.

  17. The UCSD Statin Study: a randomized controlled trial assessing the impact of statins on selected noncardiac outcomes.

    PubMed

    Golomb, Beatrice A; Criqui, Michael H; White, Halbert L; Dimsdale, Joel E

    2004-04-01

    There has been persistent controversy regarding possible favorable or adverse effects of statins or of cholesterol reduction on cognition, mood and behavior (including aggressive or violent behavior), muscle function, and quality of life. The UCSD Statin Study seeks to ascertain the beneficial or adverse effects of statin cholesterol-lowering drugs on a set of noncardiac endpoints, including cognition, behavior, and serotonin biochemistry. The study will enroll 1000 subjects (minimum 20% female) of mixed ethnicity from San Diego. Subjects must be age 20 and older, postmenopausal if female, without known cardiovascular disease or diabetes, and with LDL-cholesterol between 115 and 190 mg/dl. Subjects will be randomized to a double-blind, placebo-controlled trial with assignment 1/3, 1/3, 1/3 to placebo, simvastatin 20 mg, or pravastatin 40 mg (equipotent LDL-cholesterol-lowering doses for drug arms with simvastatin and pravastatin chosen to represent the extremes of the lipophilicity spectrum) for 6 months of treatment followed by 2 months postcessation follow-up. Primary outcomes are cognition (cognitive battery), irritability/aggression (behavior measure), and serotonin (gauged by whole blood serotonin), assessed as the difference between baseline and 6 months, judging combined statin groups vs. placebo. Secondary outcomes include mood (CES-D and Wakefield depression inventory), quality of life (SF-12V), sleep (Leeds sleep scale, modified), and secondary aggression measures (Conflict Tactics Scale; Overt Aggression Scale, Modified). Cardiovascular reactivity will be examined in a 10% subset. As additional secondary endpoints, primary and selected secondary outcomes will be assessed by statin assignment (lipophilic simvastatin vs. hydrophilic pravastatin). "Reversibility" of changes, if any, at 2 months postcessation will be determined. If effects (favorable or unfavorable) are identified, we will seek to ascertain whether there are baseline variables that predict

  18. The UCSD Statin Study: a randomized controlled trial assessing the impact of statins on selected noncardiac outcomes

    PubMed Central

    Golomb, Beatrice A.; Criqui, Michael H.; White, Halbert L.; Dimsdale, Joel E.

    2013-01-01

    There has been persistent controversy regarding possible favorable or adverse effects of statins or of cholesterol reduction on cognition, mood and behavior (including aggressive or violent behavior), muscle function, and quality of life. The UCSD Statin Study seeks to ascertain the beneficial or adverse effects of statin cholesterol-lowering drugs on a set of noncardiac endpoints, including cognition, behavior, and serotonin biochemistry. The study will enroll 1000 subjects (minimum 20% female) of mixed ethnicity from San Diego. Subjects must be age 20 and older, postmenopausal if female, without known cardiovascular disease or diabetes, and with LDL-cholesterol between 115 and 190 mg/dl. Subjects will be randomized to a double-blind, placebo-controlled trial with assignment 1/3, 1/3, 1/3 to placebo, simvastatin 20 mg, or pravastatin 40 mg (equipotent LDL-cholesterol-lowering doses for drug arms with simvastatin and pravastatin chosen to represent the extremes of the lipophilicity spectrum) for 6 months of treatment followed by 2 months postcessation follow-up. Primary outcomes are cognition (cognitive battery), irritability/aggression (behavior measure), and serotonin (gauged by whole blood serotonin), assessed as the difference between baseline and 6 months, judging combined statin groups vs. placebo. Secondary outcomes include mood (CES-D and Wakefield depression inventory), quality of life (SF-12V), sleep (Leeds sleep scale, modified), and secondary aggression measures (Conflict Tactics Scale; Overt Aggression Scale, Modified). Cardiovascular reactivity will be examined in a 10% subset. As additional secondary endpoints, primary and selected secondary outcomes will be assessed by statin assignment (lipophilic simvastatin vs. hydrophilic pravastatin). “Reversibility” of changes, if any, at 2 months postcessation will be determined. If effects (favorable or unfavorable) are identified, we will seek to ascertain whether there are baseline variables that

  19. The Ecological Effects of Universal and Selective Violence Prevention Programs for Middle School Students: A Randomized Trial

    ERIC Educational Resources Information Center

    Simon, Thomas R.; Ikeda, Robin M.; Smith, Emilie Phillips; Reese, Le'Roy E.; Rabiner, David L.; Miller, Shari; Winn, Donna-Marie; Dodge, Kenneth A.; Asher, Steven R.; Horne, Arthur M.; Orpinas, Pamela; Martin, Roy; Quinn, William H.; Tolan, Patrick H.; Gorman-Smith, Deborah; Henry, David B.; Gay, Franklin N.; Schoeny, Michael; Farrell, Albert D.; Meyer, Aleta L.; Sullivan, Terri N.; Allison, Kevin W.

    2009-01-01

    This study reports the findings of a multisite randomized trial evaluating the separate and combined effects of 2 school-based approaches to reduce violence among early adolescents. A total of 37 schools at 4 sites were randomized to 4 conditions: (1) a universal intervention that involved implementing a student curriculum and teacher training…

  20. Sexual selection has minimal impact on effective population sizes in species with high rates of random offspring mortality: an empirical demonstration using fitness distributions

    PubMed Central

    Pischedda, Alison; Friberg, Urban; Stewart, Andrew D.; Miller, Paige M.; Rice, William R.

    2015-01-01

    The effective population size (Ne) is a fundamental parameter in population genetics that influences the rate of loss of genetic diversity. Sexual selection has the potential to reduce Ne by causing the sex-specific distributions of individuals that successfully reproduce to diverge. To empirically estimate the effect of sexual selection on Ne, we obtained fitness distributions for males and females from an outbred, laboratory-adapted population of Drosophila melanogaster. We observed strong sexual selection in this population (the variance in male reproductive success was ∼14 times higher than that for females), but found that sexual selection had only a modest effect on Ne, which was 75% of the census size. This occurs because the substantial random offspring mortality in this population diminishes the effects of sexual selection on Ne, a result that necessarily applies to other high fecundity species. The inclusion of this random offspring mortality creates a scaling effect that reduces the variance/mean ratios for male and female reproductive success and causes them to converge. Our results demonstrate that measuring reproductive success without considering offspring mortality can underestimate Ne and overestimate the genetic consequences of sexual selection. Similarly, comparing genetic diversity among different genomic components may fail to detect strong sexual selection. PMID:26374275

  1. Changing friend selection in middle school: A social network analysis of a randomized intervention study designed to prevent adolescent problem behavior

    PubMed Central

    DeLay, Dawn; Ha, Thao; Van Ryzin, Mark; Winter, Charlotte; Dishion, Thomas J.

    2015-01-01

    Adolescent friendships that promote problem behavior are often chosen in middle school. The current study examines the unintended impact of a randomized school based intervention on the selection of friends in middle school, as well as on observations of deviant talk with friends five years later. Participants included 998 middle school students (526 boys and 472 girls) recruited at the onset of middle school (age 11-12 years) from three public middle schools participating in the Family Check-up model intervention. The current study focuses only on the effects of the SHAPe curriculum—one level of the Family Check-up model—on friendship choices. Participants nominated friends and completed measures of deviant peer affiliation. Approximately half of the sample (n=500) was randomly assigned to the intervention and the other half (n=498) comprised the control group within each school. The results indicate that the SHAPe curriculum affected friend selection within School 1, but not within Schools 2 or 3. The effects of friend selection in School 1 translated into reductions in observed deviancy training five years later (age 16-17 years). By coupling longitudinal social network analysis with a randomized intervention study the current findings provide initial evidence that a randomized public middle school intervention can disrupt the formation of deviant peer groups and diminish levels of adolescent deviance five years later. PMID:26377235

  2. Changing Friend Selection in Middle School: A Social Network Analysis of a Randomized Intervention Study Designed to Prevent Adolescent Problem Behavior.

    PubMed

    DeLay, Dawn; Ha, Thao; Van Ryzin, Mark; Winter, Charlotte; Dishion, Thomas J

    2016-04-01

    Adolescent friendships that promote problem behavior are often chosen in middle school. The current study examines the unintended impact of a randomized school-based intervention on the selection of friends in middle school, as well as on observations of deviant talk with friends 5 years later. Participants included 998 middle school students (526 boys and 472 girls) recruited at the onset of middle school (age 11-12 years) from three public middle schools participating in the Family Check-up model intervention. The current study focuses only on the effects of the SHAPe curriculum-one level of the Family Check-up model-on friendship choices. Participants nominated friends and completed measures of deviant peer affiliation. Approximately half of the sample (n = 500) was randomly assigned to the intervention, and the other half (n = 498) comprised the control group within each school. The results indicate that the SHAPe curriculum affected friend selection within school 1 but not within schools 2 or 3. The effects of friend selection in school 1 translated into reductions in observed deviancy training 5 years later (age 16-17 years). By coupling longitudinal social network analysis with a randomized intervention study, the current findings provide initial evidence that a randomized public middle school intervention can disrupt the formation of deviant peer groups and diminish levels of adolescent deviance 5 years later.

  3. Affinity selection of Nipah and Hendra virus-related vaccine candidates from a complex random peptide library displayed on bacteriophage virus-like particles

    DOEpatents

    Peabody, David S.; Chackerian, Bryce; Ashley, Carlee; Carnes, Eric; Negrete, Oscar

    2017-01-24

    The invention relates to virus-like particles of bacteriophage MS2 (MS2 VLPs) displaying peptide epitopes or peptide mimics of epitopes of Nipah Virus envelope glycoprotein that elicit an immune response against Nipah Virus upon vaccination of humans or animals. Affinity selection on Nipah Virus-neutralizing monoclonal antibodies using random sequence peptide libraries on MS2 VLPs selected peptides with sequence similarity to peptide sequences found within the envelope glycoprotein of Nipah itself, thus identifying the epitopes the antibodies recognize. The selected peptide sequences themselves are not necessarily identical in all respects to a sequence within Nipah Virus glycoprotein, and therefore may be referred to as epitope mimics VLPs displaying these epitope mimics can serve as vaccine. On the other hand, display of the corresponding wild-type sequence derived from Nipah Virus and corresponding to the epitope mapped by affinity selection, may also be used as a vaccine.

  4. Increasing the Generalizability of ANOVA Results by Judicious Selection of Fixed-, Random-, and Mixed-Effects ANOVA Models.

    ERIC Educational Resources Information Center

    Baugh, Frank G.

    The analysis of variance (ANOVA) is a frequently used statistical procedure by which the equality of more than two population means can be tested without inflating the Type I error rate (D. Hinkle, W. Wiersma, and S. Jurs, 1998). Fixed-, random-, and mixed-effects ANOVA models are each capable of yielding interesting and useful results when…

  5. Blood Selenium Concentration and Blood Cystatin C Concentration in a Randomly Selected Population of Healthy Children Environmentally Exposed to Lead and Cadmium.

    PubMed

    Gać, Paweł; Pawlas, Natalia; Wylężek, Paweł; Poręba, Rafał; Poręba, Małgorzata; Pawlas, Krystyna

    2017-01-01

    This study aimed at evaluation of a relationship between blood selenium concentration (Se-B) and blood cystatin C concentration (CST) in a randomly selected population of healthy children, environmentally exposed to lead and cadmium. The studies were conducted on 172 randomly selected children (7.98 ± 0.97 years). Among participants, the subgroups were distinguished, manifesting marginally low blood selenium concentration (Se-B 40-59 μg/l), suboptimal blood selenium concentration (Se-B: 60-79 μg/l) or optimal blood selenium concentration (Se-B ≥ 80 μg/l). At the subsequent stage, analogous subgroups of participants were selected separately in groups of children with BMI below median value (BMI <16.48 kg/m(2)) and in children with BMI ≥ median value (BMI ≥16.48 kg/m(2)). In all participants, values of Se-B and CST were estimated. In the entire group of examined children no significant differences in mean CST values were detected between groups distinguished on the base of normative Se-B values. Among children with BMI below 16.48 kg/m(2), children with marginally low Se-B manifested significantly higher mean CST values, as compared to children with optimum Se-B (0.95 ± 0.07 vs. 0.82 ± 0.15 mg/l, p < 0.05). In summary, in a randomly selected population of healthy children no relationships could be detected between blood selenium concentration and blood cystatin C concentration. On the other hand, in children with low body mass index, a negative non-linear relationship was present between blood selenium concentration and blood cystatin C concentration.

  6. Purification of polyclonal anti-conformational antibodies for use in affinity selection from random peptide phage display libraries: A study using the hydatid vaccine EG95

    PubMed Central

    Read, A.J.; Gauci, C.G.; Lightowlers, M.W.

    2009-01-01

    The use of polyclonal antibodies to screen random peptide phage display libraries often results in the recognition of a large number of peptides that mimic linear epitopes on various proteins. There appears to be a bias in the use of this technology toward the selection of peptides that mimic linear epitopes. In many circumstances the correct folding of a protein immunogen is required for conferring protection. The use of random peptide phage display libraries to identify peptide mimics of conformational epitopes in these cases requires a strategy for overcoming this bias. Conformational epitopes on the hydatid vaccine EG95 have been shown to result in protective immunity in sheep, whereas linear epitopes are not protective. In this paper we describe a strategy that results in the purification of polyclonal antibodies directed against conformational epitopes while eliminating antibodies directed against linear epitopes. These affinity purified antibodies were then used to select a peptide from a random peptide phage display library that has the capacity to mimic conformational epitopes on EG95. This peptide was subsequently used to affinity purify monospecific antibodies against EG95. PMID:19349218

  7. Free variable selection QSPR study to predict (19)F chemical shifts of some fluorinated organic compounds using Random Forest and RBF-PLS methods.

    PubMed

    Goudarzi, Nasser

    2016-04-05

    In this work, two new and powerful chemometrics methods are applied for the modeling and prediction of the (19)F chemical shift values of some fluorinated organic compounds. The radial basis function-partial least square (RBF-PLS) and random forest (RF) are employed to construct the models to predict the (19)F chemical shifts. In this study, we didn't used from any variable selection method and RF method can be used as variable selection and modeling technique. Effects of the important parameters affecting the ability of the RF prediction power such as the number of trees (nt) and the number of randomly selected variables to split each node (m) were investigated. The root-mean-square errors of prediction (RMSEP) for the training set and the prediction set for the RBF-PLS and RF models were 44.70, 23.86, 29.77, and 23.69, respectively. Also, the correlation coefficients of the prediction set for the RBF-PLS and RF models were 0.8684 and 0.9313, respectively. The results obtained reveal that the RF model can be used as a powerful chemometrics tool for the quantitative structure-property relationship (QSPR) studies.

  8. SNPs selected by information content outperform randomly selected microsatellite loci for delineating genetic identification and introgression in the endangered dark European honeybee (Apis mellifera mellifera).

    PubMed

    Muñoz, Irene; Henriques, Dora; Jara, Laura; Johnston, J Spencer; Chávez-Galarza, Julio; De La Rúa, Pilar; Pinto, M Alice

    2016-11-14

    The honeybee (Apis mellifera) has been threatened by multiple factors including pests and pathogens, pesticides and loss of locally adapted gene complexes due to replacement and introgression. In western Europe, the genetic integrity of the native A. m. mellifera (M-lineage) is endangered due to trading and intensive queen breeding with commercial subspecies of eastern European ancestry (C-lineage). Effective conservation actions require reliable molecular tools to identify pure-bred A. m. mellifera colonies. Microsatellites have been preferred for identification of A. m. mellifera stocks across conservation centres. However, owing to high throughput, easy transferability between laboratories and low genotyping error, SNPs promise to become popular. Here, we compared the resolving power of a widely utilized microsatellite set to detect structure and introgression with that of different sets that combine a variable number of SNPs selected for their information content and genomic proximity to the microsatellite loci. Contrary to every SNP data set, microsatellites did not discriminate between the two lineages in the PCA space. Mean introgression proportions were identical across the two marker types, although at the individual level, microsatellites' performance was relatively poor at the upper range of Q-values, a result reflected by their lower precision. Our results suggest that SNPs are more accurate and powerful than microsatellites for identification of A. m. mellifera colonies, especially when they are selected by information content.

  9. Feature selection and classification of urinary mRNA microarray data by iterative random forest to diagnose renal fibrosis: a two-stage study

    PubMed Central

    Zhou, Le-Ting; Cao, Yu-Han; Lv, Lin-Li; Ma, Kun-Ling; Chen, Ping-Sheng; Ni, Hai-Feng; Lei, Xiang-Dong; Liu, Bi-Cheng

    2017-01-01

    Renal fibrosis is a common pathological pathway of progressive chronic kidney disease (CKD). However, kidney function parameters are suboptimal for detecting early fibrosis, and therefore, novel biomarkers are urgently needed. We designed a 2-stage study and constructed a targeted microarray to detect urinary mRNAs of CKD patients with renal biopsy and healthy participants. We analysed the microarray data by an iterative random forest method to select candidate biomarkers and produce a more accurate classifier of renal fibrosis. Seventy-six and 49 participants were enrolled into stage I and stage II studies, respectively. By the iterative random forest method, we identified a four-mRNA signature in urinary sediment, including TGFβ1, MMP9, TIMP2, and vimentin, as important features of tubulointerstitial fibrosis (TIF). All four mRNAs significantly correlated with TIF scores and discriminated TIF with high sensitivity, which was further validated in the stage-II study. The combined classifiers showed excellent sensitivity and outperformed serum creatinine and estimated glomerular filtration rate measurements in diagnosing TIF. Another four mRNAs significantly correlated with glomerulosclerosis. These findings showed that urinary mRNAs can serve as sensitive biomarkers of renal fibrosis, and the random forest classifier containing urinary mRNAs showed favourable performance in diagnosing early renal fibrosis. PMID:28045061

  10. H-DROP: an SVM based helical domain linker predictor trained with features optimized by combining random forest and stepwise selection

    NASA Astrophysics Data System (ADS)

    Ebina, Teppei; Suzuki, Ryosuke; Tsuji, Ryotaro; Kuroda, Yutaka

    2014-08-01

    Domain linker prediction is attracting much interest as it can help identifying novel domains suitable for high throughput proteomics analysis. Here, we report H-DROP, an SVM-based Helical Domain linker pRediction using OPtimal features. H-DROP is, to the best of our knowledge, the first predictor for specifically and effectively identifying helical linkers. This was made possible first because a large training dataset became available from IS-Dom, and second because we selected a small number of optimal features from a huge number of potential ones. The training helical linker dataset, which included 261 helical linkers, was constructed by detecting helical residues at the boundary regions of two independent structural domains listed in our previously reported IS-Dom dataset. 45 optimal feature candidates were selected from 3,000 features by random forest, which were further reduced to 26 optimal features by stepwise selection. The prediction sensitivity and precision of H-DROP were 35.2 and 38.8 %, respectively. These values were over 10.7 % higher than those of control methods including our previously developed DROP, which is a coil linker predictor, and PPRODO, which is trained with un-differentiated domain boundary sequences. Overall, these results indicated that helical linkers can be predicted from sequence information alone by using a strictly curated training data set for helical linkers and carefully selected set of optimal features. H-DROP is available at http://domserv.lab.tuat.ac.jp.

  11. Bendamustine, thalidomide and dexamethasone combination therapy for relapsed/refractory myeloma patients: results of the MUKone randomized dose selection trial.

    PubMed

    Schey, Steve; Brown, Sarah R; Tillotson, Avie-Lee; Yong, Kwee; Williams, Cathy; Davies, Faith; Morgan, Gareth; Cavenagh, Jamie; Cook, Gordon; Cook, Mark; Orti, Guillermo; Morris, Curly; Sherratt, Debbie; Flanagan, Louise; Gregory, Walter; Cavet, James

    2015-08-01

    There is a significant unmet need in effective therapy for relapsed myeloma patients once they become refractory to bortezomib and lenalidomide. While data from the front line setting suggest bendamustine is superior to melphalan, there is no information defining optimal bendamustine dose in multiply-treated patients. We report a multi-centre randomized two-stage phase 2 trial simultaneously assessing deliverability and activity of two doses of bendamustine (60 mg/m2 vs. 100 mg/m2) days 1 and 8, thalidomide (100 mg) days 1-21 and low dose dexamethasone (20 mg) days 1, 8, 15 and 22 of a 28-d cycle. Ninety-four relapsing patients were treated on trial, with a median three prior treatment lines. A pre-planned interim deliverability and activity assessment led to closure of the 100 mg/m2 arm due to excess cytopenias, and led to amendment of entry criteria for cytopenias. Non-haematological toxicities including thromboembolism and neurotoxicity were infrequent. In the 60 mg/m2 arm, treatment was deliverable in 61.1% subjects and the partial response rate was 46.3% in the study eligible population, with 7.5 months progression-free survival. This study demonstrates bendamustine at 60 mg/m2 twice per month with thalidomide and dexamethasone is deliverable for repeated cycles in heavily pre-treated myeloma patients and has substantial clinical activity.

  12. Recruitment strategies should not be randomly selected: empirically improving recruitment success and diversity in developmental psychology research

    PubMed Central

    Sugden, Nicole A.; Moulson, Margaret C.

    2015-01-01

    Psychological and developmental research have been critiqued for the lack of diversity of research samples. Because differences in culture, race, and ethnicity can influence participant behavior, limited diversity limits the generalizability of the findings. These differences may also impact how participants behave in response to recruitment attempts, which suggests that recruitment itself may be leveraged to increase sample diversity. The goal of the current study was to determine what factors, within a recruitment interaction, could be leveraged to increase success and diversity when recruiting families with children for developmental research. Study 1 found three factors influenced success: (1) recruitment was more successful when other potential participants were also interested (i.e., recruiters were busy), (2) recruiters of particular races were more successful than recruiters of other races, and (3) differences in success were related to what the recruiter said to engage the potential participant (i.e., the script). The latter two factors interacted, suggesting some recruiters were using less optimal scripts. To improve success rates, study 2 randomly assigned scripts to recruiters and encouraged them to recruit more vigorously during busy periods. Study 2 found that two factors influenced success: (1) some scripts were more successful than others and (2) we were more successful at recruiting non-White potential participants than White participants. These two interacted, with some scripts being more successful with White and other scripts being more successful with non-White families. This intervention significantly increased recruitment success rate by 8.1% and the overall number of families recruited by 15.3%. These findings reveal that empirically evaluating and tailoring recruitment efforts based on the most successful strategies is effective in boosting diversity through increased participation of children from non-White families. PMID:25972829

  13. Recruitment strategies should not be randomly selected: empirically improving recruitment success and diversity in developmental psychology research.

    PubMed

    Sugden, Nicole A; Moulson, Margaret C

    2015-01-01

    Psychological and developmental research have been critiqued for the lack of diversity of research samples. Because differences in culture, race, and ethnicity can influence participant behavior, limited diversity limits the generalizability of the findings. These differences may also impact how participants behave in response to recruitment attempts, which suggests that recruitment itself may be leveraged to increase sample diversity. The goal of the current study was to determine what factors, within a recruitment interaction, could be leveraged to increase success and diversity when recruiting families with children for developmental research. Study 1 found three factors influenced success: (1) recruitment was more successful when other potential participants were also interested (i.e., recruiters were busy), (2) recruiters of particular races were more successful than recruiters of other races, and (3) differences in success were related to what the recruiter said to engage the potential participant (i.e., the script). The latter two factors interacted, suggesting some recruiters were using less optimal scripts. To improve success rates, study 2 randomly assigned scripts to recruiters and encouraged them to recruit more vigorously during busy periods. Study 2 found that two factors influenced success: (1) some scripts were more successful than others and (2) we were more successful at recruiting non-White potential participants than White participants. These two interacted, with some scripts being more successful with White and other scripts being more successful with non-White families. This intervention significantly increased recruitment success rate by 8.1% and the overall number of families recruited by 15.3%. These findings reveal that empirically evaluating and tailoring recruitment efforts based on the most successful strategies is effective in boosting diversity through increased participation of children from non-White families.

  14. Treatment Selection Choices Should Not Be Based on Benefits or Costs Alone: A Head-to-Head Randomized Controlled Trial of Antiviral Drugs for Hepatitis C

    PubMed Central

    Davitkov, Perica; Chandar, Apoorva Krishna; Hirsch, Amy; Compan, Anita; Silveira, Marina G.; Anthony, Donald D.; Smith, Suzanne; Gideon, Clare; Bonomo, Robert A.; Falck-Ytter, Yngve

    2016-01-01

    , pragmatic randomized controlled trials are necessary for guidance beyond just acquisition costs and to make evidence-based formulary selections when multiple effective treatments are available. (Clinicaltrials.gov registration: NCT02113631). PMID:27741230

  15. Impact of random and systematic recall errors and selection bias in case--control studies on mobile phone use and brain tumors in adolescents (CEFALO study).

    PubMed

    Aydin, Denis; Feychting, Maria; Schüz, Joachim; Andersen, Tina Veje; Poulsen, Aslak Harbo; Prochazka, Michaela; Klaeboe, Lars; Kuehni, Claudia E; Tynes, Tore; Röösli, Martin

    2011-07-01

    Whether the use of mobile phones is a risk factor for brain tumors in adolescents is currently being studied. Case--control studies investigating this possible relationship are prone to recall error and selection bias. We assessed the potential impact of random and systematic recall error and selection bias on odds ratios (ORs) by performing simulations based on real data from an ongoing case--control study of mobile phones and brain tumor risk in children and adolescents (CEFALO study). Simulations were conducted for two mobile phone exposure categories: regular and heavy use. Our choice of levels of recall error was guided by a validation study that compared objective network operator data with the self-reported amount of mobile phone use in CEFALO. In our validation study, cases overestimated their number of calls by 9% on average and controls by 34%. Cases also overestimated their duration of calls by 52% on average and controls by 163%. The participation rates in CEFALO were 83% for cases and 71% for controls. In a variety of scenarios, the combined impact of recall error and selection bias on the estimated ORs was complex. These simulations are useful for the interpretation of previous case-control studies on brain tumor and mobile phone use in adults as well as for the interpretation of future studies on adolescents.

  16. A theory for the origin of a self-replicating chemical system. I - Natural selection of the autogen from short, random oligomers

    NASA Technical Reports Server (NTRS)

    White, D. H.

    1980-01-01

    A general theory is presented for the origin of a self-replicating chemical system, termed an autogen, which is capable of both crude replication and translation (protein synthesis). The theory requires the availability of free energy and monomers to the system, a significant background low-yield synthesis of kinetically stable oligopeptides and oligonucleotides, the localization of the oligomers, crude oligonucleotide selectivity of amino acids during oligopeptide synthesis, crude oligonucleotide replication, and two short peptide families which catalyze replication and translation, to produce a localized group of at least one copy each of two protogenes and two protoenzymes. The model posits a process of random oligomerization, followed by the random nucleation of functional components and the rapid autocatalytic growth of the functioning autogen to macroscopic amounts, to account for the origin of the first self-replicating system. Such a process contains steps of such high probability and short time periods that it is suggested that the emergence of an autogen in a laboratory experiment of reasonable time scale may be possible.

  17. A comparison of the effects of random and selective mass extinctions on erosion of evolutionary history in communities of digital organisms.

    PubMed

    Yedid, Gabriel; Stredwick, Jason; Ofria, Charles A; Agapow, Paul-Michael

    2012-01-01

    The effect of mass extinctions on phylogenetic diversity and branching history of clades remains poorly understood in paleobiology. We examined the phylogenies of communities of digital organisms undergoing open-ended evolution as we subjected them to instantaneous "pulse" extinctions, choosing survivors at random, and to prolonged "press" extinctions involving a period of low resource availability. We measured age of the phylogenetic root and tree stemminess, and evaluated how branching history of the phylogenetic trees was affected by the extinction treatments. We found that strong random (pulse) and strong selective extinction (press) both left clear long-term signatures in root age distribution and tree stemminess, and eroded deep branching history to a greater degree than did weak extinction and control treatments. The widely-used Pybus-Harvey gamma statistic showed a clear short-term response to extinction and recovery, but differences between treatments diminished over time and did not show a long-term signature. The characteristics of post-extinction phylogenies were often affected as much by the recovery interval as by the extinction episode itself.

  18. A Comparison of the Effects of Random and Selective Mass Extinctions on Erosion of Evolutionary History in Communities of Digital Organisms

    PubMed Central

    Yedid, Gabriel; Stredwick, Jason; Ofria, Charles A.; Agapow, Paul-Michael

    2012-01-01

    The effect of mass extinctions on phylogenetic diversity and branching history of clades remains poorly understood in paleobiology. We examined the phylogenies of communities of digital organisms undergoing open-ended evolution as we subjected them to instantaneous “pulse” extinctions, choosing survivors at random, and to prolonged “press” extinctions involving a period of low resource availability. We measured age of the phylogenetic root and tree stemminess, and evaluated how branching history of the phylogenetic trees was affected by the extinction treatments. We found that strong random (pulse) and strong selective extinction (press) both left clear long-term signatures in root age distribution and tree stemminess, and eroded deep branching history to a greater degree than did weak extinction and control treatments. The widely-used Pybus-Harvey gamma statistic showed a clear short-term response to extinction and recovery, but differences between treatments diminished over time and did not show a long-term signature. The characteristics of post-extinction phylogenies were often affected as much by the recovery interval as by the extinction episode itself. PMID:22693570

  19. Customer-oriented counseling for physical activity in older people: study protocol and selected baseline results of a randomized-controlled trial (ISRCTN 07330512).

    PubMed

    Leinonen, R; Heikkinen, E; Hirvensalo, M; Lintunen, T; Rasinaho, M; Sakari-Rantala, R; Kallinen, M; Koski, J; Möttönen, S; Kannas, S; Huovinen, P; Rantanen, T

    2007-04-01

    The objective of this study is to describe the rationale, design and selected baseline results of a 2-year randomized-controlled trial (RCT) on the effects of physical activity counseling in community-living older people. After a four-phase screening and data-collection process targeting all independently living people in the city center of Jyväskylä, Finland, six hundred and thirty-two 75-81-year-old cognitively intact, sedentary persons who were able to move independently outdoors at least minimally and willing to take part in the RCT were randomized into intervention and control groups. At baseline, over half of the subjects exercised less than two to three times a month and two-thirds were willing to increase their physical activity level. The desire to increase physical activity was more common (86%) among subjects with mobility limitation compared with those without (60%, P=0.004). The intervention group received an individualized face-to-face counseling session, followed by phone contacts every 3 months throughout the intervention. The study outcomes include physical activity level, mobility limitation, functional impairments, disability, mood, quality of life, use of services, institutionalization and mortality. The screening and recruitment process was feasible and succeeded well, and showed that unmet physical activity needs are common in older people.

  20. Examination of the transcription factor NtcA-binding motif by in vitro selection of DNA sequences from a random library.

    PubMed

    Jiang, F; Wisén, S; Widersten, M; Bergman, B; Mannervik, B

    2000-08-25

    A recursive in vitro selection among random DNA sequences was used for analysis of the cyanobacterial transcription factor NtcA-binding motifs. An eight-base palindromic sequence, TGTA-(N(8))-TACA, was found to be the optimal NtcA-binding sequence. The more divergent the binding sequences, compared to this consensus sequence, the lower the NtcA affinity. The second and third bases in each four-nucleotide half of the consensus sequence were crucial for NtcA binding, and they were in general highly conserved. The most frequently occurring sequence in the middle weakly conserved region was similar to that of the NtcA-binding motif of the Anabaena sp. strain PCC 7120 glnA gene, previously known to have high affinity for NtcA. This indicates that the middle sequences were selected for high NtcA affinity. Analysis of natural NtcA-binding motifs showed that these could be classified into two groups based on differences in recognition consensus sequences. It is suggested that NtcA naturally recognizes different DNA-binding motifs, or has differential affinities to these sequences under different physiological conditions.

  1. A randomized controlled trial investigating the use of a predictive nomogram for the selection of the FSH starting dose in IVF/ICSI cycles.

    PubMed

    Allegra, Adolfo; Marino, Angelo; Volpes, Aldo; Coffaro, Francesco; Scaglione, Piero; Gullo, Salvatore; La Marca, Antonio

    2017-01-23

    The number of oocytes retrieved is a relevant intermediate outcome in women undergoing IVF/intracytoplasmic sperm injection (ICSI). This trial compared the efficiency of the selection of the FSH starting dose according to a nomogram based on multiple biomarkers (age, day 3 FSH, anti-Müllerian hormone) versus an age-based strategy. The primary outcome measure was the proportion of women with an optimal number of retrieved oocytes defined as 8-14. At their first IVF/ICSI cycle, 191 patients underwent a long gonadotrophin-releasing hormone agonist protocol and were randomized to receive a starting dose of recombinant (human) FSH, based on their age (150 IU if ≤35 years, 225 IU if >35 years) or based on the nomogram. Optimal response was observed in 58/92 patients (63%) in the nomogram group and in 42/99 (42%) in the control group (+21%, 95% CI = 0.07 to 0.35, P = 0.0037). No significant differences were found in the clinical pregnancy rate or the number of embryos cryopreserved per patient. The study showed that the FSH starting dose selected according to ovarian reserve is associated with an increase in the proportion of patients with an optimal response: large trials are recommended to investigate any possible effect on the live-birth rate.

  2. A Randomized, Phase II, Biomarker-Selected Study Comparing Erlotinib to Erlotinib Intercalated With Chemotherapy in First-Line Therapy for Advanced Non–Small-Cell Lung Cancer

    PubMed Central

    Hirsch, Fred R.; Kabbinavar, Fairooz; Eisen, Tim; Martins, Renato; Schnell, Fredrick M.; Dziadziuszko, Rafal; Richardson, Katherine; Richardson, Frank; Wacker, Bret; Sternberg, David W.; Rusk, Jason; Franklin, Wilbur A.; Varella-Garcia, Marileila; Bunn, Paul A.; Camidge, D. Ross

    2011-01-01

    Purpose Erlotinib prolongs survival in patients with advanced non–small-cell lung cancer (NSCLC). We report the results of a randomized, phase II study of erlotinib alone or intercalated with chemotherapy (CT + erlotinib) in chemotherapy-naïve patients with advanced NSCLC who were positive for epidermal growth factor receptor (EGFR) protein expression and/or with high EGFR gene copy number. Patients and Methods A total of 143 patients were randomly assigned to either erlotinib 150 mg daily orally until disease progression (PD) occurred or to chemotherapy with paclitaxel 200 mg/m2 intravenously (IV) and carboplatin dosed by creatinine clearance (AUC 6) IV on day 1 intercalated with erlotinib 150 mg orally on days 2 through 15 every 3 weeks for four cycles followed by erlotinib 150 mg orally until PD occurred (CT + erlotinib). The primary end point was 6-month progression-free survival (PFS); secondary end points included response rate, PFS, and survival. EGFR, KRAS mutation, EGFR fluorescent in situ hybridization and immunohistochemistry, and E-cadherin and vimentin protein levels were also assessed. Results Six-month PFS rates were 26% and 31% for the two arms (CT + erlotinib and erlotinib alone, respectively). Both were less than the historical control of 45% (P = .001 and P = .011, respectively). Median PFS times were 4.57 and 2.69 months, respectively. Patients with tumors harboring EGFR activating mutations fared better on erlotinib alone (median PFS, 18.2 months v 4.9 months for CT + erlotinib). Conclusion The feasibility of a multicenter biomarker-driven study was demonstrated, but neither treatment arms exceeded historical controls. This study does not support combined chemotherapy and erlotinib in first-line treatment of EGFR-selected advanced NSCLC, and the patients with tumors harboring EGFR mutations had a better outcome on erlotinib alone. PMID:21825259

  3. NBI‐98854, a selective monoamine transport inhibitor for the treatment of tardive dyskinesia: A randomized, double‐blind, placebo‐controlled study

    PubMed Central

    Jimenez, Roland; Hauser, Robert A.; Factor, Stewart A.; Burke, Joshua; Mandri, Daniel; Castro‐Gayol, Julio C.

    2015-01-01

    ABSTRACT Background Tardive dyskinesia is a persistent movement disorder induced by chronic neuroleptic exposure. NBI‐98854 is a novel, highly selective, vesicular monoamine transporter 2 inhibitor. We present results of a randomized, 6‐week, double‐blind, placebo‐controlled, dose‐titration study evaluating the safety, tolerability, and efficacy of NBI‐98854 for the treatment of tardive dyskinesia. Methods Male and female adult subjects with moderate or severe tardive dyskinesia were included. NBI‐98854 or placebo was given once per day starting at 25 mg and then escalated by 25 mg to a maximum of 75 mg based on dyskinesia and tolerability assessment. The primary efficacy endpoint was the change in Abnormal Involuntary Movement Scale from baseline at week 6 scored by blinded, central video raters. The secondary endpoint was the Clinical Global Impression of Change—Tardive Dyskinesia score assessed by the blinded investigator. Results Two hundred five potential subjects were screened, and 102 were randomized; 76% of NBI‐98854 subjects and 80% of placebo subjects reached the maximum allowed dose. Abnormal Involuntary Movement Scale scores for NBI‐98854 compared with placebo were significantly reduced (p = 0.0005). Active drug was also superior on the Clinical Global Impression of Change—Tardive Dyskinesia (p < 0.0001). Treatment‐emergent adverse event rates were 49% in the NBI‐98854 and 33% in the placebo subjects. The most common adverse events (active vs. placebo) were fatigue and headache (9.8% vs. 4.1%) and constipation and urinary tract infection (3.9% vs. 6.1%). No clinically relevant changes in safety assessments were noted. Conclusion NBI‐98854 significantly improved tardive dyskinesia and was well tolerated in patients. These results support the phase 3 clinical trials of NBI‐98854 now underway. © 2015 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and Movement Disorder

  4. Selection of IgG Variants with Increased FcRn Binding Using Random and Directed Mutagenesis: Impact on Effector Functions

    PubMed Central

    Monnet, Céline; Jorieux, Sylvie; Urbain, Rémi; Fournier, Nathalie; Bouayadi, Khalil; De Romeuf, Christophe; Behrens, Christian K.; Fontayne, Alexandre; Mondon, Philippe

    2015-01-01

    Despite the reasonably long half-life of immunoglogulin G (IgGs), market pressure for higher patient convenience while conserving efficacy continues to drive IgG half-life improvement. IgG half-life is dependent on the neonatal Fc receptor (FcRn), which among other functions, protects IgG from catabolism. FcRn binds the Fc domain of IgG at an acidic pH ensuring that endocytosed IgG will not be degraded in lysosomal compartments and will then be released into the bloodstream. Consistent with this mechanism of action, several Fc-engineered IgG with increased FcRn affinity and conserved pH dependency were designed and resulted in longer half-life in vivo in human FcRn-transgenic mice (hFcRn), cynomolgus monkeys, and recently in healthy humans. These IgG variants were usually obtained by in silico approaches or directed mutagenesis in the FcRn-binding site. Using random mutagenesis, combined with a pH-dependent phage display selection process, we isolated IgG variants with improved FcRn-binding, which exhibited longer in vivo half-life in hFcRn mice. Interestingly, many mutations enhancing Fc/FcRn interaction were located at a distance from the FcRn-binding site validating our random molecular approach. Directed mutagenesis was then applied to generate new variants to further characterize our IgG variants and the effect of the mutations selected. Since these mutations are distributed over the whole Fc sequence, binding to other Fc effectors, such as complement C1q and FcγRs, was dramatically modified, even by mutations distant from these effectors’ binding sites. Hence, we obtained numerous IgG variants with increased FcRn-binding and different binding patterns to other Fc effectors, including variants without any effector function, providing distinct “fit-for-purpose” Fc molecules. We therefore provide evidence that half-life and effector functions should be optimized simultaneously as mutations can have unexpected effects on all Fc receptors that are critical

  5. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    ERIC Educational Resources Information Center

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  6. Rock magnetic evidence of non-random raw material selection criteria in Cerro Toledo Obsidian Artifacts from Valles Caldera, New Mexico

    NASA Astrophysics Data System (ADS)

    Gregovich, A.; Feinberg, J. M.; Steffen, A.; Sternberg, R. S.

    2014-12-01

    Stone tools are one of the most enduring forms of ancient human behavior available to anthropologists. The geologic materials that comprise stone tools are a reflection of the rocks that were available locally or through trade, as are the intended use of the tools and the knapping technology needed to produce them. Investigation of the rock magnetic and geochemical characteristics of the artifacts and the geological source materials provides a baseline to explore these past behaviors. This study uses rock magnetic properties to explore the raw material selection criteria involved in the production of obsidian tools in the region around Valles Caldera in northern New Mexico. Obsidian is locally abundant and was traded by tribes across the central United States. Here we compare the rock magnetic properties of a sample of obsidian projectile points (N =25) that have been geochemically sourced to the Cerro Toledo obsidian flow with geological samples collected from four sites within the same flow (N =135). This collection of archaeological artifacts, albeit small, contains representatives of at least 8 different point styles that were used over 6000 years from the Archaic into the Late Prehistoric. Bulk rock hysteresis parameters (Mr, Ms, Bc, and Bcr) and low-field susceptibility (Χ) measurements show that the projectile points generally contain a lower concentration of magnetic minerals than the geologic samples. For example, the artifacts' median Ms value is 2.9 x 10-3 Am2kg-1, while that of the geological samples is 6.5 x 10-3 Am2kg-1. The concentration of magnetic minerals in obsidian is a proxy for the concentration of microlites in general, and this relationship suggests that although obsidian was locally abundant, toolmakers employed non-random selection criteria resulting in generally lower concentrations of microlites in their obsidian tools.

  7. Selective processing of auditory evoked responses with iterative-randomized stimulation and averaging: A strategy for evaluating the time-invariant assumption.

    PubMed

    Valderrama, Joaquin T; de la Torre, Angel; Medina, Carlos; Segura, Jose C; Thornton, A Roger D

    2016-03-01

    The recording of auditory evoked potentials (AEPs) at fast rates allows the study of neural adaptation, improves accuracy in estimating hearing threshold and may help diagnosing certain pathologies. Stimulation sequences used to record AEPs at fast rates require to be designed with a certain jitter, i.e., not periodical. Some authors believe that stimuli from wide-jittered sequences may evoke auditory responses of different morphology, and therefore, the time-invariant assumption would not be accomplished. This paper describes a methodology that can be used to analyze the time-invariant assumption in jittered stimulation sequences. The proposed method [Split-IRSA] is based on an extended version of the iterative randomized stimulation and averaging (IRSA) technique, including selective processing of sweeps according to a predefined criterion. The fundamentals, the mathematical basis and relevant implementation guidelines of this technique are presented in this paper. The results of this study show that Split-IRSA presents an adequate performance and that both fast and slow mechanisms of adaptation influence the evoked-response morphology, thus both mechanisms should be considered when time-invariance is assumed. The significance of these findings is discussed.

  8. Prevalence of skeletal and eye malformations in frogs from north-central United States: estimations based on collections from randomly selected sites

    USGS Publications Warehouse

    Schoff, P.K.; Johnson, C.M.; Schotthoefer, A.M.; Murphy, J.E.; Lieske, C.; Cole, R.A.; Johnson, L.B.; Beasley, V.R.

    2003-01-01

    Skeletal malformation rates for several frog species were determined in a set of randomly selected wetlands in the north-central USA over three consecutive years. In 1998, 62 sites yielded 389 metamorphic frogs, nine (2.3%) of which had skeletal or eye malformations. A subset of the original sites was surveyed in the following 2 yr. In 1999, 1,085 metamorphic frogs were collected from 36 sites and 17 (1.6%) had skeletal or eye malformations, while in 2000, examination of 1,131 metamorphs yielded 16 (1.4%) with skeletal or eye malformations. Hindlimb malformations predominated in all three years, but other abnormalities, involving forelimb, eye, and pelvis were also found. Northern leopard frogs (Rana pipiens) constituted the majority of collected metamorphs as well as most of the malformed specimens. However, malformations were also noted in mink frogs (R. septentrionalis), wood frogs (R. sylvatica), and gray tree frogs (Hyla spp.). The malformed specimens were found in clustered sites in all three years but the cluster locations were not the same in any year. The malformation rates reported here are higher than the 0.3% rate determined for metamorphic frogs collected from similar sites in Minnesota in the 1960s, and thus, appear to represent an elevation of an earlier baseline malformation rate.

  9. On Random Numbers and Design

    ERIC Educational Resources Information Center

    Ben-Ari, Morechai

    2004-01-01

    The term "random" is frequently used in discussion of the theory of evolution, even though the mathematical concept of randomness is problematic and of little relevance in the theory. Therefore, since the core concept of the theory of evolution is the non-random process of natural selection, the term random should not be used in teaching the…

  10. An assessment of the quality of care for children in eighteen randomly selected district and sub-district hospitals in Bangladesh

    PubMed Central

    2012-01-01

    Background Quality hospital care is important in ensuring that the needs of severely ill children are met to avert child mortality. However, the quality of hospital care for children in developing countries has often been found poor. As the first step of a country road map for improving hospital care for children, we assessed the baseline situation with respect to the quality of care provided to children under-five years age in district and sub-district level hospitals in Bangladesh. Methods Using adapted World Health Organization (WHO) hospital assessment tools and standards, an assessment of 18 randomly selected district (n=6) and sub-district (n=12) hospitals was undertaken. Teams of trained assessors used direct case observation, record review, interviews, and Management Information System (MIS) data to assess the quality of clinical case management and monitoring; infrastructure, processes and hospital administration; essential hospital and laboratory supports, drugs and equipment. Results Findings demonstrate that the overall quality of care provided in these hospitals was poor. No hospital had a functioning triage system to prioritise those children most in need of immediate care. Laboratory supports and essential equipment were deficient. Only one hospital had all of the essential drugs for paediatric care. Less than a third of hospitals had a back-up power supply, and just under half had functioning arrangements for safe-drinking water. Clinical case management was found to be sub-optimal for prevalent illnesses, as was the quality of neonatal care. Conclusion Action is needed to improve the quality of paediatric care in hospital settings in Bangladesh, with a particular need to invest in improving newborn care. PMID:23268650

  11. Sequence Based Prediction of DNA-Binding Proteins Based on Hybrid Feature Selection Using Random Forest and Gaussian Naïve Bayes

    PubMed Central

    Lou, Wangchao; Wang, Xiaoqing; Chen, Fan; Chen, Yixiao; Jiang, Bo; Zhang, Hua

    2014-01-01

    Developing an efficient method for determination of the DNA-binding proteins, due to their vital roles in gene regulation, is becoming highly desired since it would be invaluable to advance our understanding of protein functions. In this study, we proposed a new method for the prediction of the DNA-binding proteins, by performing the feature rank using random forest and the wrapper-based feature selection using forward best-first search strategy. The features comprise information from primary sequence, predicted secondary structure, predicted relative solvent accessibility, and position specific scoring matrix. The proposed method, called DBPPred, used Gaussian naïve Bayes as the underlying classifier since it outperformed five other classifiers, including decision tree, logistic regression, k-nearest neighbor, support vector machine with polynomial kernel, and support vector machine with radial basis function. As a result, the proposed DBPPred yields the highest average accuracy of 0.791 and average MCC of 0.583 according to the five-fold cross validation with ten runs on the training benchmark dataset PDB594. Subsequently, blind tests on the independent dataset PDB186 by the proposed model trained on the entire PDB594 dataset and by other five existing methods (including iDNA-Prot, DNA-Prot, DNAbinder, DNABIND and DBD-Threader) were performed, resulting in that the proposed DBPPred yielded the highest accuracy of 0.769, MCC of 0.538, and AUC of 0.790. The independent tests performed by the proposed DBPPred on completely a large non-DNA binding protein dataset and two RNA binding protein datasets also showed improved or comparable quality when compared with the relevant prediction methods. Moreover, we observed that majority of the selected features by the proposed method are statistically significantly different between the mean feature values of the DNA-binding and the non DNA-binding proteins. All of the experimental results indicate that the proposed DBPPred

  12. A prospective randomized multicenter trial of amnioreduction versus selective fetoscopic laser photocoagulation for the treatment of severe twin–twin transfusion syndrome

    PubMed Central

    Crombleholme, Timothy M.; Shera, David; Lee, Hanmin; Johnson, Mark; D’Alton, Mary; Porter, Flint; Chyu, Jacquelyn; Silver, Richard; Abuhamad, Alfred; Saade, George; Shields, Laurence; Kauffman, David; Stone, Joanne; Albanese, Craig T.; Bahado-Singh, Ray; Ball, Robert H.; Bilaniuk, Larissa; Coleman, Beverly; Farmer, Diana; Feldstein, Vickie; Harrison, Michael R.; Hedrick, Holly; Livingston, Jeffrey; Lorenz, Robert P.; Miller, David A.; Norton, Mary E.; Polzin, William J.; Robinson, Julian N.; Rychik, Jack; Sandberg, Per L.; Seri, Istvan; Simon, Erin; Simpson, Lynn L.; Yedigarova, Larisa; Wilson, R. Douglas; Young, Bruce

    2009-01-01

    Objective To examine the effect of selective fetoscopic laser photocoagulation (SFLP) versus serial amnioreduction (AR) on perinatal mortality in severe twin-twin transfusion syndrome (TTTS). Study Design 5-year multicenter prospective randomized controlled trial. The primary outcome variable was 30-day postnatal survival of donors and recipients. Results There is no statistically significant difference in 30-day postnatal survival between SFLP or AR treatment for donors at 55% (11/20) vs 55% (11/20) (p=1, OR=1, 95%CI=0.242 to 4.14) or recipients at 30% (6/20) vs 45% (9/20) (p=0.51, OR=1.88, 95%CI=0.44 to 8.64). There is no difference in 30-day survival of one or both twins on a per pregnancy basis between AR at 75% (15/20) and SFLP at 65% (13/20) (p=0.73, OR=1.62, 95%CI=0.34 to 8.09). Overall survival (newborns divided by the number of fetuses treated) is not statistically significant for AR at 60% (24/40) vs SFLP 45% (18/40) (p=0.18, OR=2.01, 95%CI=0.76 to 5.44). There is a statistically significant increase in fetal recipient mortality in the SFLP arm at 70% (14/20) versus the AR arm at 35% (7/20) (p=0.25, OR=5.31, 95%CI=1.19 to 27.6). This is offset by increased recipient neonatal mortality of 30% (6/20) in the AR arm. Echocardiographic abnormality in recipient twin Cardiovascular Profile Score is the most significant predictor of recipient mortality (p=0.055, OR=3.025/point) by logistic regression analysis. Conclusions The outcome of the trial does not conclusively determine whether AR or SFLP is a superior treatment modality. TTTS cardiomyopathy appears to be an important factor in recipient survival in TTTS. PMID:17904975

  13. The RANDOM computer program: A linear congruential random number generator

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  14. Selective laser melting: a unit cell approach for the manufacture of porous, titanium, bone in-growth constructs, suitable for orthopedic applications. II. Randomized structures.

    PubMed

    Mullen, Lewis; Stamp, Robin C; Fox, Peter; Jones, Eric; Ngo, Chau; Sutcliffe, Christopher J

    2010-01-01

    In this study, the unit cell approach, which has previously been demonstrated as a method of manufacturing porous components suitable for use as orthopedic implants, has been further developed to include randomized structures. These random structures may aid the bone in-growth process because of their similarity in appearance to trabecular bone and are shown to carry legacy properties that can be related back to the original unit cell on which they are ultimately based. In addition to this, it has been shown that randomization improves the mechanical properties of regular unit cell structures, resulting in anticipated improvements to both implant functionality and longevity. The study also evaluates the effect that a post process sinter cycle has on the components, outlines the improved mechanical properties that are attainable, and also the changes in both the macro and microstructure that occur.

  15. [An open randomized comparative trial of efficacy and safety of selective alpha-adrenoblocker setegis (terazosin) in therapy of patients with chronic bacterial prostatitis].

    PubMed

    Trapeznikova, M F; Morozov, A P; Dutov, V V; Urenkov, S B; Pozdniakov, K V; Bychkova, N V

    2007-01-01

    An open randomized comparative trial of setegis (terazosine) has shown good subjective and objective results in patients with chronic bacterial prostatitis. The drug is well tolerated and produces insignificant side effects. It is also demonstrated that combined therapy with alpha-adrenoblockers is more effective that monotherapy with antibacterial drugs in patients with bacterial prostatitis.

  16. Glass transition of a particle in a random potential, front selection in nonlinear renormalization group, and entropic phenomena in Liouville and sinh-Gordon models

    NASA Astrophysics Data System (ADS)

    Carpentier, David; Le Doussal, Pierre

    2001-02-01

    We study via renormalization group (RG), numerics, exact bounds, and qualitative arguments the equilibrium Gibbs measure of a particle in a d-dimensional Gaussian random potential with translationally invariant logarithmic spatial correlations. We show that for any d>=1 it exhibits a transition at T=Tc>0. The low-temperature glass phase has a nontrivial structure, being dominated by a few distant states (with replica symmetry breaking phenomenology). In finite dimension this transition exists only in this ``marginal glass'' case (energy fluctuation exponent θ=0) and disappears if correlations grow faster (single ground-state dominance θ>0) or slower (high-temperature phase). The associated extremal statistics problem for correlated energy landscapes exhibits universal features which we describe using a nonlinear Kolmogorov (KPP) RG equation. These include the tails of the distribution of the minimal energy (or free energy) and the finite-size corrections, which are universal. The glass transition is closely related to Derrida's random energy models. In d=2, the connection between this problem and Liouville and sinh-Gordon models is discussed. The glass transition of the particle exhibits interesting similarities with the weak- to strong-coupling transition in Liouville (c=1 barrier) and with a transition that we conjecture for the sinh-Gordon model, with correspondence in some exact results and RG analysis. Glassy freezing of the particle is associated with the generation under RG of new local operators and of nonsmooth configurations in Liouville. Applications to Dirac fermions in random magnetic fields at criticality reveal a peculiar ``quasilocalized'' regime (corresponding to the glass phase for the particle), where eigenfunctions are concentrated over a finite number of distant regions, and allow us to recover the multifractal spectrum in the delocalized regime.

  17. Randomization Strategies.

    PubMed

    Kepler, Christopher K

    2017-04-01

    An understanding of randomization is important both for study design and to assist medical professionals in evaluating the medical literature. Simple randomization can be done through a variety of techniques, but carries a risk of unequal distribution of subjects into treatment groups. Block randomization can be used to overcome this limitation by ensuring that small subgroups are distributed evenly between treatment groups. Finally, techniques can be used to evenly distribute subjects between treatment groups while accounting for confounding variables, so as to not skew results when there is a high index of suspicion that a particular variable will influence outcome.

  18. Randomization and sampling issues

    USGS Publications Warehouse

    Geissler, P.H.

    1996-01-01

    The need for randomly selected routes and other sampling issues have been debated by the Amphibian electronic discussion group. Many excellent comments have been made, pro and con, but we have not reached consensus yet. This paper brings those comments together and attempts a synthesis. I hope that the resulting discussion will bring us closer to a consensus.

  19. Random thoughts

    NASA Astrophysics Data System (ADS)

    ajansen; kwhitefoot; panteltje1; edprochak; sudhakar, the

    2014-07-01

    In reply to the physicsworld.com news story “How to make a quantum random-number generator from a mobile phone” (16 May, http://ow.ly/xFiYc, see also p5), which describes a way of delivering random numbers by counting the number of photons that impinge on each of the individual pixels in the camera of a Nokia N9 smartphone.

  20. Three lessons from a randomized trial of massage and meditation at end of life: patient benefit, outcome measure selection, and design of trials with terminally ill patients.

    PubMed

    Downey, Lois; Engelberg, Ruth A; Standish, Leanna J; Kozak, Leila; Lafferty, William E

    2009-01-01

    Improving end-of-life care is a priority in the United States, but assigning priorities for standard care services requires evaluations using appropriate study design and appropriate outcome indicators. A recent randomized controlled trial with terminally ill patients produced no evidence of benefit from massage or guided meditation, when evaluated with measures of global quality of life or pain distress over the course of patient participation. However, reanalysis using a more targeted outcome, surrogates' assessment of patients' benefit from the study intervention, suggested significant gains from massage-the treatment patients gave their highest preassignment preference ratings. The authors conclude that adding a menu of complementary therapies as part of standard end-of-life care may yield significant benefit, that patient preference is an important predictor of outcome, and that modifications in trial design may be appropriate for end-of-life studies.

  1. Randomized Comparison of Selective Internal Radiotherapy (SIRT) Versus Drug-Eluting Bead Transarterial Chemoembolization (DEB-TACE) for the Treatment of Hepatocellular Carcinoma

    SciTech Connect

    Pitton, Michael B. Kloeckner, Roman; Ruckes, Christian; Wirth, Gesine M.; Eichhorn, Waltraud; Wörns, Marcus A.; Weinmann, Arndt; Schreckenberger, Mathias; Galle, Peter R.; Otto, Gerd; Dueber, Christoph

    2015-04-15

    PurposeTo prospectively compare SIRT and DEB-TACE for treating hepatocellular carcinoma (HCC).MethodsFrom 04/2010–07/2012, 24 patients with histologically proven unresectable N0, M0 HCCs were randomized 1:1 to receive SIRT or DEB-TACE. SIRT could be repeated once in case of recurrence; while, TACE was repeated every 6 weeks until no viable tumor tissue was detected by MRI or contraindications prohibited further treatment. Patients were followed-up by MRI every 3 months; the final evaluation was 05/2013.ResultsBoth groups were comparable in demographics (SIRT: 8males/4females, mean age 72 ± 7 years; TACE: 10males/2females, mean age 71 ± 9 years), initial tumor load (1 patient ≥25 % in each group), and BCLC (Barcelona Clinic Liver Cancer) stage (SIRT: 12×B; TACE 1×A, 11×B). Median progression-free survival (PFS) was 180 days for SIRT versus 216 days for TACE patients (p = 0.6193) with a median TTP of 371 days versus 336 days, respectively (p = 0.5764). Median OS was 592 days for SIRT versus 788 days for TACE patients (p = 0.9271). Seven patients died in each group. Causes of death were liver failure (n = 4 SIRT group), tumor progression (n = 4 TACE group), cardiovascular events, and inconclusive (n = 1 in each group).ConclusionsNo significant differences were found in median PFS, OS, and TTP. The lower rate of tumor progression in the SIRT group was nullified by a greater incidence of liver failure. This pilot study is the first prospective randomized trial comparing SIRT and TACE for treating HCC, and results can be used for sample size calculations of future studies.

  2. Selective CO2 Sequestration with Monolithic Bimodal Micro/Macroporous Carbon Aerogels Derived from Stepwise Pyrolytic Decomposition of Polyamide-Polyimide-Polyurea Random Copolymers.

    PubMed

    Saeed, Adnan M; Rewatkar, Parwani M; Majedi Far, Hojat; Taghvaee, Tahereh; Donthula, Suraj; Mandal, Chandana; Sotiriou-Leventis, Chariklia; Leventis, Nicholas

    2017-04-05

    Polymeric aerogels (PA-xx) were synthesized via room-temperature reaction of an aromatic triisocyanate (tris(4-isocyanatophenyl) methane) with pyromellitic acid. Using solid-state CPMAS (13)C and (15)N NMR, it was found that the skeletal framework of PA-xx was a statistical copolymer of polyamide, polyurea, polyimide, and of the primary condensation product of the two reactants, a carbamic-anhydride adduct. Stepwise pyrolytic decomposition of those components yielded carbon aerogels with both open and closed microporosity. The open micropore surface area increased from <15 m(2) g(-1) in PA-xx to 340 m(2) g(-1) in the carbons. Next, reactive etching at 1,000 °C with CO2 opened access to the closed pores and the micropore area increased by almost 4× to 1150 m(2) g(-1) (out of 1750 m(2) g(-1) of a total BET surface area). At 0 °C, etched carbon aerogels demonstrated a good balance of adsorption capacity for CO2 (up to 4.9 mmol g(-1)), and selectivity toward other gases (via Henry's law). The selectivity for CO2 versus H2 (up to 928:1) is suitable for precombustion fuel purification. Relevant to postcombustion CO2 capture and sequestration (CCS), the selectivity for CO2 versus N2 was in the 17:1 to 31:1 range. In addition to typical factors involved in gas sorption (kinetic diameters, quadrupole moments and polarizabilities of the adsorbates), it is also suggested that CO2 is preferentially engaged by surface pyridinic and pyridonic N on carbon (identified with XPS) in an energy-neutral surface reaction. Relatively high uptake of CH4 (2.16 mmol g(-1) at 0 °C/1 bar) was attributed to its low polarizability, and that finding paves the way for further studies on adsorption of higher (i.e., more polarizable) hydrocarbons. Overall, high CO2 selectivities, in combination with attractive CO2 adsorption capacities, low monomer cost, and the innate physicochemical stability of carbon render the materials of this study reasonable candidates for further practical

  3. Random Vibrations

    NASA Technical Reports Server (NTRS)

    Messaro. Semma; Harrison, Phillip

    2010-01-01

    Ares I Zonal Random vibration environments due to acoustic impingement and combustion processes are develop for liftoff, ascent and reentry. Random Vibration test criteria for Ares I Upper Stage pyrotechnic components are developed by enveloping the applicable zonal environments where each component is located. Random vibration tests will be conducted to assure that these components will survive and function appropriately after exposure to the expected vibration environments. Methodology: Random Vibration test criteria for Ares I Upper Stage pyrotechnic components were desired that would envelope all the applicable environments where each component was located. Applicable Ares I Vehicle drawings and design information needed to be assessed to determine the location(s) for each component on the Ares I Upper Stage. Design and test criteria needed to be developed by plotting and enveloping the applicable environments using Microsoft Excel Spreadsheet Software and documenting them in a report Using Microsoft Word Processing Software. Conclusion: Random vibration liftoff, ascent, and green run design & test criteria for the Upper Stage Pyrotechnic Components were developed by using Microsoft Excel to envelope zonal environments applicable to each component. Results were transferred from Excel into a report using Microsoft Word. After the report is reviewed and edited by my mentor it will be submitted for publication as an attachment to a memorandum. Pyrotechnic component designers will extract criteria from my report for incorporation into the design and test specifications for components. Eventually the hardware will be tested to the environments I developed to assure that the components will survive and function appropriately after exposure to the expected vibration environments.

  4. Enhanced mRNA-protein fusion efficiency of a single-domain antibody by selection of mRNA display with additional random sequences in the terminal translated regions

    PubMed Central

    Takahashi, Kazuki; Sunohara, Masato; Terai, Takuya; Kumachi, Shigefumi; Nemoto, Naoto

    2017-01-01

    In vitro display technologies such as mRNA and cDNA display are powerful tools to create and select functional peptides. However, in some cases, efficiency of mRNA-protein fusion is very low, which results in decreased library size and lower chance of successful selection. In this study, to improve mRNA-protein fusion efficiency, we prepared an mRNA display library of a protein with random N- and C-terminal coding regions consisting of 12 nucleotides (i.e. four amino acids), and performed an electrophoresis mobility shift assay (EMSA)-based selection of successfully formed mRNA display molecules. A single-domain antibody (Nanobody, or VHH) was used as a model protein, and as a result, a pair of sequences was identified that increased mRNA-protein fusion efficiency of this protein by approximately 20%. Interestingly, enhancement of the fusion efficiency induced by the identified sequences was protein-specific, and different results were obtained for other proteins including VHHs with different CDRs. The results suggested that conformation of mRNA as a whole, rather than the amino acid sequence of the translated peptide, is an important factor to determine mRNA-protein fusion efficiency. PMID:28275529

  5. Enhanced mRNA-protein fusion efficiency of a single-domain antibody by selection of mRNA display with additional random sequences in the terminal translated regions.

    PubMed

    Takahashi, Kazuki; Sunohara, Masato; Terai, Takuya; Kumachi, Shigefumi; Nemoto, Naoto

    2017-01-01

    In vitro display technologies such as mRNA and cDNA display are powerful tools to create and select functional peptides. However, in some cases, efficiency of mRNA-protein fusion is very low, which results in decreased library size and lower chance of successful selection. In this study, to improve mRNA-protein fusion efficiency, we prepared an mRNA display library of a protein with random N- and C-terminal coding regions consisting of 12 nucleotides (i.e. four amino acids), and performed an electrophoresis mobility shift assay (EMSA)-based selection of successfully formed mRNA display molecules. A single-domain antibody (Nanobody, or VHH) was used as a model protein, and as a result, a pair of sequences was identified that increased mRNA-protein fusion efficiency of this protein by approximately 20%. Interestingly, enhancement of the fusion efficiency induced by the identified sequences was protein-specific, and different results were obtained for other proteins including VHHs with different CDRs. The results suggested that conformation of mRNA as a whole, rather than the amino acid sequence of the translated peptide, is an important factor to determine mRNA-protein fusion efficiency.

  6. Fractional randomness

    NASA Astrophysics Data System (ADS)

    Tapiero, Charles S.; Vallois, Pierre

    2016-11-01

    The premise of this paper is that a fractional probability distribution is based on fractional operators and the fractional (Hurst) index used that alters the classical setting of random variables. For example, a random variable defined by its density function might not have a fractional density function defined in its conventional sense. Practically, it implies that a distribution's granularity defined by a fractional kernel may have properties that differ due to the fractional index used and the fractional calculus applied to define it. The purpose of this paper is to consider an application of fractional calculus to define the fractional density function of a random variable. In addition, we provide and prove a number of results, defining the functional forms of these distributions as well as their existence. In particular, we define fractional probability distributions for increasing and decreasing functions that are right continuous. Examples are used to motivate the usefulness of a statistical approach to fractional calculus and its application to economic and financial problems. In conclusion, this paper is a preliminary attempt to construct statistical fractional models. Due to the breadth and the extent of such problems, this paper may be considered as an initial attempt to do so.

  7. Developing bifunctional beta-lactamase molecules with built-in target-recognizing module for prodrug therapy: identification of Enterobacter Cloacae P99 cephalosporinase loops suitable for randomization and phage-display selection.

    PubMed

    Shukla, Girja S; Krag, David N

    2009-01-01

    This study was focused on developing catalytically active beta-lactamase enzyme molecules that have target-recognizing sites built within their scaffold. Using phage-display approach, nine libraries were constructed by inserting the randomized linear or cysteine-constrained heptapeptides in the five different loops on the outer surface of P99 beta-lactamase molecule. The pIII signal peptide of Sec-pathway was employed for a periplasmic translocation of the beta-lactamase fusion protein, which we found more efficient than the DsbA signal peptide of SRP-pathway. The randomized heptapeptide loops replaced native amino acids between positions (34)Y-(37)K, (238)M-(246)A, (275)N-(280)A, (305)A-(311)S, or (329)I-(334)I of the P99 beta-lactamase molecules for generating the loop-1 to -5 libraries, respectively. The diversity of each loop library was judged by counting the primary and beta-lactamase-active clones. The linear peptide inserts in the loop-2 library showed the maximum number of the beta-lactamase-active clones, followed by the loop-5, loop-3, and loop-4. The insertion of the cysteine-constrained loops exhibited a dramatic loss of the enzyme-active beta-lactamase clones. The complexity of the loop-2 linear library, as determined by the frequency and diversity of amino acid distributions in the randomized region, appears consistent with the standards of other types of phage display library systems. The selection of the loop-2 linear library on streptavidin protein as a test target identified several beta-lactamase clones that specifically bound to streptavidin. In conclusion, this study identified the suitability of the loop-2 of P99 beta-lactamase for constructing a phage-display library of the beta-lactamase enzyme-active molecules that can be selected against a target. This is an enabling step in our long-term goal of developing bifunctional beta-lactamase molecules against cancer-specific targets for enzyme prodrug therapy of cancer.

  8. Random grammars

    NASA Astrophysics Data System (ADS)

    Malyshev, V. A.

    1998-04-01

    Contents § 1. Definitions1.1. Grammars1.2. Random grammars and L-systems1.3. Semigroup representations § 2. Infinite string dynamics2.1. Cluster expansion2.2. Cluster dynamics2.3. Local observer § 3. Large time behaviour: small perturbations3.1. Invariant measures3.2. Classification § 4. Large time behaviour: context free case4.1. Invariant measures for grammars4.2. L-systems4.3. Fractal correlation functions4.4. Measures on languages Bibliography

  9. Is random access memory random?

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    Most software is contructed on the assumption that the programs and data are stored in random access memory (RAM). Physical limitations on the relative speeds of processor and memory elements lead to a variety of memory organizations that match processor addressing rate with memory service rate. These include interleaved and cached memory. A very high fraction of a processor's address requests can be satified from the cache without reference to the main memory. The cache requests information from main memory in blocks that can be transferred at the full memory speed. Programmers who organize algorithms for locality can realize the highest performance from these computers.

  10. Repetitive transcranial magnetic stimulation (rTMS) augmentation of selective serotonin reuptake inhibitors (SSRIs) for SSRI-resistant obsessive-compulsive disorder (OCD): a meta-analysis of randomized controlled trials

    PubMed Central

    Ma, Zhong-Rui; Shi, Li-Jun

    2014-01-01

    Background and objective: Randomized controlled trials (RCTs) on repetitive transcranial magnetic stimulation (rTMS) as augmentation of selective serotonin reuptake inhibitors (SSRIs) for SSRI-resistant obsessive-compulsive disorder (OCD) have yielded conflicting results. Therefore, this meta-analysis was conducted to assess the efficacy of this strategy for SSRI-resistant OCD. Methods: Scientific and medical databases, including international databases (PubMed, MEDLINE, EMBASE, CCTR, Web of Science, PsycINFO), two Chinese databases (CBM-disc, CNKI), and relevant websites dated up to July 2014, were searched for RCTs on this strategy for treating OCD. Mantel-Haenszel random-effects model was used. Yale-Brown Obsessive Compulsive Scale (Y-BOCS) score, response rates and drop-out rates were evaluated. Results: Data were obtained from nine RCTs consisting of 290 subjects. Active rTMS was an effective augmentation strategy in treating SSRI-resistant OCD with a pooled WMD of 3.89 (95% CI = [1.27, 6.50]) for reducing Y-BOCS score and a pooled odds ratio (OR) of 2.65 (95% CI = [1.36, 5.17] for response rates. No significant differences in drop-out rates were found. No publication bias was detected. Conclusion: The pooled examination demonstrated that this strategy seems to be efficacious and acceptable for treating SSRI-resistant OCD. As the number of RCTs included here was limited, further large-scale multi-center RCTs are required to validate our conclusions. PMID:25663986

  11. Evaluation of the effect of aromatherapy with Rosa damascena Mill. on postoperative pain intensity in hospitalized children in selected hospitals affiliated to Isfahan University of Medical Sciences in 2013: A randomized clinical trial

    PubMed Central

    Marofi, Maryam; Sirousfard, Motahareh; Moeini, Mahin; Ghanadi, Alireza

    2015-01-01

    Background: Pain is the common complication after a surgery. The aim of this study was to evaluate the effect of aromatherapy with Rosa damascena Mill. on the postoperative pain in children. Materials and Methods: In a double-blind, placebo-controlled clinical trial, we selected 64 children of 3–6 years of age through convenient sampling and divided them randomly into two groups. Patients in group A were given inhalation aromatherapy with R. damascena Mill., and in group B, the patients were given almond oil as a placebo. Inhalation aromatherapy was used at the first time of subjects’ arrival to the ward and then at 3, 6, 9, and 12 h afterward. Common palliative treatments to relieve pain were used in both groups. Thirty minutes after aromatherapy, the postoperative pain in children was evaluated with the Toddler Preschooler Postoperative Pain Scale (TPPPS). Data were statistically analyzed using Chi-square test, one-way analysis of variance (ANOVA), and repeated measures ANOVA. Results: There was no significant difference in pain scores at the first time of subjects’ arrival to the ward (before receiving any aromatherapy or palliative care) between the two groups. After each time of aromatherapy and at the end of treatment, the pain score was significantly reduced in the aromatherapy group with R. damascena Mill. compared to the placebo group. Conclusions: According to our results, aromatherapy with R. damascena Mill. can be used in postoperative pain in children, together with other common treatments without any significant side effects. PMID:25878704

  12. Efficacy and tolerability balance of oxycodone/naloxone and tapentadol in chronic low back pain with a neuropathic component: a blinded end point analysis of randomly selected routine data from 12-week prospective open-label observations

    PubMed Central

    Ueberall, Michael A; Mueller-Schwefe, Gerhard H H

    2016-01-01

    Objective To evaluate the benefit–risk profile (BRP) of oxycodone/naloxone (OXN) and tapentadol (TAP) in patients with chronic low back pain (cLBP) with a neuropathic component (NC) in routine clinical practice. Methods This was a blinded end point analysis of randomly selected 12-week routine/open-label data of the German Pain Registry on adult patients with cLBP-NC who initiated an index treatment in compliance with the current German prescribing information between 1st January and 31st October 2015 (OXN/TAP, n=128/133). Primary end point was defined as a composite of three efficacy components (≥30% improvement of pain, pain-related disability, and quality of life each at the end of observation vs baseline) and three tolerability components (normal bowel function, absence of either central nervous system side effects, and treatment-emergent adverse event [TEAE]-related treatment discontinuation during the observation period) adopted to reflect BRP assessments under real-life conditions. Results Demographic as well as baseline and pretreatment characteristics were comparable for the randomly selected data sets of both index groups without any indicators for critical selection biases. Treatment with OXN resulted formally in a BRP noninferior to that of TAP and showed a significantly higher primary end point response vs TAP (39.8% vs 25.6%, odds ratio: 1.93; P=0.014), due to superior analgesic effects. Between-group differences increased with stricter response definitions for all three efficacy components in favor of OXN: ≥30%/≥50%/≥70% response rates for OXN vs TAP were seen for pain intensity in 85.2%/67.2%/39.1% vs 83.5%/54.1%/15.8% (P= ns/0.031/<0.001), for pain-related disability in 78.1%/64.8%/43.8% vs 66.9%/50.4%/24.8% (P=0.043/0.018/0.001), and for quality of life in 76.6%/68.0%/50.0% vs 63.9%/54.1%/34.6% (P=0.026/0.022/0.017). Overall, OXN vs TAP treatments were well tolerated, and proportions of patients who either maintained a normal bowel

  13. Effect of chemotherapy on the impact of FDG-PET/CT in selection of patients for surgical resection of colorectal liver metastases: single center analysis of PET-CAM randomized trial.

    PubMed

    Metser, Ur; Halankar, Jaydeep; Langer, Deanna; Mohan, Ravi; Hussey, Douglas; Hadas, Moshonov; Tamir, Shlomit

    2017-02-01

    The largest randomized controlled trial (RCT) on the effect of FDG-PET on surgical management for metastatic colorectal adenocarcinoma to liver ("PET-CAM") reported only a modest change in surgical management (8%).

  14. Fragmentation of random trees

    NASA Astrophysics Data System (ADS)

    Kalay, Z.; Ben-Naim, E.

    2015-01-01

    We study fragmentation of a random recursive tree into a forest by repeated removal of nodes. The initial tree consists of N nodes and it is generated by sequential addition of nodes with each new node attaching to a randomly-selected existing node. As nodes are removed from the tree, one at a time, the tree dissolves into an ensemble of separate trees, namely, a forest. We study statistical properties of trees and nodes in this heterogeneous forest, and find that the fraction of remaining nodes m characterizes the system in the limit N\\to ∞ . We obtain analytically the size density {{φ }s} of trees of size s. The size density has power-law tail {{φ }s}˜ {{s}-α } with exponent α =1+\\frac{1}{m}. Therefore, the tail becomes steeper as further nodes are removed, and the fragmentation process is unusual in that exponent α increases continuously with time. We also extend our analysis to the case where nodes are added as well as removed, and obtain the asymptotic size density for growing trees.

  15. Randomization in clinical trials in orthodontics: its significance in research design and methods to achieve it.

    PubMed

    Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore

    2011-12-01

    Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.

  16. Random broadcast on random geometric graphs

    SciTech Connect

    Bradonjic, Milan; Elsasser, Robert; Friedrich, Tobias

    2009-01-01

    In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.

  17. How random is a random vector?

    SciTech Connect

    Eliazar, Iddo

    2015-12-15

    Over 80 years ago Samuel Wilks proposed that the “generalized variance” of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the “Wilks standard deviation” –the square root of the generalized variance–is indeed the standard deviation of a random vector. We further establish that the “uncorrelation index” –a derivative of the Wilks standard deviation–is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: “randomness measures” and “independence indices” of random vectors. In turn, these general notions give rise to “randomness diagrams”—tangible planar visualizations that answer the question: How random is a random vector? The notion of “independence indices” yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  18. Nonvolatile random access memory

    NASA Technical Reports Server (NTRS)

    Wu, Jiin-Chuan (Inventor); Stadler, Henry L. (Inventor); Katti, Romney R. (Inventor)

    1994-01-01

    A nonvolatile magnetic random access memory can be achieved by an array of magnet-Hall effect (M-H) elements. The storage function is realized with a rectangular thin-film ferromagnetic material having an in-plane, uniaxial anisotropy and inplane bipolar remanent magnetization states. The thin-film magnetic element is magnetized by a local applied field, whose direction is used to form either a 0 or 1 state. The element remains in the 0 or 1 state until a switching field is applied to change its state. The stored information is detcted by a Hall-effect sensor which senses the fringing field from the magnetic storage element. The circuit design for addressing each cell includes transistor switches for providing a current of selected polarity to store a binary digit through a separate conductor overlying the magnetic element of the cell. To read out a stored binary digit, transistor switches are employed to provide a current through a row of Hall-effect sensors connected in series and enabling a differential voltage amplifier connected to all Hall-effect sensors of a column in series. To avoid read-out voltage errors due to shunt currents through resistive loads of the Hall-effect sensors of other cells in the same column, at least one transistor switch is provided between every pair of adjacent cells in every row which are not turned on except in the row of the selected cell.

  19. Directed random walk with random restarts: The Sisyphus random walk

    NASA Astrophysics Data System (ADS)

    Montero, Miquel; Villarroel, Javier

    2016-09-01

    In this paper we consider a particular version of the random walk with restarts: random reset events which suddenly bring the system to the starting value. We analyze its relevant statistical properties, like the transition probability, and show how an equilibrium state appears. Formulas for the first-passage time, high-water marks, and other extreme statistics are also derived; we consider counting problems naturally associated with the system. Finally we indicate feasible generalizations useful for interpreting different physical effects.

  20. Directed random walk with random restarts: The Sisyphus random walk.

    PubMed

    Montero, Miquel; Villarroel, Javier

    2016-09-01

    In this paper we consider a particular version of the random walk with restarts: random reset events which suddenly bring the system to the starting value. We analyze its relevant statistical properties, like the transition probability, and show how an equilibrium state appears. Formulas for the first-passage time, high-water marks, and other extreme statistics are also derived; we consider counting problems naturally associated with the system. Finally we indicate feasible generalizations useful for interpreting different physical effects.

  1. Associative Hierarchical Random Fields.

    PubMed

    Ladický, L'ubor; Russell, Chris; Kohli, Pushmeet; Torr, Philip H S

    2014-06-01

    This paper makes two contributions: the first is the proposal of a new model-The associative hierarchical random field (AHRF), and a novel algorithm for its optimization; the second is the application of this model to the problem of semantic segmentation. Most methods for semantic segmentation are formulated as a labeling problem for variables that might correspond to either pixels or segments such as super-pixels. It is well known that the generation of super pixel segmentations is not unique. This has motivated many researchers to use multiple super pixel segmentations for problems such as semantic segmentation or single view reconstruction. These super-pixels have not yet been combined in a principled manner, this is a difficult problem, as they may overlap, or be nested in such a way that the segmentations form a segmentation tree. Our new hierarchical random field model allows information from all of the multiple segmentations to contribute to a global energy. MAP inference in this model can be performed efficiently using powerful graph cut based move making algorithms. Our framework generalizes much of the previous work based on pixels or segments, and the resulting labelings can be viewed both as a detailed segmentation at the pixel level, or at the other extreme, as a segment selector that pieces together a solution like a jigsaw, selecting the best segments from different segmentations as pieces. We evaluate its performance on some of the most challenging data sets for object class segmentation, and show that this ability to perform inference using multiple overlapping segmentations leads to state-of-the-art results.

  2. Spectroscopy with Random and Displaced Random Ensembles

    NASA Astrophysics Data System (ADS)

    Velázquez, V.; Zuker, A. P.

    2002-02-01

    Because of the time reversal invariance of the angular momentum operator J2, the average energies and variances at fixed J for random two-body Hamiltonians exhibit odd-even- J staggering that may be especially strong for J = 0. It is shown that upon ensemble averaging over random runs, this behavior is reflected in the yrast states. Displaced (attractive) random ensembles lead to rotational spectra with strongly enhanced B(E2) transitions for a certain class of model spaces. It is explained how to generalize these results to other forms of collectivity.

  3. On Gaussian random supergravity

    NASA Astrophysics Data System (ADS)

    Bachlechner, Thomas C.

    2014-04-01

    We study the distribution of metastable vacua and the likelihood of slow roll inflation in high dimensional random landscapes. We consider two examples of landscapes: a Gaussian random potential and an effective supergravity potential defined via a Gaussian random superpotential and a trivial Kähler potential. To examine these landscapes we introduce a random matrix model that describes the correlations between various derivatives and we propose an efficient algorithm that allows for a numerical study of high dimensional random fields. Using these novel tools, we find that the vast majority of metastable critical points in N dimensional random supergravities are either approximately supersymmetric with | F| ≪ M susy or supersymmetric. Such approximately supersymmetric points are dynamical attractors in the landscape and the probability that a randomly chosen critical point is metastable scales as log( P ) ∝ - N. We argue that random supergravities lead to potentially interesting inflationary dynamics.

  4. Quantum random number generation

    SciTech Connect

    Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; Zhang, Zhen; Qi, Bing

    2016-06-28

    Quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a high speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.

  5. Quantum random number generation

    DOE PAGES

    Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; ...

    2016-06-28

    Quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a highmore » speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.« less

  6. Quantum random number generation

    NASA Astrophysics Data System (ADS)

    Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; Qi, Bing; Zhang, Zhen

    2016-06-01

    Quantum physics can be exploited to generate true random numbers, which have important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness—coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. On the basis of the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a high speed by properly modelling the devices. The second category is self-testing QRNG, in which verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category that provides a tradeoff between the trustworthiness on the device and the random number generation speed.

  7. Randomization methods in emergency setting trials: a descriptive review

    PubMed Central

    Moe‐Byrne, Thirimon; Oddie, Sam; McGuire, William

    2015-01-01

    Background Quasi‐randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic indicators between treatment groups in trials using true randomization versus trials using quasi‐randomization. Results Seven reviews contained 16 trials that used true randomization and 11 that used quasi‐randomization. Baseline group imbalance was identified in four trials using true randomization (25%) and in two quasi‐randomized trials (18%). Of the four truly randomized trials with imbalance, three concealed treatment allocation adequately. Clinical heterogeneity and poor reporting limited the assessment of trial recruitment outcomes. Conclusions We did not find strong or consistent evidence that quasi‐randomization is associated with selection bias more often than true randomization. High risk of bias judgements for quasi‐randomized emergency studies should therefore not be assumed in systematic reviews. Clinical heterogeneity across trials within reviews, coupled with limited availability of relevant trial accrual data, meant it was not possible to adequately explore the possibility that true randomization might result in slower trial recruitment rates, or the recruitment of less representative populations. © 2015 The Authors. Research Synthesis Methods published by John Wiley & Sons, Ltd. PMID:26333419

  8. Quantum random number generators

    NASA Astrophysics Data System (ADS)

    Herrero-Collantes, Miguel; Garcia-Escartin, Juan Carlos

    2017-01-01

    Random numbers are a fundamental resource in science and engineering with important applications in simulation and cryptography. The inherent randomness at the core of quantum mechanics makes quantum systems a perfect source of entropy. Quantum random number generation is one of the most mature quantum technologies with many alternative generation methods. This review discusses the different technologies in quantum random number generation from the early devices based on radioactive decay to the multiple ways to use the quantum states of light to gather entropy from a quantum origin. Randomness extraction and amplification and the notable possibility of generating trusted random numbers even with untrusted hardware using device-independent generation protocols are also discussed.

  9. Invitation to Random Tensors

    NASA Astrophysics Data System (ADS)

    Gurau, Razvan

    2016-09-01

    This article is preface to the SIGMA special issue ''Tensor Models, Formalism and Applications'', http://www.emis.de/journals/SIGMA/Tensor_Models.html. The issue is a collection of eight excellent, up to date reviews on random tensor models. The reviews combine pedagogical introductions meant for a general audience with presentations of the most recent developments in the field. This preface aims to give a condensed panoramic overview of random tensors as the natural generalization of random matrices to higher dimensions.

  10. EDITORIAL: Nano and random lasers Nano and random lasers

    NASA Astrophysics Data System (ADS)

    Wiersma, Diederik S.; Noginov, Mikhail A.

    2010-02-01

    The field of extreme miniature sources of stimulated emission represented by random lasers and nanolasers has gone through an enormous development in recent years. Random lasers are disordered optical structures in which light waves are both multiply scattered and amplified. Multiple scattering is a process that we all know very well from daily experience. Many familiar materials are actually disordered dielectrics and owe their optical appearance to multiple light scattering. Examples are white marble, white painted walls, paper, white flowers, etc. Light waves inside such materials perform random walks, that is they are scattered several times in random directions before they leave the material, and this gives it an opaque white appearance. This multiple scattering process does not destroy the coherence of the light. It just creates a very complex interference pattern (also known as speckle). Random lasers can be made of basically any disordered dielectric material by adding an optical gain mechanism to the structure. In practice this can be achieved with, for instance, laser dye that is dissolved in the material and optically excited by a pump laser. Alternative routes to incorporate gain are achieved using rare-earth or transition metal doped solid-state laser materials or direct band gap semiconductors. The latter can potentially be pumped electrically. After excitation, the material is capable of scattering light and amplifying it, and these two ingredients form the basis for a random laser. Random laser emission can be highly coherent, even in the absence of an optical cavity. The reason is that random structures can sustain optical modes that are spectrally narrow. This provides a spectral selection mechanism that, together with gain saturation, leads to coherent emission. A random laser can have a large number of (randomly distributed) modes that are usually strongly coupled. This means that many modes compete for the gain that is available in a random

  11. Randomized SUSAN edge detector

    NASA Astrophysics Data System (ADS)

    Qu, Zhi-Guo; Wang, Ping; Gao, Ying-Hui; Wang, Peng

    2011-11-01

    A speed up technique for the SUSAN edge detector based on random sampling is proposed. Instead of sliding the mask pixel by pixel on an image as the SUSAN edge detector does, the proposed scheme places the mask randomly on pixels to find edges in the image; we hereby name it randomized SUSAN edge detector (R-SUSAN). Specifically, the R-SUSAN edge detector adopts three approaches in the framework of random sampling to accelerate a SUSAN edge detector: procedure integration of response computation and nonmaxima suppression, reduction of unnecessary processing for obvious nonedge pixels, and early termination. Experimental results demonstrate the effectiveness of the proposed method.

  12. Random Packing and Random Covering Sequences.

    DTIC Science & Technology

    1988-03-24

    obtained by appeain~g to a result due to Marsaglia [39, and de Finetti [8]. Their result states that if (XI. X2 .. X,) is a random point on the simplex {X E...to sequeil~ coverage problems. J. App). Prob. 11. 281-293. [81 de Finetti . B. (1964). Alcune ossevazioni in tema de "suddivisione casuale." Giornale I

  13. A machine learning methodology for the selection and classification of spontaneous spinal cord dorsum potentials allows disclosure of structured (non-random) changes in neuronal connectivity induced by nociceptive stimulation.

    PubMed

    Martin, Mario; Contreras-Hernández, Enrique; Béjar, Javier; Esposito, Gennaro; Chávez, Diógenes; Glusman, Silvio; Cortés, Ulises; Rudomin, Pablo

    2015-01-01

    Previous studies aimed to disclose the functional organization of the neuronal networks involved in the generation of the spontaneous cord dorsum potentials (CDPs) generated in the lumbosacral spinal segments used predetermined templates to select specific classes of spontaneous CDPs. Since this procedure was time consuming and required continuous supervision, it was limited to the analysis of two specific types of CDPs (negative CDPs and negative positive CDPs), thus excluding potentials that may reflect activation of other neuronal networks of presumed functional relevance. We now present a novel procedure based in machine learning that allows the efficient and unbiased selection of a variety of spontaneous CDPs with different shapes and amplitudes. The reliability and performance of the present method is evaluated by analyzing the effects on the probabilities of generation of different classes of spontaneous CDPs induced by the intradermic injection of small amounts of capsaicin in the anesthetized cat, a procedure known to induce a state of central sensitization leading to allodynia and hyperalgesia. The results obtained with the selection method presently described allowed detection of spontaneous CDPs with specific shapes and amplitudes that are assumed to represent the activation of functionally coupled sets of dorsal horn neurones that acquire different, structured configurations in response to nociceptive stimuli. These changes are considered as responses tending to adequate transmission of sensory information to specific functional requirements as part of homeostatic adjustments.

  14. A machine learning methodology for the selection and classification of spontaneous spinal cord dorsum potentials allows disclosure of structured (non-random) changes in neuronal connectivity induced by nociceptive stimulation

    PubMed Central

    Martin, Mario; Contreras-Hernández, Enrique; Béjar, Javier; Esposito, Gennaro; Chávez, Diógenes; Glusman, Silvio; Cortés, Ulises; Rudomin, Pablo

    2015-01-01

    Previous studies aimed to disclose the functional organization of the neuronal networks involved in the generation of the spontaneous cord dorsum potentials (CDPs) generated in the lumbosacral spinal segments used predetermined templates to select specific classes of spontaneous CDPs. Since this procedure was time consuming and required continuous supervision, it was limited to the analysis of two specific types of CDPs (negative CDPs and negative positive CDPs), thus excluding potentials that may reflect activation of other neuronal networks of presumed functional relevance. We now present a novel procedure based in machine learning that allows the efficient and unbiased selection of a variety of spontaneous CDPs with different shapes and amplitudes. The reliability and performance of the present method is evaluated by analyzing the effects on the probabilities of generation of different classes of spontaneous CDPs induced by the intradermic injection of small amounts of capsaicin in the anesthetized cat, a procedure known to induce a state of central sensitization leading to allodynia and hyperalgesia. The results obtained with the selection method presently described allowed detection of spontaneous CDPs with specific shapes and amplitudes that are assumed to represent the activation of functionally coupled sets of dorsal horn neurones that acquire different, structured configurations in response to nociceptive stimuli. These changes are considered as responses tending to adequate transmission of sensory information to specific functional requirements as part of homeostatic adjustments. PMID:26379540

  15. Quantum random number generator

    DOEpatents

    Pooser, Raphael C.

    2016-05-10

    A quantum random number generator (QRNG) and a photon generator for a QRNG are provided. The photon generator may be operated in a spontaneous mode below a lasing threshold to emit photons. Photons emitted from the photon generator may have at least one random characteristic, which may be monitored by the QRNG to generate a random number. In one embodiment, the photon generator may include a photon emitter and an amplifier coupled to the photon emitter. The amplifier may enable the photon generator to be used in the QRNG without introducing significant bias in the random number and may enable multiplexing of multiple random numbers. The amplifier may also desensitize the photon generator to fluctuations in power supplied thereto while operating in the spontaneous mode. In one embodiment, the photon emitter and amplifier may be a tapered diode amplifier.

  16. Randomness: Quantum versus classical

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2016-05-01

    Recent tremendous development of quantum information theory has led to a number of quantum technological projects, e.g. quantum random generators. This development had stimulated a new wave of interest in quantum foundations. One of the most intriguing problems of quantum foundations is the elaboration of a consistent and commonly accepted interpretation of a quantum state. Closely related problem is the clarification of the notion of quantum randomness and its interrelation with classical randomness. In this short review, we shall discuss basics of classical theory of randomness (which by itself is very complex and characterized by diversity of approaches) and compare it with irreducible quantum randomness. We also discuss briefly “digital philosophy”, its role in physics (classical and quantum) and its coupling to the information interpretation of quantum mechanics (QM).

  17. Autonomous Byte Stream Randomizer

    NASA Technical Reports Server (NTRS)

    Paloulian, George K.; Woo, Simon S.; Chow, Edward T.

    2013-01-01

    Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.

  18. Random survival forests for competing risks

    PubMed Central

    Ishwaran, Hemant; Gerds, Thomas A.; Kogalur, Udaya B.; Moore, Richard D.; Gange, Stephen J.; Lau, Bryan M.

    2014-01-01

    We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection in high-dimensional problems and in settings such as HIV/AIDS that involve many competing risks. PMID:24728979

  19. Random one-of-N selector

    DOEpatents

    Kronberg, J.W.

    1993-04-20

    An apparatus for selecting at random one item of N items on the average comprising counter and reset elements for counting repeatedly between zero and N, a number selected by the user, a circuit for activating and deactivating the counter, a comparator to determine if the counter stopped at a count of zero, an output to indicate an item has been selected when the count is zero or not selected if the count is not zero. Randomness is provided by having the counter cycle very often while varying the relatively longer duration between activation and deactivation of the count. The passive circuit components of the activating/deactivating circuit and those of the counter are selected for the sensitivity of their response to variations in temperature and other physical characteristics of the environment so that the response time of the circuitry varies. Additionally, the items themselves, which may be people, may vary in shape or the time they press a pushbutton, so that, for example, an ultrasonic beam broken by the item or person passing through it will add to the duration of the count and thus to the randomness of the selection.

  20. Random one-of-N selector

    DOEpatents

    Kronberg, James W.

    1993-01-01

    An apparatus for selecting at random one item of N items on the average comprising counter and reset elements for counting repeatedly between zero and N, a number selected by the user, a circuit for activating and deactivating the counter, a comparator to determine if the counter stopped at a count of zero, an output to indicate an item has been selected when the count is zero or not selected if the count is not zero. Randomness is provided by having the counter cycle very often while varying the relatively longer duration between activation and deactivation of the count. The passive circuit components of the activating/deactivating circuit and those of the counter are selected for the sensitivity of their response to variations in temperature and other physical characteristics of the environment so that the response time of the circuitry varies. Additionally, the items themselves, which may be people, may vary in shape or the time they press a pushbutton, so that, for example, an ultrasonic beam broken by the item or person passing through it will add to the duration of the count and thus to the randomness of the selection.

  1. Correlated randomness and switching phenomena

    NASA Astrophysics Data System (ADS)

    Stanley, H. E.; Buldyrev, S. V.; Franzese, G.; Havlin, S.; Mallamace, F.; Kumar, P.; Plerou, V.; Preis, T.

    2010-08-01

    One challenge of biology, medicine, and economics is that the systems treated by these serious scientific disciplines have no perfect metronome in time and no perfect spatial architecture-crystalline or otherwise. Nonetheless, as if by magic, out of nothing but randomness one finds remarkably fine-tuned processes in time and remarkably fine-tuned structures in space. Further, many of these processes and structures have the remarkable feature of “switching” from one behavior to another as if by magic. The past century has, philosophically, been concerned with placing aside the human tendency to see the universe as a fine-tuned machine. Here we will address the challenge of uncovering how, through randomness (albeit, as we shall see, strongly correlated randomness), one can arrive at some of the many spatial and temporal patterns in biology, medicine, and economics and even begin to characterize the switching phenomena that enables a system to pass from one state to another. Inspired by principles developed by A. Nihat Berker and scores of other statistical physicists in recent years, we discuss some applications of correlated randomness to understand switching phenomena in various fields. Specifically, we present evidence from experiments and from computer simulations supporting the hypothesis that water’s anomalies are related to a switching point (which is not unlike the “tipping point” immortalized by Malcolm Gladwell), and that the bubbles in economic phenomena that occur on all scales are not “outliers” (another Gladwell immortalization). Though more speculative, we support the idea of disease as arising from some kind of yet-to-be-understood complex switching phenomenon, by discussing data on selected examples, including heart disease and Alzheimer disease.

  2. Optofluidic random laser

    NASA Astrophysics Data System (ADS)

    Shivakiran Bhaktha, B. N.; Bachelard, Nicolas; Noblin, Xavier; Sebbah, Patrick

    2012-10-01

    Random lasing is reported in a dye-circulated structured polymeric microfluidic channel. The role of disorder, which results from limited accuracy of photolithographic process, is demonstrated by the variation of the emission spectrum with local-pump position and by the extreme sensitivity to a local perturbation of the structure. Thresholds comparable to those of conventional microfluidic lasers are achieved, without the hurdle of state-of-the-art cavity fabrication. Potential applications of optofluidic random lasers for on-chip sensors are discussed. Introduction of random lasers in the field of optofluidics is a promising alternative to on-chip laser integration with light and fluidic functionalities.

  3. Assessment of non-BDNF neurotrophins and GDNF levels after depression treatment with sertraline and transcranial direct current stimulation in a factorial, randomized, sham-controlled trial (SELECT-TDCS): an exploratory analysis.

    PubMed

    Brunoni, André R; Machado-Vieira, Rodrigo; Zarate, Carlos A; Vieira, Erica L M; Valiengo, Leandro; Benseñor, Isabela M; Lotufo, Paulo A; Gattaz, Wagner F; Teixeira, Antonio L

    2015-01-02

    The neurotrophic hypothesis of depression states that the major depressive episode is associated with lower neurotrophic factors levels, which increase with amelioration of depressive symptoms. However, this hypothesis has not been extended to investigate neurotrophic factors other than the brain-derived neurotrophic factor (BDNF). We therefore explored whether plasma levels of neurotrophins 3 (NT-3) and 4 (NT-4), nerve growth factor (NGF) and glial cell line derived neurotrophic factor (GDNF) changed after antidepressant treatment and correlated with treatment response. Seventy-three patients with moderate-to-severe, antidepressant-free unipolar depression were assigned to a pharmacological (sertraline) and a non-pharmacological (transcranial direct current stimulation, tDCS) intervention in a randomized, 2 × 2, placebo-controlled design. The plasma levels of NT-3, NT-4, NGF and GDNF were determined by enzyme-linked immunosorbent assay before and after a 6-week treatment course and analyzed according to clinical response and allocation group. We found that tDCS and sertraline (separately and combined) produced significant improvement in depressive symptoms. Plasma levels of all neurotrophic factors were similar across groups at baseline and remained significantly unchanged regardless of the intervention and of clinical response. Also, baseline plasma levels were not associated with clinical response. To conclude, in this 6-week placebo-controlled trial, NT-3, NT-4, NGF and GDNF plasma levels did not significantly change with sertraline or tDCS. These data suggest that these neurotrophic factors are not surrogate biomarkers of treatment response or involved in the antidepressant mechanisms of tDCS.

  4. Bayesian Enrichment Strategies for Randomized Discontinuation Trials

    PubMed Central

    Trippa, Lorenzo; Rosner, Gary L.; Müller, Peter

    2013-01-01

    Summary We propose optimal choice of the design parameters for random discontinuation designs (RDD) using a Bayesian decision-theoretic approach. We consider applications of RDDs to oncology phase II studies evaluating activity of cytostatic agents. The design consists of two stages. The preliminary open-label stage treats all patients with the new agent and identifies a possibly sensitive subpopulation. The subsequent second stage randomizes, treats, follows, and compares outcomes among patients in the identified subgroup, with randomization to either the new or a control treatment. Several tuning parameters characterize the design: the number of patients in the trial, the duration of the preliminary stage, and the duration of follow-up after randomization. We define a probability model for tumor growth, specify a suitable utility function, and develop a computational procedure for selecting the optimal tuning parameters. PMID:21714780

  5. Random array grid collimator

    DOEpatents

    Fenimore, E.E.

    1980-08-22

    A hexagonally shaped quasi-random no-two-holes touching grid collimator. The quasi-random array grid collimator eliminates contamination from small angle off-axis rays by using a no-two-holes-touching pattern which simultaneously provides for a self-supporting array increasng throughput by elimination of a substrate. The presentation invention also provides maximum throughput using hexagonally shaped holes in a hexagonal lattice pattern for diffraction limited applications. Mosaicking is also disclosed for reducing fabrication effort.

  6. 47 CFR 1.1604 - Post-selection hearings.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings....

  7. 47 CFR 1.1604 - Post-selection hearings.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the random selection, the Commission...

  8. The pursuit of balance: An overview of covariate-adaptive randomization techniques in clinical trials.

    PubMed

    Lin, Yunzhi; Zhu, Ming; Su, Zheng

    2015-11-01

    Randomization is fundamental to the design and conduct of clinical trials. Simple randomization ensures independence among subject treatment assignments and prevents potential selection biases, yet it does not guarantee balance in covariate distributions across treatment groups. Ensuring balance in important prognostic covariates across treatment groups is desirable for many reasons. A broad class of randomization methods for achieving balance are reviewed in this paper; these include block randomization, stratified randomization, minimization, and dynamic hierarchical randomization. Practical considerations arising from experience with using the techniques are described. A review of randomization methods used in practice in recent randomized clinical trials is also provided.

  9. Rationale and design of the randomized, double-blind trial testing INtraveNous and Oral administration of elinogrel, a selective and reversible P2Y(12)-receptor inhibitor, versus clopidogrel to eVAluate Tolerability and Efficacy in nonurgent Percutaneous Coronary Interventions patients (INNOVATE-PCI).

    PubMed

    Leonardi, Sergio; Rao, Sunil V; Harrington, Robert A; Bhatt, Deepak L; Gibson, C Michael; Roe, Matthew T; Kochman, Janusz; Huber, Kurt; Zeymer, Uwe; Madan, Mina; Gretler, Daniel D; McClure, Matthew W; Paynter, Gayle E; Thompson, Vivian; Welsh, Robert C

    2010-07-01

    Despite current dual-antiplatelet therapy with aspirin and clopidogrel, adverse clinical events continue to occur during and after percutaneous coronary intervention (PCI). The failure of clopidogrel to provide optimal protection may be related to delayed onset of action, interpatient variability in its effect, and an insufficient level of platelet inhibition. Furthermore, the irreversible binding of clopidogrel to the P2Y(12) receptor for the life span of the platelet is associated with increased bleeding risk especially during urgent or emergency surgery. Novel antiplatelet agents are required to improve management of patients undergoing PCI. Elinogrel is a potent, direct-acting (ie, non-prodrug), selective, competitive, and reversible P2Y(12) inhibitor available in both intravenous and oral formulations. The INNOVATE-PCI study is a phase 2 randomized, double-blind, clopidogrel-controlled trial to evaluate the safety, tolerability, and preliminary efficacy of this novel antiplatelet agent in patients undergoing nonurgent PCI.

  10. Fluctuating Selection in the Moran

    PubMed Central

    Dean, Antony M.; Lehman, Clarence; Yi, Xiao

    2017-01-01

    Contrary to classical population genetics theory, experiments demonstrate that fluctuating selection can protect a haploid polymorphism in the absence of frequency dependent effects on fitness. Using forward simulations with the Moran model, we confirm our analytical results showing that a fluctuating selection regime, with a mean selection coefficient of zero, promotes polymorphism. We find that increases in heterozygosity over neutral expectations are especially pronounced when fluctuations are rapid, mutation is weak, the population size is large, and the variance in selection is big. Lowering the frequency of fluctuations makes selection more directional, and so heterozygosity declines. We also show that fluctuating selection raises dn/ds ratios for polymorphism, not only by sweeping selected alleles into the population, but also by purging the neutral variants of selected alleles as they undergo repeated bottlenecks. Our analysis shows that randomly fluctuating selection increases the rate of evolution by increasing the probability of fixation. The impact is especially noticeable when the selection is strong and mutation is weak. Simulations show the increase in the rate of evolution declines as the rate of new mutations entering the population increases, an effect attributable to clonal interference. Intriguingly, fluctuating selection increases the dn/ds ratios for divergence more than for polymorphism, a pattern commonly seen in comparative genomics. Our model, which extends the classical neutral model of molecular evolution by incorporating random fluctuations in selection, accommodates a wide variety of observations, both neutral and selected, with economy. PMID:28108586

  11. Strategies for Improving Precision in Group-Randomized Experiments

    ERIC Educational Resources Information Center

    Raudenbush, Stephen W.; Martinez, Andres; Spybrook, Jessaca

    2007-01-01

    Interest has rapidly increased in studies that randomly assign classrooms or schools to interventions. When well implemented, such studies eliminate selection bias, providing strong evidence about the impact of the interventions. However, unless expected impacts are large, the number of units to be randomized needs to be quite large to achieve…

  12. Randomized Prediction Games for Adversarial Machine Learning.

    PubMed

    Rota Bulo, Samuel; Biggio, Battista; Pillai, Ignazio; Pelillo, Marcello; Roli, Fabio

    2016-08-04

    In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different evasion attacks and modifying the classification function accordingly. However, both the classification function and the simulated data manipulations have been modeled in a deterministic manner, without accounting for any form of randomization. In this paper, we overcome this limitation by proposing a randomized prediction game, namely, a noncooperative game-theoretic formulation in which the classifier and the attacker make randomized strategy selections according to some probability distribution defined over the respective strategy set. We show that our approach allows one to improve the tradeoff between attack detection and false alarms with respect to the state-of-the-art secure classifiers, even against attacks that are different from those hypothesized during design, on application examples including handwritten digit recognition, spam, and malware detection.

  13. Tunable random fiber laser

    SciTech Connect

    Babin, S. A.; Podivilov, E. V.; El-Taher, A. E.; Harper, P.; Turitsyn, S. K.

    2011-08-15

    An optical fiber is treated as a natural one-dimensional random system where lasing is possible due to a combination of Rayleigh scattering by refractive index inhomogeneities and distributed amplification through the Raman effect. We present such a random fiber laser that is tunable over a broad wavelength range with uniquely flat output power and high efficiency, which outperforms traditional lasers of the same category. Outstanding characteristics defined by deep underlying physics and the simplicity of the scheme make the demonstrated laser a very attractive light source both for fundamental science and practical applications.

  14. Randomness Of Amoeba Movements

    NASA Astrophysics Data System (ADS)

    Hashiguchi, S.; Khadijah, Siti; Kuwajima, T.; Ohki, M.; Tacano, M.; Sikula, J.

    2005-11-01

    Movements of amoebas were automatically traced using the difference between two successive frames of the microscopic movie. It was observed that the movements were almost random in that the directions and the magnitudes of the successive two steps are not correlated, and that the distance from the origin was proportional to the square root of the step number.

  15. Random lattice superstrings

    SciTech Connect

    Feng Haidong; Siegel, Warren

    2006-08-15

    We propose some new simplifying ingredients for Feynman diagrams that seem necessary for random lattice formulations of superstrings. In particular, half the fermionic variables appear only in particle loops (similarly to loop momenta), reducing the supersymmetry of the constituents of the type IIB superstring to N=1, as expected from their interpretation in the 1/N expansion as super Yang-Mills.

  16. Generating "Random" Integers

    ERIC Educational Resources Information Center

    Griffiths, Martin

    2011-01-01

    One of the author's undergraduate students recently asked him whether it was possible to generate a random positive integer. After some thought, the author realised that there were plenty of interesting mathematical ideas inherent in her question. So much so in fact, that the author decided to organise a workshop, open both to undergraduates and…

  17. Powerful narrow linewidth random fiber laser

    NASA Astrophysics Data System (ADS)

    Ye, Jun; Xu, Jiangming; Zhang, Hanwei; Zhou, Pu

    2016-11-01

    In this paper, we demonstrate a narrow linewidth random fiber laser, which employs a tunable pump laser to select the operating wavelength for efficiency optimization, a narrow-band fiber Bragg grating (FBG) and a section of single mode fiber to construct a half-open cavity, and a circulator to separate pump light input and random lasing output. Spectral linewidth down to 42.31 GHz is achieved through filtering by the FBG. When 8.97 W pump light centered at the optimized wavelength 1036.5 nm is launched into the half-open cavity, 1081.4 nm random lasing with the maximum output power of 2.15 W is achieved, which is more powerful than the previous reported results.

  18. Powerful narrow linewidth random fiber laser

    NASA Astrophysics Data System (ADS)

    Ye, Jun; Xu, Jiangming; Zhang, Hanwei; Zhou, Pu

    2017-03-01

    In this paper, we demonstrate a narrow linewidth random fiber laser, which employs a tunable pump laser to select the operating wavelength for efficiency optimization, a narrow-band fiber Bragg grating (FBG) and a section of single mode fiber to construct a half-open cavity, and a circulator to separate pump light input and random lasing output. Spectral linewidth down to 42.31 GHz is achieved through filtering by the FBG. When 8.97 W pump light centered at the optimized wavelength 1036.5 nm is launched into the half-open cavity, 1081.4 nm random lasing with the maximum output power of 2.15 W is achieved, which is more powerful than the previous reported results.

  19. Efficient robust conditional random fields.

    PubMed

    Song, Dongjin; Liu, Wei; Zhou, Tianyi; Tao, Dacheng; Meyer, David A

    2015-10-01

    Conditional random fields (CRFs) are a flexible yet powerful probabilistic approach and have shown advantages for popular applications in various areas, including text analysis, bioinformatics, and computer vision. Traditional CRF models, however, are incapable of selecting relevant features as well as suppressing noise from noisy original features. Moreover, conventional optimization methods often converge slowly in solving the training procedure of CRFs, and will degrade significantly for tasks with a large number of samples and features. In this paper, we propose robust CRFs (RCRFs) to simultaneously select relevant features. An optimal gradient method (OGM) is further designed to train RCRFs efficiently. Specifically, the proposed RCRFs employ the l1 norm of the model parameters to regularize the objective used by traditional CRFs, therefore enabling discovery of the relevant unary features and pairwise features of CRFs. In each iteration of OGM, the gradient direction is determined jointly by the current gradient together with the historical gradients, and the Lipschitz constant is leveraged to specify the proper step size. We show that an OGM can tackle the RCRF model training very efficiently, achieving the optimal convergence rate [Formula: see text] (where k is the number of iterations). This convergence rate is theoretically superior to the convergence rate O(1/k) of previous first-order optimization methods. Extensive experiments performed on three practical image segmentation tasks demonstrate the efficacy of OGM in training our proposed RCRFs.

  20. Random matrix theory

    NASA Astrophysics Data System (ADS)

    Edelman, Alan; Rao, N. Raj

    Random matrix theory is now a big subject with applications in many disciplines of science, engineering and finance. This article is a survey specifically oriented towards the needs and interests of a numerical analyst. This survey includes some original material not found anywhere else. We include the important mathematics which is a very modern development, as well as the computational software that is transforming the theory into useful practice.

  1. Diffusion in random networks

    DOE PAGES

    Zhang, Duan Z.; Padrino, Juan C.

    2017-06-01

    The ensemble averaging technique is applied to model mass transport by diffusion in random networks. The system consists of an ensemble of random networks, where each network is made of pockets connected by tortuous channels. Inside a channel, fluid transport is assumed to be governed by the one-dimensional diffusion equation. Mass balance leads to an integro-differential equation for the pocket mass density. The so-called dual-porosity model is found to be equivalent to the leading order approximation of the integration kernel when the diffusion time scale inside the channels is small compared to the macroscopic time scale. As a test problem,more » we consider the one-dimensional mass diffusion in a semi-infinite domain. Because of the required time to establish the linear concentration profile inside a channel, for early times the similarity variable is xt$-$1/4 rather than xt$-$1/2 as in the traditional theory. We found this early time similarity can be explained by random walk theory through the network.« less

  2. Random forests for genomic data analysis.

    PubMed

    Chen, Xi; Ishwaran, Hemant

    2012-06-01

    Random forests (RF) is a popular tree-based ensemble machine learning tool that is highly data adaptive, applies to "large p, small n" problems, and is able to account for correlation as well as interactions among features. This makes RF particularly appealing for high-dimensional genomic data analysis. In this article, we systematically review the applications and recent progresses of RF for genomic data, including prediction and classification, variable selection, pathway analysis, genetic association and epistasis detection, and unsupervised learning.

  3. Programmable disorder in random DNA tilings

    NASA Astrophysics Data System (ADS)

    Tikhomirov, Grigory; Petersen, Philip; Qian, Lulu

    2016-11-01

    Scaling up the complexity and diversity of synthetic molecular structures will require strategies that exploit the inherent stochasticity of molecular systems in a controlled fashion. Here we demonstrate a framework for programming random DNA tilings and show how to control the properties of global patterns through simple, local rules. We constructed three general forms of planar network—random loops, mazes and trees—on the surface of self-assembled DNA origami arrays on the micrometre scale with nanometre resolution. Using simple molecular building blocks and robust experimental conditions, we demonstrate control of a wide range of properties of the random networks, including the branching rules, the growth directions, the proximity between adjacent networks and the size distribution. Much as combinatorial approaches for generating random one-dimensional chains of polymers have been used to revolutionize chemical synthesis and the selection of functional nucleic acids, our strategy extends these principles to random two-dimensional networks of molecules and creates new opportunities for fabricating more complex molecular devices that are organized by DNA nanostructures.

  4. Three dimensional imaging with randomly distributed sensors.

    PubMed

    DaneshPanah, Mehdi; Javidi, Bahram; Watson, Edward A

    2008-04-28

    As a promising three dimensional passive imaging modality, Integral Imaging (II) has been investigated widely within the research community. In virtually all of such investigations, there is an implicit assumption that the collection of elemental images lie on a simple geometric surface (e.g. flat, concave, etc), also known as pickup surface. In this paper, we present a generalized framework for 3D II with arbitrary pickup surface geometry and randomly distributed sensor configuration. In particular, we will study the case of Synthetic Aperture Integral Imaging (SAII) with random location of cameras in space, while all cameras have parallel optical axes but different distances from the 3D scene. We assume that the sensors are randomly distributed in 3D volume of pick up space. For 3D reconstruction, a finite number of sensors with known coordinates are randomly selected from within this volume. The mathematical framework for 3D scene reconstruction is developed based on an affine transform representation of imaging under geometrical optics regime. We demonstrate the feasibility of the methods proposed here by experimental results. To the best of our knowledge, this is the first report on 3D imaging using randomly distributed sensors.

  5. Selective mutism

    MedlinePlus

    ... in selective mutism. Treatment Treating selective mutism involves behavior changes. The child's family and school should be involved. Certain medicines that treat anxiety and social phobia have been used safely and successfully. Support ...

  6. Random bits, true and unbiased, from atmospheric turbulence

    PubMed Central

    Marangon, Davide G.; Vallone, Giuseppe; Villoresi, Paolo

    2014-01-01

    Random numbers represent a fundamental ingredient for secure communications and numerical simulation as well as to games and in general to Information Science. Physical processes with intrinsic unpredictability may be exploited to generate genuine random numbers. The optical propagation in strong atmospheric turbulence is here taken to this purpose, by observing a laser beam after a 143 km free-space path. In addition, we developed an algorithm to extract the randomness of the beam images at the receiver without post-processing. The numbers passed very selective randomness tests for qualification as genuine random numbers. The extracting algorithm can be easily generalized to random images generated by different physical processes. PMID:24976499

  7. Randomization Methods in Emergency Setting Trials: A Descriptive Review

    ERIC Educational Resources Information Center

    Corbett, Mark Stephen; Moe-Byrne, Thirimon; Oddie, Sam; McGuire, William

    2016-01-01

    Background: Quasi-randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods: We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic…

  8. Certified randomness in quantum physics.

    PubMed

    Acín, Antonio; Masanes, Lluis

    2016-12-07

    The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.

  9. Certified randomness in quantum physics

    NASA Astrophysics Data System (ADS)

    Acín, Antonio; Masanes, Lluis

    2016-12-01

    The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.

  10. Randomly Hyperbranched Polymers

    NASA Astrophysics Data System (ADS)

    Konkolewicz, Dominik; Gilbert, Robert G.; Gray-Weale, Angus

    2007-06-01

    We describe a model for the structures of randomly hyperbranched polymers in solution, and find a logarithmic growth of radius with polymer mass. We include segmental overcrowding, which puts an upper limit on the density. The model is tested against simulations, against data on amylopectin, a major component of starch, on glycogen, and on polyglycerols. For samples of synthetic polyglycerol and glycogen, our model holds well for all the available data. The model reveals higher-level scaling structure in glycogen, related to the β particles seen in electron microscopy.

  11. Randomly hyperbranched polymers.

    PubMed

    Konkolewicz, Dominik; Gilbert, Robert G; Gray-Weale, Angus

    2007-06-08

    We describe a model for the structures of randomly hyperbranched polymers in solution, and find a logarithmic growth of radius with polymer mass. We include segmental overcrowding, which puts an upper limit on the density. The model is tested against simulations, against data on amylopectin, a major component of starch, on glycogen, and on polyglycerols. For samples of synthetic polyglycerol and glycogen, our model holds well for all the available data. The model reveals higher-level scaling structure in glycogen, related to the beta particles seen in electron microscopy.

  12. Generation of kth-order random toposequences

    NASA Astrophysics Data System (ADS)

    Odgers, Nathan P.; McBratney, Alex. B.; Minasny, Budiman

    2008-05-01

    The model presented in this paper derives toposequences from a digital elevation model (DEM). It is written in ArcInfo Macro Language (AML). The toposequences are called kth-order random toposequences, because they take a random path uphill to the top of a hill and downhill to a stream or valley bottom from a randomly selected seed point, and they are located in a streamshed of order k according to a particular stream-ordering system. We define a kth-order streamshed as the area of land that drains directly to a stream segment of stream order k. The model attempts to optimise the spatial configuration of a set of derived toposequences iteratively by using simulated annealing to maximise the total sum of distances between each toposequence hilltop in the set. The user is able to select the order, k, of the derived toposequences. Toposequences are useful for determining soil sampling locations for use in collecting soil data for digital soil mapping applications. Sampling locations can be allocated according to equal elevation or equal-distance intervals along the length of the toposequence, for example. We demonstrate the use of this model for a study area in the Hunter Valley of New South Wales, Australia. Of the 64 toposequences derived, 32 were first-order random toposequences according to Strahler's stream-ordering system, and 32 were second-order random toposequences. The model that we present in this paper is an efficient method for sampling soil along soil toposequences. The soils along a toposequence are related to each other by the topography they are found in, so soil data collected by this method is useful for establishing soil-landscape rules for the preparation of digital soil maps.

  13. Random numbers from vacuum fluctuations

    NASA Astrophysics Data System (ADS)

    Shi, Yicheng; Chng, Brenda; Kurtsiefer, Christian

    2016-07-01

    We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.

  14. Instant Random Information

    NASA Astrophysics Data System (ADS)

    Abramson, Nils H.

    2010-12-01

    Information is carried by matter or by energy and thus Einstein stated that "no information can travel faster than light." He also was very critical to the "Spooky action at distance" as described in Quantum Physics. However, many verified experiments have proven that the "Spooky actions" not only work at distance but also that they travel at a velocity faster than light, probably at infinite velocity. Examples are Young's fringes at low light levels or entanglements. My explanation is that this information is without energy. In the following I will refer to this spooky information as exformation, where "ex-" refers to existence, the information is not transported in any way, it simply exists. Thus Einstein might have been wrong when he stated that no information can travel faster than light. But he was right in that no detectable information can travel faster than light. Phenomena connected to entanglement appear at first to be exceptions, but in those cases the information can not be reconstructed until energy is later sent in the form of correlation using ordinary information at the velocity of light. In entanglement we see that even if the exformation can not be detected directly because its luck of energy it still can influence what happens at random, because in Quantum Physics there is by definition no energy difference between two states that happen randomly.

  15. Fragmentation of random trees

    NASA Astrophysics Data System (ADS)

    Kalay, Ziya; Ben-Naim, Eli

    2015-03-01

    We investigate the fragmentation of a random recursive tree by repeated removal of nodes, resulting in a forest of disjoint trees. The initial tree is generated by sequentially attaching new nodes to randomly chosen existing nodes until the tree contains N nodes. As nodes are removed, one at a time, the tree dissolves into an ensemble of separate trees, namely a forest. We study the statistical properties of trees and nodes in this heterogeneous forest. In the limit N --> ∞ , we find that the system is characterized by a single parameter: the fraction of remaining nodes m. We obtain analytically the size density ϕs of trees of size s, which has a power-law tail ϕs ~s-α , with exponent α = 1 + 1 / m . Therefore, the tail becomes steeper as further nodes are removed, producing an unusual scaling exponent that increases continuously with time. Furthermore, we investigate the fragment size distribution in a growing tree, where nodes are added as well as removed, and find that the distribution for this case is much narrower.

  16. Random-walk enzymes.

    PubMed

    Mak, Chi H; Pham, Phuong; Afif, Samir A; Goodman, Myron F

    2015-09-01

    Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C→U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics.

  17. Random-walk enzymes

    PubMed Central

    Mak, Chi H.; Pham, Phuong; Afif, Samir A.; Goodman, Myron F.

    2015-01-01

    Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C → U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics. PMID:26465508

  18. Random-walk enzymes

    NASA Astrophysics Data System (ADS)

    Mak, Chi H.; Pham, Phuong; Afif, Samir A.; Goodman, Myron F.

    2015-09-01

    Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C →U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics.

  19. Randomness and Non-Locality

    NASA Astrophysics Data System (ADS)

    Senno, Gabriel; Bendersky, Ariel; Figueira, Santiago

    2016-07-01

    The concepts of randomness and non-locality are intimately intertwined outcomes of randomly chosen measurements over entangled systems exhibiting non-local correlations are, if we preclude instantaneous influence between distant measurement choices and outcomes, random. In this paper, we survey some recent advances in the knowledge of the interplay between these two important notions from a quantum information science perspective.

  20. Random Numbers and Quantum Computers

    ERIC Educational Resources Information Center

    McCartney, Mark; Glass, David

    2002-01-01

    The topic of random numbers is investigated in such a way as to illustrate links between mathematics, physics and computer science. First, the generation of random numbers by a classical computer using the linear congruential generator and logistic map is considered. It is noted that these procedures yield only pseudo-random numbers since…

  1. Investigating the Randomness of Numbers

    ERIC Educational Resources Information Center

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  2. Wireless Network Security Using Randomness

    DTIC Science & Technology

    2012-06-19

    REPORT WIRELESS NETWORK SECURITY USING RANDOMNESS 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: The present invention provides systems and methods for... securing communications in a wireless network by utilizing the inherent randomness of propagation errors to enable legitimate users to dynamically...Box 12211 Research Triangle Park, NC 27709-2211 15. SUBJECT TERMS Patent, security , wireless networks, randomness Sheng Xiao, Weibo Gong

  3. Improved Compressive Sensing of Natural Scenes Using Localized Random Sampling

    NASA Astrophysics Data System (ADS)

    Barranca, Victor J.; Kovačič, Gregor; Zhou, Douglas; Cai, David

    2016-08-01

    Compressive sensing (CS) theory demonstrates that by using uniformly-random sampling, rather than uniformly-spaced sampling, higher quality image reconstructions are often achievable. Considering that the structure of sampling protocols has such a profound impact on the quality of image reconstructions, we formulate a new sampling scheme motivated by physiological receptive field structure, localized random sampling, which yields significantly improved CS image reconstructions. For each set of localized image measurements, our sampling method first randomly selects an image pixel and then measures its nearby pixels with probability depending on their distance from the initially selected pixel. We compare the uniformly-random and localized random sampling methods over a large space of sampling parameters, and show that, for the optimal parameter choices, higher quality image reconstructions can be consistently obtained by using localized random sampling. In addition, we argue that the localized random CS optimal parameter choice is stable with respect to diverse natural images, and scales with the number of samples used for reconstruction. We expect that the localized random sampling protocol helps to explain the evolutionarily advantageous nature of receptive field structure in visual systems and suggests several future research areas in CS theory and its application to brain imaging.

  4. Weighted Hybrid Decision Tree Model for Random Forest Classifier

    NASA Astrophysics Data System (ADS)

    Kulkarni, Vrushali Y.; Sinha, Pradeep K.; Petare, Manisha C.

    2016-06-01

    Random Forest is an ensemble, supervised machine learning algorithm. An ensemble generates many classifiers and combines their results by majority voting. Random forest uses decision tree as base classifier. In decision tree induction, an attribute split/evaluation measure is used to decide the best split at each node of the decision tree. The generalization error of a forest of tree classifiers depends on the strength of the individual trees in the forest and the correlation among them. The work presented in this paper is related to attribute split measures and is a two step process: first theoretical study of the five selected split measures is done and a comparison matrix is generated to understand pros and cons of each measure. These theoretical results are verified by performing empirical analysis. For empirical analysis, random forest is generated using each of the five selected split measures, chosen one at a time. i.e. random forest using information gain, random forest using gain ratio, etc. The next step is, based on this theoretical and empirical analysis, a new approach of hybrid decision tree model for random forest classifier is proposed. In this model, individual decision tree in Random Forest is generated using different split measures. This model is augmented by weighted voting based on the strength of individual tree. The new approach has shown notable increase in the accuracy of random forest.

  5. Improved Compressive Sensing of Natural Scenes Using Localized Random Sampling

    PubMed Central

    Barranca, Victor J.; Kovačič, Gregor; Zhou, Douglas; Cai, David

    2016-01-01

    Compressive sensing (CS) theory demonstrates that by using uniformly-random sampling, rather than uniformly-spaced sampling, higher quality image reconstructions are often achievable. Considering that the structure of sampling protocols has such a profound impact on the quality of image reconstructions, we formulate a new sampling scheme motivated by physiological receptive field structure, localized random sampling, which yields significantly improved CS image reconstructions. For each set of localized image measurements, our sampling method first randomly selects an image pixel and then measures its nearby pixels with probability depending on their distance from the initially selected pixel. We compare the uniformly-random and localized random sampling methods over a large space of sampling parameters, and show that, for the optimal parameter choices, higher quality image reconstructions can be consistently obtained by using localized random sampling. In addition, we argue that the localized random CS optimal parameter choice is stable with respect to diverse natural images, and scales with the number of samples used for reconstruction. We expect that the localized random sampling protocol helps to explain the evolutionarily advantageous nature of receptive field structure in visual systems and suggests several future research areas in CS theory and its application to brain imaging. PMID:27555464

  6. Molecular selection in a unified evolutionary sequence

    NASA Technical Reports Server (NTRS)

    Fox, S. W.

    1986-01-01

    With guidance from experiments and observations that indicate internally limited phenomena, an outline of unified evolutionary sequence is inferred. Such unification is not visible for a context of random matrix and random mutation. The sequence proceeds from Big Bang through prebiotic matter, protocells, through the evolving cell via molecular and natural selection, to mind, behavior, and society.

  7. Mapping in random-structures

    SciTech Connect

    Reidys, C.M.

    1996-06-01

    A mapping in random-structures is defined on the vertices of a generalized hypercube Q{sub {alpha}}{sup n}. A random-structure will consist of (1) a random contact graph and (2) a family of relations imposed on adjacent vertices. The vertex set of a random contact graph will be the set of all coordinates of a vertex P {element_of} Q{sub {alpha}}{sup n}. Its edge will be the union of the edge sets of two random graphs. The first is a random 1-regular graph on 2m vertices (coordinates) and the second is a random graph G{sub p} with p = c{sub 2}/n on all n vertices (coordinates). The structure of the random contact graphs will be investigated and it will be shown that for certain values of m, c{sub 2} the mapping in random-structures allows to search by the set of random-structures. This is applied to mappings in RNA-secondary structures. Also, the results on random-structures might be helpful for designing 3D-folding algorithms for RNA.

  8. Structure of Random Foam

    NASA Astrophysics Data System (ADS)

    Kraynik, Andrew M.; Reinelt, Douglas A.; van Swol, Frank

    2004-11-01

    The Surface Evolver was used to compute the equilibrium microstructure of dry soap foams with random structure and a wide range of cell-size distributions. Topological and geometric properties of foams and individual cells were evaluated. The theory for isotropic Plateau polyhedra describes the dependence of cell geometric properties on their volume and number of faces. The surface area of all cells is about 10% greater than a sphere of equal volume; this leads to a simple but accurate theory for the surface free energy density of foam. A novel parameter based on the surface-volume mean bubble radius R32 is used to characterize foam polydispersity. The foam energy, total cell edge length, and average number of faces per cell all decrease with increasing polydispersity. Pentagonal faces are the most common in monodisperse foam but quadrilaterals take over in highly polydisperse structures.

  9. Investments in random environments

    NASA Astrophysics Data System (ADS)

    Navarro-Barrientos, Jesús Emeterio; Cantero-Álvarez, Rubén; Matias Rodrigues, João F.; Schweitzer, Frank

    2008-03-01

    We present analytical investigations of a multiplicative stochastic process that models a simple investor dynamics in a random environment. The dynamics of the investor's budget, x(t) , depends on the stochasticity of the return on investment, r(t) , for which different model assumptions are discussed. The fat-tail distribution of the budget is investigated and compared with theoretical predictions. We are mainly interested in the most probable value xmp of the budget that reaches a constant value over time. Based on an analytical investigation of the dynamics, we are able to predict xmpstat . We find a scaling law that relates the most probable value to the characteristic parameters describing the stochastic process. Our analytical results are confirmed by stochastic computer simulations that show a very good agreement with the predictions.

  10. Structure of random foam.

    SciTech Connect

    Reinelt, Douglas A.; van Swol, Frank B.; Kraynik, Andrew Michael

    2004-06-01

    The Surface Evolver was used to compute the equilibrium microstructure of dry soap foams with random structure and a wide range of cell-size distributions. Topological and geometric properties of foams and individual cells were evaluated. The theory for isotropic Plateau polyhedra describes the dependence of cell geometric properties on their volume and number of faces. The surface area of all cells is about 10% greater than a sphere of equal volume; this leads to a simple but accurate theory for the surface free energy density of foam. A novel parameter based on the surface-volume mean bubble radius R32 is used to characterize foam polydispersity. The foam energy, total cell edge length, and average number of faces per cell all decrease with increasing polydispersity. Pentagonal faces are the most common in monodisperse foam but quadrilaterals take over in highly polydisperse structures.

  11. Generalized random sequential adsorption

    NASA Astrophysics Data System (ADS)

    Tarjus, G.; Schaaf, P.; Talbot, J.

    1990-12-01

    Adsorption of hard spherical particles onto a flat uniform surface is analyzed by using generalized random sequential adsorption (RSA) models. These models are defined by releasing the condition of immobility present in the usual RSA rules to allow for desorption or surface diffusion. Contrary to the simple RSA case, generalized RSA processes are no longer irreversible and the system formed by the adsorbed particles on the surface may reach an equilibrium state. We show by using a distribution function approach that the kinetics of such processes can be described by means of an exact infinite hierarchy of equations reminiscent of the Kirkwood-Salsburg hierarchy for systems at equilibrium. We illustrate the way in which the systems produced by adsorption/desorption and by adsorption/diffusion evolve between the two limits represented by ``simple RSA'' and ``equilibrium'' by considering approximate solutions in terms of truncated density expansions.

  12. Adaptive random testing with combinatorial input domain.

    PubMed

    Huang, Rubing; Chen, Jinfu; Lu, Yansheng

    2014-01-01

    Random testing (RT) is a fundamental testing technique to assess software reliability, by simply selecting test cases in a random manner from the whole input domain. As an enhancement of RT, adaptive random testing (ART) has better failure-detection capability and has been widely applied in different scenarios, such as numerical programs, some object-oriented programs, and mobile applications. However, not much work has been done on the effectiveness of ART for the programs with combinatorial input domain (i.e., the set of categorical data). To extend the ideas to the testing for combinatorial input domain, we have adopted different similarity measures that are widely used for categorical data in data mining and have proposed two similarity measures based on interaction coverage. Then, we propose a new version named ART-CID as an extension of ART in combinatorial input domain, which selects an element from categorical data as the next test case such that it has the lowest similarity against already generated test cases. Experimental results show that ART-CID generally performs better than RT, with respect to different evaluation metrics.

  13. Selective Mutism

    PubMed Central

    2010-01-01

    Selective mutism is a rare and multidimensional childhood disorder that typically affects children entering school age. It is characterized by the persistent failure to speak in select social settings despite possessing the ability to speak and speak comfortably in more familiar settings. Many theories attempt to explain the etiology of selective mutism. Comorbidities and treatment. Selective mutism can present a variety of comorbidities including enuresis, encopresis, obsessive-compulsive disorder, depression, premorbid speech and language abnormalities, developmental delay, and Asperger's disorders. The specific manifestations and severity of these comorbidities vary based on the individual. Given the multidimensional manifestations of selective mutism, treatment options are similarly diverse. They include individual behavioral therapy, family therapy, and psychotherapy with antidepressants and anti-anxiety medications. Future directions. While studies have helped to elucidate the phenomenology of selective mutism, limitations and gaps in knowledge still persist. In particular, the literature on selective mutism consists primarily of small sample populations and case reports. Future research aims to develop an increasingly integrated, multidimensional framework for evaluating and treating children with selective mutism. PMID:20436772

  14. Resolving social dilemmas on evolving random networks

    NASA Astrophysics Data System (ADS)

    Szolnoki, Attila; Perc, Matjaž

    2009-05-01

    We show that strategy-independent adaptations of random interaction networks can induce powerful mechanisms, ranging from the Red Queen to group selection, which promote cooperation in evolutionary social dilemmas. These two mechanisms emerge spontaneously as dynamical processes due to deletions and additions of links, which are performed whenever players adopt new strategies and after a certain number of game iterations, respectively. The potency of cooperation promotion, as well as the mechanism responsible for it, can thereby be tuned via a single parameter determining the frequency of link additions. We thus demonstrate that coevolving random networks may evoke an appropriate mechanism for each social dilemma, such that cooperation prevails even in highly unfavorable conditions.

  15. Purely antiferromagnetic magnetoelectric random access memory.

    PubMed

    Kosub, Tobias; Kopte, Martin; Hühne, Ruben; Appel, Patrick; Shields, Brendan; Maletinsky, Patrick; Hübner, René; Liedke, Maciej Oskar; Fassbender, Jürgen; Schmidt, Oliver G; Makarov, Denys

    2017-01-03

    Magnetic random access memory schemes employing magnetoelectric coupling to write binary information promise outstanding energy efficiency. We propose and demonstrate a purely antiferromagnetic magnetoelectric random access memory (AF-MERAM) that offers a remarkable 50-fold reduction of the writing threshold compared with ferromagnet-based counterparts, is robust against magnetic disturbances and exhibits no ferromagnetic hysteresis losses. Using the magnetoelectric antiferromagnet Cr2O3, we demonstrate reliable isothermal switching via gate voltage pulses and all-electric readout at room temperature. As no ferromagnetic component is present in the system, the writing magnetic field does not need to be pulsed for readout, allowing permanent magnets to be used. Based on our prototypes, we construct a comprehensive model of the magnetoelectric selection mechanisms in thin films of magnetoelectric antiferromagnets, revealing misfit induced ferrimagnetism as an important factor. Beyond memory applications, the AF-MERAM concept introduces a general all-electric interface for antiferromagnets and should find wide applicability in antiferromagnetic spintronics.

  16. Purely antiferromagnetic magnetoelectric random access memory

    NASA Astrophysics Data System (ADS)

    Kosub, Tobias; Kopte, Martin; Hühne, Ruben; Appel, Patrick; Shields, Brendan; Maletinsky, Patrick; Hübner, René; Liedke, Maciej Oskar; Fassbender, Jürgen; Schmidt, Oliver G.; Makarov, Denys

    2017-01-01

    Magnetic random access memory schemes employing magnetoelectric coupling to write binary information promise outstanding energy efficiency. We propose and demonstrate a purely antiferromagnetic magnetoelectric random access memory (AF-MERAM) that offers a remarkable 50-fold reduction of the writing threshold compared with ferromagnet-based counterparts, is robust against magnetic disturbances and exhibits no ferromagnetic hysteresis losses. Using the magnetoelectric antiferromagnet Cr2O3, we demonstrate reliable isothermal switching via gate voltage pulses and all-electric readout at room temperature. As no ferromagnetic component is present in the system, the writing magnetic field does not need to be pulsed for readout, allowing permanent magnets to be used. Based on our prototypes, we construct a comprehensive model of the magnetoelectric selection mechanisms in thin films of magnetoelectric antiferromagnets, revealing misfit induced ferrimagnetism as an important factor. Beyond memory applications, the AF-MERAM concept introduces a general all-electric interface for antiferromagnets and should find wide applicability in antiferromagnetic spintronics.

  17. Random Test Run Length and Effectiveness

    NASA Technical Reports Server (NTRS)

    Andrews, James H.; Groce, Alex; Weston, Melissa; Xu, Ru-Gang

    2008-01-01

    A poorly understood but important factor in many applications of random testing is the selection of a maximum length for test runs. Given a limited time for testing, it is seldom clear whether executing a small number of long runs or a large number of short runs maximizes utility. It is generally expected that longer runs are more likely to expose failures -- which is certainly true with respect to runs shorter than the shortest failing trace. However, longer runs produce longer failing traces, requiring more effort from humans in debugging or more resources for automated minimization. In testing with feedback, increasing ranges for parameters may also cause the probability of failure to decrease in longer runs. We show that the choice of test length dramatically impacts the effectiveness of random testing, and that the patterns observed in simple models and predicted by analysis are useful in understanding effects observed.

  18. Simulation of pedigree genotypes by random walks.

    PubMed Central

    Lange, K; Matthysse, S

    1989-01-01

    A random walk method, based on the Metropolis algorithm, is developed for simulating the distribution of trait and linkage marker genotypes in pedigrees where trait phenotypes are already known. The method complements techniques suggested by Ploughman and Boehnke and by Ott that are based on sequential sampling of genotypes within a pedigree. These methods are useful for estimating the power of linkage analysis before complete study of a pedigree is undertaken. We apply the random walk technique to a partially penetrant disease, schizophrenia, and to a recessive disease, ataxia-telangiectasia. In the first case we show that accessory phenotypes with higher penetrance than that of schizophrenia itself may be crucial for effective linkage analysis, and in the second case we show that impressionistic selection of informative pedigrees may be misleading. PMID:2589323

  19. Pseudo-random number generator for the Sigma 5 computer

    NASA Technical Reports Server (NTRS)

    Carroll, S. N.

    1983-01-01

    A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.

  20. Neither fixed nor random: weighted least squares meta-regression.

    PubMed

    Stanley, T D; Doucouliagos, Hristos

    2017-03-01

    Our study revisits and challenges two core conventional meta-regression estimators: the prevalent use of 'mixed-effects' or random-effects meta-regression analysis and the correction of standard errors that defines fixed-effects meta-regression analysis (FE-MRA). We show how and explain why an unrestricted weighted least squares MRA (WLS-MRA) estimator is superior to conventional random-effects (or mixed-effects) meta-regression when there is publication (or small-sample) bias that is as good as FE-MRA in all cases and better than fixed effects in most practical applications. Simulations and statistical theory show that WLS-MRA provides satisfactory estimates of meta-regression coefficients that are practically equivalent to mixed effects or random effects when there is no publication bias. When there is publication selection bias, WLS-MRA always has smaller bias than mixed effects or random effects. In practical applications, an unrestricted WLS meta-regression is likely to give practically equivalent or superior estimates to fixed-effects, random-effects, and mixed-effects meta-regression approaches. However, random-effects meta-regression remains viable and perhaps somewhat preferable if selection for statistical significance (publication bias) can be ruled out and when random, additive normal heterogeneity is known to directly affect the 'true' regression coefficient. Copyright © 2016 John Wiley & Sons, Ltd.

  1. Fractional random walk lattice dynamics

    NASA Astrophysics Data System (ADS)

    Michelitsch, T. M.; Collet, B. A.; Riascos, A. P.; Nowakowski, A. F.; Nicolleau, F. C. G. A.

    2017-02-01

    We analyze time-discrete and time-continuous ‘fractional’ random walks on undirected regular networks with special focus on cubic periodic lattices in n  =  1, 2, 3,.. dimensions. The fractional random walk dynamics is governed by a master equation involving fractional powers of Laplacian matrices {{L}\\fracα{2}}} where α =2 recovers the normal walk. First we demonstrate that the interval 0<α ≤slant 2 is admissible for the fractional random walk. We derive analytical expressions for the transition matrix of the fractional random walk and closely related the average return probabilities. We further obtain the fundamental matrix {{Z}(α )} , and the mean relaxation time (Kemeny constant) for the fractional random walk. The representation for the fundamental matrix {{Z}(α )} relates fractional random walks with normal random walks. We show that the matrix elements of the transition matrix of the fractional random walk exihibit for large cubic n-dimensional lattices a power law decay of an n-dimensional infinite space Riesz fractional derivative type indicating emergence of Lévy flights. As a further footprint of Lévy flights in the n-dimensional space, the transition matrix and return probabilities of the fractional random walk are dominated for large times t by slowly relaxing long-wave modes leading to a characteristic {{t}-\\frac{n{α}} -decay. It can be concluded that, due to long range moves of fractional random walk, a small world property is emerging increasing the efficiency to explore the lattice when instead of a normal random walk a fractional random walk is chosen.

  2. Does Random Dispersion Help Survival?

    NASA Astrophysics Data System (ADS)

    Schinazi, Rinaldo B.

    2015-04-01

    Many species live in colonies that prosper for a while and then collapse. After the collapse the colony survivors disperse randomly and found new colonies that may or may not make it depending on the new environment they find. We use birth and death chains in random environments to model such a population and to argue that random dispersion is a superior strategy for survival.

  3. Cluster randomized trials for pharmacy practice research.

    PubMed

    Gums, Tyler; Carter, Barry; Foster, Eric

    2016-06-01

    Introduction Cluster randomized trials (CRTs) are now the gold standard in health services research, including pharmacy-based interventions. Studies of behaviour, epidemiology, lifestyle modifications, educational programs, and health care models are utilizing the strengths of cluster randomized analyses. Methodology The key property of CRTs is the unit of randomization (clusters), which may be different from the unit of analysis (individual). Subject sample size and, ideally, the number of clusters is determined by the relationship of between-cluster and within-cluster variability. The correlation among participants recruited from the same cluster is known as the intraclass correlation coefficient (ICC). Generally, having more clusters with smaller ICC values will lead to smaller sample sizes. When selecting clusters, stratification before randomization may be useful in decreasing imbalances between study arms. Participant recruitment methods can differ from other types of randomized trials, as blinding a behavioural intervention cannot always be done. When to use CRTs can yield results that are relevant for making "real world" decisions. CRTs are often used in non-therapeutic intervention studies (e.g. change in practice guidelines). The advantages of CRT design in pharmacy research have been avoiding contamination and the generalizability of the results. A large CRT that studied physician-pharmacist collaborative management of hypertension is used in this manuscript as a CRT example. The trial, entitled Collaboration Among Pharmacists and physicians To Improve Outcomes Now (CAPTION), was implemented in primary care offices in the United States for hypertensive patients. Limitations CRT design limitations include the need for a large number of clusters, high costs, increased training, increased monitoring, and statistical complexity.

  4. Selecting Interventions.

    ERIC Educational Resources Information Center

    Langdon, Danny G.

    1997-01-01

    Describes a systematic approach to selecting instructional designs, discussing performance analysis, gaps, elements (inputs, conditions, process, outputs, consequences, feedback), matrices, changes in performance state (establishing, improving, maintaining, and extinguishing performance), intervention interference, and involving others in…

  5. Selected References.

    ERIC Educational Resources Information Center

    Allen, Walter C.

    1987-01-01

    This extensive bibliography on library building includes 15 categories: bibliography; background; general; planning teams; building programs; alternatives to new buildings; academic libraries; public libraries; school libraries; special libraries; site selection; interior planning and equipment; maintenance; security; and moving. (MES)

  6. A Random Variable Transformation Process.

    ERIC Educational Resources Information Center

    Scheuermann, Larry

    1989-01-01

    Provides a short BASIC program, RANVAR, which generates random variates for various theoretical probability distributions. The seven variates include: uniform, exponential, normal, binomial, Poisson, Pascal, and triangular. (MVL)

  7. Phase transitions on random lattices: how random is topological disorder?

    PubMed

    Barghathi, Hatem; Vojta, Thomas

    2014-09-19

    We study the effects of topological (connectivity) disorder on phase transitions. We identify a broad class of random lattices whose disorder fluctuations decay much faster with increasing length scale than those of generic random systems, yielding a wandering exponent of ω=(d-1)/(2d) in d dimensions. The stability of clean critical points is thus governed by the criterion (d+1)ν>2 rather than the usual Harris criterion dν>2, making topological disorder less relevant than generic randomness. The Imry-Ma criterion is also modified, allowing first-order transitions to survive in all dimensions d>1. These results explain a host of puzzling violations of the original criteria for equilibrium and nonequilibrium phase transitions on random lattices. We discuss applications, and we illustrate our theory by computer simulations of random Voronoi and other lattices.

  8. Random distributed feedback fibre lasers

    NASA Astrophysics Data System (ADS)

    Turitsyn, Sergei K.; Babin, Sergey A.; Churkin, Dmitry V.; Vatnik, Ilya D.; Nikulin, Maxim; Podivilov, Evgenii V.

    2014-09-01

    The concept of random lasers exploiting multiple scattering of photons in an amplifying disordered medium in order to generate coherent light without a traditional laser resonator has attracted a great deal of attention in recent years. This research area lies at the interface of the fundamental theory of disordered systems and laser science. The idea was originally proposed in the context of astrophysics in the 1960s by V.S. Letokhov, who studied scattering with “negative absorption” of the interstellar molecular clouds. Research on random lasers has since developed into a mature experimental and theoretical field. A simple design of such lasers would be promising for potential applications. However, in traditional random lasers the properties of the output radiation are typically characterized by complex features in the spatial, spectral and time domains, making them less attractive than standard laser systems in terms of practical applications. Recently, an interesting and novel type of one-dimensional random laser that operates in a conventional telecommunication fibre without any pre-designed resonator mirrors-random distributed feedback fibre laser-was demonstrated. The positive feedback required for laser generation in random fibre lasers is provided by the Rayleigh scattering from the inhomogeneities of the refractive index that are naturally present in silica glass. In the proposed laser concept, the randomly backscattered light is amplified through the Raman effect, providing distributed gain over distances up to 100 km. Although an effective reflection due to the Rayleigh scattering is extremely small (˜0.1%), the lasing threshold may be exceeded when a sufficiently large distributed Raman gain is provided. Such a random distributed feedback fibre laser has a number of interesting and attractive features. The fibre waveguide geometry provides transverse confinement, and effectively one-dimensional random distributed feedback leads to the generation

  9. Model Selection with the Linear Mixed Model for Longitudinal Data

    ERIC Educational Resources Information Center

    Ryoo, Ji Hoon

    2011-01-01

    Model building or model selection with linear mixed models (LMMs) is complicated by the presence of both fixed effects and random effects. The fixed effects structure and random effects structure are codependent, so selection of one influences the other. Most presentations of LMM in psychology and education are based on a multilevel or…

  10. When Is Selection Effective?

    PubMed

    Gravel, Simon

    2016-05-01

    Deleterious alleles can reach high frequency in small populations because of random fluctuations in allele frequency. This may lead, over time, to reduced average fitness. In this sense, selection is more "effective" in larger populations. Recent studies have considered whether the different demographic histories across human populations have resulted in differences in the number, distribution, and severity of deleterious variants, leading to an animated debate. This article first seeks to clarify some terms of the debate by identifying differences in definitions and assumptions used in recent studies. We argue that variants of Morton, Crow, and Muller's "total mutational damage" provide the soundest and most practical basis for such comparisons. Using simulations, analytical calculations, and 1000 Genomes Project data, we provide an intuitive and quantitative explanation for the observed similarity in genetic load across populations. We show that recent demography has likely modulated the effect of selection and still affects it, but the net result of the accumulated differences is small. Direct observation of differential efficacy of selection for specific allele classes is nevertheless possible with contemporary data sets. By contrast, identifying average genome-wide differences in the efficacy of selection across populations will require many modeling assumptions and is unlikely to provide much biological insight about human populations.

  11. Ticks of a Random clock

    NASA Astrophysics Data System (ADS)

    Jung, P.; Talkner, P.

    2010-09-01

    A simple way to convert a purely random sequence of events into a signal with a strong periodic component is proposed. The signal consists of those instants of time at which the length of the random sequence exceeds an integer multiple of a given number. The larger this number the more pronounced the periodic behavior becomes.

  12. The random continued fraction transformation

    NASA Astrophysics Data System (ADS)

    Kalle, Charlene; Kempton, Tom; Verbitskiy, Evgeny

    2017-03-01

    We introduce a random dynamical system related to continued fraction expansions. It uses random combinations of the Gauss map and the Rényi (or backwards) continued fraction map. We explore the continued fraction expansions that this system produces, as well as the dynamical properties of the system.

  13. A brief note regarding randomization.

    PubMed

    Senn, Stephen

    2013-01-01

    This note argues, contrary to claims in this journal, that the possible existence of indefinitely many causal factors does not invalidate randomization. The effect of such factors has to be bounded by outcome, and since inference is based on a ratio of between-treatment-group to within-treatment-group variation, randomization remains valid.

  14. Quantum to classical randomness extractors

    NASA Astrophysics Data System (ADS)

    Wehner, Stephanie; Berta, Mario; Fawzi, Omar

    2013-03-01

    The goal of randomness extraction is to distill (almost) perfect randomness from a weak source of randomness. When the source yields a classical string X, many extractor constructions are known. Yet, when considering a physical randomness source, X is itself ultimately the result of a measurement on an underlying quantum system. When characterizing the power of a source to supply randomness it is hence a natural question to ask, how much classical randomness we can extract from a quantum system. To tackle this question we here introduce the notion of quantum-to-classical randomness extractors (QC-extractors). We identify an entropic quantity that determines exactly how much randomness can be obtained. Furthermore, we provide constructions of QC-extractors based on measurements in a full set of mutually unbiased bases (MUBs), and certain single qubit measurements. As the first application, we show that any QC-extractor gives rise to entropic uncertainty relations with respect to quantum side information. Such relations were previously only known for two measurements. As the second application, we resolve the central open question in the noisy-storage model [Wehner et al., PRL 100, 220502 (2008)] by linking security to the quantum capacity of the adversary's storage device.

  15. Selected Health Practices Among Ohio's Rural Residents.

    ERIC Educational Resources Information Center

    Phillips, G. Howard; Pugh, Albert

    Using a stratified random sample of 12 of Ohio's 88 counties, this 1967 study had as its objectives (1) to measure the level of participation in selected health practices by Ohio's rural residents, (2) to compare the level of participation in selected health practices of farm and rural nonfarm residents, and (3) to examine levels of participation…

  16. Evaluating the random representation assumption of lexical semantics in cognitive models.

    PubMed

    Johns, Brendan T; Jones, Michael N

    2010-10-01

    A common assumption implicit in cognitive models is that lexical semantics can be approximated by using randomly generated representations to stand in for word meaning. However, the use of random representations contains the hidden assumption that semantic similarity is symmetrically distributed across randomly selected words or between instances within a semantic category. We evaluated this assumption by computing similarity distributions for randomly selected words from a number of well-known semantic measures and comparing them with the distributions from random representations commonly used in cognitive models. The similarity distributions from all semantic measures were positively skewed compared with the symmetric normal distributions assumed by random representations. We discuss potential consequences that this false assumption may have for conclusions drawn from process models that use random representations.

  17. Aging transition by random errors

    PubMed Central

    Sun, Zhongkui; Ma, Ning; Xu, Wei

    2017-01-01

    In this paper, the effects of random errors on the oscillating behaviors have been studied theoretically and numerically in a prototypical coupled nonlinear oscillator. Two kinds of noises have been employed respectively to represent the measurement errors accompanied with the parameter specifying the distance from a Hopf bifurcation in the Stuart-Landau model. It has been demonstrated that when the random errors are uniform random noise, the change of the noise intensity can effectively increase the robustness of the system. While the random errors are normal random noise, the increasing of variance can also enhance the robustness of the system under certain conditions that the probability of aging transition occurs reaches a certain threshold. The opposite conclusion is obtained when the probability is less than the threshold. These findings provide an alternative candidate to control the critical value of aging transition in coupled oscillator system, which is composed of the active oscillators and inactive oscillators in practice. PMID:28198430

  18. Aging transition by random errors

    NASA Astrophysics Data System (ADS)

    Sun, Zhongkui; Ma, Ning; Xu, Wei

    2017-02-01

    In this paper, the effects of random errors on the oscillating behaviors have been studied theoretically and numerically in a prototypical coupled nonlinear oscillator. Two kinds of noises have been employed respectively to represent the measurement errors accompanied with the parameter specifying the distance from a Hopf bifurcation in the Stuart-Landau model. It has been demonstrated that when the random errors are uniform random noise, the change of the noise intensity can effectively increase the robustness of the system. While the random errors are normal random noise, the increasing of variance can also enhance the robustness of the system under certain conditions that the probability of aging transition occurs reaches a certain threshold. The opposite conclusion is obtained when the probability is less than the threshold. These findings provide an alternative candidate to control the critical value of aging transition in coupled oscillator system, which is composed of the active oscillators and inactive oscillators in practice.

  19. KASER: Knowledge Amplification by Structured Expert Randomization.

    PubMed

    Rubin, Stuart H; Murthy, S N Jayaram; Smith, Michael H; Trajković, Ljiljana

    2004-12-01

    In this paper and attached video, we present a third-generation expert system named Knowledge Amplification by Structured Expert Randomization (KASER) for which a patent has been filed by the U.S. Navy's SPAWAR Systems Center, San Diego, CA (SSC SD). KASER is a creative expert system. It is capable of deductive, inductive, and mixed derivations. Its qualitative creativity is realized by using a tree-search mechanism. The system achieves creative reasoning by using a declarative representation of knowledge consisting of object trees and inheritance. KASER computes with words and phrases. It possesses a capability for metaphor-based explanations. This capability is useful in explaining its creative suggestions and serves to augment the capabilities provided by the explanation subsystems of conventional expert systems. KASER also exhibits an accelerated capability to learn. However, this capability depends on the particulars of the selected application domain. For example, application domains such as the game of chess exhibit a high degree of geometric symmetry. Conversely, application domains such as the game of craps played with two dice exhibit no predictable pattern, unless the dice are loaded. More generally, we say that domains whose informative content can be compressed to a significant degree without loss (or with relatively little loss) are symmetric. Incompressible domains are said to be asymmetric or random. The measure of symmetry plus the measure of randomness must always sum to unity.

  20. Non-Hermitian random matrix models: Free random variable approach

    SciTech Connect

    Janik, R.A.,; Nowak, M.A., ||; Papp, G.,; Wambach, J.,; Zahed, I., |

    1997-04-01

    Using the standard concepts of free random variables, we show that for a large class of non-Hermitian random matrix models, the support of the eigenvalue distribution follows from their Hermitian analogs using a conformal transformation. We also extend the concepts of free random variables to the class of non-Hermitian matrices, and apply them to the models discussed by Ginibre-Girko (elliptic ensemble) [J. Ginibre, J. Math. Phys. {bold 6}, 1440 (1965); V. L. Girko, {ital Spectral Theory of Random Matrices} (in Russian) (Nauka, Moscow, 1988)] and Mahaux-Weidenm{umlt u}ller (chaotic resonance scattering) [C. Mahaux and H. A. Weidenm{umlt u}ller, {ital Shell-model Approach to Nuclear Reactions} (North-Holland, Amsterdam, 1969)]. {copyright} {ital 1997} {ital The American Physical Society}

  1. True Randomness from Big Data

    NASA Astrophysics Data System (ADS)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  2. True Randomness from Big Data

    PubMed Central

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  3. True Randomness from Big Data.

    PubMed

    Papakonstantinou, Periklis A; Woodruff, David P; Yang, Guang

    2016-09-26

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  4. Random sequential adsorption on fractals.

    PubMed

    Ciesla, Michal; Barbasz, Jakub

    2012-07-28

    Irreversible adsorption of spheres on flat collectors having dimension d < 2 is studied. Molecules are adsorbed on Sierpinski's triangle and carpet-like fractals (1 < d < 2), and on general Cantor set (d < 1). Adsorption process is modeled numerically using random sequential adsorption (RSA) algorithm. The paper concentrates on measurement of fundamental properties of coverages, i.e., maximal random coverage ratio and density autocorrelation function, as well as RSA kinetics. Obtained results allow to improve phenomenological relation between maximal random coverage ratio and collector dimension. Moreover, simulations show that, in general, most of known dimensional properties of adsorbed monolayers are valid for non-integer dimensions.

  5. A random spatial sampling method in a rural developing nation

    PubMed Central

    2014-01-01

    Background Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. Methods We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. Results This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Conclusions Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available. PMID:24716473

  6. Performance of Random Effects Model Estimators under Complex Sampling Designs

    ERIC Educational Resources Information Center

    Jia, Yue; Stokes, Lynne; Harris, Ian; Wang, Yan

    2011-01-01

    In this article, we consider estimation of parameters of random effects models from samples collected via complex multistage designs. Incorporation of sampling weights is one way to reduce estimation bias due to unequal probabilities of selection. Several weighting methods have been proposed in the literature for estimating the parameters of…

  7. Record statistics of financial time series and geometric random walks.

    PubMed

    Sabir, Behlool; Santhanam, M S

    2014-09-01

    The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.

  8. Record statistics of financial time series and geometric random walks

    NASA Astrophysics Data System (ADS)

    Sabir, Behlool; Santhanam, M. S.

    2014-09-01

    The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.

  9. Random property allocation: A novel geographic imputation procedure based on a complete geocoded address file.

    PubMed

    Walter, Scott R; Rose, Nectarios

    2013-09-01

    Allocating an incomplete address to randomly selected property coordinates within a locality, known as random property allocation, has many advantages over other geoimputation techniques. We compared the performance of random property allocation to four other methods under various conditions using a simulation approach. All methods performed well for large spatial units, but random property allocation was the least prone to bias and error under volatile scenarios with small units and low prevalence. Both its coordinate based approach as well as the random process of assignment contribute to its increased accuracy and reduced bias in many scenarios. Hence it is preferable to fixed or areal geoimputation for many epidemiological and surveillance applications.

  10. Selective Enumeration

    DTIC Science & Technology

    2000-07-01

    claims about properties of a specification by solving formulae derived the specification. Ladybug is a new tool that incorporates selective enumeration...languages. Ladybug includes implementations of three significant new algorithms to help reduce the search space: bounded generation, domain coloring, and...thesis, I have applied Ladybug to a suite of software specifications, including both artificial and "real-world" specifications, to quantify the

  11. Quantifying randomness in real networks

    PubMed Central

    Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-01-01

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks—the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain—and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs. PMID:26482121

  12. Quantifying randomness in real networks

    NASA Astrophysics Data System (ADS)

    Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-10-01

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs.

  13. Quantum-noise randomized ciphers

    SciTech Connect

    Nair, Ranjith; Yuen, Horace P.; Kumar, Prem; Corndorf, Eric; Eguchi, Takami

    2006-11-15

    We review the notion of a classical random cipher and its advantages. We sharpen the usual description of random ciphers to a particular mathematical characterization suggested by the salient feature responsible for their increased security. We describe a concrete system known as {alpha}{eta} and show that it is equivalent to a random cipher in which the required randomization is affected by coherent-state quantum noise. We describe the currently known security features of {alpha}{eta} and similar systems, including lower bounds on the unicity distances against ciphertext-only and known-plaintext attacks. We show how {alpha}{eta} used in conjunction with any standard stream cipher such as the Advanced Encryption Standard provides an additional, qualitatively different layer of security from physical encryption against known-plaintext attacks on the key. We refute some claims in the literature that {alpha}{eta} is equivalent to a nonrandom stream cipher.

  14. Cluster randomization and political philosophy.

    PubMed

    Chwang, Eric

    2012-11-01

    In this paper, I will argue that, while the ethical issues raised by cluster randomization can be challenging, they are not new. My thesis divides neatly into two parts. In the first, easier part I argue that many of the ethical challenges posed by cluster randomized human subjects research are clearly present in other types of human subjects research, and so are not novel. In the second, more difficult part I discuss the thorniest ethical challenge for cluster randomized research--cases where consent is genuinely impractical to obtain. I argue that once again these cases require no new analytic insight; instead, we should look to political philosophy for guidance. In other words, the most serious ethical problem that arises in cluster randomized research also arises in political philosophy.

  15. Quantifying randomness in real networks.

    PubMed

    Orsini, Chiara; Dankulov, Marija M; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-10-20

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs.

  16. Selective Emitters

    NASA Technical Reports Server (NTRS)

    Chubb, Donald L. (Inventor)

    1992-01-01

    This invention relates to a small particle selective emitter for converting thermal energy into narrow band radiation with high efficiency. The small particle selective emitter is used in combination with a photovoltaic array to provide a thermal to electrical energy conversion device. An energy conversion apparatus of this type is called a thermo-photovoltaic device. In the first embodiment, small diameter particles of a rare earth oxide are suspended in an inert gas enclosed between concentric cylinders. The rare earth oxides are used because they have the desired property of large emittance in a narrow wavelength band and small emittance outside the band. However, it should be emphasized that it is the smallness of the particles that enhances the radiation property. The small particle selective emitter is surrounded by a photovoltaic array. In an alternate embodiment, the small particle gas mixture is circulated through a thermal energy source. This thermal energy source can be a nuclear reactor, solar receiver, or combustor of a fossil fuel.

  17. Selective emitters

    NASA Astrophysics Data System (ADS)

    Chubb, Donald L.

    1992-01-01

    This invention relates to a small particle selective emitter for converting thermal energy into narrow band radiation with high efficiency. The small particle selective emitter is used in combination with a photovoltaic array to provide a thermal to electrical energy conversion device. An energy conversion apparatus of this type is called a thermo-photovoltaic device. In the first embodiment, small diameter particles of a rare earth oxide are suspended in an inert gas enclosed between concentric cylinders. The rare earth oxides are used because they have the desired property of large emittance in a narrow wavelength band and small emittance outside the band. However, it should be emphasized that it is the smallness of the particles that enhances the radiation property. The small particle selective emitter is surrounded by a photovoltaic array. In an alternate embodiment, the small particle gas mixture is circulated through a thermal energy source. This thermal energy source can be a nuclear reactor, solar receiver, or combustor of a fossil fuel.

  18. Staggered chiral random matrix theory

    SciTech Connect

    Osborn, James C.

    2011-02-01

    We present a random matrix theory for the staggered lattice QCD Dirac operator. The staggered random matrix theory is equivalent to the zero-momentum limit of the staggered chiral Lagrangian and includes all taste breaking terms at their leading order. This is an extension of previous work which only included some of the taste breaking terms. We will also present some results for the taste breaking contributions to the partition function and the Dirac eigenvalues.

  19. Linear equations with random variables.

    PubMed

    Tango, Toshiro

    2005-10-30

    A system of linear equations is presented where the unknowns are unobserved values of random variables. A maximum likelihood estimator assuming a multivariate normal distribution and a non-parametric proportional allotment estimator are proposed for the unobserved values of the random variables and for their means. Both estimators can be computed by simple iterative procedures and are shown to perform similarly. The methods are illustrated with data from a national nutrition survey in Japan.

  20. Large Deviations for Random Trees

    PubMed Central

    Heitsch, Christine

    2010-01-01

    We consider large random trees under Gibbs distributions and prove a Large Deviation Principle (LDP) for the distribution of degrees of vertices of the tree. The LDP rate function is given explicitly. An immediate consequence is a Law of Large Numbers for the distribution of vertex degrees in a large random tree. Our motivation for this study comes from the analysis of RNA secondary structures. PMID:20216937

  1. On Pfaffian Random Point Fields

    NASA Astrophysics Data System (ADS)

    Kargin, V.

    2014-02-01

    We study Pfaffian random point fields by using the Moore-Dyson quaternion determinants. First, we give sufficient conditions that ensure that a self-dual quaternion kernel defines a valid random point field, and then we prove a CLT for Pfaffian point fields. The proofs are based on a new quaternion extension of the Cauchy-Binet determinantal identity. In addition, we derive the Fredholm determinantal formulas for the Pfaffian point fields which use the quaternion determinant.

  2. The MIXMAX random number generator

    NASA Astrophysics Data System (ADS)

    Savvidy, Konstantin G.

    2015-11-01

    In this paper, we study the randomness properties of unimodular matrix random number generators. Under well-known conditions, these discrete-time dynamical systems have the highly desirable K-mixing properties which guarantee high quality random numbers. It is found that some widely used random number generators have poor Kolmogorov entropy and consequently fail in empirical tests of randomness. These tests show that the lowest acceptable value of the Kolmogorov entropy is around 50. Next, we provide a solution to the problem of determining the maximal period of unimodular matrix generators of pseudo-random numbers. We formulate the necessary and sufficient condition to attain the maximum period and present a family of specific generators in the MIXMAX family with superior performance and excellent statistical properties. Finally, we construct three efficient algorithms for operations with the MIXMAX matrix which is a multi-dimensional generalization of the famous cat-map. First, allowing to compute the multiplication by the MIXMAX matrix with O(N) operations. Second, to recursively compute its characteristic polynomial with O(N2) operations, and third, to apply skips of large number of steps S to the sequence in O(N2 log(S)) operations.

  3. Nematode-Bacteria Mutualism: Selection Within the Mutualism Supersedes Selection Outside of the Mutualism

    PubMed Central

    Morran, Levi T.; Penley, McKenna J.; Byrd, Victoria S.; Meyer, Andrew J.; O’Sullivan, Timothy S.; Bashey, Farrah; Goodrich-Blair, Heidi; Lively, Curtis M.

    2016-01-01

    The coevolution of interacting species can lead to co-dependent mutualists. Little is known about the effect of selection on partners within verses apart from the association. Here, we determined the effect of selection on bacteria (Xenorhabdus nematophila) both within and apart from its mutualistic partner (a nematode, Steinernema carpocapsae). In nature, the two species cooperatively infect and kill arthropods. We passaged the bacteria either together with (M+), or isolated from (M−), nematodes under two different selection regimes: random selection (S−) and selection for increased virulence against arthropod hosts (S+). We found that the isolated bacteria evolved greater virulence under selection for greater virulence (M−S+) than under random selection (M−S−). In addition, the response to selection in the isolated bacteria (M−S+) caused a breakdown of the mutualism following reintroduction to the nematode. Finally, selection for greater virulence did not alter the evolutionary trajectories of bacteria passaged within the mutualism (M+S+ = M+S−), indicating that selection for the maintenance of the mutualism was stronger than selection for increased virulence. The results show that selection on isolated mutualists can rapidly breakdown beneficial interactions between species, but that selection within a mutualism can supersede external selection, potentially generating co-dependence over time. PMID:26867502

  4. Opinion dynamics with similarity-based random neighbors

    NASA Astrophysics Data System (ADS)

    Liu, Qipeng; Wang, Xiaofan

    2013-10-01

    A typical assumption made in the existing opinion formation models is that two individuals can communicate with each other only if the distance between their opinions is less than a threshold called bound of confidence. However, in the real world it is quite possible that people may also have a few friends with quite different opinions. To model this situation, we propose a bounded confidence plus random selection model, in which each agent has several long-range neighbors outside the bound who are selected according to a similarity-based probability rule. We find that the opinions of all agents can reach a consensus in bounded time. We further consider the situation when agents ignore the bound of confidence and select all their neighbors randomly according to the similarity-based probability rule. We prove that in this scenario the whole group could also reach a consensus but in the probability sense.

  5. EDITORIAL: Nanotechnological selection Nanotechnological selection

    NASA Astrophysics Data System (ADS)

    Demming, Anna

    2013-01-01

    At the nanoscale measures can move from a mass-scale analogue calibration to counters of discrete units. The shift redefines the possible levels of control that can be achieved in a system if adequate selectivity can be imposed. As an example as ionic substances pass through nanoscale pores, the quantity of ions is low enough that the pore can contain either negative or positive ions. Yet precise control over this selectivity still raises difficulties. In this issue researchers address the challenge of how to regulate the ionic selectivity of negative and positive charges with the use of an external charge. The approach may be useful for controlling the behaviour, properties and chemical composition of liquids and has possible technical applications for nanofluidic field effect transistors [1]. Selectivity is a critical advantage in the administration of drugs. Nanoparticles functionalized with targeting moieties can allow delivery of anti-cancer drugs to tumour cells, whilst avoiding healthy cells and hence reducing some of the debilitating side effects of cancer treatments [2]. Researchers in Belarus and the US developed a new theranostic approach—combining therapy and diagnosis—to support the evident benefits of cellular selectivity that can be achieved when nanoparticles are applied in medicine [3]. Their process uses nanobubbles of photothermal vapour, referred to as plasmonic nanobubbles, generated by plasmonic excitations in gold nanoparticles conjugated to diagnosis-specific antibodies. The intracellular plasmonic nanobubbles are controlled by laser fluence so that the response can be tuned in individual living cells. Lower fluence allows non-invasive high-sensitive imaging for diagnosis and higher fluence can disrupt the cellular membrane for treatments. The selective response of carbon nanotubes to different gases has leant them to be used within various different types of sensors, as summarized in a review by researchers at the University of

  6. On Convergent Probability of a Random Walk

    ERIC Educational Resources Information Center

    Lee, Y.-F.; Ching, W.-K.

    2006-01-01

    This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.

  7. Wave propagation through a random medium - The random slab problem

    NASA Technical Reports Server (NTRS)

    Acquista, C.

    1978-01-01

    The first-order smoothing approximation yields integral equations for the mean and the two-point correlation function of a wave in a random medium. A method is presented for the approximate solution of these equations that combines features of the eiconal approximation and of the Born expansion. This method is applied to the problem of reflection and transmission of a plane wave by a slab of a random medium. Both the mean wave and the covariance are calculated to determine the reflected and transmitted amplitudes and intensities.

  8. Cover times of random searches

    NASA Astrophysics Data System (ADS)

    Chupeau, Marie; Bénichou, Olivier; Voituriez, Raphaël

    2015-10-01

    How long must one undertake a random search to visit all sites of a given domain? This time, known as the cover time, is a key observable to quantify the efficiency of exhaustive searches, which require a complete exploration of an area and not only the discovery of a single target. Examples range from immune-system cells chasing pathogens to animals harvesting resources, from robotic exploration for cleaning or demining to the task of improving search algorithms. Despite its broad relevance, the cover time has remained elusive and so far explicit results have been scarce and mostly limited to regular random walks. Here we determine the full distribution of the cover time for a broad range of random search processes, including Lévy strategies, intermittent strategies, persistent random walks and random walks on complex networks, and reveal its universal features. We show that for all these examples the mean cover time can be minimized, and that the corresponding optimal strategies also minimize the mean search time for a single target, unambiguously pointing towards their robustness.

  9. Forest Fires in a Random Forest

    NASA Astrophysics Data System (ADS)

    Leuenberger, Michael; Kanevski, Mikhaïl; Vega Orozco, Carmen D.

    2013-04-01

    Forest fires in Canton Ticino (Switzerland) are very complex phenomena. Meteorological data can explain some occurrences of fires in time, but not necessarily in space. Using anthropogenic and geographical feature data with the random forest algorithm, this study tries to highlight factors that most influence the fire-ignition and to identify areas under risk. The fundamental scientific problem considered in the present research deals with an application of random forest algorithms for the analysis and modeling of forest fires patterns in a high dimensional input feature space. This study is focused on the 2,224 anthropogenic forest fires among the 2,401 forest fire ignition points that have occurred in Canton Ticino from 1969 to 2008. Provided by the Swiss Federal Institute for Forest, Snow and Landscape Research (WSL), the database characterizes each fire by their location (x,y coordinates of the ignition point), start date, duration, burned area, and other information such as ignition cause and topographic features such as slope, aspect, altitude, etc. In addition, the database VECTOR25 from SwissTopo was used to extract information of the distances between fire ignition points and anthropogenic structures like buildings, road network, rail network, etc. Developed by L. Breiman and A. Cutler, the Random Forests (RF) algorithm provides an ensemble of classification and regression trees. By a pseudo-random variable selection for each split node, this method grows a variety of decision trees that do not return the same results, and thus by a committee system, returns a value that has a better accuracy than other machine learning methods. This algorithm incorporates directly measurement of importance variable which is used to display factors affecting forest fires. Dealing with this parameter, several models can be fit, and thus, a prediction can be made throughout the validity domain of Canton Ticino. Comprehensive RF analysis was carried out in order to 1

  10. The XXZ Heisenberg model on random surfaces

    NASA Astrophysics Data System (ADS)

    Ambjørn, J.; Sedrakyan, A.

    2013-09-01

    We consider integrable models, or in general any model defined by an R-matrix, on random surfaces, which are discretized using random Manhattan lattices. The set of random Manhattan lattices is defined as the set dual to the lattice random surfaces embedded on a regular d-dimensional lattice. They can also be associated with the random graphs of multiparticle scattering nodes. As an example we formulate a random matrix model where the partition function reproduces the annealed average of the XXZ Heisenberg model over all random Manhattan lattices. A technique is presented which reduces the random matrix integration in partition function to an integration over their eigenvalues.

  11. Site selection

    SciTech Connect

    Olsen, C.W.

    1983-07-01

    The conditions and criteria for selecting a site for a nuclear weapons test at the Nevada Test Site are summarized. Factors considered are: (1) scheduling of drill rigs, (2) scheduling of site preparation (dirt work, auger hole, surface casing, cementing), (3) schedule of event (when are drill hole data needed), (4) depth range of proposed W.P., (5) geologic structure (faults, Pz contact, etc.), (6) stratigraphy (alluvium, location of Grouse Canyon Tuff, etc.), (7) material properties (particularly montmorillonite and CO/sub 2/ content), (8) water table depth, (9) potential drilling problems (caving), (10) adjacent collapse craters and chimneys, (11) adjacent expended but uncollapsed sites, (12) adjacent post-shot or other small diameter holes, (13) adjacent stockpile emplacement holes, (14) adjacent planned events (including LANL), (15) projected needs of Test Program for various DOB's and operational separations, and (16) optimal use of NTS real estate.

  12. Triangulation in Random Refractive Distortions.

    PubMed

    Alterman, Marina; Schechner, Yoav Y; Swirski, Yohay

    2017-03-01

    Random refraction occurs in turbulence and through a wavy water-air interface. It creates distortion that changes in space, time and with viewpoint. Localizing objects in three dimensions (3D) despite this random distortion is important to some predators and also to submariners avoiding the salient use of periscopes. We take a multiview approach to this task. Refracted distortion statistics induce a probabilistic relation between any pixel location and a line of sight in space. Measurements of an object's random projection from multiple views and times lead to a likelihood function of the object's 3D location. The likelihood leads to estimates of the 3D location and its uncertainty. Furthermore, multiview images acquired simultaneously in a wide stereo baseline have uncorrelated distortions. This helps reduce the acquisition time needed for localization. The method is demonstrated in stereoscopic video sequences, both in a lab and a swimming pool.

  13. Risk, randomness, crashes and quants

    NASA Astrophysics Data System (ADS)

    Farhadi, Alessio; Vvedensky, Dimitri

    2003-03-01

    Market movements, whether short-term fluctuations, long-term trends, or sudden surges or crashes, have an immense and widespread economic impact. These movements are suggestive of the complex behaviour seen in many non-equilibrium physical systems. Not surprisingly, therefore, the characterization of market behaviour presents an inviting challenge to the physical sciences and, indeed, many concepts and methods developed for modelling non-equilibrium natural phenomena have found fertile ground in financial settings. In this review, we begin with the simplest random process, the random walk, and, assuming no prior knowledge of markets, build up to the conceptual and computational machinery used to analyse and model the behaviour of financial systems. We then consider the evidence that calls into question several aspects of the random walk model of markets and discuss some ideas that have been put forward to account for the observed discrepancies. The application of all of these methods is illustrated with examples of actual market data.

  14. Balancing Participation across Students in Large College Classes via Randomized Participation Credit

    ERIC Educational Resources Information Center

    McCleary, Daniel F.; Aspiranti, Kathleen B.; Foster, Lisa N.; Blondin, Carolyn A.; Gaylon, Charles E.; Yaw, Jared S.; Forbes, Bethany N.; Williams, Robert L.

    2011-01-01

    The study examines the effects of randomized credit on the percentage of students participating at four predefined levels. Students recorded their comments on specially designed record cards, and days were randomly selected for participation credit. This arrangement balanced participation across students while cutting instructor time for recording…

  15. Analysis of Quantitative Traits in Two Long-Term Randomly Mated Soybean Populations I. Genetic Variances

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The genetic effects of long term random mating and natural selection aided by genetic male sterility were evaluated in two soybean [Glycine max (L.) Merr.] populations: RSII and RSIII. Population means, variances, and heritabilities were estimated to determine the effects of 26 generations of random...

  16. 78 FR 71036 - Pipeline Safety: Random Drug Testing Rate; Contractor Management Information System Reporting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-27

    ... for Random Drug Testing Operators of gas, hazardous liquid, and carbon dioxide pipelines and operators of liquefied natural gas facilities must randomly select and test a percentage of covered employees...://opsweb.phmsa.dot.gov/portal_message/PHMSA_Portal_Registration.pdf . Pursuant to Sec. Sec. 199.119(a)...

  17. Granulator Selection

    SciTech Connect

    Gould, T H; Armantrout, G

    1999-08-02

    Following our detailed review of the granulation reports and additional conversations with process and development personnel, we have reached a consensus position regarding granulator selection. At this time, we recommend going forward with implementation of the tumbling granulator approach (GEMCO) based on our assessment of the tested granulation techniques using the established criteria. The basis for this selection is summarized in the following sections, followed by our recommendations for proceeding with implementation of the tumbling granulation approach. All five granulation technologies produced granulated products that can be made into acceptable sintered pucks. A possible exception is the product from the fluidized bed granulator. This material has been more difficult to press into uniform pucks without subsequent cracking of the puck during the sintering cycle for the pucks in this series of tests. This problem may be an artifact of the conditions of the particular granulation demonstration run involved, but earlier results have also been mixed. All granulators made acceptable granulated feed from the standpoint of transfer and press feeding, though the roller compactor and fluidized bed products were dustier than the rest. There was also differentiation among the granulators in the operational areas of (1) potential for process upset, (2) plant implementation and operational complexity, and (3) maintenance concerns. These considerations will be discussed further in the next section. Note that concerns also exist regarding the extension of the granulation processes to powders containing actinides. Only the method that involves tumbling and moisture addition has been tested with uranium, and in that instance, significant differences were found in the granulation behavior of the powders.

  18. Systematic random sampling of the comet assay.

    PubMed

    McArt, Darragh G; Wasson, Gillian R; McKerr, George; Saetzler, Kurt; Reed, Matt; Howard, C Vyvyan

    2009-07-01

    The comet assay is a technique used to quantify DNA damage and repair at a cellular level. In the assay, cells are embedded in agarose and the cellular content is stripped away leaving only the DNA trapped in an agarose cavity which can then be electrophoresed. The damaged DNA can enter the agarose and migrate while the undamaged DNA cannot and is retained. DNA damage is measured as the proportion of the migratory 'tail' DNA compared to the total DNA in the cell. The fundamental basis of these arbitrary values is obtained in the comet acquisition phase using fluorescence microscopy with a stoichiometric stain in tandem with image analysis software. Current methods deployed in such an acquisition are expected to be both objectively and randomly obtained. In this paper we examine the 'randomness' of the acquisition phase and suggest an alternative method that offers both objective and unbiased comet selection. In order to achieve this, we have adopted a survey sampling approach widely used in stereology, which offers a method of systematic random sampling (SRS). This is desirable as it offers an impartial and reproducible method of comet analysis that can be used both manually or automated. By making use of an unbiased sampling frame and using microscope verniers, we are able to increase the precision of estimates of DNA damage. Results obtained from a multiple-user pooled variation experiment showed that the SRS technique attained a lower variability than that of the traditional approach. The analysis of a single user with repetition experiment showed greater individual variances while not being detrimental to overall averages. This would suggest that the SRS method offers a better reflection of DNA damage for a given slide and also offers better user reproducibility.

  19. Feature Learning Based Random Walk for Liver Segmentation

    PubMed Central

    Zheng, Yongchang; Ai, Danni; Zhang, Pan; Gao, Yefei; Xia, Likun; Du, Shunda; Sang, Xinting; Yang, Jian

    2016-01-01

    Liver segmentation is a significant processing technique for computer-assisted diagnosis. This method has attracted considerable attention and achieved effective result. However, liver segmentation using computed tomography (CT) images remains a challenging task because of the low contrast between the liver and adjacent organs. This paper proposes a feature-learning-based random walk method for liver segmentation using CT images. Four texture features were extracted and then classified to determine the classification probability corresponding to the test images. Seed points on the original test image were automatically selected and further used in the random walk (RW) algorithm to achieve comparable results to previous segmentation methods. PMID:27846217

  20. Using Psychokinesis to Explore the Nature of Quantum Randomness

    NASA Astrophysics Data System (ADS)

    Burns, Jean E.

    2011-11-01

    In retrocausation different causal events can produce different successor events, yet a successor event reflecting a particular cause occurs before the causal event does. It is sometimes proposed that the successor event is determined by propagation of the causal effect backwards in time via the dynamical equations governing the events. However, because dynamical equations are time reversible, the evolution of the system is not subject to change. Therefore, the backward propagation hypothesis implies that what may have seemed to be an arbitrary selection of a causal factor was in reality predetermined. Yet quantum randomness can be used to determine the causal factor, and a quantum random event is ordinarily thought of as being arbitrarily generated. So we must ask, when quantum random events occur, are they arbitrary (subject to their probabilistic constraints) or are they predetermined? Because psychokinesis (PK) can act on quantum random events, it can be used as a probe to explore questions such as the above. It is found that if quantum random events are predetermined (aside from the action of PK), certain types of experimental design can show enhanced PK through the use of precognition. Actual experiments are examined and compared, and most of those for which the design is especially suitable for showing this effect had unusually low p values for the number of trials. It is concluded that either the experimenter produced a remarkably strong experimenter effect or quantum random events are predetermined, thereby enabling enhanced PK in suitable experimental designs.

  1. Mixing rates and limit theorems for random intermittent maps

    NASA Astrophysics Data System (ADS)

    Bahsoun, Wael; Bose, Christopher

    2016-04-01

    We study random transformations built from intermittent maps on the unit interval that share a common neutral fixed point. We focus mainly on random selections of Pomeu-Manneville-type maps {{T}α} using the full parameter range 0<α <∞ , in general. We derive a number of results around a common theme that illustrates in detail how the constituent map that is fastest mixing (i.e. smallest α) combined with details of the randomizing process, determines the asymptotic properties of the random transformation. Our key result (theorem 1.1) establishes sharp estimates on the position of return time intervals for the quenched dynamics. The main applications of this estimate are to limit laws (in particular, CLT and stable laws, depending on the parameters chosen in the range 0<α <1 ) for the associated skew product; these are detailed in theorem 3.2. Since our estimates in theorem 1.1 also hold for 1≤slant α <∞ we study a second class of random transformations derived from piecewise affine Gaspard-Wang maps, prove existence of an infinite (σ-finite) invariant measure and study the corresponding correlation asymptotics. To the best of our knowledge, this latter kind of result is completely new in the setting of random transformations.

  2. Using Psychokinesis to Explore the Nature of Quantum Randomness

    SciTech Connect

    Burns, Jean E.

    2011-11-29

    In retrocausation different causal events can produce different successor events, yet a successor event reflecting a particular cause occurs before the causal event does. It is sometimes proposed that the successor event is determined by propagation of the causal effect backwards in time via the dynamical equations governing the events. However, because dynamical equations are time reversible, the evolution of the system is not subject to change. Therefore, the backward propagation hypothesis implies that what may have seemed to be an arbitrary selection of a causal factor was in reality predetermined.Yet quantum randomness can be used to determine the causal factor, and a quantum random event is ordinarily thought of as being arbitrarily generated. So we must ask, when quantum random events occur, are they arbitrary (subject to their probabilistic constraints) or are they predetermined?Because psychokinesis (PK) can act on quantum random events, it can be used as a probe to explore questions such as the above. It is found that if quantum random events are predetermined (aside from the action of PK), certain types of experimental design can show enhanced PK through the use of precognition. Actual experiments are examined and compared, and most of those for which the design is especially suitable for showing this effect had unusually low p values for the number of trials. It is concluded that either the experimenter produced a remarkably strong experimenter effect or quantum random events are predetermined, thereby enabling enhanced PK in suitable experimental designs.

  3. A New Random Walk for Replica Detection in WSNs

    PubMed Central

    Aalsalem, Mohammed Y.; Saad, N. M.; Hossain, Md. Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram

    2016-01-01

    Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical. PMID:27409082

  4. Fragmentation of Fractal Random Structures

    NASA Astrophysics Data System (ADS)

    Elçi, Eren Metin; Weigel, Martin; Fytas, Nikolaos G.

    2015-03-01

    We analyze the fragmentation behavior of random clusters on the lattice under a process where bonds between neighboring sites are successively broken. Modeling such structures by configurations of a generalized Potts or random-cluster model allows us to discuss a wide range of systems with fractal properties including trees as well as dense clusters. We present exact results for the densities of fragmenting edges and the distribution of fragment sizes for critical clusters in two dimensions. Dynamical fragmentation with a size cutoff leads to broad distributions of fragment sizes. The resulting power laws are shown to encode characteristic fingerprints of the fragmented objects.

  5. Random matrix theory within superstatistics.

    PubMed

    Abul-Magd, A Y

    2005-12-01

    We propose a generalization of the random matrix theory following the basic prescription of the recently suggested concept of superstatistics. Spectral characteristics of systems with mixed regular-chaotic dynamics are expressed as weighted averages of the corresponding quantities in the standard theory assuming that the mean level spacing itself is a stochastic variable. We illustrate the method by calculating the level density, the nearest-neighbor-spacing distributions, and the two-level correlation functions for systems in transition from order to chaos. The calculated spacing distribution fits the resonance statistics of random binary networks obtained in a recent numerical experiment.

  6. Neutron transport in random media

    SciTech Connect

    Makai, M.

    1996-08-01

    The survey reviews the methods available in the literature which allow a discussion of corium recriticality after a severe accident and a characterization of the corium. It appears that to date no one has considered the eigenvalue problem, though for the source problem several approaches have been proposed. The mathematical formulation of a random medium may be approached in different ways. Based on the review of the literature, we can draw three basic conclusions. The problem of static, random perturbations has been solved. The static case is tractable by the Monte Carlo method. There is a specific time dependent case for which the average flux is given as a series expansion.

  7. Molecular random tilings as glasses

    PubMed Central

    Garrahan, Juan P.; Stannard, Andrew; Blunt, Matthew O.; Beton, Peter H.

    2009-01-01

    We have recently shown that p-terphenyl-3,5,3′,5′-tetracarboxylic acid adsorbed on graphite self-assembles into a two-dimensional rhombus random tiling. This tiling is close to ideal, displaying long-range correlations punctuated by sparse localized tiling defects. In this article we explore the analogy between dynamic arrest in this type of random tilings and that of structural glasses. We show that the structural relaxation of these systems is via the propagation–reaction of tiling defects, giving rise to dynamic heterogeneity. We study the scaling properties of the dynamics and discuss connections with kinetically constrained models of glasses. PMID:19720990

  8. Synchronizability of random rectangular graphs

    SciTech Connect

    Estrada, Ernesto Chen, Guanrong

    2015-08-15

    Random rectangular graphs (RRGs) represent a generalization of the random geometric graphs in which the nodes are embedded into hyperrectangles instead of on hypercubes. The synchronizability of RRG model is studied. Both upper and lower bounds of the eigenratio of the network Laplacian matrix are determined analytically. It is proven that as the rectangular network is more elongated, the network becomes harder to synchronize. The synchronization processing behavior of a RRG network of chaotic Lorenz system nodes is numerically investigated, showing complete consistence with the theoretical results.

  9. Speckle spectroscopy of fluorescent randomly inhomogeneous media

    NASA Astrophysics Data System (ADS)

    Zimnyakov, D. A.; Asharchuk, I. A.; Yuvchenko, S. A.; Sviridov, A. P.

    2016-11-01

    We propose a coherence optical method for probing fluorescent randomly inhomogeneous media based on the statistical analysis of spatial fluctuations of spectrally selected fluorescence radiation. We develop a phenomenological model that interrelates the flicker index of the spatial distribution of the fluorescence intensity at a fixed wavelength and the mean path difference of partial components of the fluorescence radiation field in the probed medium. The results of experimental approbation of the developed method using the layers of densely packed silicon dioxide particles saturated with the aqueous rhodamine 6G solution with a high concentration of the dye are presented. The experimentally observed significant decrease in the flicker index under the wavelength tuning from the edges of the fluorescence spectrum towards it central part is presumably a manifestation of spectrally dependent negative absorption in the medium.

  10. Background Extraction Using Random Walk Image Fusion.

    PubMed

    Hua, Kai-Lung; Wang, Hong-Cyuan; Yeh, Chih-Hsiang; Cheng, Wen-Huang; Lai, Yu-Chi

    2016-12-23

    It is important to extract a clear background for computer vision and augmented reality. Generally, background extraction assumes the existence of a clean background shot through the input sequence, but realistically, situations may violate this assumption such as highway traffic videos. Therefore, our probabilistic model-based method formulates fusion of candidate background patches of the input sequence as a random walk problem and seeks a globally optimal solution based on their temporal and spatial relationship. Furthermore, we also design two quality measures to consider spatial and temporal coherence and contrast distinctness among pixels as background selection basis. A static background should have high temporal coherence among frames, and thus, we improve our fusion precision with a temporal contrast filter and an optical-flow-based motionless patch extractor. Experiments demonstrate that our algorithm can successfully extract artifact-free background images with low computational cost while comparing to state-of-the-art algorithms.

  11. Two-Stage Modelling Of Random Phenomena

    NASA Astrophysics Data System (ADS)

    Barańska, Anna

    2015-12-01

    The main objective of this publication was to present a two-stage algorithm of modelling random phenomena, based on multidimensional function modelling, on the example of modelling the real estate market for the purpose of real estate valuation and estimation of model parameters of foundations vertical displacements. The first stage of the presented algorithm includes a selection of a suitable form of the function model. In the classical algorithms, based on function modelling, prediction of the dependent variable is its value obtained directly from the model. The better the model reflects a relationship between the independent variables and their effect on the dependent variable, the more reliable is the model value. In this paper, an algorithm has been proposed which comprises adjustment of the value obtained from the model with a random correction determined from the residuals of the model for these cases which, in a separate analysis, were considered to be the most similar to the object for which we want to model the dependent variable. The effect of applying the developed quantitative procedures for calculating the corrections and qualitative methods to assess the similarity on the final outcome of the prediction and its accuracy, was examined by statistical methods, mainly using appropriate parametric tests of significance. The idea of the presented algorithm has been designed so as to approximate the value of the dependent variable of the studied phenomenon to its value in reality and, at the same time, to have it "smoothed out" by a well fitted modelling function.

  12. Purely antiferromagnetic magnetoelectric random access memory

    PubMed Central

    Kosub, Tobias; Kopte, Martin; Hühne, Ruben; Appel, Patrick; Shields, Brendan; Maletinsky, Patrick; Hübner, René; Liedke, Maciej Oskar; Fassbender, Jürgen; Schmidt, Oliver G.; Makarov, Denys

    2017-01-01

    Magnetic random access memory schemes employing magnetoelectric coupling to write binary information promise outstanding energy efficiency. We propose and demonstrate a purely antiferromagnetic magnetoelectric random access memory (AF-MERAM) that offers a remarkable 50-fold reduction of the writing threshold compared with ferromagnet-based counterparts, is robust against magnetic disturbances and exhibits no ferromagnetic hysteresis losses. Using the magnetoelectric antiferromagnet Cr2O3, we demonstrate reliable isothermal switching via gate voltage pulses and all-electric readout at room temperature. As no ferromagnetic component is present in the system, the writing magnetic field does not need to be pulsed for readout, allowing permanent magnets to be used. Based on our prototypes, we construct a comprehensive model of the magnetoelectric selection mechanisms in thin films of magnetoelectric antiferromagnets, revealing misfit induced ferrimagnetism as an important factor. Beyond memory applications, the AF-MERAM concept introduces a general all-electric interface for antiferromagnets and should find wide applicability in antiferromagnetic spintronics. PMID:28045029

  13. Bridging Emergent Attributes and Darwinian Principles in Teaching Natural Selection

    ERIC Educational Resources Information Center

    Xu, Dongchen; Chi, Michelene T. H.

    2016-01-01

    Students often have misconceptions about natural selection as they misuse a direct causal schema to explain the process. Natural selection is in fact an emergent process where random interactions lead to changes in a population. The misconceptions stem from students' lack of emergent schema for natural selection. In order to help students…

  14. Random potentials and cosmological attractors

    NASA Astrophysics Data System (ADS)

    Linde, Andrei

    2017-02-01

    I show that the problem of realizing inflation in theories with random potentials of a limited number of fields can be solved, and agreement with the observational data can be naturally achieved if at least one of these fields has a non-minimal kinetic term of the type used in the theory of cosmological α-attractors.

  15. Structure of random monodisperse foam

    NASA Astrophysics Data System (ADS)

    Kraynik, Andrew M.; Reinelt, Douglas A.; van Swol, Frank

    2003-03-01

    The Surface Evolver was used to calculate the equilibrium microstructure of random monodisperse soap froth, starting from Voronoi partitions of randomly packed spheres. The sphere packing has a strong influence on foam properties, such as E (surface free energy) and (average number of faces per cell). This means that random foams composed of equal-volume cells come in a range of structures with different topological and geometric properties. Annealing—subjecting relaxed foams to large-deformation, tension-compression cycles—provokes topological transitions that can further reduce E and . All of the foams have ⩽14. The topological statistics and census of cell types for fully annealed foams are in excellent agreement with experiments by Matzke. Geometric properties related to surface area, edge length, and stress are evaluated for the foams and their individual cells. Simple models based on regular polygons predict trends for the edge length of individual cells and the area of individual faces. Graphs of surface area vs shape anisotropy for the cells reflect the geometrical frustration in random monodisperse foam, which is epitomized by pentagonal dodecahedra: they have low surface area but do not pack to fill space.

  16. Plated wire random access memories

    NASA Technical Reports Server (NTRS)

    Gouldin, L. D.

    1975-01-01

    A program was conducted to construct 4096-work by 18-bit random access, NDRO-plated wire memory units. The memory units were subjected to comprehensive functional and environmental tests at the end-item level to verify comformance with the specified requirements. A technical description of the unit is given, along with acceptance test data sheets.

  17. Common Randomness Principles of Secrecy

    ERIC Educational Resources Information Center

    Tyagi, Himanshu

    2013-01-01

    This dissertation concerns the secure processing of distributed data by multiple terminals, using interactive public communication among themselves, in order to accomplish a given computational task. In the setting of a probabilistic multiterminal source model in which several terminals observe correlated random signals, we analyze secure…

  18. On-chip random spectrometer

    NASA Astrophysics Data System (ADS)

    Redding, B.; Liew, S. F.; Sarma, R.; Cao, H.

    2014-05-01

    Spectrometers are widely used tools in chemical and biological sensing, material analysis, and light source characterization. The development of a high-resolution on-chip spectrometer could enable compact, low-cost spectroscopy for portable sensing as well as increasing lab-on-a-chip functionality. However, the spectral resolution of traditional grating-based spectrometers scales with the optical pathlength, which translates to the linear dimension or footprint of the system, which is limited on-chip. In this work, we utilize multiple scattering in a random photonic structure fabricated on a silicon chip to fold the optical path, making the effective pathlength much longer than the linear dimension of the system and enabling high spectral resolution with a small footprint. Of course, the random spectrometer also requires a different operating paradigm, since different wavelengths are not spatially separated by the random structure, as they would be by a grating. Instead, light transmitted through the random structure produces a wavelengthdependent speckle pattern which can be used as a fingerprint to identify the input spectra after calibration. In practice, these wavelength-dependent speckle patterns are experimentally measured and stored in a transmission matrix, which describes the spectral-to-spatial mapping of the spectrometer. After calibrating the transmission matrix, an arbitrary input spectrum can be reconstructed from its speckle pattern. We achieved sub-nm resolution with 25 nm bandwidth at a wavelength of 1500 nm using a scattering medium with largest dimension of merely 50 μm.

  19. A Mixed Effects Randomized Item Response Model

    ERIC Educational Resources Information Center

    Fox, J.-P.; Wyrick, Cheryl

    2008-01-01

    The randomized response technique ensures that individual item responses, denoted as true item responses, are randomized before observing them and so-called randomized item responses are observed. A relationship is specified between randomized item response data and true item response data. True item response data are modeled with a (non)linear…

  20. Ultra-fast Quantum Random Number Generator

    NASA Astrophysics Data System (ADS)

    Yicheng, Shi

    We describe a series of Randomness Extractors for removing bias and residual correlations in random numbers generated from measurements on noisy physical systems. The structures of the randomness extractors are based on Linear Feedback Shift Registers (LFSR). This leads to a significant simplification in the implementation of randomness extractors.

  1. Clifford Algebras, Random Graphs, and Quantum Random Variables

    NASA Astrophysics Data System (ADS)

    Schott, René; Staples, G. Stacey

    2008-08-01

    For fixed n > 0, the space of finite graphs on n vertices is canonically associated with an abelian, nilpotent-generated subalgebra of the Clifford algebra {C}l2n,2n which is canonically isomorphic to the 2n-particle fermion algebra. Using the generators of the subalgebra, an algebraic probability space of "Clifford adjacency matrices" associated with finite graphs is defined. Each Clifford adjacency matrix is a quantum random variable whose mth moment corresponds to the number of m-cycles in the graph G. Each matrix admits a canonical "quantum decomposition" into a sum of three algebraic random variables: a = aΔ + aΥ + aΛ, where aΔ is classical while aΥ and aΛ are quantum. Moreover, within the Clifford algebra context the NP problem of cycle enumeration is reduced to matrix multiplication, requiring no more than n4 Clifford (geo-metric) multiplications within the algebra.

  2. Target detection with randomized thresholds for lidar applications.

    PubMed

    Johnson, Steven E

    2012-06-20

    Light detection and ranging (lidar) systems use binary hypothesis tests to detect the presence of a target in a range interval. For systems that count photon detections, hypothesis test thresholds are normally set so that a target detection is declared if the number of detections exceeds a particular number. When this method is employed, the false alarm probability can not be selected arbitrarily. In this paper, a hypothesis test that uses randomized thresholds is described. This randomized method of thresholding allows lidar operation at any false alarm probability. When there is a maximum allowable false alarm probability, the hypothesis test that uses randomized thresholds generally produces higher target detection probabilities than the conventional (nonrandom) hypothesis test.

  3. Initial Status in Growth Curve Modeling for Randomized Trials

    PubMed Central

    Chou, Chih-Ping; Chi, Felicia; Weisner, Constance; Pentz, MaryAnn; Hser, Yih-Ing

    2010-01-01

    The growth curve modeling (GCM) technique has been widely adopted in longitudinal studies to investigate progression over time. The simplest growth profile involves two growth factors, initial status (intercept) and growth trajectory (slope). Conventionally, all repeated measures of outcome are included as components of the growth profile, and the first measure is used to reflect the initial status. Selection of the initial status, however, can greatly influence study findings, especially for randomized trials. In this article, we propose an alternative GCM approach involving only post-intervention measures in the growth profile and treating the first wave after intervention as the initial status. We discuss and empirically illustrate how choices of initial status may influence study conclusions in addressing research questions in randomized trials using two longitudinal studies. Data from two randomized trials are used to illustrate that the alternative GCM approach proposed in this article offers better model fitting and more meaningful results. PMID:21572585

  4. Evolutionary games on cycles with strong selection.

    PubMed

    Altrock, P M; Traulsen, A; Nowak, M A

    2017-02-01

    Evolutionary games on graphs describe how strategic interactions and population structure determine evolutionary success, quantified by the probability that a single mutant takes over a population. Graph structures, compared to the well-mixed case, can act as amplifiers or suppressors of selection by increasing or decreasing the fixation probability of a beneficial mutant. Properties of the associated mean fixation times can be more intricate, especially when selection is strong. The intuition is that fixation of a beneficial mutant happens fast in a dominance game, that fixation takes very long in a coexistence game, and that strong selection eliminates demographic noise. Here we show that these intuitions can be misleading in structured populations. We analyze mean fixation times on the cycle graph under strong frequency-dependent selection for two different microscopic evolutionary update rules (death-birth and birth-death). We establish exact analytical results for fixation times under strong selection and show that there are coexistence games in which fixation occurs in time polynomial in population size. Depending on the underlying game, we observe inherence of demographic noise even under strong selection if the process is driven by random death before selection for birth of an offspring (death-birth update). In contrast, if selection for an offspring occurs before random removal (birth-death update), then strong selection can remove demographic noise almost entirely.

  5. Evolutionary games on cycles with strong selection

    NASA Astrophysics Data System (ADS)

    Altrock, P. M.; Traulsen, A.; Nowak, M. A.

    2017-02-01

    Evolutionary games on graphs describe how strategic interactions and population structure determine evolutionary success, quantified by the probability that a single mutant takes over a population. Graph structures, compared to the well-mixed case, can act as amplifiers or suppressors of selection by increasing or decreasing the fixation probability of a beneficial mutant. Properties of the associated mean fixation times can be more intricate, especially when selection is strong. The intuition is that fixation of a beneficial mutant happens fast in a dominance game, that fixation takes very long in a coexistence game, and that strong selection eliminates demographic noise. Here we show that these intuitions can be misleading in structured populations. We analyze mean fixation times on the cycle graph under strong frequency-dependent selection for two different microscopic evolutionary update rules (death-birth and birth-death). We establish exact analytical results for fixation times under strong selection and show that there are coexistence games in which fixation occurs in time polynomial in population size. Depending on the underlying game, we observe inherence of demographic noise even under strong selection if the process is driven by random death before selection for birth of an offspring (death-birth update). In contrast, if selection for an offspring occurs before random removal (birth-death update), then strong selection can remove demographic noise almost entirely.

  6. Random walk with random resetting to the maximum position

    NASA Astrophysics Data System (ADS)

    Majumdar, Satya N.; Sabhapandit, Sanjib; Schehr, Grégory

    2015-11-01

    We study analytically a simple random walk model on a one-dimensional lattice, where at each time step the walker resets to the maximum of the already visited positions (to the rightmost visited site) with a probability r , and with probability (1 -r ) , it undergoes symmetric random walk, i.e., it hops to one of its neighboring sites, with equal probability (1 -r )/2 . For r =0 , it reduces to a standard random walk whose typical distance grows as √{n } for large n . In the presence of a nonzero resetting rate 0

  7. In vitro selection of catalytic RNAs

    NASA Technical Reports Server (NTRS)

    Chapman, K. B.; Szostak, J. W.

    1994-01-01

    In vitro selection techniques are poised to allow a rapid expansion of the study of catalysis by RNA enzymes (ribozymes). This truly molecular version of genetics has already been applied to the study of the structures of known ribozymes and to the tailoring of their catalytic activity to meet specific requirements of substrate specificity or reaction conditions. During the past year, in vitro selection has been successfully used to isolate novel RNA catalysts from random sequence pools.

  8. Randomized interpolative decomposition of separated representations

    NASA Astrophysics Data System (ADS)

    Biagioni, David J.; Beylkin, Daniel; Beylkin, Gregory

    2015-01-01

    We introduce an algorithm to compute tensor interpolative decomposition (dubbed CTD-ID) for the reduction of the separation rank of Canonical Tensor Decompositions (CTDs). Tensor ID selects, for a user-defined accuracy ɛ, a near optimal subset of terms of a CTD to represent the remaining terms via a linear combination of the selected terms. CTD-ID can be used as an alternative to or in combination with the Alternating Least Squares (ALS) algorithm. We present examples of its use within a convergent iteration to compute inverse operators in high dimensions. We also briefly discuss the spectral norm as a computational alternative to the Frobenius norm in estimating approximation errors of tensor ID. We reduce the problem of finding tensor IDs to that of constructing interpolative decompositions of certain matrices. These matrices are generated via randomized projection of the terms of the given tensor. We provide cost estimates and several examples of the new approach to the reduction of separation rank.

  9. Shapes of randomly placed droplets

    NASA Astrophysics Data System (ADS)

    Panchagnula, Mahesh; Janardan, Nachiketa; Deevi, Sri Vallabha

    2016-11-01

    Surface characterization is essential for many industrial applications. Surface defects result in a range of contact angles, which lead to Contact Angle Hysteresis (CAH). We use shapes of randomly shaped drops on surfaces to study the family of shapes that may result from CAH. We image the triple line from these drops and extract additional information related to local contact angles as well as curvatures from these images. We perform a generalized extreme value analysis (GEV) on this microscopic contact angle data. From this analysis, we predict a range for extreme contact angles that are possible for a sessile drop. We have also measured the macroscopic advancing and receding contact angles using a Goniometer. From the extreme values of the contact line curvature, we estimate the pinning stress distribution responsible for the random shapes. It is seen that this range follows the same trend as the macroscopic CAH measured using a Goniometer, and can be used as a method of characterizing the surface.

  10. Optimal randomized scheduling by replacement

    SciTech Connect

    Saias, I.

    1996-05-01

    In the replacement scheduling problem, a system is composed of n processors drawn from a pool of p. The processors can become faulty while in operation and faulty processors never recover. A report is issued whenever a fault occurs. This report states only the existence of a fault but does not indicate its location. Based on this report, the scheduler can reconfigure the system and choose another set of n processors. The system operates satisfactorily as long as, upon report of a fault, the scheduler chooses n non-faulty processors. We provide a randomized protocol maximizing the expected number of faults the system can sustain before the occurrence of a crash. The optimality of the protocol is established by considering a closely related dual optimization problem. The game-theoretic technical difficulties that we solve in this paper are very general and encountered whenever proving the optimality of a randomized algorithm in parallel and distributed computation.

  11. Propensity Score Matching: Retrospective Randomization?

    PubMed

    Jupiter, Daniel C

    Randomized controlled trials are viewed as the optimal study design. In this commentary, we explore the strength of this design and its complexity. We also discuss some situations in which these trials are not possible, or not ethical, or not economical. In such situations, specifically, in retrospective studies, we should make every effort to recapitulate the rigor and strength of the randomized trial. However, we could be faced with an inherent indication bias in such a setting. Thus, we consider the tools available to address that bias. Specifically, we examine matching and introduce and explore a new tool: propensity score matching. This tool allows us to group subjects according to their propensity to be in a particular treatment group and, in so doing, to account for the indication bias.

  12. On Combinations of Random Loads

    DTIC Science & Technology

    1980-01-01

    NPS55-80-006 NAVAL POSTGRADUATE SCHOOL NM ’Monterey, California 00 •2• • TD -E E C AN : JUN 16 1980 i ON COMBINATIONS OF RANDOM LOADS by D. P. Gaver...of MKn is close to that of Mn for K large. PROPOSITION (3.3). Let F and G be as in (3.5), and u be such that (un)-c L(un) n as n ÷ (3.6) Then lim HKn

  13. Random Variate Generation: A Survey.

    DTIC Science & Technology

    1980-06-01

    Lawrance and Lewis (1977, 1978), Jacobs and Lewis (1977) and Schmeiser and Lal (1979) consider time series having gamma marginal distributions. Price...random variables from probability distributions," Proceedings of the Winter Simulation Confgrnce, 269-280. Lawrance . A.J. and P.A.W. Lewis (1977). "An...exponential moving-average sequence and point process (EMAI)," J. Appl. Prob., 14, 98-113. Lawrance , A.J. and P.A.W. Lewis (1978), "An exponential

  14. Random drift and culture change.

    PubMed Central

    Bentley, R. Alexander; Hahn, Matthew W.; Shennan, Stephen J.

    2004-01-01

    We show that the frequency distributions of cultural variants, in three different real-world examples--first names, archaeological pottery and applications for technology patents--follow power laws that can be explained by a simple model of random drift. We conclude that cultural and economic choices often reflect a decision process that is value-neutral; this result has far-reaching testable implications for social-science research. PMID:15306315

  15. Approximating random quantum optimization problems

    NASA Astrophysics Data System (ADS)

    Hsu, B.; Laumann, C. R.; Läuchli, A. M.; Moessner, R.; Sondhi, S. L.

    2013-06-01

    We report a cluster of results regarding the difficulty of finding approximate ground states to typical instances of the quantum satisfiability problem k-body quantum satisfiability (k-QSAT) on large random graphs. As an approximation strategy, we optimize the solution space over “classical” product states, which in turn introduces a novel autonomous classical optimization problem, PSAT, over a space of continuous degrees of freedom rather than discrete bits. Our central results are (i) the derivation of a set of bounds and approximations in various limits of the problem, several of which we believe may be amenable to a rigorous treatment; (ii) a demonstration that an approximation based on a greedy algorithm borrowed from the study of frustrated magnetism performs well over a wide range in parameter space, and its performance reflects the structure of the solution space of random k-QSAT. Simulated annealing exhibits metastability in similar “hard” regions of parameter space; and (iii) a generalization of belief propagation algorithms introduced for classical problems to the case of continuous spins. This yields both approximate solutions, as well as insights into the free energy “landscape” of the approximation problem, including a so-called dynamical transition near the satisfiability threshold. Taken together, these results allow us to elucidate the phase diagram of random k-QSAT in a two-dimensional energy-density-clause-density space.

  16. Resolution analysis by random probing

    NASA Astrophysics Data System (ADS)

    Simutė, S.; Fichtner, A.; van Leeuwen, T.

    2015-12-01

    We develop and apply methods for resolution analysis in tomography, based on stochastic probing of the Hessian or resolution operators. Key properties of our methods are (i) low algorithmic complexity and easy implementation, (ii) applicability to any tomographic technique, including full-waveform inversion and linearized ray tomography, (iii) applicability in any spatial dimension and to inversions with a large number of model parameters, (iv) low computational costs that are mostly a fraction of those required for synthetic recovery tests, and (v) the ability to quantify both spatial resolution and inter-parameter trade-offs. Using synthetic full-waveform inversions as benchmarks, we demonstrate that auto-correlations of random-model applications to the Hessian yield various resolution measures, including direction- and position-dependent resolution lengths, and the strength of inter-parameter mappings. We observe that the required number of random test models is around 5 in one, two and three dimensions. This means that the proposed resolution analyses are not only more meaningful than recovery tests but also computationally less expensive. We demonstrate the applicability of our method in 3D real-data full-waveform inversions for the western Mediterranean and Japan. In addition to tomographic problems, resolution analysis by random probing may be used in other inverse methods that constrain continuously distributed properties, including electromagnetic and potential-field inversions, as well as recently emerging geodynamic data assimilation.

  17. Transport on randomly evolving trees

    NASA Astrophysics Data System (ADS)

    Pál, L.

    2005-11-01

    The time process of transport on randomly evolving trees is investigated. By introducing the notions of living and dead nodes, a model of random tree evolution is constructed which describes the spreading in time of objects corresponding to nodes. It is assumed that at t=0 the tree consists of a single living node (root), from which the evolution may begin. At a certain time instant τ⩾0 , the root produces ν⩾0 living nodes connected by lines to the root which becomes dead at the moment of the offspring production. In the evolution process each of the new living nodes evolves further like a root independently of the others. By using the methods of the age-dependent branching processes we derive the joint distribution function of the numbers of living and dead nodes, and determine the correlation between these node numbers as a function of time. It is proved that the correlation function converges to 3/2 independently of the distributions of ν and τ when q1→1 and t→∞ . Also analyzed are the stochastic properties of the end nodes; and the correlation between the numbers of living and dead end nodes is shown to change its character suddenly at the very beginning of the evolution process. The survival probability of random trees is investigated and expressions are derived for this probability.

  18. Enhanced hyperuniformity from random reorganization.

    PubMed

    Hexner, Daniel; Chaikin, Paul M; Levine, Dov

    2017-04-10

    Diffusion relaxes density fluctuations toward a uniform random state whose variance in regions of volume [Formula: see text] scales as [Formula: see text] Systems whose fluctuations decay faster, [Formula: see text] with [Formula: see text], are called hyperuniform. The larger [Formula: see text], the more uniform, with systems like crystals achieving the maximum value: [Formula: see text] Although finite temperature equilibrium dynamics will not yield hyperuniform states, driven, nonequilibrium dynamics may. Such is the case, for example, in a simple model where overlapping particles are each given a small random displacement. Above a critical particle density [Formula: see text], the system evolves forever, never finding a configuration where no particles overlap. Below [Formula: see text], however, it eventually finds such a state, and stops evolving. This "absorbing state" is hyperuniform up to a length scale [Formula: see text], which diverges at [Formula: see text] An important question is whether hyperuniformity survives noise and thermal fluctuations. We find that hyperuniformity of the absorbing state is not only robust against noise, diffusion, or activity, but that such perturbations reduce fluctuations toward their limiting behavior, [Formula: see text], a uniformity similar to random close packing and early universe fluctuations, but with arbitrary controllable density.

  19. Transport on randomly evolving trees.

    PubMed

    Pál, L

    2005-11-01

    The time process of transport on randomly evolving trees is investigated. By introducing the notions of living and dead nodes, a model of random tree evolution is constructed which describes the spreading in time of objects corresponding to nodes. It is assumed that at t=0 the tree consists of a single living node (root), from which the evolution may begin. At a certain time instant tau> or =0, the root produces v> or =0 living nodes connected by lines to the root which becomes dead at the moment of the offspring production. In the evolution process each of the new living nodes evolves further like a root independently of the others. By using the methods of the age-dependent branching processes we derive the joint distribution function of the numbers of living and dead nodes, and determine the correlation between these node numbers as a function of time. It is proved that the correlation function converges to square root of 3/2 independently of the distributions of v and tau when q1-->1 and t-->infinity. Also analyzed are the stochastic properties of the end nodes; and the correlation between the numbers of living and dead end nodes is shown to change its character suddenly at the very beginning of the evolution process. The survival probability of random trees is investigated and expressions are derived for this probability.

  20. Error threshold transition in the random-energy model

    NASA Astrophysics Data System (ADS)

    Campos, Paulo R.

    2002-12-01

    We perform a statistical analysis of the error threshold transition in quasispecies evolution on a random-energy fitness landscape. We obtain a precise description of the genealogical properties of the population through extensive numerical simulations. We find a clear phase transition and can distinguish two regimes of evolution: The first, for low mutation rates, is characterized by strong selection, and the second, for high mutation rates, is characterized by quasineutral evolution.

  1. The emergence of collective phenomena in systems with random interactions

    NASA Astrophysics Data System (ADS)

    Abramkina, Volha

    Emergent phenomena are one of the most profound topics in modern science, addressing the ways that collectivities and complex patterns appear due to multiplicity of components and simple interactions. Ensembles of random Hamiltonians allow one to explore emergent phenomena in a statistical way. In this work we adopt a shell model approach with a two-body interaction Hamiltonian. The sets of the two-body interaction strengths are selected at random, resulting in the two-body random ensemble (TBRE). Symmetries such as angular momentum, isospin, and parity entangled with complex many-body dynamics result in surprising order discovered in the spectrum of low-lying excitations. The statistical patterns exhibited in the TBRE are remarkably similar to those observed in real nuclei. Signs of almost every collective feature seen in nuclei, namely, pairing superconductivity, deformation, and vibration, have been observed in random ensembles [3, 4, 5, 6]. In what follows a systematic investigation of nuclear shape collectivities in random ensembles is conducted. The development of the mean field, its geometry, multipole collectivities and their dependence on the underlying two-body interaction are explored. Apart from the role of static symmetries such as SU(2) angular momentum and isospin groups, the emergence of dynamical symmetries including the seniority SU(2), rotational symmetry, as well as the Elliot SU(3) is shown to be an important precursor for the existence of geometric collectivities.

  2. DNA-based random number generation in security circuitry.

    PubMed

    Gearheart, Christy M; Arazi, Benjamin; Rouchka, Eric C

    2010-06-01

    DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. This research focuses on further developing DNA-based methodologies to mimic digital data manipulation. While exhibiting fundamental principles, this work was done in conjunction with the vision that DNA-based circuitry, when the technology matures, will form the basis for a tamper-proof security module, revolutionizing the meaning and concept of tamper-proofing and possibly preventing it altogether based on accurate scientific observations. A paramount part of such a solution would be self-generation of random numbers. A novel prototype schema employs solid phase synthesis of oligonucleotides for random construction of DNA sequences; temporary storage and retrieval is achieved through plasmid vectors. A discussion of how to evaluate sequence randomness is included, as well as how these techniques are applied to a simulation of the random number generation circuitry. Simulation results show generated sequences successfully pass three selected NIST random number generation tests specified for security applications.

  3. RACBVHs: random-accessible compressed bounding volume hierarchies.

    PubMed

    Kim, Tae-Joon; Moon, Bochang; Kim, Duksu; Yoon, Sung-Eui

    2010-01-01

    We present a novel compressed bounding volume hierarchy (BVH) representation, random-accessible compressed bounding volume hierarchies (RACBVHs), for various applications requiring random access on BVHs of massive models. Our RACBVH representation is compact and transparently supports random access on the compressed BVHs without decompressing the whole BVH. To support random access on our compressed BVHs, we decompose a BVH into a set of clusters. Each cluster contains consecutive bounding volume (BV) nodes in the original layout of the BVH. Also, each cluster is compressed separately from other clusters and serves as an access point to the RACBVH representation. We provide the general BVH access API to transparently access our RACBVH representation. At runtime, our decompression framework is guaranteed to provide correct BV nodes without decompressing the whole BVH. Also, our method is extended to support parallel random access that can utilize the multicore CPU architecture. Our method can achieve up to a 12:1 compression ratio, and more importantly, can decompress 4.2 M BV nodes ({=}135 {\\rm MB}) per second by using a single CPU-core. To highlight the benefits of our approach, we apply our method to two different applications: ray tracing and collision detection. We can improve the runtime performance by more than a factor of 4 as compared to using the uncompressed original data. This improvement is a result of the fast decompression performance and reduced data access time by selectively fetching and decompressing small regions of the compressed BVHs requested by applications.

  4. Neither fixed nor random: weighted least squares meta-analysis.

    PubMed

    Stanley, T D; Doucouliagos, Hristos

    2015-06-15

    This study challenges two core conventional meta-analysis methods: fixed effect and random effects. We show how and explain why an unrestricted weighted least squares estimator is superior to conventional random-effects meta-analysis when there is publication (or small-sample) bias and better than a fixed-effect weighted average if there is heterogeneity. Statistical theory and simulations of effect sizes, log odds ratios and regression coefficients demonstrate that this unrestricted weighted least squares estimator provides satisfactory estimates and confidence intervals that are comparable to random effects when there is no publication (or small-sample) bias and identical to fixed-effect meta-analysis when there is no heterogeneity. When there is publication selection bias, the unrestricted weighted least squares approach dominates random effects; when there is excess heterogeneity, it is clearly superior to fixed-effect meta-analysis. In practical applications, an unrestricted weighted least squares weighted average will often provide superior estimates to both conventional fixed and random effects.

  5. Random Time Identity Based Firewall In Mobile Ad hoc Networks

    NASA Astrophysics Data System (ADS)

    Suman, Patel, R. B.; Singh, Parvinder

    2010-11-01

    A mobile ad hoc network (MANET) is a self-organizing network of mobile routers and associated hosts connected by wireless links. MANETs are highly flexible and adaptable but at the same time are highly prone to security risks due to the open medium, dynamically changing network topology, cooperative algorithms, and lack of centralized control. Firewall is an effective means of protecting a local network from network-based security threats and forms a key component in MANET security architecture. This paper presents a review of firewall implementation techniques in MANETs and their relative merits and demerits. A new approach is proposed to select MANET nodes at random for firewall implementation. This approach randomly select a new node as firewall after fixed time and based on critical value of certain parameters like power backup. This approach effectively balances power and resource utilization of entire MANET because responsibility of implementing firewall is equally shared among all the nodes. At the same time it ensures improved security for MANETs from outside attacks as intruder will not be able to find out the entry point in MANET due to the random selection of nodes for firewall implementation.

  6. Adaptive biased urn randomization in small strata when blinding is impossible.

    PubMed

    Schouten, H J

    1995-12-01

    Adaptive biased urn randomization, applied in, e.g., a clinical trial, has certain attractive properties. If stratified randomization is desired, a good balance between group sizes can be guaranteed, even in (very) small strata. Yet treatment assignment may be kept unpredictable, which is necessary to avoid selection bias if blinding is impossible. In the present paper a more flexible urn model is described. The investigator may choose assignment probabilities that strongly depend on the degree of imbalance when the groups are still small, but with a tendency toward complete randomization when the groups become large. It is also possible to keep the difference in group size below a chosen maximum, which is useful if population characteristics may change during the course of a trial. The new urn model includes random permutations and complete randomization as special cases. An extension of the model allows the promotion of unequal group sizes. Some attention is paid to a randomized version of the minimization method.

  7. Instructive selection and immunological theory.

    PubMed

    Lederberg, Joshua

    2002-07-01

    The turning point of modern immunological theory was the advent of the clonal selection theory (Burnet, Talmage - 1957). A useful heuristic in the classification of theoretical models was the contrast of 'instructive' with 'selective' models of the acquisition of information by biological systems. The neo-Darwinian synthesis of the 1940s had consolidated biologists' model of evolution based on prior random variation and natural selection, viz. differential fecundity. While evolution in the large was by then pretty well settled, controversy remained about examples of cellular adaptation to chemical challenges, like induced drug-resistance, enzyme formation and the antibody response. While instructive theories have been on the decline, some clear cut examples can be found of molecular imprinting in the abiotic world, leading, e.g. to the production of specific sorbents. Template-driven assembly, as in DNA synthesis, has remained a paradigm of instructive specification. Nevertheless, the classification may break down with more microscopic scrutiny of the processes of molecular fit of substrates with enzymes, of monomers to an elongating polymer chain, as the reactants often traverse a state space from with activated components are appropriately selected. The same process may be 'instructive' from a holistic, 'selective' from an atomic perspective.

  8. Feature-selective attention in healthy old age: a selective decline in selective attention?

    PubMed

    Quigley, Cliodhna; Müller, Matthias M

    2014-02-12

    Deficient selection against irrelevant information has been proposed to underlie age-related cognitive decline. We recently reported evidence for maintained early sensory selection when older and younger adults used spatial selective attention to perform a challenging task. Here we explored age-related differences when spatial selection is not possible and feature-selective attention must be deployed. We additionally compared the integrity of feedforward processing by exploiting the well established phenomenon of suppression of visual cortical responses attributable to interstimulus competition. Electroencephalogram was measured while older and younger human adults responded to brief occurrences of coherent motion in an attended stimulus composed of randomly moving, orientation-defined, flickering bars. Attention was directed to horizontal or vertical bars by a pretrial cue, after which two orthogonally oriented, overlapping stimuli or a single stimulus were presented. Horizontal and vertical bars flickered at different frequencies and thereby elicited separable steady-state visual-evoked potentials, which were used to examine the effect of feature-based selection and the competitive influence of a second stimulus on ongoing visual processing. Age differences were found in feature-selective attentional modulation of visual responses: older adults did not show consistent modulation of magnitude or phase. In contrast, the suppressive effect of a second stimulus was robust and comparable in magnitude across age groups, suggesting that bottom-up processing of the current stimuli is essentially unchanged in healthy old age. Thus, it seems that visual processing per se is unchanged, but top-down attentional control is compromised in older adults when space cannot be used to guide selection.

  9. Quantum random walks without walking

    SciTech Connect

    Manouchehri, K.; Wang, J. B.

    2009-12-15

    Quantum random walks have received much interest due to their nonintuitive dynamics, which may hold the key to a new generation of quantum algorithms. What remains a major challenge is a physical realization that is experimentally viable and not limited to special connectivity criteria. We present a scheme for walking on arbitrarily complex graphs, which can be realized using a variety of quantum systems such as a Bose-Einstein condensate trapped inside an optical lattice. This scheme is particularly elegant since the walker is not required to physically step between the nodes; only flipping coins is sufficient.

  10. Position modulation with random pulses.

    PubMed

    Yao, Min; Korotkova, Olga; Ding, Chaoliang; Pan, Liuzhan

    2014-06-30

    A new class of sources generating ensemble of random pulses is introduced based on superposition of the mutual coherence functions of several Multi-Gaussian Schell-model sources that separately are capable of shaping the propagating pulse's average intensity into flat profiles with adjustable duration and edge sharpness. Under certain conditions that we discuss in detail such superposition allows for production of a pulse ensemble that after a sufficiently long propagation distance in a dispersive medium reshapes its average intensity from an arbitrary initial profile to a train whose parts have flat intensities of different levels and durations and can be either temporarily separated or adjacent.

  11. Ring correlations in random networks

    NASA Astrophysics Data System (ADS)

    Sadjadi, Mahdi; Thorpe, M. F.

    2016-12-01

    We examine the correlations between rings in random network glasses in two dimensions as a function of their separation. Initially, we use the topological separation (measured by the number of intervening rings), but this leads to pseudo-long-range correlations due to a lack of topological charge neutrality in the shells surrounding a central ring. This effect is associated with the noncircular nature of the shells. It is, therefore, necessary to use the geometrical distance between ring centers. Hence we find a generalization of the Aboav-Weaire law out to larger distances, with the correlations between rings decaying away when two rings are more than about three rings apart.

  12. Demographic parameters and natural selection.

    PubMed

    Demetrius, L

    1974-12-01

    This paper introduces two new demographic parameters, the entropy and the reproductive potential of a population. The entropy of a population measures the variability of the contribution of the different age classes to the stationary age distribution. The reproductive potential measures the mean of the contribution of the different age classes to the growth rate. Using a relation between these measures and the Malthusian parameter, it is shown that in a random mating population in Hardy-Weinberg equilibrium, and under slow selection, the rate of change of entropy is equal to the genetic variance in entropy minus the genetic covariance of entropy and reproductive potential. This result is an analogue of Fisher's fundamental theorem of natural selection.

  13. Self-correcting random number generator

    SciTech Connect

    Humble, Travis S.; Pooser, Raphael C.

    2016-09-06

    A system and method for generating random numbers. The system may include a random number generator (RNG), such as a quantum random number generator (QRNG) configured to self-correct or adapt in order to substantially achieve randomness from the output of the RNG. By adapting, the RNG may generate a random number that may be considered random regardless of whether the random number itself is tested as such. As an example, the RNG may include components to monitor one or more characteristics of the RNG during operation, and may use the monitored characteristics as a basis for adapting, or self-correcting, to provide a random number according to one or more performance criteria.

  14. Berkson's bias, selection bias, and missing data.

    PubMed

    Westreich, Daniel

    2012-01-01

    Although Berkson's bias is widely recognized in the epidemiologic literature, it remains underappreciated as a model of both selection bias and bias due to missing data. Simple causal diagrams and 2 × 2 tables illustrate how Berkson's bias connects to collider bias and selection bias more generally, and show the strong analogies between Berksonian selection bias and bias due to missing data. In some situations, considerations of whether data are missing at random or missing not at random are less important than the causal structure of the missing data process. Although dealing with missing data always relies on strong assumptions about unobserved variables, the intuitions built with simple examples can provide a better understanding of approaches to missing data in real-world situations.

  15. On the randomness of pulsar nulls

    NASA Astrophysics Data System (ADS)

    Redman, Stephen L.; Rankin, Joanna M.

    2009-05-01

    Pulsar nulling is not always a random process; most pulsars, in fact, null non-randomly. The Wald-Wolfowitz statistical runs test is a simple diagnostic that pulsar astronomers can use to identify pulsars that have non-random nulls. It is not clear at this point how the dichotomy in pulsar nulling randomness is related to the underlying nulling phenomenon, but its nature suggests that there are at least two distinct reasons that pulsars null.

  16. High speed optical quantum random number generation.

    PubMed

    Fürst, Martin; Weier, Henning; Nauerth, Sebastian; Marangon, Davide G; Kurtsiefer, Christian; Weinfurter, Harald

    2010-06-07

    We present a fully integrated, ready-for-use quantum random number generator (QRNG) whose stochastic model is based on the randomness of detecting single photons in attenuated light. We show that often annoying deadtime effects associated with photomultiplier tubes (PMT) can be utilized to avoid postprocessing for bias or correlations. The random numbers directly delivered to a PC, generated at a rate of up to 50 Mbit/s, clearly pass all tests relevant for (physical) random number generators.

  17. Supersymmetric vacua in random supergravity

    NASA Astrophysics Data System (ADS)

    Bachlechner, Thomas C.; Marsh, David; McAllister, Liam; Wrase, Timm

    2013-01-01

    We determine the spectrum of scalar masses in a supersymmetric vacuum of a general mathcal{N}=1 supergravity theory, with the Kähler potential and superpotential taken to be random functions of N complex scalar fields. We derive a random matrix model for the Hessian matrix and compute the eigenvalue spectrum. Tachyons consistent with the Breitenlohner-Freedman bound are generically present, and although these tachyons cannot destabilize the supersymmetric vacuum, they do influence the likelihood of the existence of an `uplift' to a metastable vacuum with positive cosmological constant. We show that the probability that a supersymmetric AdS vacuum has no tachyons is formally equivalent to the probability of a large fluctuation of the smallest eigenvalue of a certain real Wishart matrix. For normally-distributed matrix entries and any N, this probability is given exactly by P=exp left( {{{{-2{N^2}{{{left| W right|}}^2}}} left/ {{m_{susy}^2}} right.}} right) , with W denoting the superpotential and m susy the supersymmetric mass scale; for more general distributions of the entries, our result is accurate when N ≫ 1. We conclude that for left| W right|gtrsim {{{{m_{susy}}}} left/ {N} right.} , tachyonic instabilities are ubiquitous in configurations obtained by uplifting supersymmetric vacua.

  18. Persistence of random walk records

    NASA Astrophysics Data System (ADS)

    Ben-Naim, E.; Krapivsky, P. L.

    2014-06-01

    We study records generated by Brownian particles in one dimension. Specifically, we investigate an ordinary random walk and define the record as the maximal position of the walk. We compare the record of an individual random walk with the mean record, obtained as an average over infinitely many realizations. We term the walk ‘superior’ if the record is always above average, and conversely, the walk is said to be ‘inferior’ if the record is always below average. We find that the fraction of superior walks, S, decays algebraically with time, S ˜ t-β, in the limit t → ∞, and that the persistence exponent is nontrivial, β = 0.382 258…. The fraction of inferior walks, I, also decays as a power law, I ˜ t-α, but the persistence exponent is smaller, α = 0.241 608…. Both exponents are roots of transcendental equations involving the parabolic cylinder function. To obtain these theoretical results, we analyze the joint density of superior walks with a given record and position, while for inferior walks it suffices to study the density as a function of position.

  19. Drop Spreading with Random Viscosity

    NASA Astrophysics Data System (ADS)

    Xu, Feng; Jensen, Oliver

    2016-11-01

    Airway mucus acts as a barrier to protect the lung. However as a biological material, its physical properties are known imperfectly and can be spatially heterogeneous. In this study we assess the impact of these uncertainties on the rate of spreading of a drop (representing an inhaled aerosol) over a mucus film. We model the film as Newtonian, having a viscosity that depends linearly on the concentration of a passive solute (a crude proxy for mucin proteins). Given an initial random solute (and hence viscosity) distribution, described as a Gaussian random field with a given correlation structure, we seek to quantify the uncertainties in outcomes as the drop spreads. Using lubrication theory, we describe the spreading of the drop in terms of a system of coupled nonlinear PDEs governing the evolution of film height and the vertically-averaged solute concentration. We perform Monte Carlo simulations to predict the variability in the drop centre location and width (1D) or area (2D). We show how simulation results are well described (at much lower computational cost) by a low-order model using a weak disorder expansion. Our results show for example how variability in the drop location is a non-monotonic function of the solute correlation length increases. Engineering and Physical Sciences Research Council.

  20. Chromatic polynomials of random graphs

    NASA Astrophysics Data System (ADS)

    Van Bussel, Frank; Ehrlich, Christoph; Fliegner, Denny; Stolzenberg, Sebastian; Timme, Marc

    2010-04-01

    Chromatic polynomials and related graph invariants are central objects in both graph theory and statistical physics. Computational difficulties, however, have so far restricted studies of such polynomials to graphs that were either very small, very sparse or highly structured. Recent algorithmic advances (Timme et al 2009 New J. Phys. 11 023001) now make it possible to compute chromatic polynomials for moderately sized graphs of arbitrary structure and number of edges. Here we present chromatic polynomials of ensembles of random graphs with up to 30 vertices, over the entire range of edge density. We specifically focus on the locations of the zeros of the polynomial in the complex plane. The results indicate that the chromatic zeros of random graphs have a very consistent layout. In particular, the crossing point, the point at which the chromatic zeros with non-zero imaginary part approach the real axis, scales linearly with the average degree over most of the density range. While the scaling laws obtained are purely empirical, if they continue to hold in general there are significant implications: the crossing points of chromatic zeros in the thermodynamic limit separate systems with zero ground state entropy from systems with positive ground state entropy, the latter an exception to the third law of thermodynamics.

  1. Random Interchange of Magnetic Connectivity

    NASA Astrophysics Data System (ADS)

    Matthaeus, W. H.; Ruffolo, D. J.; Servidio, S.; Wan, M.; Rappazzo, A. F.

    2015-12-01

    Magnetic connectivity, the connection between two points along a magnetic field line, has a stochastic character associated with field lines random walking in space due to magnetic fluctuations, but connectivity can also change in time due to dynamical activity [1]. For fluctuations transverse to a strong mean field, this connectivity change be caused by stochastic interchange due to component reconnection. The process may be understood approximately by formulating a diffusion-like Fokker-Planck coefficient [2] that is asymptotically related to standard field line random walk. Quantitative estimates are provided, for transverse magnetic field models and anisotropic models such as reduced magnetohydrodynamics. In heliospheric applications, these estimates may be useful for understanding mixing between open and close field line regions near coronal hole boundaries, and large latitude excursions of connectivity associated with turbulence. [1] A. F. Rappazzo, W. H. Matthaeus, D. Ruffolo, S. Servidio & M. Velli, ApJL, 758, L14 (2012) [2] D. Ruffolo & W. Matthaeus, ApJ, 806, 233 (2015)

  2. 49 CFR 382.305 - Random testing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ALCOHOL USE AND TESTING Tests Required § 382.305 Random testing. (a) Every employer shall comply with the requirements of this section. Every driver shall submit to random alcohol and controlled substance testing as... minimum annual percentage rate for random alcohol testing shall be 10 percent of the average number...

  3. 49 CFR 382.305 - Random testing.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... ALCOHOL USE AND TESTING Tests Required § 382.305 Random testing. (a) Every employer shall comply with the requirements of this section. Every driver shall submit to random alcohol and controlled substance testing as... minimum annual percentage rate for random alcohol testing shall be 10 percent of the average number...

  4. Randomness in Sequence Evolution Increases over Time

    PubMed Central

    Wang, Guangyu; Sun, Shixiang; Zhang, Zhang

    2016-01-01

    The second law of thermodynamics states that entropy, as a measure of randomness in a system, increases over time. Although studies have investigated biological sequence randomness from different aspects, it remains unknown whether sequence randomness changes over time and whether this change consists with the second law of thermodynamics. To capture the dynamics of randomness in molecular sequence evolution, here we detect sequence randomness based on a collection of eight statistical random tests and investigate the randomness variation of coding sequences with an application to Escherichia coli. Given that core/essential genes are more ancient than specific/non-essential genes, our results clearly show that core/essential genes are more random than specific/non-essential genes and accordingly indicate that sequence randomness indeed increases over time, consistent well with the second law of thermodynamics. We further find that an increase in sequence randomness leads to increasing randomness of GC content and longer sequence length. Taken together, our study presents an important finding, for the first time, that sequence randomness increases over time, which may provide profound insights for unveiling the underlying mechanisms of molecular sequence evolution. PMID:27224236

  5. Randomness, Its Meanings and Educational Implications.

    ERIC Educational Resources Information Center

    Batanero, Carmen; Green, David R.; Serrano, Luis Romero

    1998-01-01

    Presents an analysis of the different meanings associated with randomness throughout its historical evolution as well as a summary of research concerning the subjective perception of randomness by children and adolescents. Some teaching suggestions are included to help students gradually understand the characteristics of random phenomena. Contains…

  6. Source-Independent Quantum Random Number Generation

    NASA Astrophysics Data System (ADS)

    Cao, Zhu; Zhou, Hongyi; Yuan, Xiao; Ma, Xiongfeng

    2016-01-01

    Quantum random number generators can provide genuine randomness by appealing to the fundamental principles of quantum mechanics. In general, a physical generator contains two parts—a randomness source and its readout. The source is essential to the quality of the resulting random numbers; hence, it needs to be carefully calibrated and modeled to achieve information-theoretical provable randomness. However, in practice, the source is a complicated physical system, such as a light source or an atomic ensemble, and any deviations in the real-life implementation from the theoretical model may affect the randomness of the output. To close this gap, we propose a source-independent scheme for quantum random number generation in which output randomness can be certified, even when the source is uncharacterized and untrusted. In our randomness analysis, we make no assumptions about the dimension of the source. For instance, multiphoton emissions are allowed in optical implementations. Our analysis takes into account the finite-key effect with the composable security definition. In the limit of large data size, the length of the input random seed is exponentially small compared to that of the output random bit. In addition, by modifying a quantum key distribution system, we experimentally demonstrate our scheme and achieve a randomness generation rate of over 5 ×103 bit /s .

  7. Parabolic Anderson Model in a Dynamic Random Environment: Random Conductances

    NASA Astrophysics Data System (ADS)

    Erhard, D.; den Hollander, F.; Maillard, G.

    2016-06-01

    The parabolic Anderson model is defined as the partial differential equation ∂ u( x, t)/ ∂ t = κ Δ u( x, t) + ξ( x, t) u( x, t), x ∈ ℤ d , t ≥ 0, where κ ∈ [0, ∞) is the diffusion constant, Δ is the discrete Laplacian, and ξ is a dynamic random environment that drives the equation. The initial condition u( x, 0) = u 0( x), x ∈ ℤ d , is typically taken to be non-negative and bounded. The solution of the parabolic Anderson equation describes the evolution of a field of particles performing independent simple random walks with binary branching: particles jump at rate 2 d κ, split into two at rate ξ ∨ 0, and die at rate (- ξ) ∨ 0. In earlier work we looked at the Lyapunov exponents λ p(κ ) = limlimits _{tto ∞} 1/t log {E} ([u(0,t)]p)^{1/p}, quad p in {N} , qquad λ 0(κ ) = limlimits _{tto ∞} 1/2 log u(0,t). For the former we derived quantitative results on the κ-dependence for four choices of ξ : space-time white noise, independent simple random walks, the exclusion process and the voter model. For the latter we obtained qualitative results under certain space-time mixing conditions on ξ. In the present paper we investigate what happens when κΔ is replaced by Δ𝓚, where 𝓚 = {𝓚( x, y) : x, y ∈ ℤ d , x ˜ y} is a collection of random conductances between neighbouring sites replacing the constant conductances κ in the homogeneous model. We show that the associated annealed Lyapunov exponents λ p (𝓚), p ∈ ℕ, are given by the formula λ p({K} ) = {sup} {λ p(κ ) : κ in {Supp} ({K} )}, where, for a fixed realisation of 𝓚, Supp(𝓚) is the set of values taken by the 𝓚-field. We also show that for the associated quenched Lyapunov exponent λ 0(𝓚) this formula only provides a lower bound, and we conjecture that an upper bound holds when Supp(𝓚) is replaced by its convex hull. Our proof is valid for three classes of reversible ξ, and for all 𝓚

  8. Quantifying consistent individual differences in habitat selection.

    PubMed

    Leclerc, Martin; Vander Wal, Eric; Zedrosser, Andreas; Swenson, Jon E; Kindberg, Jonas; Pelletier, Fanie

    2016-03-01

    Habitat selection is a fundamental behaviour that links individuals to the resources required for survival and reproduction. Although natural selection acts on an individual's phenotype, research on habitat selection often pools inter-individual patterns to provide inferences on the population scale. Here, we expanded a traditional approach of quantifying habitat selection at the individual level to explore the potential for consistent individual differences of habitat selection. We used random coefficients in resource selection functions (RSFs) and repeatability estimates to test for variability in habitat selection. We applied our method to a detailed dataset of GPS relocations of brown bears (Ursus arctos) taken over a period of 6 years, and assessed whether they displayed repeatable individual differences in habitat selection toward two habitat types: bogs and recent timber-harvest cut blocks. In our analyses, we controlled for the availability of habitat, i.e. the functional response in habitat selection. Repeatability estimates of habitat selection toward bogs and cut blocks were 0.304 and 0.420, respectively. Therefore, 30.4 and 42.0 % of the population-scale habitat selection variability for bogs and cut blocks, respectively, was due to differences among individuals, suggesting that consistent individual variation in habitat selection exists in brown bears. Using simulations, we posit that repeatability values of habitat selection are not related to the value and significance of β estimates in RSFs. Although individual differences in habitat selection could be the results of non-exclusive factors, our results illustrate the evolutionary potential of habitat selection.

  9. Minimal sufficient balance-a new strategy to balance baseline covariates and preserve randomness of treatment allocation.

    PubMed

    Zhao, Wenle; Hill, Michael D; Palesch, Yuko

    2015-12-01

    In many clinical trials, baseline covariates could affect the primary outcome. Commonly used strategies to balance baseline covariates include stratified constrained randomization and minimization. Stratification is limited to few categorical covariates. Minimization lacks the randomness of treatment allocation. Both apply only to categorical covariates. As a result, serious imbalances could occur in important baseline covariates not included in the randomization algorithm. Furthermore, randomness of treatment allocation could be significantly compromised because of the high proportion of deterministic assignments associated with stratified block randomization and minimization, potentially resulting in selection bias. Serious baseline covariate imbalances and selection biases often contribute to controversial interpretation of the trial results. The National Institute of Neurological Disorders and Stroke recombinant tissue plasminogen activator Stroke Trial and the Captopril Prevention Project are two examples. In this article, we propose a new randomization strategy, termed the minimal sufficient balance randomization, which will dually prevent serious imbalances in all important baseline covariates, including both categorical and continuous types, and preserve the randomness of treatment allocation. Computer simulations are conducted using the data from the National Institute of Neurological Disorders and Stroke recombinant tissue plasminogen activator Stroke Trial. Serious imbalances in four continuous and one categorical covariate are prevented with a small cost in treatment allocation randomness. A scenario of simultaneously balancing 11 baseline covariates is explored with similar promising results. The proposed minimal sufficient balance randomization algorithm can be easily implemented in computerized central randomization systems for large multicenter trials.

  10. Network meta-analysis incorporating randomized controlled trials and non-randomized comparative cohort studies for assessing the safety and effectiveness of medical treatments: challenges and opportunities.

    PubMed

    Cameron, Chris; Fireman, Bruce; Hutton, Brian; Clifford, Tammy; Coyle, Doug; Wells, George; Dormuth, Colin R; Platt, Robert; Toh, Sengwee

    2015-11-05

    Network meta-analysis is increasingly used to allow comparison of multiple treatment alternatives simultaneously, some of which may not have been compared directly in primary research studies. The majority of network meta-analyses published to date have incorporated data from randomized controlled trials (RCTs) only; however, inclusion of non-randomized studies may sometimes be considered. Non-randomized studies can complement RCTs or address some of their limitations, such as short follow-up time, small sample size, highly selected population, high cost, and ethical restrictions. In this paper, we discuss the challenges and opportunities of incorporating both RCTs and non-randomized comparative cohort studies into network meta-analysis for assessing the safety and effectiveness of medical treatments. Non-randomized studies with inadequate control of biases such as confounding may threaten the validity of the entire network meta-analysis. Therefore, identification and inclusion of non-randomized studies must balance their strengths with their limitations. Inclusion of both RCTs and non-randomized studies in network meta-analysis will likely increase in the future due to the growing need to assess multiple treatments simultaneously, the availability of higher quality non-randomized data and more valid methods, and the increased use of progressive licensing and product listing agreements requiring collection of data over the life cycle of medical products. Inappropriate inclusion of non-randomized studies could perpetuate the biases that are unknown, unmeasured, or uncontrolled. However, thoughtful integration of randomized and non-randomized studies may offer opportunities to provide more timely, comprehensive, and generalizable evidence about the comparative safety and effectiveness of medical treatments.

  11. Truly random number generation: an example

    NASA Astrophysics Data System (ADS)

    Frauchiger, Daniela; Renner, Renato

    2013-10-01

    Randomness is crucial for a variety of applications, ranging from gambling to computer simulations, and from cryptography to statistics. However, many of the currently used methods for generating randomness do not meet the criteria that are necessary for these applications to work properly and safely. A common problem is that a sequence of numbers may look random but nevertheless not be truly random. In fact, the sequence may pass all standard statistical tests and yet be perfectly predictable. This renders it useless for many applications. For example, in cryptography, the predictability of a "andomly" chosen password is obviously undesirable. Here, we review a recently developed approach to generating true | and hence unpredictable | randomness.

  12. Random matrix techniques in quantum information theory

    SciTech Connect

    Collins, Benoît; Nechita, Ion

    2016-01-15

    The purpose of this review is to present some of the latest developments using random techniques, and in particular, random matrix techniques in quantum information theory. Our review is a blend of a rather exhaustive review and of more detailed examples—coming mainly from research projects in which the authors were involved. We focus on two main topics, random quantum states and random quantum channels. We present results related to entropic quantities, entanglement of typical states, entanglement thresholds, the output set of quantum channels, and violations of the minimum output entropy of random channels.

  13. Model ecosystems with random nonlinear interspecies interactions

    NASA Astrophysics Data System (ADS)

    Santos, Danielle O. C.; Fontanari, José F.

    2004-12-01

    The principle of competitive exclusion in ecology establishes that two species living together cannot occupy the same ecological niche. Here we present a model ecosystem in which the species are described by a series of phenotypic characters and the strength of the competition between two species is given by a nondecreasing (modulating) function of the number of common characters. Using analytical tools of statistical mechanics we find that the ecosystem diversity, defined as the fraction of species that coexist at equilibrium, decreases as the complexity (i.e., number of characters) of the species increases, regardless of the modulating function. By considering both selective and random elimination of the links in the community web, we show that ecosystems composed of simple species are more robust than those composed of complex species. In addition, we show that the puzzling result that there exists either rich or poor ecosystems for a linear modulating function is not typical of communities in which the interspecies interactions are determined by a complementarity rule.

  14. Social Selection and Religiously Selective Faith Schools

    ERIC Educational Resources Information Center

    Pettinger, Paul

    2014-01-01

    This article reviews recent research looking at the socio-economic profile of pupils at faith schools and the contribution religiously selective admission arrangements make. It finds that selection by faith leads to greater social segregation and is open to manipulation. It urges that such selection should end, making the state-funded school…

  15. 40 CFR 91.506 - Engine sample selection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) CONTROL OF EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Manufacturer Production Line Testing... manufacturer will begin to randomly select engines from each engine family for production line testing at a rate of one percent. Each engine will be selected from the end of the assembly line. (1) For...

  16. 40 CFR 91.506 - Engine sample selection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) CONTROL OF EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Manufacturer Production Line Testing... manufacturer will begin to randomly select engines from each engine family for production line testing at a rate of one percent. Each engine will be selected from the end of the assembly line. (1) For...

  17. 40 CFR 91.506 - Engine sample selection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) CONTROL OF EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Manufacturer Production Line Testing Program § 91... will begin to randomly select engines from each engine family for production line testing at a rate of one percent. Each engine will be selected from the end of the assembly line. (1) For newly...

  18. 40 CFR 91.506 - Engine sample selection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) CONTROL OF EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Manufacturer Production Line Testing... manufacturer will begin to randomly select engines from each engine family for production line testing at a rate of one percent. Each engine will be selected from the end of the assembly line. (1) For...

  19. Selective Exposure and Retention of Political Advertising: A Regional Comparison.

    ERIC Educational Resources Information Center

    Surlin, Stuart H.; Gordon, Thomas F.

    The results presented in this article are but a portion of the information gathered in a larger survey examining the relative roles of "selective exposure" to and "selective retention" of political advertising during the 1972 presidential election. Random samples in two metropolitan areas in different regions of the country (Atlanta, Ga., n=281;…

  20. In vitro selection of optimal DNA substrates for T4 RNA ligase

    NASA Technical Reports Server (NTRS)

    Harada, Kazuo; Orgel, Leslie E.

    1993-01-01

    We have used in vitro selection techniques to characterize DNA sequences that are ligated efficiently by T4 RNA ligase. We find that the ensemble of selected sequences ligated about 10 times as efficiently as the random mixture of sequences used as the input for selection. Surprisingly, the majority of the selected sequences approximated a well-defined consensus sequence.

  1. Fourier dimension of random images

    NASA Astrophysics Data System (ADS)

    Ekström, Fredrik

    2016-10-01

    Given a compact set of real numbers, a random C^{m + α}-diffeomorphism is constructed such that the image of any measure concentrated on the set and satisfying a certain condition involving a real number s, almost surely has Fourier dimension greater than or equal to s / (m + α). This is used to show that every Borel subset of the real numbers of Hausdorff dimension s is C^{m + α}-equivalent to a set of Fourier dimension greater than or equal to s / (m + α ). In particular every Borel set is diffeomorphic to a Salem set, and the Fourier dimension is not invariant under Cm-diffeomorphisms for any m.

  2. Experimental studies: randomized clinical trials.

    PubMed

    Gjorgov, A N

    1998-01-01

    There are two major approaches to medical investigations: observational studies and experimental trials. The classical application of the experimental design to studies of human populations is the randomized clinical trial of the efficacy of a new drug or treatment. A further application of the experimental studies is to the testing of hypotheses about the etiology of a disease, already tested and corroborated from various forms of observational studies. Ethical considerations and requirements for consent of the experimental subjects are of primary concern in the clinical trials, and those concerns set the first and final limits for implementing a trial. General moral principles in research with human and animal beings, defined by the "Nuremberg Code," deal with strict criteria for approval, endorsement and evaluation of a clinical trial.

  3. The Random Quadratic Assignment Problem

    NASA Astrophysics Data System (ADS)

    Paul, Gerald; Shao, Jia; Stanley, H. Eugene

    2011-11-01

    The quadratic assignment problem, QAP, is one of the most difficult of all combinatorial optimization problems. Here, we use an abbreviated application of the statistical mechanics replica method to study the asymptotic behavior of instances in which the entries of at least one of the two matrices that specify the problem are chosen from a random distribution P. Surprisingly, the QAP has not been studied before using the replica method despite the fact that the QAP was first proposed over 50 years ago and the replica method was developed over 30 years ago. We find simple forms for C min and C max , the costs of the minimal and maximum solutions respectively. Notable features of our results are the symmetry of the results for C min and C max and their dependence on P only through its mean and standard deviation, independent of the details of P.

  4. Structure of random discrete spacetime

    NASA Technical Reports Server (NTRS)

    Brightwell, Graham; Gregory, Ruth

    1991-01-01

    The usual picture of spacetime consists of a continuous manifold, together with a metric of Lorentzian signature which imposes a causal structure on the spacetime. A model, first suggested by Bombelli et al., is considered in which spacetime consists of a discrete set of points taken at random from a manifold, with only the causal structure on this set remaining. This structure constitutes a partially ordered set (or poset). Working from the poset alone, it is shown how to construct a metric on the space which closely approximates the metric on the original spacetime manifold, how to define the effective dimension of the spacetime, and how such quantities may depend on the scale of measurement. Possible desirable features of the model are discussed.

  5. Clique percolation in random networks.

    PubMed

    Derényi, Imre; Palla, Gergely; Vicsek, Tamás

    2005-04-29

    The notion of k-clique percolation in random graphs is introduced, where k is the size of the complete subgraphs whose large scale organizations are analytically and numerically investigated. For the Erdos-Rényi graph of N vertices we obtain that the percolation transition of k-cliques takes place when the probability of two vertices being connected by an edge reaches the threshold p(c) (k) = [(k - 1)N](-1/(k - 1)). At the transition point the scaling of the giant component with N is highly nontrivial and depends on k. We discuss why clique percolation is a novel and efficient approach to the identification of overlapping communities in large real networks.

  6. Structure of random bidisperse foam.

    SciTech Connect

    Reinelt, Douglas A.; van Swol, Frank B.; Kraynik, Andrew Michael

    2005-02-01

    The Surface Evolver was used to compute the equilibrium microstructure of random soap foams with bidisperse cell-size distributions and to evaluate topological and geometric properties of the foams and individual cells. The simulations agree with the experimental data of Matzke and Nestler for the probability {rho}(F) of finding cells with F faces and its dependence on the fraction of large cells. The simulations also agree with the theory for isotropic Plateau polyhedra (IPP), which describes the F-dependence of cell geometric properties, such as surface area, edge length, and mean curvature (diffusive growth rate); this is consistent with results for polydisperse foams. Cell surface areas are about 10% greater than spheres of equal volume, which leads to a simple but accurate relation for the surface free energy density of foams. The Aboav-Weaire law is not valid for bidisperse foams.

  7. Lasso adjustments of treatment effect estimates in randomized experiments

    PubMed Central

    Bloniarz, Adam; Liu, Hanzhong; Zhang, Cun-Hui; Sekhon, Jasjeet S.; Yu, Bin

    2016-01-01

    We provide a principled way for investigators to analyze randomized experiments when the number of covariates is large. Investigators often use linear multivariate regression to analyze randomized experiments instead of simply reporting the difference of means between treatment and control groups. Their aim is to reduce the variance of the estimated treatment effect by adjusting for covariates. If there are a large number of covariates relative to the number of observations, regression may perform poorly because of overfitting. In such cases, the least absolute shrinkage and selection operator (Lasso) may be helpful. We study the resulting Lasso-based treatment effect estimator under the Neyman–Rubin model of randomized experiments. We present theoretical conditions that guarantee that the estimator is more efficient than the simple difference-of-means estimator, and we provide a conservative estimator of the asymptotic variance, which can yield tighter confidence intervals than the difference-of-means estimator. Simulation and data examples show that Lasso-based adjustment can be advantageous even when the number of covariates is less than the number of observations. Specifically, a variant using Lasso for selection and ordinary least squares (OLS) for estimation performs particularly well, and it chooses a smoothing parameter based on combined performance of Lasso and OLS. PMID:27382153

  8. Compressive sensing optical coherence tomography using randomly accessible lasers

    NASA Astrophysics Data System (ADS)

    Harfouche, Mark; Satyan, Naresh; Vasilyev, Arseny; Yariv, Amnon

    2014-05-01

    We propose and demonstrate a novel a compressive sensing swept source optical coherence tomography (SSOCT) system that enables high speed images to be taken while maintaining the high resolution offered from a large bandwidth sweep. Conventional SSOCT systems sweep the optical frequency of a laser ω(t) to determine the depth of the reflectors at a given lateral location. A scatterer located at delay τ appears as a sinusoid cos (ω(t)τ ) at the photodetector. The finite optical chirp rate and the speed of analog to digital and digital to analog converters limit the acquisition rate of an axial scan. The proposed acquisition modality enables much faster image acquisition rates by interrogating the beat signal at randomly selected optical frequencies while preserving resolution and depth of field. The system utilizes a randomly accessible laser, a modulated grating Y-branch laser, to sample the interference pattern from a scene at randomly selected optical frequencies over an optical bandwidth of 5 THz , corresponding to a resolution of 30 μm in air. The depth profile is then reconstructed using an l1 minimization algorithm with a LASSO constraint. Signal-dependent noise sources, shot noise and phase noise, are analyzed and taken into consideration during the recovery. Redundant dictionaries are used to improve the reconstruction of the depth profile. A compression by a factor of 10 for sparse targets up to a depth of 15 mm in noisy environments is shown.

  9. Weight distributions for turbo codes using random and nonrandom permutations

    NASA Technical Reports Server (NTRS)

    Dolinar, S.; Divsalar, D.

    1995-01-01

    This article takes a preliminary look at the weight distributions achievable for turbo codes using random, nonrandom, and semirandom permutations. Due to the recursiveness of the encoders, it is important to distinguish between self-terminating and non-self-terminating input sequences. The non-self-terminating sequences have little effect on decoder performance, because they accumulate high encoded weight until they are artificially terminated at the end of the block. From probabilistic arguments based on selecting the permutations randomly, it is concluded that the self-terminating weight-2 data sequences are the most important consideration in the design of constituent codes; higher-weight self-terminating sequences have successively decreasing importance. Also, increasing the number of codes and, correspondingly, the number of permutations makes it more and more likely that the bad input sequences will be broken up by one or more of the permuters. It is possible to design nonrandom permutations that ensure that the minimum distance due to weight-2 input sequences grows roughly as the square root of (2N), where N is the block length. However, these nonrandom permutations amplify the bad effects of higher-weight inputs, and as a result they are inferior in performance to randomly selected permutations. But there are 'semirandom' permutations that perform nearly as well as the designed nonrandom permutations with respect to weight-2 input sequences and are not as susceptible to being foiled by higher-weight inputs.

  10. Random stress and Omori's law

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.

    2011-09-01

    We consider two statistical regularities that were used to explain Omori's law of the aftershock rate decay: the Lévy and Inverse Gaussian (IGD) distributions. These distributions are thought to describe stress behaviour influenced by various random factors: post-earthquake stress time history is described by a Brownian motion. Both distributions decay to zero for time intervals close to zero. But this feature contradicts the high immediate aftershock level according to Omori's law. We propose that these statistical distributions are influenced by the power-law stress distribution near the earthquake focal zone and we derive new distributions as a mixture of power-law stress with the exponent ψ and Lévy as well as IGD distributions. Such new distributions describe the resulting inter-earthquake time intervals and closely resemble Omori's law. The new Lévy distribution has a pure power law form with the exponent -(1 +ψ/2) and the mixed IGD has two exponents: the same as Lévy for small time intervals and -(1 +ψ) for longer times. For even longer time intervals this power-law behaviour should be replaced by a uniform seismicity rate corresponding to the long-term tectonic deformation. We compute these background rates using our former analysis of earthquake size distribution and its connection to plate tectonics. We analyse several earthquake catalogues to confirm and illustrate our theoretical results. Finally, we discuss how the parameters of random stress dynamics can be determined through a more detailed statistical analysis of earthquake occurrence or by new laboratory experiments.

  11. Universality of fixation probabilities in randomly structured populations.

    PubMed

    Adlam, Ben; Nowak, Martin A

    2014-10-27

    The stage of evolution is the population of reproducing individuals. The structure of the population is known to affect the dynamics and outcome of evolutionary processes, but analytical results for generic random structures have been lacking. The most general result so far, the isothermal theorem, assumes the propensity for change in each position is exactly the same, but realistic biological structures are always subject to variation and noise. We consider a finite population under constant selection whose structure is given by a variety of weighted, directed, random graphs; vertices represent individuals and edges interactions between individuals. By establishing a robustness result for the isothermal theorem and using large deviation estimates to understand the typical structure of random graphs, we prove that for a generalization of the Erdős-Rényi model, the fixation probability of an invading mutant is approximately the same as that of a mutant of equal fitness in a well-mixed population with high probability. Simulations of perturbed lattices, small-world networks, and scale-free networks behave similarly. We conjecture that the fixation probability in a well-mixed population, (1 - r(-1))/(1 - r(-n)), is universal: for many random graph models, the fixation probability approaches the above function uniformly as the graphs become large.

  12. The group selection controversy.

    PubMed

    Leigh, E G

    2010-01-01

    Many thought Darwinian natural selection could not explain altruism. This error led Wynne-Edwards to explain sustainable exploitation in animals by selection against overexploiting groups. Williams riposted that selection among groups rarely overrides within-group selection. Hamilton showed that altruism can evolve through kin selection. How strongly does group selection influence evolution? Following Price, Hamilton showed how levels of selection interact: group selection prevails if Hamilton's rule applies. Several showed that group selection drove some major evolutionary transitions. Following Hamilton's lead, Queller extended Hamilton's rule, replacing genealogical relatedness by the regression on an actor's genotypic altruism of interacting neighbours' phenotypic altruism. Price's theorem shows the generality of Hamilton's rule. All instances of group selection can be viewed as increasing inclusive fitness of autosomal genomes. Nonetheless, to grasp fully how cooperation and altruism evolve, most biologists need more concrete concepts like kin selection, group selection and selection among individuals for their common good.

  13. Postprocessing for quantum random-number generators: Entropy evaluation and randomness extraction

    NASA Astrophysics Data System (ADS)

    Ma, Xiongfeng; Xu, Feihu; Xu, He; Tan, Xiaoqing; Qi, Bing; Lo, Hoi-Kwong

    2013-06-01

    Quantum random-number generators (QRNGs) can offer a means to generate information-theoretically provable random numbers, in principle. In practice, unfortunately, the quantum randomness is inevitably mixed with classical randomness due to classical noises. To distill this quantum randomness, one needs to quantify the randomness of the source and apply a randomness extractor. Here, we propose a generic framework for evaluating quantum randomness of real-life QRNGs by min-entropy, and apply it to two different existing quantum random-number systems in the literature. Moreover, we provide a guideline of QRNG data postprocessing for which we implement two information-theoretically provable randomness extractors: Toeplitz-hashing extractor and Trevisan's extractor.

  14. 77 FR 72905 - Pipeline Safety: Random Drug Testing Rate; Contractor MIS Reporting; and Obtaining DAMIS Sign-In...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-06

    ... for Random Drug Testing Operators of gas, hazardous liquid, and carbon dioxide pipelines and operators of liquefied natural gas facilities must randomly select and test a percentage of covered employees..._Registration.pdf . Pursuant to 49 CFR Parts 199.119(a) and 199.229(a), operators with 50 or more...

  15. Semi-device-independent randomness expansion with partially free random sources using 3 →1 quantum random access code

    NASA Astrophysics Data System (ADS)

    Zhou, Yu-Qian; Gao, Fei; Li, Dan-Dan; Li, Xin-Hui; Wen, Qiao-Yan

    2016-09-01

    We have proved that new randomness can be certified by partially free sources using 2 →1 quantum random access code (QRAC) in the framework of semi-device-independent (SDI) protocols [Y.-Q. Zhou, H.-W. Li, Y.-K. Wang, D.-D. Li, F. Gao, and Q.-Y. Wen, Phys. Rev. A 92, 022331 (2015), 10.1103/PhysRevA.92.022331]. To improve the effectiveness of the randomness generation, here we propose the SDI randomness expansion using 3 →1 QRAC and obtain the corresponding classical and quantum bounds of the two-dimensional quantum witness. Moreover, we get the condition which should be satisfied by the partially free sources to successfully certify new randomness, and the analytic relationship between the certified randomness and the two-dimensional quantum witness violation.

  16. A Markov Chain Model for evaluating the effectiveness of randomized surveillance procedures

    SciTech Connect

    Edmunds, T.A.

    1994-01-01

    A Markov Chain Model has been developed to evaluate the effectiveness of randomized surveillance procedures. The model is applicable for surveillance systems that monitor a collection of assets by randomly selecting and inspecting the assets. The model provides an estimate of the detection probability as a function of the amount of time that an adversary would require to steal or sabotage the asset. An interactive computer code has been written to perform the necessary computations.

  17. Estimating the causal effect of randomization versus treatment preference in a doubly randomized preference trial.

    PubMed

    Marcus, Sue M; Stuart, Elizabeth A; Wang, Pei; Shadish, William R; Steiner, Peter M

    2012-06-01

    Although randomized studies have high internal validity, generalizability of the estimated causal effect from randomized clinical trials to real-world clinical or educational practice may be limited. We consider the implication of randomized assignment to treatment, as compared with choice of preferred treatment as it occurs in real-world conditions. Compliance, engagement, or motivation may be better with a preferred treatment, and this can complicate the generalizability of results from randomized trials. The doubly randomized preference trial (DRPT) is a hybrid randomized and nonrandomized design that allows for estimation of the causal effect of randomization versus treatment preference. In the DRPT, individuals are first randomized to either randomized assignment or choice assignment. Those in the randomized assignment group are then randomized to treatment or control, and those in the choice group receive their preference of treatment versus control. Using the potential outcomes framework, we apply the algebra of conditional independence to show how the DRPT can be used to derive an unbiased estimate of the causal effect of randomization versus preference for each of the treatment and comparison conditions. Also, we show how these results can be implemented using full matching on the propensity score. The methodology is illustrated with a DRPT of introductory psychology students who were randomized to randomized assignment or preference of mathematics versus vocabulary training. We found a small to moderate benefit of preference versus randomization with respect to the mathematics outcome for those who received mathematics training.

  18. Lower bounds for randomized Exclusive Write PRAMs

    SciTech Connect

    MacKenzie, P.D.

    1995-05-02

    In this paper we study the question: How useful is randomization in speeding up Exclusive Write PRAM computations? Our results give further evidence that randomization is of limited use in these types of computations. First we examine a compaction problem on both the CREW and EREW PRAM models, and we present randomized lower bounds which match the best deterministic lower bounds known. (For the CREW PRAM model, the lower bound is asymptotically optimal.) These are the first non-trivial randomized lower bounds known for the compaction problem on these models. We show that our lower bounds also apply to the problem of approximate compaction. Next we examine the problem of computing boolean functions on the CREW PRAM model, and we present a randomized lower bound, which improves on the previous best randomized lower bound for many boolean functions, including the OR function. (The previous lower bounds for these functions were asymptotically optimal, but we improve the constant multiplicative factor.) We also give an alternate proof for the randomized lower bound on PARITY, which was already optimal to within a constant additive factor. Lastly, we give a randomized lower bound for integer merging on an EREW PRAM which matches the best deterministic lower bound known. In all our proofs, we use the Random Adversary method, which has previously only been used for proving lower bounds on models with Concurrent Write capabilities. Thus this paper also serves to illustrate the power and generality of this method for proving parallel randomized lower bounds.

  19. Random geometric graph description of connectedness percolation in rod systems.

    PubMed

    Chatterjee, Avik P; Grimaldi, Claudio

    2015-09-01

    The problem of continuum percolation in dispersions of rods is reformulated in terms of weighted random geometric graphs. Nodes (or sites or vertices) in the graph represent spatial locations occupied by the centers of the rods. The probability that an edge (or link) connects any randomly selected pair of nodes depends upon the rod volume fraction as well as the distribution over their sizes and shapes, and also upon quantities that characterize their state of dispersion (such as the orientational distribution function). We employ the observation that contributions from closed loops of connected rods are negligible in the limit of large aspect ratios to obtain percolation thresholds that are fully equivalent to those calculated within the second-virial approximation of the connectedness Ornstein-Zernike equation. Our formulation can account for effects due to interactions between the rods, and many-body features can be partially addressed by suitable choices for the edge probabilities.

  20. Structure damage detection based on random forest recursive feature elimination

    NASA Astrophysics Data System (ADS)

    Zhou, Qifeng; Zhou, Hao; Zhou, Qingqing; Yang, Fan; Luo, Linkai

    2014-05-01

    Feature extraction is a key former step in structural damage detection. In this paper, a structural damage detection method based on wavelet packet decomposition (WPD) and random forest recursive feature elimination (RF-RFE) is proposed. In order to gain the most effective feature subset and to improve the identification accuracy a two-stage feature selection method is adopted after WPD. First, the damage features are sorted according to original random forest variable importance analysis. Second, using RF-RFE to eliminate the least important feature and reorder the feature list each time, then get the new feature importance sequence. Finally, k-nearest neighbor (KNN) algorithm, as a benchmark classifier, is used to evaluate the extracted feature subset. A four-storey steel shear building model is chosen as an example in method verification. The experimental results show that using the fewer features got from proposed method can achieve higher identification accuracy and reduce the detection time cost.

  1. Random geometric graph description of connectedness percolation in rod systems

    NASA Astrophysics Data System (ADS)

    Chatterjee, Avik P.; Grimaldi, Claudio

    2015-09-01

    The problem of continuum percolation in dispersions of rods is reformulated in terms of weighted random geometric graphs. Nodes (or sites or vertices) in the graph represent spatial locations occupied by the centers of the rods. The probability that an edge (or link) connects any randomly selected pair of nodes depends upon the rod volume fraction as well as the distribution over their sizes and shapes, and also upon quantities that characterize their state of dispersion (such as the orientational distribution function). We employ the observation that contributions from closed loops of connected rods are negligible in the limit of large aspect ratios to obtain percolation thresholds that are fully equivalent to those calculated within the second-virial approximation of the connectedness Ornstein-Zernike equation. Our formulation can account for effects due to interactions between the rods, and many-body features can be partially addressed by suitable choices for the edge probabilities.

  2. Non-local MRI denoising using random sampling.

    PubMed

    Hu, Jinrong; Zhou, Jiliu; Wu, Xi

    2016-09-01

    In this paper, we propose a random sampling non-local mean (SNLM) algorithm to eliminate noise in 3D MRI datasets. Non-local means (NLM) algorithms have been implemented efficiently for MRI denoising, but are always limited by high computational complexity. Compared to conventional methods, which raster through the entire search window when computing similarity weights, the proposed SNLM algorithm randomly selects a small subset of voxels which dramatically decreases the computational burden, together with competitive denoising result. Moreover, structure tensor which encapsulates high-order information was introduced as an optimal sampling pattern for further improvement. Numerical experiments demonstrated that the proposed SNLM method can get a good balance between denoising quality and computation efficiency. At a relative sampling ratio (i.e. ξ=0.05), SNLM can remove noise as effectively as full NLM, meanwhile the running time can be reduced to 1/20 of NLM's.

  3. Strategic use of number representation is independent of test instruction in random number generation.

    PubMed

    Strenge, Hans; Rogge, Carolin

    2010-04-01

    The effects of different instructions on verbal random number generation were examined in 40 healthy students who attempted to generate random sequences of the digits 1 to 6. Two groups of 20 received different instructions with alternative numerical representations. The Symbolic group (Arabic digits) was instructed to randomize while continuously using the analogy of selecting and replacing numbered balls from a hat, whereas the Nonsymbolic group (arrays of dots) was instructed to imagine repeatedly throwing a die. Participants asked for self-reports on their strategies reported spontaneously occurring visuospatial imagination of a mental number line (42%), or imagining throwing a die (23%). Individual number representation was not affected by the initial instruction. There were no differences in randomization performance by group. Comprehensive understanding of the nature of the randomization task requires considering individual differences in construction of mental models.

  4. Selective mutism - resources

    MedlinePlus

    Resources - selective mutism ... The following organizations are good resources for information on selective mutism : American Speech-Language-Hearing Association -- www.asha.org/public/speech/disorders/selectivemutism.htm Selective Mutism and ...

  5. Rare attributes in finite universe: Hypotheses testing specification and exact randomized upper confidence bounds

    SciTech Connect

    Wright, T.

    1993-03-01

    When attributes are rare and few or none are observed in the selected sample from a finite universe, sampling statisticians are increasingly being challenged to use whatever methods are available to declare with high probability or confidence that the universe is near or completely attribute-free. This is especially true when the attribute is undesirable. Approximations such as those based on normal theory are frequently inadequate with rare attributes. For simple random sampling without replacement, an appropriate probability distribution for statistical inference is the hypergeometric distribution. But even with the hypergeometric distribution, the investigator is limited from making claims of attribute-free with high confidence unless the sample size is quite large using nonrandomized techniques. In the hypergeometric setting with rare attributes, exact randomized tests of hypothesis a,re investigated to determine the effect on power of how one specifies the null hypothesis. In particular, specifying the null hypothesis as zero attributes does not always yield maximum possible power. We also consider the hypothesis specification question under complex sampling designs including stratified random sampling and two-stage cluster sampling (one case involves random selection at first stage and another case involves probability proportional to size without replacement selection at first stage). Also under simple random sampling, this article defines and presents a simple algorithm for the construction of exact randomized'' upper confidence bounds which permit one to possibly report tighter bounds than those exact bounds obtained using nonrandomized'' methods.

  6. Rare attributes in finite universe: Hypotheses testing specification and exact randomized upper confidence bounds

    SciTech Connect

    Wright, T.

    1993-03-01

    When attributes are rare and few or none are observed in the selected sample from a finite universe, sampling statisticians are increasingly being challenged to use whatever methods are available to declare with high probability or confidence that the universe is near or completely attribute-free. This is especially true when the attribute is undesirable. Approximations such as those based on normal theory are frequently inadequate with rare attributes. For simple random sampling without replacement, an appropriate probability distribution for statistical inference is the hypergeometric distribution. But even with the hypergeometric distribution, the investigator is limited from making claims of attribute-free with high confidence unless the sample size is quite large using nonrandomized techniques. In the hypergeometric setting with rare attributes, exact randomized tests of hypothesis a,re investigated to determine the effect on power of how one specifies the null hypothesis. In particular, specifying the null hypothesis as zero attributes does not always yield maximum possible power. We also consider the hypothesis specification question under complex sampling designs including stratified random sampling and two-stage cluster sampling (one case involves random selection at first stage and another case involves probability proportional to size without replacement selection at first stage). Also under simple random sampling, this article defines and presents a simple algorithm for the construction of exact ``randomized`` upper confidence bounds which permit one to possibly report tighter bounds than those exact bounds obtained using ``nonrandomized`` methods.

  7. Randomized Controlled Trials of Add-On Antidepressants in Schizophrenia

    PubMed Central

    Joffe, Grigori; Stenberg, Jan-Henry

    2015-01-01

    Background: Despite adequate treatment with antipsychotics, a substantial number of patients with schizophrenia demonstrate only suboptimal clinical outcome. To overcome this challenge, various psychopharmacological combination strategies have been used, including antidepressants added to antipsychotics. Methods: To analyze the efficacy of add-on antidepressants for the treatment of negative, positive, cognitive, depressive, and antipsychotic-induced extrapyramidal symptoms in schizophrenia, published randomized controlled trials assessing the efficacy of adjunctive antidepressants in schizophrenia were reviewed using the following parameters: baseline clinical characteristics and number of patients, their on-going antipsychotic treatment, dosage of the add-on antidepressants, duration of the trial, efficacy measures, and outcomes. Results: There were 36 randomized controlled trials reported in 41 journal publications (n=1582). The antidepressants used were the selective serotonin reuptake inhibitors, duloxetine, imipramine, mianserin, mirtazapine, nefazodone, reboxetin, trazodone, and bupropion. Mirtazapine and mianserin showed somewhat consistent efficacy for negative symptoms and both seemed to enhance neurocognition. Trazodone and nefazodone appeared to improve the antipsychotics-induced extrapyramidal symptoms. Imipramine and duloxetine tended to improve depressive symptoms. No clear evidence supporting selective serotonin reuptake inhibitors’ efficacy on any clinical domain of schizophrenia was found. Add-on antidepressants did not worsen psychosis. Conclusions: Despite a substantial number of randomized controlled trials, the overall efficacy of add-on antidepressants in schizophrenia remains uncertain mainly due to methodological issues. Some differences in efficacy on several schizophrenia domains seem, however, to exist and to vary by the antidepressant subgroups—plausibly due to differences in the mechanisms of action. Antidepressants may not worsen

  8. Inherent randomness of evolving populations.

    PubMed

    Harper, Marc

    2014-03-01

    The entropy rates of the Wright-Fisher process, the Moran process, and generalizations are computed and used to compare these processes and their dependence on standard evolutionary parameters. Entropy rates are measures of the variation dependent on both short-run and long-run behaviors and allow the relationships between mutation, selection, and population size to be examined. Bounds for the entropy rate are given for the Moran process (independent of population size) and for the Wright-Fisher process (bounded for fixed population size). A generational Moran process is also presented for comparison to the Wright-Fisher Process. Results include analytic results and computational extensions.

  9. Inherent randomness of evolving populations

    NASA Astrophysics Data System (ADS)

    Harper, Marc

    2014-03-01

    The entropy rates of the Wright-Fisher process, the Moran process, and generalizations are computed and used to compare these processes and their dependence on standard evolutionary parameters. Entropy rates are measures of the variation dependent on both short-run and long-run behaviors and allow the relationships between mutation, selection, and population size to be examined. Bounds for the entropy rate are given for the Moran process (independent of population size) and for the Wright-Fisher process (bounded for fixed population size). A generational Moran process is also presented for comparison to the Wright-Fisher Process. Results include analytic results and computational extensions.

  10. From random walks to spin glasses

    NASA Astrophysics Data System (ADS)

    Derrida, B.

    1997-02-01

    The talk was a short review on systems which exhibit non-self-averaging effects: sums of random variables when the distribution has a long tail, mean field spin glasses, random map models and returns of a random walk to the origin. Non-self-averaging effects are identical in the case of sums of random variables and in the spin glass problem as predicted by the replica approach. Also we will see that for the random map models or for the problem of the returns of a random walk to the origin, the non-self-averaging effects coincide with the results of the replica approach when the number n of replica n = - {1}/{2} or n = -1.

  11. Tunable random lasing behavior in plasmonic nanostructures

    NASA Astrophysics Data System (ADS)

    Yadav, Ashish; Zhong, Liubiao; Sun, Jun; Jiang, Lin; Cheng, Gary J.; Chi, Lifeng

    2017-01-01

    Random lasing is desired in plasmonics nanostructures through surface plasmon amplification. In this study, tunable random lasing behavior was observed in dye molecules attached with Au nanorods (NRs), Au nanoparticles (NPs) and Au@Ag nanorods (NRs) respectively. Our experimental investigations showed that all nanostructures i.e., Au@AgNRs, AuNRs & AuNPs have intensive tunable spectral effects. The random lasing has been observed at excitation wavelength 532 nm and varying pump powers. The best random lasing properties were noticed in Au@AgNRs structure, which exhibits broad absorption spectrum, sufficiently overlapping with that of dye Rhodamine B (RhB). Au@AgNRs significantly enhance the tunable spectral behavior through localized electromagnetic field and scattering. The random lasing in Au@AgNRs provides an efficient coherent feedback for random lasers.

  12. Organization of growing random networks

    SciTech Connect

    Krapivsky, P. L.; Redner, S.

    2001-06-01

    The organizational development of growing random networks is investigated. These growing networks are built by adding nodes successively, and linking each to an earlier node of degree k with an attachment probability A{sub k}. When A{sub k} grows more slowly than linearly with k, the number of nodes with k links, N{sub k}(t), decays faster than a power law in k, while for A{sub k} growing faster than linearly in k, a single node emerges which connects to nearly all other nodes. When A{sub k} is asymptotically linear, N{sub k}(t){similar_to}tk{sup {minus}{nu}}, with {nu} dependent on details of the attachment probability, but in the range 2{lt}{nu}{lt}{infinity}. The combined age and degree distribution of nodes shows that old nodes typically have a large degree. There is also a significant correlation in the degrees of neighboring nodes, so that nodes of similar degree are more likely to be connected. The size distributions of the in and out components of the network with respect to a given node{emdash}namely, its {open_quotes}descendants{close_quotes} and {open_quotes}ancestors{close_quotes}{emdash}are also determined. The in component exhibits a robust s{sup {minus}2} power-law tail, where s is the component size. The out component has a typical size of order lnt, and it provides basic insights into the genealogy of the network.

  13. Aggregated Recommendation through Random Forests

    PubMed Central

    2014-01-01

    Aggregated recommendation refers to the process of suggesting one kind of items to a group of users. Compared to user-oriented or item-oriented approaches, it is more general and, therefore, more appropriate for cold-start recommendation. In this paper, we propose a random forest approach to create aggregated recommender systems. The approach is used to predict the rating of a group of users to a kind of items. In the preprocessing stage, we merge user, item, and rating information to construct an aggregated decision table, where rating information serves as the decision attribute. We also model the data conversion process corresponding to the new user, new item, and both new problems. In the training stage, a forest is built for the aggregated training set, where each leaf is assigned a distribution of discrete rating. In the testing stage, we present four predicting approaches to compute evaluation values based on the distribution of each tree. Experiments results on the well-known MovieLens dataset show that the aggregated approach maintains an acceptable level of accuracy. PMID:25180204

  14. Relationships between age and epi-genotype of the FMR1 exon 1/intron 1 boundary are consistent with non-random X-chromosome inactivation in FM individuals, with the selection for the unmethylated state being most significant between birth and puberty.

    PubMed

    Godler, David E; Inaba, Yoshimi; Shi, Elva Z; Skinner, Cindy; Bui, Quang M; Francis, David; Amor, David J; Hopper, John L; Loesch, Danuta Z; Hagerman, Randi J; Schwartz, Charles E; Slater, Howard R

    2013-04-15

    Methylation of the fragile X-related epigenetic element 2 (FREE2) located on the exon 1/intron 1 boundary of the FMR1 gene is related to FMRP expression and cognitive impairment in full mutation (FM; CGG>200) individuals. We examined the relationship between age, the size of the FMR1 CGG expansion and the methylation output ratio (MOR) at 12 CpG sites proximal to the exon 1/intron 1 boundary using FREE2 MALDI-TOF MS. The patient cohort included 119 males and 368 females, i.e. 121 healthy controls (CGG<40), 176 premutation (CGG 55-170) and 190 FM (CGG 213-2000). For all CpG units examined, FM males showed a significantly elevated MOR compared with that in hypermethylated FM females. In FM males the MOR for most CpG units significantly positively correlated with both age and CGG size (P< 0.05). In FM females the skewing towards the unmethylated state was significant for half of the units between birth and puberty (P < 0.05). The methylation status of intron 1 CpG10-12 that was most significantly related to cognitive impairment in our earlier study, did not change significantly with age in FM females. These results challenge the concept of fragile X syndrome (FXS)-related methylation being static over time, and suggest that due to the preference for the unmethylated state in FM females, X-inactivation at this locus is not random. The findings also highlight that the prognostic value of FXS methylation testing is not uniform between all CpG sites, and thus may need to be evaluated on a site-by-site basis.

  15. Rewards teach visual selective attention.

    PubMed

    Chelazzi, Leonardo; Perlato, Andrea; Santandrea, Elisa; Della Libera, Chiara

    2013-06-07

    Visual selective attention is the brain function that modulates ongoing processing of retinal input in order for selected representations to gain privileged access to perceptual awareness and guide behavior. Enhanced analysis of currently relevant or otherwise salient information is often accompanied by suppressed processing of the less relevant or salient input. Recent findings indicate that rewards exert a powerful influence on the deployment of visual selective attention. Such influence takes different forms depending on the specific protocol adopted in the given study. In some cases, the prospect of earning a larger reward in relation to a specific stimulus or location biases attention accordingly in order to maximize overall gain. This is mediated by an effect of reward acting as a type of incentive motivation for the strategic control of attention. In contrast, reward delivery can directly alter the processing of specific stimuli by increasing their attentional priority, and this can be measured even when rewards are no longer involved, reflecting a form of reward-mediated attentional learning. As a further development, recent work demonstrates that rewards can affect attentional learning in dissociable ways depending on whether rewards are perceived as feedback on performance or instead are registered as random-like events occurring during task performance. Specifically, it appears that visual selective attention is shaped by two distinct reward-related learning mechanisms: one requiring active monitoring of performance and outcome, and a second one detecting the sheer association between objects in the environment (whether attended or ignored) and the more-or-less rewarding events that accompany them. Overall this emerging literature demonstrates unequivocally that rewards "teach" visual selective attention so that processing resources will be allocated to objects, features and locations which are likely to optimize the organism's interaction with the

  16. Some Tests of Randomness with Applications

    DTIC Science & Technology

    1981-02-01

    relative efficiencies of distriLution-free tests of randomness against normal alternatives. J. Am. Statist. Assoc. 49, 147-57. Wald . A. and Wolfowitz , J...assumption of randomness in data analysis. There may be possibilities of grave errors in assuming the randomness of a given set of data while it i may...Equidistribution test. This can be performed by using Kolmogorov-Smirnov statistic to test uniformity of the real valued sequence so generated. The discrete form of

  17. All-optical fast random number generator.

    PubMed

    Li, Pu; Wang, Yun-Cai; Zhang, Jian-Zhong

    2010-09-13

    We propose a scheme of all-optical random number generator (RNG), which consists of an ultra-wide bandwidth (UWB) chaotic laser, an all-optical sampler and an all-optical comparator. Free from the electric-device bandwidth, it can generate 10Gbit/s random numbers in our simulation. The high-speed bit sequences can pass standard statistical tests for randomness after all-optical exclusive-or (XOR) operation.

  18. Some physical applications of random hierarchical matrices

    SciTech Connect

    Avetisov, V. A.; Bikulov, A. Kh.; Vasilyev, O. A.; Nechaev, S. K.; Chertovich, A. V.

    2009-09-15

    The investigation of spectral properties of random block-hierarchical matrices as applied to dynamic and structural characteristics of complex hierarchical systems with disorder is proposed for the first time. Peculiarities of dynamics on random ultrametric energy landscapes are discussed and the statistical properties of scale-free and polyscale (depending on the topological characteristics under investigation) random hierarchical networks (graphs) obtained by multiple mapping are considered.

  19. The Theory of Random Laser Systems

    SciTech Connect

    Jiang, Xunya

    2001-01-01

    Studies of random laser systems are a new direction with promising potential applications and theoretical interest. The research is based on the theories of localization and laser physics. So far, the research shows that there are random lasing modes inside the systems which is quite different from the common laser systems. From the properties of the random lasing modes, they can understand the phenomena observed in the experiments, such as multi-peak and anisotropic spectrum, lasing mode number saturation, mode competition and dynamic processes, etc. To summarize, this dissertation has contributed the following in the study of random laser systems: (1) by comparing the Lamb theory with the Letokhov theory, the general formulas of the threshold length or gain of random laser systems were obtained; (2) they pointed out the vital weakness of previous time-independent methods in random laser research; (3) a new model which includes the FDTD method and the semi-classical laser theory. The solutions of this model provided an explanation of the experimental results of multi-peak and anisotropic emission spectra, predicted the saturation of lasing modes number and the length of localized lasing modes; (4) theoretical (Lamb theory) and numerical (FDTD and transfer-matrix calculation) studies of the origin of localized lasing modes in the random laser systems; and (5) proposal of using random lasing modes as a new path to study wave localization in random systems and prediction of the lasing threshold discontinuity at mobility edge.

  20. Random walks on simplicial complexes and harmonics†

    PubMed Central

    Steenbergen, John

    2016-01-01

    Abstract In this paper, we introduce a class of random walks with absorbing states on simplicial complexes. Given a simplicial complex of dimension d, a random walk with an absorbing state is defined which relates to the spectrum of the k‐dimensional Laplacian for 1 ≤ k ≤ d. We study an example of random walks on simplicial complexes in the context of a semi‐supervised learning problem. Specifically, we consider a label propagation algorithm on oriented edges, which applies to a generalization of the partially labelled classification problem on graphs. © 2016 Wiley Periodicals, Inc. Random Struct. Alg., 49, 379–405, 2016

  1. True random numbers from amplified quantum vacuum.

    PubMed

    Jofre, M; Curty, M; Steinlechner, F; Anzolin, G; Torres, J P; Mitchell, M W; Pruneri, V

    2011-10-10

    Random numbers are essential for applications ranging from secure communications to numerical simulation and quantitative finance. Algorithms can rapidly produce pseudo-random outcomes, series of numbers that mimic most properties of true random numbers while quantum random number generators (QRNGs) exploit intrinsic quantum randomness to produce true random numbers. Single-photon QRNGs are conceptually simple but produce few random bits per detection. In contrast, vacuum fluctuations are a vast resource for QRNGs: they are broad-band and thus can encode many random bits per second. Direct recording of vacuum fluctuations is possible, but requires shot-noise-limited detectors, at the cost of bandwidth. We demonstrate efficient conversion of vacuum fluctuations to true random bits using optical amplification of vacuum and interferometry. Using commercially-available optical components we demonstrate a QRNG at a bit rate of 1.11 Gbps. The proposed scheme has the potential to be extended to 10 Gbps and even up to 100 Gbps by taking advantage of high speed modulation sources and detectors for optical fiber telecommunication devices.

  2. Direct dialling of Haar random unitary matrices

    NASA Astrophysics Data System (ADS)

    Russell, Nicholas J.; Chakhmakhchyan, Levon; O’Brien, Jeremy L.; Laing, Anthony

    2017-03-01

    Random unitary matrices find a number of applications in quantum information science, and are central to the recently defined boson sampling algorithm for photons in linear optics. We describe an operationally simple method to directly implement Haar random unitary matrices in optical circuits, with no requirement for prior or explicit matrix calculations. Our physically motivated and compact representation directly maps independent probability density functions for parameters in Haar random unitary matrices, to optical circuit components. We go on to extend the results to the case of random unitaries for qubits.

  3. Feature Selection for Chemical Sensor Arrays Using Mutual Information

    PubMed Central

    Wang, X. Rosalind; Lizier, Joseph T.; Nowotny, Thomas; Berna, Amalia Z.; Prokopenko, Mikhail; Trowell, Stephen C.

    2014-01-01

    We address the problem of feature selection for classifying a diverse set of chemicals using an array of metal oxide sensors. Our aim is to evaluate a filter approach to feature selection with reference to previous work, which used a wrapper approach on the same data set, and established best features and upper bounds on classification performance. We selected feature sets that exhibit the maximal mutual information with the identity of the chemicals. The selected features closely match those found to perform well in the previous study using a wrapper approach to conduct an exhaustive search of all permitted feature combinations. By comparing the classification performance of support vector machines (using features selected by mutual information) with the performance observed in the previous study, we found that while our approach does not always give the maximum possible classification performance, it always selects features that achieve classification performance approaching the optimum obtained by exhaustive search. We performed further classification using the selected feature set with some common classifiers and found that, for the selected features, Bayesian Networks gave the best performance. Finally, we compared the observed classification performances with the performance of classifiers using randomly selected features. We found that the selected features consistently outperformed randomly selected features for all tested classifiers. The mutual information filter approach is therefore a computationally efficient method for selecting near optimal features for chemical sensor arrays. PMID:24595058

  4. Addressing selection bias in dental health services research.

    PubMed

    Lee, J Y; Rozier, R G; Norton, E C; Vann, W F

    2005-10-01

    When randomization is not possible, researchers must control for non-random assignment to experimental groups. One technique for statistical adjustment for non-random assignment is through the use of a two-stage analytical technique. The purpose of this study was to demonstrate the use of this technique to control for selection bias in examining the effects of the The Supplemental Program for Women, Infants, and Children's (WIC) on dental visits. From 5 data sources, an analysis file was constructed for 49,512 children ages 1-5 years. The two-stage technique was used to control for selection bias in WIC participation, the potentially endogenous variable. Specification tests showed that WIC participation was not random and that selection bias was present. The effects of the WIC on dental use differed by 36% after adjustment for selection bias by means of the two-stage technique. This technique can be used to control for potential selection bias in dental research when randomization is not possible.

  5. Using histograms to introduce randomization in the generation of ensembles of decision trees

    DOEpatents

    Kamath, Chandrika; Cantu-Paz, Erick; Littau, David

    2005-02-22

    A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.

  6. Raman mode random lasing in ZnS-β-carotene random gain media

    NASA Astrophysics Data System (ADS)

    Bingi, Jayachandra; Warrier, Anita R.; Vijayan, C.

    2013-06-01

    Raman mode random lasing is demonstrated in ZnS-β-carotene random gain media at room temperature. A self assembled random medium is prepared with ZnS sub micron spheres synthesized by homogeneous precipitation method. β-Carotene extracted from pale green leaves is embedded in this random medium. The emission band of ZnS random medium (on excitation at 488 nm) overlaps considerably with that of β-carotene, which functions as a gain medium. Here, random medium works as a cavity, leading to Raman mode lasing at 517 nm and 527 nm triggered by stimulated resonance Raman scattering.

  7. Physical randomness sources for loophole-free Bell tests

    NASA Astrophysics Data System (ADS)

    Mitchell, Morgan W.

    2016-05-01

    We describe the strategy and physics used to select unpredictable measurement settings in the loophole-free Bell tests reported in [Hensen et al. Nature 2015, Giustina et al. PRL 2015, and Shalm et al. PRL 2015]. We demonstrate direct measurements of laser phase diffusion, a process driven by spontaneous emission, rigorous bounds on the effect of other, less-trusted contributions, and exponential predictability reduction by randomness extraction. As required for the cited experiments, we show the six-sigma bound for the predictability of the basis choices is below 0.001%. C. Abellan et al. PRL 2015.

  8. Thermodynamics of protein folding: a random matrix formulation.

    PubMed

    Shukla, Pragya

    2010-10-20

    The process of protein folding from an unfolded state to a biologically active, folded conformation is governed by many parameters, e.g. the sequence of amino acids, intermolecular interactions, the solvent, temperature and chaperon molecules. Our study, based on random matrix modeling of the interactions, shows, however, that the evolution of the statistical measures, e.g. Gibbs free energy, heat capacity, and entropy, is single parametric. The information can explain the selection of specific folding pathways from an infinite number of possible ways as well as other folding characteristics observed in computer simulation studies.

  9. A random interacting network model for complex networks

    PubMed Central

    Goswami, Bedartha; Shekatkar, Snehal M.; Rheinwalt, Aljoscha; Ambika, G.; Kurths, Jürgen

    2015-01-01

    We propose a RAndom Interacting Network (RAIN) model to study the interactions between a pair of complex networks. The model involves two major steps: (i) the selection of a pair of nodes, one from each network, based on intra-network node-based characteristics, and (ii) the placement of a link between selected nodes based on the similarity of their relative importance in their respective networks. Node selection is based on a selection fitness function and node linkage is based on a linkage probability defined on the linkage scores of nodes. The model allows us to relate within-network characteristics to between-network structure. We apply the model to the interaction between the USA and Schengen airline transportation networks (ATNs). Our results indicate that two mechanisms: degree-based preferential node selection and degree-assortative link placement are necessary to replicate the observed inter-network degree distributions as well as the observed inter-network assortativity. The RAIN model offers the possibility to test multiple hypotheses regarding the mechanisms underlying network interactions. It can also incorporate complex interaction topologies. Furthermore, the framework of the RAIN model is general and can be potentially adapted to various real-world complex systems. PMID:26657032

  10. Model selection for logistic regression models

    NASA Astrophysics Data System (ADS)

    Duller, Christine

    2012-09-01

    Model selection for logistic regression models decides which of some given potential regressors have an effect and hence should be included in the final model. The second interesting question is whether a certain factor is heterogeneous among some subsets, i.e. whether the model should include a random intercept or not. In this paper these questions will be answered with classical as well as with Bayesian methods. The application show some results of recent research projects in medicine and business administration.

  11. Theories and Quantification of Thymic Selection

    PubMed Central

    Yates, Andrew J.

    2013-01-01

    The peripheral T cell repertoire is sculpted from prototypic T cells in the thymus bearing randomly generated T cell receptors (TCR) and by a series of developmental and selection steps that remove cells that are unresponsive or overly reactive to self-peptide–MHC complexes. The challenge of understanding how the kinetics of T cell development and the statistics of the selection processes combine to provide a diverse but self-tolerant T cell repertoire has invited quantitative modeling approaches, which are reviewed here. PMID:24550908

  12. Application of stochastic processes in random growth and evolutionary dynamics

    NASA Astrophysics Data System (ADS)

    Oikonomou, Panagiotis

    We study the effect of power-law distributed randomness on the dynamical behavior of processes such as stochastic growth patterns and evolution. First, we examine the geometrical properties of random shapes produced by a generalized stochastic Loewner Evolution driven by a superposition of a Brownian motion and a stable Levy process. The situation is defined by the usual stochastic Loewner Evolution parameter, kappa, as well as alpha which defines the power-law tail of the stable Levy distribution. We show that the properties of these patterns change qualitatively and singularly at critical values of kappa and alpha. It is reasonable to call such changes "phase transitions". These transitions occur as kappa passes through four and as alpha passes through one. Numerical simulations are used to explore the global scaling behavior of these patterns in each "phase". We show both analytically and numerically that the growth continues indefinitely in the vertical direction for alpha greater than 1, goes as logarithmically with time for alpha equals to 1, and saturates for alpha smaller than 1. The probability density has two different scales corresponding to directions along and perpendicular to the boundary. Scaling functions for the probability density are given for various limiting cases. Second, we study the effect of the architecture of biological networks on their evolutionary dynamics. In recent years, studies of the architecture of large networks have unveiled a common topology, called scale-free, in which a majority of the elements are poorly connected except for a small fraction of highly connected components. We ask how networks with distinct topologies can evolve towards a pre-established target phenotype through a process of random mutations and selection. We use networks of Boolean components as a framework to model a large class of phenotypes. Within this approach, we find that homogeneous random networks and scale-free networks exhibit drastically

  13. Extrapolating Weak Selection in Evolutionary Games

    PubMed Central

    Wu, Bin; García, Julián; Hauert, Christoph; Traulsen, Arne

    2013-01-01

    In evolutionary games, reproductive success is determined by payoffs. Weak selection means that even large differences in game outcomes translate into small fitness differences. Many results have been derived using weak selection approximations, in which perturbation analysis facilitates the derivation of analytical results. Here, we ask whether results derived under weak selection are also qualitatively valid for intermediate and strong selection. By “qualitatively valid” we mean that the ranking of strategies induced by an evolutionary process does not change when the intensity of selection increases. For two-strategy games, we show that the ranking obtained under weak selection cannot be carried over to higher selection intensity if the number of players exceeds two. For games with three (or more) strategies, previous examples for multiplayer games have shown that the ranking of strategies can change with the intensity of selection. In particular, rank changes imply that the most abundant strategy at one intensity of selection can become the least abundant for another. We show that this applies already to pairwise interactions for a broad class of evolutionary processes. Even when both weak and strong selection limits lead to consistent predictions, rank changes can occur for intermediate intensities of selection. To analyze how common such games are, we show numerically that for randomly drawn two-player games with three or more strategies, rank changes frequently occur and their likelihood increases rapidly with the number of strategies . In particular, rank changes are almost certain for , which jeopardizes the predictive power of results derived for weak selection. PMID:24339769

  14. Realization of high performance random laser diodes

    NASA Astrophysics Data System (ADS)

    Yu, S. F.

    2011-03-01

    For the past four decades, extensive studies have been concentrated on the understanding of the physics of random lasing phenomena in scattering media with optical gain. Although lasing modes can be excited from the mirrorless scattering media, the characteristics of high scattering loss, multiple-direction emission, as well as multiple-mode oscillation prohibited them to be used as practical laser cavities. Furthermore, due to the difficulty of achieving high optical gain under electrical excitation, electrical excitation of random lasing action was seldom reported. Hence, mirrorless random cavities have never been used to realize lasers for practical applications -- CD, DVD, pico-projector, etc. Nowadays, studies of random lasing are still limited to the scientific research. Recently, the difficulty of achieving `battery driven' random laser diodes has been overcome by using nano-structured ZnO as the random medium and the careful design of heterojunctions. This lead to the first demonstration of room-temperature electrically pumped random lasing action under continuity wave and pulsed operation. In this presentation, we proposed to realize an array of quasi-one dimensional ZnO random laser diodes. We can show that if the laser array can be manipulated in a way such that every individual random laser can be coupled laterally to and locked with a particular phase relationship to its adjacent neighbor, the laser array can obtain coherent addition of random modes. Hence, output power can be multiplied and one lasing mode will only be supported due to the repulsion characteristics of random modes. This work was supported by HK PolyU grant no. 1-ZV6X.

  15. RARtool: A MATLAB Software Package for Designing Response-Adaptive Randomized Clinical Trials with Time-to-Event Outcomes.

    PubMed

    Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee

    2015-08-01

    Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool, a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature.

  16. [Analysis of population stratification using random SNPs in genome-wide association studies].

    PubMed

    Cao, Zong-Fu; Ma, Chuan-Xiang; Wang, Lei; Cai, Bin

    2010-09-01

    Since population genetic STRUCTURE can increase false-positive rate in genome-wide association studies (GWAS) for complex diseases, the effect of population stratification should be taken into account in GWAS. However, the effect of randomly selected SNPs in population stratification analysis is underdetermined. In this study, based on the genotype data generated on Genome-Wide Human SNP Array 6.0 from unrelated individuals of HapMap Phase2, we randomly selected SNPs that were evenly distributed across the whole-genome, and acquired Ancestry Informative Markers (AIMs) by the method of f value and allelic Fisher exact test. F-statistics and STRUCTURE analysis based on the select different sets of SNPs were used to evaluate the effect of distinguishing the populations from HapMap Phase3. We found that randomly selected SNPs that were evenly distributed across the whole-genome were able to be used to identify the population structure. This study further indicated that more than 3 000 randomly selected SNPs that were evenly distributed across the whole-genome were substituted for AIMs in population stratification analysis, when there were no available AIMs for spe-cific populations.

  17. High-Tg Polynorbornene-Based Block and Random Copolymers for Butanol Pervaporation Membranes

    NASA Astrophysics Data System (ADS)

    Register, Richard A.; Kim, Dong-Gyun; Takigawa, Tamami; Kashino, Tomomasa; Burtovyy, Oleksandr; Bell, Andrew

    Vinyl addition polymers of substituted norbornene (NB) monomers possess desirably high glass transition temperatures (Tg); however, until very recently, the lack of an applicable living polymerization chemistry has precluded the synthesis of such polymers with controlled architecture, or copolymers with controlled sequence distribution. We have recently synthesized block and random copolymers of NB monomers bearing hydroxyhexafluoroisopropyl and n-butyl substituents (HFANB and BuNB) via living vinyl addition polymerization with Pd-based catalysts. Both series of polymers were cast into the selective skin layers of thin film composite (TFC) membranes, and these organophilic membranes investigated for the isolation of n-butanol from dilute aqueous solution (model fermentation broth) via pervaporation. The block copolymers show well-defined microphase-separated morphologies, both in bulk and as the selective skin layers on TFC membranes, while the random copolymers are homogeneous. Both block and random vinyl addition copolymers are effective as n-butanol pervaporation membranes, with the block copolymers showing a better flux-selectivity balance. While polyHFANB has much higher permeability and n-butanol selectivity than polyBuNB, incorporating BuNB units into the polymer (in either a block or random sequence) limits the swelling of the polyHFANB and thereby improves the n-butanol pervaporation selectivity.

  18. Random generalized linear model: a highly accurate and interpretable ensemble predictor

    PubMed Central

    2013-01-01

    Background Ensemble predictors such as the random forest are known to have superior accuracy but their black-box predictions are difficult to interpret. In contrast, a generalized linear model (GLM) is very interpretable especially when forward feature selection is used to construct the model. However, forward feature selection tends to overfit the data and leads to low predictive accuracy. Therefore, it remains an important research goal to combine the advantages of ensemble predictors (high accuracy) with the advantages of forward regression modeling (interpretability). To address this goal several articles have explored GLM based ensemble predictors. Since limited evaluations suggested that these ensemble predictors were less accurate than alternative predictors, they have found little attention in the literature. Results Comprehensive evaluations involving hundreds of genomic data sets, the UCI machine learning benchmark data, and simulations are used to give GLM based ensemble predictors a new and careful look. A novel bootstrap aggregated (bagged) GLM predictor that incorporates several elements of randomness and instability (random subspace method, optional interaction terms, forward variable selection) often outperforms a host of alternative prediction methods including random forests and penalized regression models (ridge regression, elastic net, lasso). This random generalized linear model (RGLM) predictor provides variable importance measures that can be used to define a “thinned” ensemble predictor (involving few features) that retains excellent predictive accuracy. Conclusion RGLM is a state of the art predictor that shares the advantages of a random forest (excellent predictive accuracy, feature importance measures, out-of-bag estimates of accuracy) with those of a forward selected generalized linear model (interpretability). These methods are implemented in the freely available R software package randomGLM. PMID:23323760

  19. Color Charts, Esthetics, and Subjective Randomness

    ERIC Educational Resources Information Center

    Sanderson, Yasmine B.

    2012-01-01

    Color charts, or grids of evenly spaced multicolored dots or squares, appear in the work of modern artists and designers. Often the artist/designer distributes the many colors in a way that could be described as "random," that is, without an obvious pattern. We conduct a statistical analysis of 125 "random-looking" art and design color charts and…

  20. 49 CFR 655.45 - Random testing.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., DEPARTMENT OF TRANSPORTATION PREVENTION OF ALCOHOL MISUSE AND PROHIBITED DRUG USE IN TRANSIT OPERATIONS Types... employees; the random alcohol testing rate shall be 10 percent. As provided in paragraph (b) of this section... increase or decrease the minimum annual percentage rate for random drug and alcohol testing is...

  1. Quantum random walks and decision making.

    PubMed

    Shankar, Karthik H

    2014-01-01

    How realistic is it to adopt a quantum random walk model to account for decisions involving two choices? Here, we discuss the neural plausibility and the effect of initial state and boundary thresholds on such a model and contrast it with various features of the classical random walk model of decision making.

  2. Non-Hermitian Euclidean random matrix theory.

    PubMed

    Goetschy, A; Skipetrov, S E

    2011-07-01

    We develop a theory for the eigenvalue density of arbitrary non-Hermitian Euclidean matrices. Closed equations for the resolvent and the eigenvector correlator are derived. The theory is applied to the random Green's matrix relevant to wave propagation in an ensemble of pointlike scattering centers. This opens a new perspective in the study of wave diffusion, Anderson localization, and random lasing.

  3. Random Assignment: Practical Considerations from Field Experiments.

    ERIC Educational Resources Information Center

    Dunford, Franklyn W.

    1990-01-01

    Seven qualitative issues associated with randomization that have the potential to weaken or destroy otherwise sound experimental designs are reviewed and illustrated via actual field experiments. Issue areas include ethics and legality, liability risks, manipulation of randomized outcomes, hidden bias, design intrusiveness, case flow, and…

  4. The Design of Cluster Randomized Crossover Trials

    ERIC Educational Resources Information Center

    Rietbergen, Charlotte; Moerbeek, Mirjam

    2011-01-01

    The inefficiency induced by between-cluster variation in cluster randomized (CR) trials can be reduced by implementing a crossover (CO) design. In a simple CO trial, each subject receives each treatment in random order. A powerful characteristic of this design is that each subject serves as its own control. In a CR CO trial, clusters of subjects…

  5. Random numbers certified by Bell's theorem.

    PubMed

    Pironio, S; Acín, A; Massar, S; de la Giroday, A Boyer; Matsukevich, D N; Maunz, P; Olmschenk, S; Hayes, D; Luo, L; Manning, T A; Monroe, C

    2010-04-15

    Randomness is a fundamental feature of nature and a valuable resource for applications ranging from cryptography and gambling to numerical simulation of physical and biological systems. Random numbers, however, are difficult to characterize mathematically, and their generation must rely on an unpredictable physical process. Inaccuracies in the theoretical modelling of such processes or failures of the devices, possibly due to adversarial attacks, limit the reliability of random number generators in ways that are difficult to control and detect. Here, inspired by earlier work on non-locality-based and device-independent quantum information processing, we show that the non-local correlations of entangled quantum particles can be used to certify the presence of genuine randomness. It is thereby possible to design a cryptographically secure random number generator that does not require any assumption about the internal working of the device. Such a strong form of randomness generation is impossible classically and possible in quantum systems only if certified by a Bell inequality violation. We carry out a proof-of-concept demonstration of this proposal in a system of two entangled atoms separated by approximately one metre. The observed Bell inequality violation, featuring near perfect detection efficiency, guarantees that 42 new random numbers are generated with 99 per cent confidence. Our results lay the groundwork for future device-independent quantum information experiments and for addressing fundamental issues raised by the intrinsic randomness of quantum theory.

  6. Effect Sizes in Cluster-Randomized Designs

    ERIC Educational Resources Information Center

    Hedges, Larry V.

    2007-01-01

    Multisite research designs involving cluster randomization are becoming increasingly important in educational and behavioral research. Researchers would like to compute effect size indexes based on the standardized mean difference to compare the results of cluster-randomized studies (and corresponding quasi-experiments) with other studies and to…

  7. Random Item Generation Is Affected by Age

    ERIC Educational Resources Information Center

    Multani, Namita; Rudzicz, Frank; Wong, Wing Yiu Stephanie; Namasivayam, Aravind Kumar; van Lieshout, Pascal

    2016-01-01

    Purpose: Random item generation (RIG) involves central executive functioning. Measuring aspects of random sequences can therefore provide a simple method to complement other tools for cognitive assessment. We examine the extent to which RIG relates to specific measures of cognitive function, and whether those measures can be estimated using RIG…

  8. Individual Differences Methods for Randomized Experiments

    ERIC Educational Resources Information Center

    Tucker-Drob, Elliot M.

    2011-01-01

    Experiments allow researchers to randomly vary the key manipulation, the instruments of measurement, and the sequences of the measurements and manipulations across participants. To date, however, the advantages of randomized experiments to manipulate both the aspects of interest and the aspects that threaten internal validity have been primarily…

  9. Evaluation of transmittance of selected infrared bands. [of air pollutants

    NASA Technical Reports Server (NTRS)

    Gupta, S. K.; Tiwari, S. N.

    1976-01-01

    Computer programs were developed for evaluating homogeneous path transmittance with line-by-line and quasi-random band model formulations. Spectral transmittances for some selected bands of different gases (CO, N2O, CO2, H2O) were obtained using these programs. Results of theoretical computations are compared with available experimental measurements. Significant errors are observed in the results obtained from a quasi-random band model formulation, indicating that it is inadequate to meet the accuracy requirements for atmospheric work.

  10. Weyl node with random vector potential

    NASA Astrophysics Data System (ADS)

    Sbierski, Björn; Decker, Kevin S. C.; Brouwer, Piet W.

    2016-12-01

    We study Weyl semimetals in the presence of generic disorder, consisting of a random vector potential as well as a random scalar potential. We derive renormalization group flow equations to second order in the disorder strength. These flow equations predict a disorder-induced phase transition between a pseudoballistic weak-disorder phase and a diffusive strong-disorder phase for a sufficiently strong random scalar potential or for a pure three-component random vector potential. We verify these predictions using a numerical study of the density of states near the Weyl point and of quantum transport properties at the Weyl point. In contrast, for a pure single-component random vector potential, the diffusive strong-disorder phase is absent.

  11. Statistical bubble localization with random interactions

    NASA Astrophysics Data System (ADS)

    Li, Xiaopeng; Deng, Dong-Ling; Wu, Yang-Le; Das Sarma, S.

    2017-01-01

    We study one-dimensional spinless fermions with random interactions, but without any on-site disorder. We find that random interactions generically stabilize a many-body localized phase, in spite of the completely extended single-particle degrees of freedom. In the large randomness limit, we construct "bubble-neck" eigenstates having a universal area-law entanglement entropy on average, with the number of volume-law states being exponentially suppressed. We argue that this statistical localization is beyond the phenomenological local-integrals-of-motion description of many-body localization. With exact diagonalization, we confirm the robustness of the many-body localized phase at finite randomness by investigating eigenstate properties such as level statistics, entanglement/participation entropies, and nonergodic quantum dynamics. At weak random interactions, the system develops a thermalization transition when the single-particle hopping becomes dominant.

  12. Unbounded random operators and Feynman formulae

    NASA Astrophysics Data System (ADS)

    Orlov, Yu. N.; Sakbaev, V. Zh.; Smolyanov, O. G.

    2016-12-01

    We introduce and study probabilistic interpolations of various quantization methods. To do this, we develop a method for finding the expectations of unbounded random operators on a Hilbert space by averaging (with the help of Feynman formulae) the random one-parameter semigroups generated by these operators (the usual method for finding the expectations of bounded random operators is generally inapplicable to unbounded ones). Although the averaging of families of semigroups generates a function that need not possess the semigroup property, the Chernoff iterates of this function approximate a certain semigroup, whose generator is taken for the expectation of the original random operator. In the case of bounded random operators, this expectation coincides with the ordinary one.

  13. Random walks in the history of life

    PubMed Central

    Cornette, James L.; Lieberman, Bruce S.

    2004-01-01

    The simplest null hypothesis for evolutionary time series is that the observed data follow a random walk. We examined whether aspects of Sepkoski's compilation of marine generic diversity depart from a random walk by using statistical tests from econometrics. Throughout most of the Phanerozoic, the random-walk null hypothesis is not rejected for marine diversity, accumulated origination or accumulated extinction, suggesting that either these variables were correlated with environmental variables that follow a random walk or so many mechanisms were affecting these variables, in different ways, that the resultant trends appear random. The only deviation from this pattern involves rejection of the null hypothesis for roughly the last 75 million years for the diversity and accumulated origination time series. PMID:14684835

  14. A new derivation of the randomness parameter

    NASA Astrophysics Data System (ADS)

    Wang, Hongyun

    2007-10-01

    For a stochastic stepper that can only step forward, there are two randomnesses: (1) the randomness in the cycle time and (2) the randomness in the number of steps (cycles) over long time. The equivalence between these two randomnesses was previously established using the approach of Laplace transform [M. J. Schnitzer and S. M. Block, "Statistical kinetics of processive enzymes," Cold Spring Harbor Symp. Quant. Biol. 60, 793 (1995)]. In this study, we first discuss the problems of this approach when the cycle time distribution has a discrete component, and then present a new derivation based on the framework of semi-Markov processes with age structure. We also show that the equivalence between the two randomnesses depends on the existence of the first moment of the waiting time for completing the first cycle, which is strongly affected by the initial age distribution. Therefore, any derivation that concludes the equivalence categorically regardless of the initial age distribution is mathematically questionable.

  15. Quantum walks with random phase shifts

    SciTech Connect

    Kosik, Jozef; Buzek, Vladimir; Hillery, Mark

    2006-08-15

    We investigate quantum walks in multiple dimensions with different quantum coins. We augment the model by assuming that at each step the amplitudes of the coin state are multiplied by random phases. This model enables us to study in detail the role of decoherence in quantum walks and to investigate the quantum-to-classical transition. We also provide classical analog of the quantum random walks studied. Interestingly enough, it turns out that the classical counterparts of some quantum random walks are classical random walks with a memory and biased coin. In addition random phase shifts 'simplify' the dynamics (the cross-interference terms of different paths vanish on average) and enable us to give a compact formula for the dispersion of such walks.

  16. Self-Testing Quantum Random Number Generator

    NASA Astrophysics Data System (ADS)

    Lunghi, Tommaso; Brask, Jonatan Bohr; Lim, Charles Ci Wen; Lavigne, Quentin; Bowles, Joseph; Martin, Anthony; Zbinden, Hugo; Brunner, Nicolas

    2015-04-01

    The generation of random numbers is a task of paramount importance in modern science. A central problem for both classical and quantum randomness generation is to estimate the entropy of the data generated by a given device. Here we present a protocol for self-testing quantum random number generation, in which the user can monitor the entropy in real time. Based on a few general assumptions, our protocol guarantees continuous generation of high quality randomness, without the need for a detailed characterization of the devices. Using a fully optical setup, we implement our protocol and illustrate its self-testing capacity. Our work thus provides a practical approach to quantum randomness generation in a scenario of trusted but error-prone devices.

  17. What is the difference between missing completely at random and missing at random?

    PubMed

    Bhaskaran, Krishnan; Smeeth, Liam

    2014-08-01

    The terminology describing missingness mechanisms is confusing. In particular the meaning of 'missing at random' is often misunderstood, leading researchers faced with missing data problems away from multiple imputation, a method with considerable advantages. The purpose of this article is to clarify how 'missing at random' differs from 'missing completely at random' via an imagined dialogue between a clinical researcher and statistician.

  18. Empirical power and sample size calculations for cluster-randomized and cluster-randomized crossover studies.

    PubMed

    Reich, Nicholas G; Myers, Jessica A; Obeng, Daniel; Milstone, Aaron M; Perl, Trish M

    2012-01-01

    In recent years, the number of studies using a cluster-randomized design has grown dramatically. In addition, the cluster-randomized crossover design has been touted as a methodological advance that can increase efficiency of cluster-randomized studies in certain situations. While the cluster-randomized crossover trial has become a popular tool, standards of design, analysis, reporting and implementation have not been established for this emergent design. We address one particular aspect of cluster-randomized and cluster-randomized crossover trial design: estimating statistical power. We present a general framework for estimating power via simulation in cluster-randomized studies with or without one or more crossover periods. We have implemented this framework in the clusterPower software package for R, freely available online from the Comprehensive R Archive Network. Our simulation framework is easy to implement and users may customize the methods used for data analysis. We give four examples of using the software in practice. The clusterPower package could play an important role in the design of future cluster-randomized and cluster-randomized crossover studies. This work is the first to establish a universal method for calculating power for both cluster-randomized and cluster-randomized clinical trials. More research is needed to develop standardized and recommended methodology for cluster-randomized crossover studies.

  19. Estimating the Causal Effect of Randomization versus Treatment Preference in a Doubly Randomized Preference Trial

    ERIC Educational Resources Information Center

    Marcus, Sue M.; Stuart, Elizabeth A.; Wang, Pei; Shadish, William R.; Steiner, Peter M.

    2012-01-01

    Although randomized studies have high internal validity, generalizability of the estimated causal effect from randomized clinical trials to real-world clinical or educational practice may be limited. We consider the implication of randomized assignment to treatment, as compared with choice of preferred treatment as it occurs in real-world…

  20. A Secure LFSR Based Random Measurement Matrix for Compressive Sensing

    NASA Astrophysics Data System (ADS)

    George, Sudhish N.; Pattathil, Deepthi P.

    2014-11-01

    In this paper, a novel approach for generating the secure measurement matrix for compressive sensing (CS) based on linear feedback shift register (LFSR) is presented. The basic idea is to select the different states of LFSR as the random entries of the measurement matrix and normalize these values to get independent and identically distributed (i.i.d.) random variables with zero mean and variance , where N is the number of input samples. The initial seed for the LFSR system act as the key to the user to provide security. Since the measurement matrix is generated from the LFSR system, and memory overload to store the measurement matrix is avoided in the proposed system. Moreover, the proposed system can provide security maintaining the robustness to noise of the CS system. The proposed system is validated through different block-based CS techniques of images. To enhance security, the different blocks of images are measured with different measurement matrices so that the proposed encryption system can withstand known plaintext attack. A modulo division circuit is used to reseed the LFSR system to generate multiple random measurement matrices, whereby after each fundamental period of LFSR, the feedback polynomial of the modulo circuit is modified in terms of a chaotic value. The proposed secure robust CS paradigm for images is subjected to several forms of attacks and is proven to be resistant against the same. From experimental analysis, it is proven that the proposed system provides better performance than its counterparts.

  1. Random Subspace Aggregation for Cancer Prediction with Gene Expression Profiles

    PubMed Central

    Yuan, Xiguo; Zhang, Junying

    2016-01-01

    Background. Precisely predicting cancer is crucial for cancer treatment. Gene expression profiles make it possible to analyze patterns between genes and cancers on the genome-wide scale. Gene expression data analysis, however, is confronted with enormous challenges for its characteristics, such as high dimensionality, small sample size, and low Signal-to-Noise Ratio. Results. This paper proposes a method, termed RS_SVM, to predict gene expression profiles via aggregating SVM trained on random subspaces. After choosing gene features through statistical analysis, RS_SVM randomly selects feature subsets to yield random subspaces and training SVM classifiers accordingly and then aggregates SVM classifiers to capture the advantage of ensemble learning. Experiments on eight real gene expression datasets are performed to validate the RS_SVM method. Experimental results show that RS_SVM achieved better classification accuracy and generalization performance in contrast with single SVM, K-nearest neighbor, decision tree, Bagging, AdaBoost, and the state-of-the-art methods. Experiments also explored the effect of subspace size on prediction performance. Conclusions. The proposed RS_SVM method yielded superior performance in analyzing gene expression profiles, which demonstrates that RS_SVM provides a good channel for such biological data. PMID:27999797

  2. Repeated randomization and matching in multi-arm trials.

    PubMed

    Xu, Zhenzhen; Kalbfleisch, John D

    2013-12-01

    Cluster randomized trials with relatively few clusters have been widely used in recent years for evaluation of health-care strategies. The balance match weighted (BMW) design, introduced in Xu and Kalbfleisch (2010, Biometrics 66, 813-823), applies the optimal full matching with constraints technique to a prospective randomized design with the aim of minimizing the mean squared error (MSE) of the treatment effect estimator. This is accomplished through consideration of M independent randomizations of the experimental units and then selecting the one which provides the most balance evaluated by matching on the estimated propensity scores. Often in practice, clinical trials may involve more than two treatment arms and multiple treatment options need to be evaluated. Therefore, we consider extensions of the BMW propensity score matching method to allow for studies with three or more arms. In this article, we propose three approaches to extend the BMW design to clinical trials with more than two arms and evaluate the property of the extended design in simulation studies.

  3. Use of group-randomized trials in pet population research.

    PubMed

    Lord, L K; Wittum, T E; Scarlett, J M

    2007-12-14

    Communities invest considerable resources to address the animal welfare and public health concerns resulting from unwanted pet animals. Traditionally, research in this area has enumerated the pet-owning population, described pet population dynamics in individual communities, and estimated national euthanasia figures. Recent research has investigated the human-animal bond and explored the community implications of managed feral cat colonies. These reports have utilized traditional epidemiologic study designs to generate observational data to describe populations and measure associations. However, rigorous scientific evaluations of potential interventions at the group level have been lacking. Group-randomized trials have been used extensively in public health research to evaluate interventions that change a population's behavior, not just the behavior of selected individuals. We briefly describe the strengths and limitations of group-randomized trials as they are used to evaluate interventions that promote social and behavioral changes in the human public health field. We extend these examples to suggest the appropriate application of group-randomized trials for pet population dynamics research.

  4. Isotopic Randomness and Maxwell's Demon

    NASA Astrophysics Data System (ADS)

    Berezin, Alexander A.

    2005-03-01

    Isotopic disorder in crystals can lead to suppression of thermal conductivity, mobility variations and (weak) Anderson localization on isotopic fluctuations. The latter (AAB, J.ChemPhys.1984) is akin to polaron effect (self-localization due polarization). Possibility of isotopic patterning (IP) increases near melting point (thermally activated isotopic hopping swaps). Crystal near melting threshold become “informationally sensitive” as if its IP is operated by some external Maxwell’s Demon, MD (AAB, URAM J, 2002). At this state short range (e.g. electrostatic inverse square) forces evolve into long-range interactions (due to divergence of order parameter) and information sensitivity can be further amplified by (say) a single fast electron (e.g. beta-particle from decay of 14-C or other radioactive isotope) which may result in cascade of impact ionization events and (short time-scale) enhancement of screening by impact-generated non-equilibrium (non-thermal) electrons. In this state informationally driven (MD-controlled) IP (Eccles effect) can result in decrease of positional entropy signifying emergence of physical complexity out of pure information, similar to peculiar “jinni effect” on closed time loops in relativistic cosmology (R.J.Gott, 2001) or Wheeler’s “it from bit” metaphor. By selecting special IP, MD modifies ergodicity principle in favor of info rich states.

  5. Unexpected substrate specificity of T4 DNA ligase revealed by in vitro selection

    NASA Technical Reports Server (NTRS)

    Harada, Kazuo; Orgel, Leslie E.

    1993-01-01

    We have used in vitro selection techniques to characterize DNA sequences that are ligated efficiently by T4 DNA ligase. We find that the ensemble of selected sequences ligates about 50 times as efficiently as the random mixture of sequences used as the input for selection. Surprisingly many of the selected sequences failed to produce a match at or close to the ligation junction. None of the 20 selected oligomers that we sequenced produced a match two bases upstream from the ligation junction.

  6. A New Method of Random Environmental Walking for Assessing Behavioral Preferences for Different Lighting Applications

    PubMed Central

    Patching, Geoffrey R.; Rahm, Johan; Jansson, Märit; Johansson, Maria

    2017-01-01

    Accurate assessment of people’s preferences for different outdoor lighting applications is increasingly considered important in the development of new urban environments. Here a new method of random environmental walking is proposed to complement current methods of assessing urban lighting applications, such as self-report questionnaires. The procedure involves participants repeatedly walking between different lighting applications by random selection of a lighting application and preferred choice or by random selection of a lighting application alone. In this manner, participants are exposed to all lighting applications of interest more than once and participants’ preferences for the different lighting applications are reflected in the number of times they walk to each lighting application. On the basis of an initial simulation study, to explore the feasibility of this approach, a comprehensive field test was undertaken. The field test included random environmental walking and collection of participants’ subjective ratings of perceived pleasantness (PP), perceived quality, perceived strength, and perceived flicker of four lighting applications. The results indicate that random environmental walking can reveal participants’ preferences for different lighting applications that, in the present study, conformed to participants’ ratings of PP and perceived quality of the lighting applications. As a complement to subjectively stated environmental preferences, random environmental walking has the potential to expose behavioral preferences for different lighting applications. PMID:28337163

  7. A New Method of Random Environmental Walking for Assessing Behavioral Preferences for Different Lighting Applications.

    PubMed

    Patching, Geoffrey R; Rahm, Johan; Jansson, Märit; Johansson, Maria

    2017-01-01

    Accurate assessment of people's preferences for different outdoor lighting applications is increasingly considered important in the development of new urban environments. Here a new method of random environmental walking is proposed to complement current methods of assessing urban lighting applications, such as self-report questionnaires. The procedure involves participants repeatedly walking between different lighting applications by random selection of a lighting application and preferred choice or by random selection of a lighting application alone. In this manner, participants are exposed to all lighting applications of interest more than once and participants' preferences for the different lighting applications are reflected in the number of times they walk to each lighting application. On the basis of an initial simulation study, to explore the feasibility of this approach, a comprehensive field test was undertaken. The field test included random environmental walking and collection of participants' subjective ratings of perceived pleasantness (PP), perceived quality, perceived strength, and perceived flicker of four lighting applications. The results indicate that random environmental walking can reveal participants' preferences for different lighting applications that, in the present study, conformed to participants' ratings of PP and perceived quality of the lighting applications. As a complement to subjectively stated environmental preferences, random environmental walking has the potential to expose behavioral preferences for different lighting applications.

  8. Inverse probability weighting for covariate adjustment in randomized studies.

    PubMed

    Shen, Changyu; Li, Xiaochun; Li, Lingling

    2014-02-20

    Covariate adjustment in randomized clinical trials has the potential benefit of precision gain. It also has the potential pitfall of reduced objectivity as it opens the possibility of selecting a 'favorable' model that yields strong treatment benefit estimate. Although there is a large volume of statistical literature targeting on the first aspect, realistic solutions to enforce objective inference and improve precision are rare. As a typical randomized trial needs to accommodate many implementation issues beyond statistical considerations, maintaining the objectivity is at least as important as precision gain if not more, particularly from the perspective of the regulatory agencies. In this article, we propose a two-stage estimation procedure based on inverse probability weighting to achieve better precision without compromising objectivity. The procedure is designed in a way such that the covariate adjustment is performed before seeing the outcome, effectively reducing the possibility of selecting a 'favorable' model that yields a strong intervention effect. Both theoretical and numerical properties of the estimation procedure are presented. Application of the proposed method to a real data example is presented.

  9. Randomized Controlled Trial of Sertraline, Prolonged Exposure Therapy and their Combination in OEF/OIF Combat Veterans with PTSD

    DTIC Science & Technology

    2012-12-01

    PTSD include exposure therapy (such as PE) and selective serotonin reuptake inhibitors (SSRIs; such as SERT). To date, there have been no randomized...has selected Don Robinaugh as their Fidelity Rater for the study in 2012Q3. Mr. Robinaugh completed fidelity training in 2012Q4 o Bethany Wangelin

  10. Reduction of display artifacts by random sampling

    NASA Technical Reports Server (NTRS)

    Ahumada, A. J., Jr.; Nagel, D. C.; Watson, A. B.; Yellott, J. I., Jr.

    1983-01-01

    The application of random-sampling techniques to remove visible artifacts (such as flicker, moire patterns, and paradoxical motion) introduced in TV-type displays by discrete sequential scanning is discussed and demonstrated. Sequential-scanning artifacts are described; the window of visibility defined in spatiotemporal frequency space by Watson and Ahumada (1982 and 1983) and Watson et al. (1983) is explained; the basic principles of random sampling are reviewed and illustrated by the case of the human retina; and it is proposed that the sampling artifacts can be replaced by random noise, which can then be shifted to frequency-space regions outside the window of visibility. Vertical sequential, single-random-sequence, and continuously renewed random-sequence plotting displays generating 128 points at update rates up to 130 Hz are applied to images of stationary and moving lines, and best results are obtained with the single random sequence for the stationary lines and with the renewed random sequence for the moving lines.

  11. Random dynamics of the Morris-Lecar neural model.

    PubMed

    Tateno, Takashi; Pakdaman, Khashayar

    2004-09-01

    Determining the response characteristics of neurons to fluctuating noise-like inputs similar to realistic stimuli is essential for understanding neuronal coding. This study addresses this issue by providing a random dynamical system analysis of the Morris-Lecar neural model driven by a white Gaussian noise current. Depending on parameter selections, the deterministic Morris-Lecar model can be considered as a canonical prototype for widely encountered classes of neuronal membranes, referred to as class I and class II membranes. In both the transitions from excitable to oscillating regimes are associated with different bifurcation scenarios. This work examines how random perturbations affect these two bifurcation scenarios. It is first numerically shown that the Morris-Lecar model driven by white Gaussian noise current tends to have a unique stationary distribution in the phase space. Numerical evaluations also reveal quantitative and qualitative changes in this distribution in the vicinity of the bifurcations of the deterministic system. However, these changes notwithstanding, our numerical simulations show that the Lyapunov exponents of the system remain negative in these parameter regions, indicating that no dynamical stochastic bifurcations take place. Moreover, our numerical simulations confirm that, regardless of the asymptotic dynamics of the deterministic system, the random Morris-Lecar model stabilizes at a unique stationary stochastic process. In terms of random dynamical system theory, our analysis shows that additive noise destroys the above-mentioned bifurcation sequences that characterize class I and class II regimes in the Morris-Lecar model. The interpretation of this result in terms of neuronal coding is that, despite the differences in the deterministic dynamics of class I and class II membranes, their responses to noise-like stimuli present a reliable feature.

  12. Robust automated lymph node segmentation with random forests

    NASA Astrophysics Data System (ADS)

    Allen, David; Lu, Le; Yao, Jianhua; Liu, Jiamin; Turkbey, Evrim; Summers, Ronald M.

    2014-03-01

    Enlarged lymph nodes may indicate the presence of illness. Therefore, identification and measurement of lymph nodes provide essential biomarkers for diagnosing disease. Accurate automatic detection and measurement of lymph nodes can assist radiologists for better repeatability and quality assurance, but is challenging as well because lymph nodes are often very small and have a highly variable shape. In this paper, we propose to tackle this problem via supervised statistical learning-based robust voxel labeling, specifically the random forest algorithm. Random forest employs an ensemble of decision trees that are trained on labeled multi-class data to recognize the data features and is adopted to handle lowlevel image features sampled and extracted from 3D medical scans. Here we exploit three types of image features (intensity, order-1 contrast and order-2 contrast) and evaluate their effectiveness in random forest feature selection setting. The trained forest can then be applied to unseen data by voxel scanning via sliding windows (11×11×11), to assign the class label and class-conditional probability to each unlabeled voxel at the center of window. Voxels from the manually annotated lymph nodes in a CT volume are treated as positive class; background non-lymph node voxels as negatives. We show that the random forest algorithm can be adapted and perform the voxel labeling task accurately and efficiently. The experimental results are very promising, with AUCs (area under curve) of the training and validation ROC (receiver operating characteristic) of 0.972 and 0.959, respectively. The visualized voxel labeling results also confirm the validity.

  13. Demonstrating Natural Selection

    ERIC Educational Resources Information Center

    Hinds, David S.; Amundson, John C.

    1975-01-01

    Describes laboratory exercises with chickens selecting their food from dyed and natural corn kernels as a method of demonstrating natural selection. The procedure is based on the fact that organisms that blend into their surroundings escape predation. (BR)

  14. Experimenting with Apostatic Selection.

    ERIC Educational Resources Information Center

    Allen, J. A.; Cooper, J. M.

    1988-01-01

    Reviewed is some of the experimental evidence for apostatic selection from work with artificial prey. Guidelines for further experiments are suggested including experimental design, analysis, variables, and selection in the wild. (Author/CW)

  15. Semi-device-independent randomness expansion with partially free random sources

    NASA Astrophysics Data System (ADS)

    Zhou, Yu-Qian; Li, Hong-Wei; Wang, Yu-Kun; Li, Dan-Dan; Gao, Fei; Wen, Qiao-Yan

    2015-08-01

    By proposing device-independent protocols, Pironio et al. [Nature (London) 464, 1021 (2010), 10.1038/nature09008] and Colbeck et al. [Nat. Phys. 8, 450 (2012), 10.1038/nphys2300] proved that new randomness can be generated by using perfectly free random sources or partially free ones as seed. Subsequently, Li et al. [Phys. Rev. A 84, 034301 (2011), 10.1103/PhysRevA.84.034301] studied this topic in the framework of semi-device-independent and proved that new randomness can be obtained from perfectly free random sources. Here we discuss whether and how partially free random sources bring us new randomness in a semi-device-independent scenario. We propose a semi-device-independent randomness expansion protocol with partially free random sources and obtain the condition that the partially free random sources should satisfy to generate new randomness. In the process of analysis, we acquire a two-dimensional quantum witness. Furthermore, we get the analytic relationship between the generated randomness and the two-dimensional quantum witness violation.

  16. The history of random vibrations through 1958

    NASA Astrophysics Data System (ADS)

    Paez, Thomas L.

    2006-11-01

    Interest in the analysis of random vibrations of mechanical systems started to grow about a half century ago in response to the need for a theory that could accurately predict structural response to jet engine noise and missile launch-induced environments. However, the work that enabled development of the theory of random vibrations started about a half century earlier. This paper discusses contributions to the theory of random vibrations from the time of Einstein to the time of an MIT workshop that was organized by Crandall in 1958.

  17. Universality in complex networks: random matrix analysis.

    PubMed

    Bandyopadhyay, Jayendra N; Jalan, Sarika

    2007-08-01

    We apply random matrix theory to complex networks. We show that nearest neighbor spacing distribution of the eigenvalues of the adjacency matrices of various model networks, namely scale-free, small-world, and random networks follow universal Gaussian orthogonal ensemble statistics of random matrix theory. Second, we show an analogy between the onset of small-world behavior, quantified by the structural properties of networks, and the transition from Poisson to Gaussian orthogonal ensemble statistics, quantified by Brody parameter characterizing a spectral property. We also present our analysis for a protein-protein interaction network in budding yeast.

  18. The defect variance of random spherical harmonics

    NASA Astrophysics Data System (ADS)

    Marinucci, Domenico; Wigman, Igor

    2011-09-01

    The defect of a function f:M\\rightarrow {R} is defined as the difference between the measure of the positive and negative regions. In this paper, we begin the analysis of the distribution of defect of random Gaussian spherical harmonics. By an easy argument, the defect is non-trivial only for even degree and the expected value always vanishes. Our principal result is evaluating the defect variance, asymptotically in the high-frequency limit. As other geometric functionals of random eigenfunctions, the defect may be used as a tool to probe the statistical properties of spherical random fields, a topic of great interest for modern cosmological data analysis.

  19. Fraunhofer diffraction by a random screen.

    PubMed

    Malinka, Aleksey V

    2011-08-01

    The stochastic approach is applied to the problem of Fraunhofer diffraction by a random screen. The diffraction pattern is expressed through the random chord distribution. Two cases are considered: the sparse ensemble, where the interference between different obstacles can be neglected, and the densely packed ensemble, where this interference is to be taken into account. The solution is found for the general case and the analytical formulas are obtained for the Switzer model of a random screen, i.e., for the case of Markov statistics.

  20. Self-assembly of Random Copolymers

    PubMed Central

    Li, Longyu; Raghupathi, Kishore; Song, Cunfeng; Prasad, Priyaa; Thayumanavan, S.

    2014-01-01

    Self-assembly of random copolymers has attracted considerable attention recently. In this feature article, we highlight the use of random copolymers to prepare nanostructures with different morphologies and to prepare nanomaterials that are responsive to single or multiple stimuli. The synthesis of single-chain nanoparticles and their potential applications from random copolymers are also discussed in some detail. We aim to draw more attention to these easily accessible copolymers, which are likely to play an important role in translational polymer research. PMID:25036552