Science.gov

Sample records for selective a1a-blocker randomized

  1. Random Selection for Drug Screening

    SciTech Connect

    Center for Human Reliability Studies

    2007-05-01

    Simple random sampling is generally the starting point for a random sampling process. This sampling technique ensures that each individual within a group (population) has an equal chance of being selected. There are a variety of ways to implement random sampling in a practical situation.

  2. Blocked randomization with randomly selected block sizes.

    PubMed

    Efird, Jimmy

    2011-01-01

    When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  3. Blocked Randomization with Randomly Selected Block Sizes

    PubMed Central

    Efird, Jimmy

    2011-01-01

    When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes. PMID:21318011

  4. Random Selection for Drug Screening

    SciTech Connect

    Center for Human Reliability Studies

    2007-05-01

    Sampling is the process of choosing some members out of a group or population. Probablity sampling, or random sampling, is the process of selecting members by chance with a known probability of each individual being chosen.

  5. Randomized selection on the GPU

    SciTech Connect

    Monroe, Laura Marie; Wendelberger, Joanne R; Michalak, Sarah E

    2011-01-13

    We implement here a fast and memory-sparing probabilistic top N selection algorithm on the GPU. To our knowledge, this is the first direct selection in the literature for the GPU. The algorithm proceeds via a probabilistic-guess-and-chcck process searching for the Nth element. It always gives a correct result and always terminates. The use of randomization reduces the amount of data that needs heavy processing, and so reduces the average time required for the algorithm. Probabilistic Las Vegas algorithms of this kind are a form of stochastic optimization and can be well suited to more general parallel processors with limited amounts of fast memory.

  6. 47 CFR 1.1603 - Conduct of random selection.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The random selection probabilities will be...

  7. Randomness in post-selected events

    NASA Astrophysics Data System (ADS)

    Phuc Thinh, Le; de la Torre, Gonzalo; Bancal, Jean-Daniel; Pironio, Stefano; Scarani, Valerio

    2016-03-01

    Bell inequality violations can be used to certify private randomness for use in cryptographic applications. In photonic Bell experiments, a large amount of the data that is generated comes from no-detection events and presumably contains little randomness. This raises the question as to whether randomness can be extracted only from the smaller post-selected subset corresponding to proper detection events, instead of from the entire set of data. This could in principle be feasible without opening an analogue of the detection loophole as long as the min-entropy of the post-selected data is evaluated by taking all the information into account, including no-detection events. The possibility of extracting randomness from a short string has a practical advantage, because it reduces the computational time of the extraction. Here, we investigate the above idea in a simple scenario, where the devices and the adversary behave according to i.i.d. strategies. We show that indeed almost all the randomness is present in the pair of outcomes for which at least one detection happened. We further show that in some cases applying a pre-processing on the data can capture features that an analysis based on global frequencies only misses, thus resulting in the certification of more randomness. We then briefly consider non-i.i.d strategies and provide an explicit example of such a strategy that is more powerful than any i.i.d. one even in the asymptotic limit of infinitely many measurement rounds, something that was not reported before in the context of Bell inequalities.

  8. Improving randomness characterization through Bayesian model selection.

    PubMed

    Díaz Hernández Rojas, Rafael; Solís, Aldo; Angulo Martínez, Alí M; U'Ren, Alfred B; Hirsch, Jorge G; Marsili, Matteo; Pérez Castillo, Isaac

    2017-06-08

    Random number generation plays an essential role in technology with important applications in areas ranging from cryptography to Monte Carlo methods, and other probabilistic algorithms. All such applications require high-quality sources of random numbers, yet effective methods for assessing whether a source produce truly random sequences are still missing. Current methods either do not rely on a formal description of randomness (NIST test suite) on the one hand, or are inapplicable in principle (the characterization derived from the Algorithmic Theory of Information), on the other, for they require testing all the possible computer programs that could produce the sequence to be analysed. Here we present a rigorous method that overcomes these problems based on Bayesian model selection. We derive analytic expressions for a model's likelihood which is then used to compute its posterior distribution. Our method proves to be more rigorous than NIST's suite and Borel-Normality criterion and its implementation is straightforward. We applied our method to an experimental device based on the process of spontaneous parametric downconversion to confirm it behaves as a genuine quantum random number generator. As our approach relies on Bayesian inference our scheme transcends individual sequence analysis, leading to a characterization of the source itself.

  9. Selecting materialized views using random algorithm

    NASA Astrophysics Data System (ADS)

    Zhou, Lijuan; Hao, Zhongxiao; Liu, Chi

    2007-04-01

    The data warehouse is a repository of information collected from multiple possibly heterogeneous autonomous distributed databases. The information stored at the data warehouse is in form of views referred to as materialized views. The selection of the materialized views is one of the most important decisions in designing a data warehouse. Materialized views are stored in the data warehouse for the purpose of efficiently implementing on-line analytical processing queries. The first issue for the user to consider is query response time. So in this paper, we develop algorithms to select a set of views to materialize in data warehouse in order to minimize the total view maintenance cost under the constraint of a given query response time. We call it query_cost view_ selection problem. First, cost graph and cost model of query_cost view_ selection problem are presented. Second, the methods for selecting materialized views by using random algorithms are presented. The genetic algorithm is applied to the materialized views selection problem. But with the development of genetic process, the legal solution produced become more and more difficult, so a lot of solutions are eliminated and producing time of the solutions is lengthened in genetic algorithm. Therefore, improved algorithm has been presented in this paper, which is the combination of simulated annealing algorithm and genetic algorithm for the purpose of solving the query cost view selection problem. Finally, in order to test the function and efficiency of our algorithms experiment simulation is adopted. The experiments show that the given methods can provide near-optimal solutions in limited time and works better in practical cases. Randomized algorithms will become invaluable tools for data warehouse evolution.

  10. 32 CFR 1624.1 - Random selection procedures for induction.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Random selection procedures for induction. 1624... SYSTEM INDUCTIONS § 1624.1 Random selection procedures for induction. (a) The Director of Selective Service shall from time to time establish a random selection sequence for induction by a drawing to be...

  11. 47 CFR 1.1602 - Designation for random selection.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...

  12. 47 CFR 1.1602 - Designation for random selection.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...

  13. 47 CFR 1.1603 - Conduct of random selection.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct...

  14. 47 CFR 1.1602 - Designation for random selection.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1602...

  15. 47 CFR 1.1603 - Conduct of random selection.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The...

  16. Protein minimization by random fragmentation and selection.

    PubMed

    Rudgers, G W; Palzkill, T

    2001-07-01

    Protein-protein interactions are involved in most biological processes and are important targets for drug design. Over the past decade, there has been increased interest in the design of small molecules that mimic functional epitopes of protein inhibitors. BLIP is a 165 amino acid protein that is a potent inhibitor of TEM-1 beta-lactamase (K(i) = 0.1 nM). To aid in the development of new inhibitors of beta-lactamase, the gene encoding BLIP was randomly fragmented and DNA segments encoding peptides that retain the ability to bind TEM-1 beta-lactamase were isolated using phage display. The selected peptides revealed a common, overlapping region that includes BLIP residues C30-D49. Synthesis and binding analysis of the C30-D49 peptide indicate that this peptide inhibits TEM-1 beta-lactamase. Therefore, a peptide derivative of BLIP that has been reduced in size by 88% compared with wild-type BLIP retains the ability to bind and inhibit beta-lactamase.

  17. Fixed and random effects selection in linear and logistic models.

    PubMed

    Kinney, Satkartar K; Dunson, David B

    2007-09-01

    We address the problem of selecting which variables should be included in the fixed and random components of logistic mixed effects models for correlated data. A fully Bayesian variable selection is implemented using a stochastic search Gibbs sampler to estimate the exact model-averaged posterior distribution. This approach automatically identifies subsets of predictors having nonzero fixed effect coefficients or nonzero random effects variance, while allowing uncertainty in the model selection process. Default priors are proposed for the variance components and an efficient parameter expansion Gibbs sampler is developed for posterior computation. The approach is illustrated using simulated data and an epidemiologic example.

  18. Bayesian nonparametric centered random effects models with variable selection.

    PubMed

    Yang, Mingan

    2013-03-01

    In a linear mixed effects model, it is common practice to assume that the random effects follow a parametric distribution such as a normal distribution with mean zero. However, in the case of variable selection, substantial violation of the normality assumption can potentially impact the subset selection and result in poor interpretation and even incorrect results. In nonparametric random effects models, the random effects generally have a nonzero mean, which causes an identifiability problem for the fixed effects that are paired with the random effects. In this article, we focus on a Bayesian method for variable selection. We characterize the subject-specific random effects nonparametrically with a Dirichlet process and resolve the bias simultaneously. In particular, we propose flexible modeling of the conditional distribution of the random effects with changes across the predictor space. The approach is implemented using a stochastic search Gibbs sampler to identify subsets of fixed effects and random effects to be included in the model. Simulations are provided to evaluate and compare the performance of our approach to the existing ones. We then apply the new approach to a real data example, cross-country and interlaboratory rodent uterotrophic bioassay.

  19. 47 CFR 1.824 - Random selection procedures for Multichannel Multipoint Distribution Service and Multipoint...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Random selection procedures for Multichannel..., and Reports Involving Common Carriers Grants by Random Selection § 1.824 Random selection procedures..., the Commission may use the random selection process to select the conditional licensee or licensee...

  20. Alternative Modal Basis Selection Procedures for Nonlinear Random Response Simulation

    NASA Technical Reports Server (NTRS)

    Przekop, Adam; Guo, Xinyun; Rizzi, Stephen A.

    2010-01-01

    Three procedures to guide selection of an efficient modal basis in a nonlinear random response analysis are examined. One method is based only on proper orthogonal decomposition, while the other two additionally involve smooth orthogonal decomposition. Acoustic random response problems are employed to assess the performance of the three modal basis selection approaches. A thermally post-buckled beam exhibiting snap-through behavior, a shallowly curved arch in the auto-parametric response regime and a plate structure are used as numerical test articles. The results of the three reduced-order analyses are compared with the results of the computationally taxing simulation in the physical degrees of freedom. For the cases considered, all three methods are shown to produce modal bases resulting in accurate and computationally efficient reduced-order nonlinear simulations.

  1. Feature selection and classification of leukocytes using random forest.

    PubMed

    Saraswat, Mukesh; Arya, K V

    2014-12-01

    In automatic segmentation of leukocytes from the complex morphological background of tissue section images, a vast number of artifacts/noise are also extracted causing large amount of multivariate data generation. This multivariate data degrades the performance of a classifier to discriminate between leukocytes and artifacts/noise. However, the selection of prominent features plays an important role in reducing the computational complexity and increasing the performance of the classifier as compared to a high-dimensional features space. Therefore, this paper introduces a novel Gini importance-based binary random forest feature selection method. Moreover, the random forest classifier is used to classify the extracted objects into artifacts, mononuclear cells, and polymorphonuclear cells. The experimental results establish that the proposed method effectively eliminates the irrelevant features, maintaining the high classification accuracy as compared to other feature reduction methods.

  2. Selective advantage for sexual reproduction with random haploid fusion.

    PubMed

    Tannenbaum, Emmanuel

    2009-05-01

    This article develops a simplified set of models describing asexual and sexual replication in unicellular diploid organisms. The models assume organisms whose genomes consist of two chromosomes, where each chromosome is assumed to be functional if it is equal to some master sequence sigma(0), and non-functional otherwise. We review the previously studied case of selective mating, where it is assumed that only haploids with functional chromosomes can fuse, and also consider the case of random haploid fusion. When the cost for sex is small, as measured by the ratio of the characteristic haploid fusion time to the characteristic growth time, we find that sexual replication with random haploid fusion leads to a greater mean fitness for the population than a purely asexual strategy. However, independently of the cost for sex, we find that sexual replication with a selective mating strategy leads to a higher mean fitness than the random mating strategy. The results of this article are consistent with previous studies suggesting that sex is favored at intermediate mutation rates, for slowly replicating organisms, and at high population densities. Furthermore, the results of this article provide a basis for understanding sex as a stress response in unicellular organisms such as Saccharomyces cerevisiae (Baker's yeast).

  3. Selective randomized load balancing and mesh networks with changing demands

    NASA Astrophysics Data System (ADS)

    Shepherd, F. B.; Winzer, P. J.

    2006-05-01

    We consider the problem of building cost-effective networks that are robust to dynamic changes in demand patterns. We compare several architectures using demand-oblivious routing strategies. Traditional approaches include single-hop architectures based on a (static or dynamic) circuit-switched core infrastructure and multihop (packet-switched) architectures based on point-to-point circuits in the core. To address demand uncertainty, we seek minimum cost networks that can carry the class of hose demand matrices. Apart from shortest-path routing, Valiant's randomized load balancing (RLB), and virtual private network (VPN) tree routing, we propose a third, highly attractive approach: selective randomized load balancing (SRLB). This is a blend of dual-hop hub routing and randomized load balancing that combines the advantages of both architectures in terms of network cost, delay, and delay jitter. In particular, we give empirical analyses for the cost (in terms of transport and switching equipment) for the discussed architectures, based on three representative carrier networks. Of these three networks, SRLB maintains the resilience properties of RLB while achieving significant cost reduction over all other architectures, including RLB and multihop Internet protocol/multiprotocol label switching (IP/MPLS) networks using VPN-tree routing.

  4. Hierarchy and extremes in selections from pools of randomized proteins

    PubMed Central

    Boyer, Sébastien; Biswas, Dipanwita; Kumar Soshee, Ananda; Scaramozzino, Natale; Nizak, Clément; Rivoire, Olivier

    2016-01-01

    Variation and selection are the core principles of Darwinian evolution, but quantitatively relating the diversity of a population to its capacity to respond to selection is challenging. Here, we examine this problem at a molecular level in the context of populations of partially randomized proteins selected for binding to well-defined targets. We built several minimal protein libraries, screened them in vitro by phage display, and analyzed their response to selection by high-throughput sequencing. A statistical analysis of the results reveals two main findings. First, libraries with the same sequence diversity but built around different “frameworks” typically have vastly different responses; second, the distribution of responses of the best binders in a library follows a simple scaling law. We show how an elementary probabilistic model based on extreme value theory rationalizes the latter finding. Our results have implications for designing synthetic protein libraries, estimating the density of functional biomolecules in sequence space, characterizing diversity in natural populations, and experimentally investigating evolvability (i.e., the potential for future evolution). PMID:26969726

  5. Selecting Random Distributed Elements for HIFU using Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Zhou, Yufeng

    2011-09-01

    As an effective and noninvasive therapeutic modality for tumor treatment, high-intensity focused ultrasound (HIFU) has attracted attention from both physicians and patients. New generations of HIFU systems with the ability to electrically steer the HIFU focus using phased array transducers have been under development. The presence of side and grating lobes may cause undesired thermal accumulation at the interface of the coupling medium (i.e. water) and skin, or in the intervening tissue. Although sparse randomly distributed piston elements could reduce the amplitude of grating lobes, there are theoretically no grating lobes with the use of concave elements in the new phased array HIFU. A new HIFU transmission strategy is proposed in this study, firing a number of but not all elements for a certain period and then changing to another group for the next firing sequence. The advantages are: 1) the asymmetric position of active elements may reduce the side lobes, and 2) each element has some resting time during the entire HIFU ablation (up to several hours for some clinical applications) so that the decreasing efficiency of the transducer due to thermal accumulation is minimized. Genetic algorithm was used for selecting randomly distributed elements in a HIFU array. Amplitudes of the first side lobes at the focal plane were used as the fitness value in the optimization. Overall, it is suggested that the proposed new strategy could reduce the side lobe and the consequent side-effects, and the genetic algorithm is effective in selecting those randomly distributed elements in a HIFU array.

  6. Unbiased split variable selection for random survival forests using maximally selected rank statistics.

    PubMed

    Wright, Marvin N; Dankowski, Theresa; Ziegler, Andreas

    2017-04-15

    The most popular approach for analyzing survival data is the Cox regression model. The Cox model may, however, be misspecified, and its proportionality assumption may not always be fulfilled. An alternative approach for survival prediction is random forests for survival outcomes. The standard split criterion for random survival forests is the log-rank test statistic, which favors splitting variables with many possible split points. Conditional inference forests avoid this split variable selection bias. However, linear rank statistics are utilized by default in conditional inference forests to select the optimal splitting variable, which cannot detect non-linear effects in the independent variables. An alternative is to use maximally selected rank statistics for the split point selection. As in conditional inference forests, splitting variables are compared on the p-value scale. However, instead of the conditional Monte-Carlo approach used in conditional inference forests, p-value approximations are employed. We describe several p-value approximations and the implementation of the proposed random forest approach. A simulation study demonstrates that unbiased split variable selection is possible. However, there is a trade-off between unbiased split variable selection and runtime. In benchmark studies of prediction performance on simulated and real datasets, the new method performs better than random survival forests if informative dichotomous variables are combined with uninformative variables with more categories and better than conditional inference forests if non-linear covariate effects are included. In a runtime comparison, the method proves to be computationally faster than both alternatives, if a simple p-value approximation is used. Copyright © 2017 John Wiley & Sons, Ltd.

  7. A Bayesian random effects discrete-choice model for resource selection: Population-level selection inference

    USGS Publications Warehouse

    Thomas, D.L.; Johnson, D.; Griffith, B.

    2006-01-01

    Modeling the probability of use of land units characterized by discrete and continuous measures, we present a Bayesian random-effects model to assess resource selection. This model provides simultaneous estimation of both individual- and population-level selection. Deviance information criterion (DIC), a Bayesian alternative to AIC that is sample-size specific, is used for model selection. Aerial radiolocation data from 76 adult female caribou (Rangifer tarandus) and calf pairs during 1 year on an Arctic coastal plain calving ground were used to illustrate models and assess population-level selection of landscape attributes, as well as individual heterogeneity of selection. Landscape attributes included elevation, NDVI (a measure of forage greenness), and land cover-type classification. Results from the first of a 2-stage model-selection procedure indicated that there is substantial heterogeneity among cow-calf pairs with respect to selection of the landscape attributes. In the second stage, selection of models with heterogeneity included indicated that at the population-level, NDVI and land cover class were significant attributes for selection of different landscapes by pairs on the calving ground. Population-level selection coefficients indicate that the pairs generally select landscapes with higher levels of NDVI, but the relationship is quadratic. The highest rate of selection occurs at values of NDVI less than the maximum observed. Results for land cover-class selections coefficients indicate that wet sedge, moist sedge, herbaceous tussock tundra, and shrub tussock tundra are selected at approximately the same rate, while alpine and sparsely vegetated landscapes are selected at a lower rate. Furthermore, the variability in selection by individual caribou for moist sedge and sparsely vegetated landscapes is large relative to the variability in selection of other land cover types. The example analysis illustrates that, while sometimes computationally intense, a

  8. Thymidine kinase mutants obtained by random sequence selection.

    PubMed

    Munir, K M; French, D C; Loeb, L A

    1993-05-01

    Knowledge of the catalytic properties and structural information regarding the amino acid residues that comprise the active site of an enzyme allows one, in principle, to use site-specific mutagenesis to construct genes that encode enzymes with altered functions. However, such information about most enzymes is not known and the effects of specific amino acid substitutions are not generally predictable. An alternative approach is to substitute random nucleotides for key codons in a gene and to use genetic selection to identify new and interesting enzyme variants. We describe here the construction, selection, and characterization of herpes simplex virus type 1 thymidine kinase mutants either with different catalytic properties or with enhanced thermostability. From a library containing 2 x 10(6) plasmid-encoded herpes thymidine kinase genes, each with a different nucleotide sequence at the putative nucleoside binding site, we obtained 1540 active mutants. Using this library and one previously constructed, we identified by secondary selection Escherichia coli harboring thymidine kinase mutant clones that were unable to grow in the presence of concentrations of 3'-azido-3'-deoxythymidine (AZT) that permits colony formation by E. coli harboring the wild-type plasmid. Two of the mutant enzymes exhibited a reduced Km for AZT, one of which displayed a higher catalytic efficiency for AZT over thymidine relative to that of the wild type. We also identified one mutant with enhanced thermostability. These mutants may have clinical potential as the promise of gene therapy is increasingly becoming a reality.

  9. Materials selection for oxide-based resistive random access memories

    NASA Astrophysics Data System (ADS)

    Guo, Yuzheng; Robertson, John

    2014-12-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO2, TiO2, Ta2O5, and Al2O3, to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta2O5 RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy.

  10. Ammons quick test validity among randomly selected referrals.

    PubMed

    Zagar, Robert John; Kovach, Joseph W; Busch, Kenneth G; Zablocki, Michael D; Osnowitz, William; Neuhengen, Jonas; Liu, Yutong; Zagar, Agata Karolina

    2013-12-01

    After selection using a random number table, from volunteer referrals, 89 Youth (61 boys, 28 girls; 48 African Americans, 2 Asian Americans, 27 Euro-Americans, 12 Hispanic Americans), and 147 Adults (107 men, 40 women; 11 African Americans, 6 Asian Americans, 124 Euro-Americans, 6 Hispanic Americans) were administered the Ammons Quick Test (QT). Means, confidence intervals, standard deviations, and Pearson product-moment correlations among tests were computed. The Ammons QT was moderately to strongly and significantly correlated statistically with: the Peabody Picture Vocabulary Test-3b (PPVT-3b); the Vineland Adaptive Behavior Scales-2 Parent/Teacher Form; the Wechsler Intelligence Scale for Children (WISC-4) or the Wechsler Adult Intelligence Scale (WAIS-4); and the Wide Range Achievement Test-Fourth Edition (WRAT-4) Blue and Green Forms. After 51 years, the original norms for the Ammons QT remain valid measures of receptive vocabulary, verbal intelligence, and auditory information processing useful to clinicians.

  11. Materials selection for oxide-based resistive random access memories

    SciTech Connect

    Guo, Yuzheng; Robertson, John

    2014-12-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO{sub 2}, TiO{sub 2}, Ta{sub 2}O{sub 5}, and Al{sub 2}O{sub 3}, to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta{sub 2}O{sub 5} RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy.

  12. Random copolymers at a selective interface: saturation effects.

    PubMed

    Kłos, J; Sommer, J-U

    2007-11-07

    Combining scaling arguments and Monte Carlo simulations using the bond fluctuation method we have studied concentration effects for the adsorption of symmetric AB-random copolymers at selective, symmetric interfaces. For the scaling analysis we consider a hierarchy of two length scales given by the excess (adsorption) blobs and by two dimensional thermal blobs in the semidilute surface regime. When both length scales match, a densely packed array of adsorption blobs is formed (saturation). We show that for random copolymer adsorption the interface concentration can be further increased (oversaturation) due to reorganization of excess blobs. Crossing over this threshold results in a qualitative change in the behavior of the adsorption layer which involves a change in the average shape of the adsorbed chains towards a hairpinlike form. We have analyzed the distribution of loops and tails of adsorbed chains in the various concentration regimes as well as the chain order parameter, concentration profiles, and the exchange rate of individual chains. We emphasized the role of saturation scaling which dominates the behavior of static and dynamic quantities at higher surface concentration.

  13. Study on MAX-MIN Ant System with Random Selection in Quadratic Assignment Problem

    NASA Astrophysics Data System (ADS)

    Iimura, Ichiro; Yoshida, Kenji; Ishibashi, Ken; Nakayama, Shigeru

    Ant Colony Optimization (ACO), which is a type of swarm intelligence inspired by ants' foraging behavior, has been studied extensively and its effectiveness has been shown by many researchers. The previous studies have reported that MAX-MIN Ant System (MMAS) is one of effective ACO algorithms. The MMAS maintains the balance of intensification and diversification concerning pheromone by limiting the quantity of pheromone to the range of minimum and maximum values. In this paper, we propose MAX-MIN Ant System with Random Selection (MMASRS) for improving the search performance even further. The MMASRS is a new ACO algorithm that is MMAS into which random selection was newly introduced. The random selection is one of the edgechoosing methods by agents (ants). In our experimental evaluation using ten quadratic assignment problems, we have proved that the proposed MMASRS with the random selection is superior to the conventional MMAS without the random selection in the viewpoint of the search performance.

  14. 32 CFR 1624.1 - Random selection procedures for induction.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., during the specified calendar year(s) attain their 18th year of birth. The drawing, commencing with the... date of birth of the registrant that appears on his Selective Service Registration Record on the day... date of birth in all matters pertaining to his relations with the Selective Service System. ...

  15. 32 CFR 1624.1 - Random selection procedures for induction.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., during the specified calendar year(s) attain their 18th year of birth. The drawing, commencing with the... date of birth of the registrant that appears on his Selective Service Registration Record on the day... date of birth in all matters pertaining to his relations with the Selective Service System. ...

  16. Assessing risk-adjustment approaches under non-random selection.

    PubMed

    Luft, Harold S; Dudley, R Adams

    2004-01-01

    Various approaches have been proposed to adjust for differences in enrollee risk in health plans. Because risk-selection strategies may have different effects on enrollment, we simulated three types of selection--dumping, skimming, and stinting. Concurrent diagnosis-based risk adjustment, and a hybrid using concurrent adjustment for about 8% of the cases and prospective adjustment for the rest, perform markedly better than prospective or demographic adjustments, both in terms of R2 and the extent to which plans experience unwarranted gains or losses. The simulation approach offers a valuable tool for analysts in assessing various risk-adjustment strategies under different selection situations.

  17. Testing, Selection, and Implementation of Random Number Generators

    DTIC Science & Technology

    2008-07-01

    U.S. Army Research Laboratory ATTN: AMSRD-ARL-SL-BD Aberdeen Proving Ground, MD 21005-5068 8 . PERFORMING ORGANIZATION REPORT NUMBER ARL...NUMBER (Include area code) 410-278-6832 Standard Form 298 (Rev. 8 /98) Prescribed by ANSI Std. Z39.18 ii Contents 1. Random Number Generators 1...Linear RNGs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 3.4.2 The Characteristic Polynomial

  18. The Application of Imperialist Competitive Algorithm for Fuzzy Random Portfolio Selection Problem

    NASA Astrophysics Data System (ADS)

    EhsanHesamSadati, Mir; Bagherzadeh Mohasefi, Jamshid

    2013-10-01

    This paper presents an implementation of the Imperialist Competitive Algorithm (ICA) for solving the fuzzy random portfolio selection problem where the asset returns are represented by fuzzy random variables. Portfolio Optimization is an important research field in modern finance. By using the necessity-based model, fuzzy random variables reformulate to the linear programming and ICA will be designed to find the optimum solution. To show the efficiency of the proposed method, a numerical example illustrates the whole idea on implementation of ICA for fuzzy random portfolio selection problem.

  19. Fitting Selected Random Planetary Systems to Titius-Bode Laws

    NASA Astrophysics Data System (ADS)

    Hayes, Wayne; Tremaine, Scott

    1998-10-01

    Simple “solar systems” are generated with planetary orbital radiirdistributed uniformly random in logrbetween 0.2 and 50 AU, with masses and order identical to our own Solar System. A conservative stability criterion is imposed by requiring that adjacent planets are separated by a minimum distance ofktimes the sum of their Hill radii for values ofkranging from 0 to 8. Least-squares fits of these systems to generalized Bode laws are performed and compared to the fit of our own Solar System. We find that this stability criterion and other “radius-exclusion” laws generally produce approximately geometrically spaced planets that fit a Titius-Bode law about as well as our own Solar System. We then allow the random systems the same exceptions that have historically been applied to our own Solar System. Namely, one gap may be inserted, similar to the gap between Mars and Jupiter, and up to 3 planets may be “ignored,” similar to how some forms of Bode's law ignore Mercury, Neptune, and Pluto. With these particular exceptions, we find that our Solar System fits significantly better than the random ones. However, we believe that this choice of exceptions, designed specifically to give our own Solar System a better fit, gives it an unfair advantage that would be lost if other exception rules were used. We compare our results to previous work that uses a “law of increasing differences” as a basis for judging the significance of Bode's law. We note that the law of increasing differences is not physically based and is probably too stringent a constraint for judging the significance of Bode's law. We conclude that the significance of Bode's law is simply that stable planetary systems tend to be regularly spaced and conjecture that this conclusion could be strengthened by the use of more rigorous methods of rejecting unstable planetary systems, such as long-term orbit integrations.

  20. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    EPA Science Inventory

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  1. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  2. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  3. On Measuring and Reducing Selection Bias with a Quasi-Doubly Randomized Preference Trial

    ERIC Educational Resources Information Center

    Joyce, Ted; Remler, Dahlia K.; Jaeger, David A.; Altindag, Onur; O'Connell, Stephen D.; Crockett, Sean

    2017-01-01

    Randomized experiments provide unbiased estimates of treatment effects, but are costly and time consuming. We demonstrate how a randomized experiment can be leveraged to measure selection bias by conducting a subsequent observational study that is identical in every way except that subjects choose their treatment--a quasi-doubly randomized…

  4. Selective advantage for sexual replication with random haploid fusion

    NASA Astrophysics Data System (ADS)

    Tannenbaum, Emmanuel

    2008-03-01

    This talk develops a simplified set of models describing asexual and sexual replication in unicellular diploid organisms. The models assume organisms whose genomes consist of two chromosomes, where each chromosome is assumed to be functional if and only if it is equal to some master sequence. The fitness of an organism is determined by the number of functional chromosomes in its genome. For a population replicating asexually, a cell replicates both of its chromosomes, and then divides and splits its genetic material evenly between the two cells. For a population replicating sexually, a given cell first divides into two haploids, which enter a haploid pool. Within the haploid pool, haploids fuse into diploids, which then divide via the normal mitotic process. When the cost for sex is small, as measured by the ratio of the characteristic haploid fusion time to the characteristic growth time, we find that sexual replication with random haploid fusion leads to a greater mean fitness for the population than a purely asexual strategy. The results of this talk are consistent with previous studies suggesting that sex is favored at intermediate mutation rates, for slowly replicating organisms, and at high population densities.

  5. Acceptance sampling using judgmental and randomly selected samples

    SciTech Connect

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  6. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  7. THE SELECTION OF A NATIONAL RANDOM SAMPLE OF TEACHERS FOR EXPERIMENTAL CURRICULUM EVALUATION.

    ERIC Educational Resources Information Center

    WELCH, WAYNE W.; AND OTHERS

    MEMBERS OF THE EVALUATION SECTION OF HARVARD PROJECT PHYSICS, DESCRIBING WHAT IS SAID TO BE THE FIRST ATTEMPT TO SELECT A NATIONAL RANDOM SAMPLE OF (HIGH SCHOOL PHYSICS) TEACHERS, LIST THE STEPS AS (1) PURCHASE OF A LIST OF PHYSICS TEACHERS FROM THE NATIONAL SCIENCE TEACHERS ASSOCIATION (MOST COMPLETE AVAILABLE), (2) SELECTION OF 136 NAMES BY A…

  8. Random Forests-Based Feature Selection for Land-Use Classification Using LIDAR Data and Orthoimagery

    NASA Astrophysics Data System (ADS)

    Guan, H.; Yu, J.; Li, J.; Luo, L.

    2012-07-01

    The development of lidar system, especially incorporated with high-resolution camera components, has shown great potential for urban classification. However, how to automatically select the best features for land-use classification is challenging. Random Forests, a newly developed machine learning algorithm, is receiving considerable attention in the field of image classification and pattern recognition. Especially, it can provide the measure of variable importance. Thus, in this study the performance of the Random Forests-based feature selection for urban areas was explored. First, we extract features from lidar data, including height-based, intensity-based GLCM measures; other spectral features can be obtained from imagery, such as Red, Blue and Green three bands, and GLCM-based measures. Finally, Random Forests is used to automatically select the optimal and uncorrelated features for landuse classification. 0.5-meter resolution lidar data and aerial imagery are used to assess the feature selection performance of Random Forests in the study area located in Mannheim, Germany. The results clearly demonstrate that the use of Random Forests-based feature selection can improve the classification performance by the selected features.

  9. Statistical considerations of the random selection process in a drug testing program

    SciTech Connect

    Burtis, C.A.; Owings, J.H.; Leete, R.S. Jr.

    1987-01-01

    In a prospective drug testing program, individuals whose job classifications have been defined as sensitive are placed in a selection pool. On a periodic basis, individuals are chosen from this pool for drug testing. Random selection is a fair and impartial approach. A random selection process generates a Poisson distribution of probabilities that can be used to predict how many times an individual will be selected during a specific time interval. This information can be used to model the selection part of a drug testing program to determine whether specific conditions of testing are met. For example, the probability of being selected a given number of times during the testing period can be minimized or maximized by varying the frequency of the sampling process. Consequently, the Poisson distribution and the mathematics governing it can be used to structure a drug testing program to meet the needs and dictates of any given situation.

  10. Selection on plasticity of seasonal life-history traits using random regression mixed model analysis

    PubMed Central

    Brommer, Jon E; Kontiainen, Pekka; Pietiäinen, Hannu

    2012-01-01

    Theory considers the covariation of seasonal life-history traits as an optimal reaction norm, implying that deviating from this reaction norm reduces fitness. However, the estimation of reaction-norm properties (i.e., elevation, linear slope, and higher order slope terms) and the selection on these is statistically challenging. We here advocate the use of random regression mixed models to estimate reaction-norm properties and the use of bivariate random regression to estimate selection on these properties within a single model. We illustrate the approach by random regression mixed models on 1115 observations of clutch sizes and laying dates of 361 female Ural owl Strix uralensis collected over 31 years to show that (1) there is variation across individuals in the slope of their clutch size–laying date relationship, and that (2) there is selection on the slope of the reaction norm between these two traits. Hence, natural selection potentially drives the negative covariance in clutch size and laying date in this species. The random-regression approach is hampered by inability to estimate nonlinear selection, but avoids a number of disadvantages (stats-on-stats, connecting reaction-norm properties to fitness). The approach is of value in describing and studying selection on behavioral reaction norms (behavioral syndromes) or life-history reaction norms. The approach can also be extended to consider the genetic underpinning of reaction-norm properties. PMID:22837818

  11. SNP selection and classification of genome-wide SNP data using stratified sampling random forests.

    PubMed

    Wu, Qingyao; Ye, Yunming; Liu, Yang; Ng, Michael K

    2012-09-01

    For high dimensional genome-wide association (GWA) case-control data of complex disease, there are usually a large portion of single-nucleotide polymorphisms (SNPs) that are irrelevant with the disease. A simple random sampling method in random forest using default mtry parameter to choose feature subspace, will select too many subspaces without informative SNPs. Exhaustive searching an optimal mtry is often required in order to include useful and relevant SNPs and get rid of vast of non-informative SNPs. However, it is too time-consuming and not favorable in GWA for high-dimensional data. The main aim of this paper is to propose a stratified sampling method for feature subspace selection to generate decision trees in a random forest for GWA high-dimensional data. Our idea is to design an equal-width discretization scheme for informativeness to divide SNPs into multiple groups. In feature subspace selection, we randomly select the same number of SNPs from each group and combine them to form a subspace to generate a decision tree. The advantage of this stratified sampling procedure can make sure each subspace contains enough useful SNPs, but can avoid a very high computational cost of exhaustive search of an optimal mtry, and maintain the randomness of a random forest. We employ two genome-wide SNP data sets (Parkinson case-control data comprised of 408 803 SNPs and Alzheimer case-control data comprised of 380 157 SNPs) to demonstrate that the proposed stratified sampling method is effective, and it can generate better random forest with higher accuracy and lower error bound than those by Breiman's random forest generation method. For Parkinson data, we also show some interesting genes identified by the method, which may be associated with neurological disorders for further biological investigations.

  12. Aiming for a representative sample: Simulating random versus purposive strategies for hospital selection.

    PubMed

    van Hoeven, Loan R; Janssen, Mart P; Roes, Kit C B; Koffijberg, Hendrik

    2015-10-23

    A ubiquitous issue in research is that of selecting a representative sample from the study population. While random sampling strategies are the gold standard, in practice, random sampling of participants is not always feasible nor necessarily the optimal choice. In our case, a selection must be made of 12 hospitals (out of 89 Dutch hospitals in total). With this selection of 12 hospitals, it should be possible to estimate blood use in the remaining hospitals as well. In this paper, we evaluate both random and purposive strategies for the case of estimating blood use in Dutch hospitals. Available population-wide data on hospital blood use and number of hospital beds are used to simulate five sampling strategies: (1) select only the largest hospitals, (2) select the largest and the smallest hospitals ('maximum variation'), (3) select hospitals randomly, (4) select hospitals from as many different geographic regions as possible, (5) select hospitals from only two regions. Simulations of each strategy result in different selections of hospitals, that are each used to estimate blood use in the remaining hospitals. The estimates are compared to the actual population values; the subsequent prediction errors are used to indicate the quality of the sampling strategy. The strategy leading to the lowest prediction error in the case study was maximum variation sampling, followed by random, regional variation and two-region sampling, with sampling the largest hospitals resulting in the worst performance. Maximum variation sampling led to a hospital level prediction error of 15%, whereas random sampling led to a prediction error of 19% (95% CI 17%-26%). While lowering the sample size reduced the differences between maximum variation and the random strategies, increasing sample size to n = 18 did not change the ranking of the strategies and led to only slightly better predictions. The optimal strategy for estimating blood use was maximum variation sampling. When proxy data

  13. Unbiased feature selection in learning random forests for high-dimensional data.

    PubMed

    Nguyen, Thanh-Tung; Huang, Joshua Zhexue; Nguyen, Thuy Thi

    2015-01-01

    Random forests (RFs) have been widely used as a powerful classification method. However, with the randomization in both bagging samples and feature selection, the trees in the forest tend to select uninformative features for node splitting. This makes RFs have poor accuracy when working with high-dimensional data. Besides that, RFs have bias in the feature selection process where multivalued features are favored. Aiming at debiasing feature selection in RFs, we propose a new RF algorithm, called xRF, to select good features in learning RFs for high-dimensional data. We first remove the uninformative features using p-value assessment, and the subset of unbiased features is then selected based on some statistical measures. This feature subset is then partitioned into two subsets. A feature weighting sampling technique is used to sample features from these two subsets for building trees. This approach enables one to generate more accurate trees, while allowing one to reduce dimensionality and the amount of data needed for learning RFs. An extensive set of experiments has been conducted on 47 high-dimensional real-world datasets including image datasets. The experimental results have shown that RFs with the proposed approach outperformed the existing random forests in increasing the accuracy and the AUC measures.

  14. COMPARISON OF RANDOM AND SYSTEMATIC SITE SELECTION FOR ASSESSING ATTAINMENT OF AQUATIC LIFE USES IN SEGMENTS OF THE OHIO RIVER

    EPA Science Inventory

    This report is a description of field work and data analysis results comparing a design comparable to systematic site selection with one based on random selection of sites. The report is expected to validate the use of random site selection in the bioassessment program for the O...

  15. COMPARISON OF RANDOM AND SYSTEMATIC SITE SELECTION FOR ASSESSING ATTAINMENT OF AQUATIC LIFE USES IN SEGMENTS OF THE OHIO RIVER

    EPA Science Inventory

    This report is a description of field work and data analysis results comparing a design comparable to systematic site selection with one based on random selection of sites. The report is expected to validate the use of random site selection in the bioassessment program for the O...

  16. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or...

  17. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or...

  18. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or...

  19. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or...

  20. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or...

  1. Variable selection with random forest: Balancing stability, performance, and interpretation in ecological and environmental modeling

    EPA Science Inventory

    Random forest (RF) is popular in ecological and environmental modeling, in part, because of its insensitivity to correlated predictors and resistance to overfitting. Although variable selection has been proposed to improve both performance and interpretation of RF models, it is u...

  2. Variable selection with random forest: Balancing stability, performance, and interpretation in ecological and environmental modeling

    EPA Science Inventory

    Random forest (RF) is popular in ecological and environmental modeling, in part, because of its insensitivity to correlated predictors and resistance to overfitting. Although variable selection has been proposed to improve both performance and interpretation of RF models, it is u...

  3. An Evaluation of Information Criteria Use for Correct Cross-Classified Random Effects Model Selection

    ERIC Educational Resources Information Center

    Beretvas, S. Natasha; Murphy, Daniel L.

    2013-01-01

    The authors assessed correct model identification rates of Akaike's information criterion (AIC), corrected criterion (AICC), consistent AIC (CAIC), Hannon and Quinn's information criterion (HQIC), and Bayesian information criterion (BIC) for selecting among cross-classified random effects models. Performance of default values for the 5…

  4. An Evaluation of Information Criteria Use for Correct Cross-Classified Random Effects Model Selection

    ERIC Educational Resources Information Center

    Beretvas, S. Natasha; Murphy, Daniel L.

    2013-01-01

    The authors assessed correct model identification rates of Akaike's information criterion (AIC), corrected criterion (AICC), consistent AIC (CAIC), Hannon and Quinn's information criterion (HQIC), and Bayesian information criterion (BIC) for selecting among cross-classified random effects models. Performance of default values for the 5…

  5. Hebbian Learning in a Random Network Captures Selectivity Properties of Prefrontal Cortex.

    PubMed

    Lindsay, Grace W; Rigotti, Mattia; Warden, Melissa R; Miller, Earl K; Fusi, Stefano

    2017-10-11

    Complex cognitive behaviors, such as context-switching and rule-following, are thought to be supported by prefrontal cortex (PFC). Neural activity in PFC must thus be specialized to specific tasks while retaining flexibility. Nonlinear 'mixed' selectivity is an important neurophysiological trait for enabling complex and context-dependent behaviors. Here we investigate (1) the extent to which PFC exhibits computationally-relevant properties such as mixed selectivity and (2) how such properties could arise via circuit mechanisms. We show that PFC cells recorded from male and female rhesus macaques during a complex task show a moderate level of specialization and structure that is not replicated by a model wherein cells receive random feedforward inputs. While random connectivity can be effective at generating mixed selectivity, the data shows significantly more mixed selectivity than predicted by a model with otherwise matched parameters. A simple Hebbian learning rule applied to the random connectivity, however, increases mixed selectivity and allows the model to match the data more accurately. To explain how learning achieves this, we provide analysis along with a clear geometric interpretation of the impact of learning on selectivity. After learning, the model also matches the data on measures of noise, response density, clustering, and the distribution of selectivities. Of two styles of Hebbian learning tested, the simpler and more biologically plausible option better matches the data. These modeling results give intuition about how neural properties important for cognition can arise in a circuit and make clear experimental predictions regarding how various measures of selectivity would evolve during animal training.Significance Statement: Prefrontal cortex (PFC) is a brain region believed to support the ability of animals to engage in complex behavior. How neurons in this area respond to stimuli---and in particular, to combinations of stimuli ("mixed selectivity

  6. Optimization Of Mean-Semivariance-Skewness Portfolio Selection Model In Fuzzy Random Environment

    NASA Astrophysics Data System (ADS)

    Chatterjee, Amitava; Bhattacharyya, Rupak; Mukherjee, Supratim; Kar, Samarjit

    2010-10-01

    The purpose of the paper is to construct a mean-semivariance-skewness portfolio selection model in fuzzy random environment. The objective is to maximize the skewness with predefined maximum risk tolerance and minimum expected return. Here the security returns in the objectives and constraints are assumed to be fuzzy random variables in nature and then the vagueness of the fuzzy random variables in the objectives and constraints are transformed into fuzzy variables which are similar to trapezoidal numbers. The newly formed fuzzy model is then converted into a deterministic optimization model. The feasibility and effectiveness of the proposed method is verified by numerical example extracted from Bombay Stock Exchange (BSE). The exact parameters of fuzzy membership function and probability density function are obtained through fuzzy random simulating the past dates.

  7. Adaptive consensus of scale-free multi-agent system by randomly selecting links

    NASA Astrophysics Data System (ADS)

    Mou, Jinping; Ge, Huafeng

    2016-06-01

    This paper investigates an adaptive consensus problem for distributed scale-free multi-agent systems (SFMASs) by randomly selecting links, where the degree of each node follows a power-law distribution. The randomly selecting links are based on the assumption that every agent decides to select links among its neighbours according to the received data with a certain probability. Accordingly, a novel consensus protocol with the range of the received data is developed, and each node updates its state according to the protocol. By the iterative method and Cauchy inequality, the theoretical analysis shows that all errors among agents converge to zero, and in the meanwhile, several criteria of consensus are obtained. One numerical example shows the reliability of the proposed methods.

  8. Classification of epileptic EEG signals based on simple random sampling and sequential feature selection.

    PubMed

    Ghayab, Hadi Ratham Al; Li, Yan; Abdulla, Shahab; Diykh, Mohammed; Wan, Xiangkui

    2016-06-01

    Electroencephalogram (EEG) signals are used broadly in the medical fields. The main applications of EEG signals are the diagnosis and treatment of diseases such as epilepsy, Alzheimer, sleep problems and so on. This paper presents a new method which extracts and selects features from multi-channel EEG signals. This research focuses on three main points. Firstly, simple random sampling (SRS) technique is used to extract features from the time domain of EEG signals. Secondly, the sequential feature selection (SFS) algorithm is applied to select the key features and to reduce the dimensionality of the data. Finally, the selected features are forwarded to a least square support vector machine (LS_SVM) classifier to classify the EEG signals. The LS_SVM classifier classified the features which are extracted and selected from the SRS and the SFS. The experimental results show that the method achieves 99.90, 99.80 and 100 % for classification accuracy, sensitivity and specificity, respectively.

  9. Development of Solution Algorithm and Sensitivity Analysis for Random Fuzzy Portfolio Selection Model

    NASA Astrophysics Data System (ADS)

    Hasuike, Takashi; Katagiri, Hideki

    2010-10-01

    This paper focuses on the proposition of a portfolio selection problem considering an investor's subjectivity and the sensitivity analysis for the change of subjectivity. Since this proposed problem is formulated as a random fuzzy programming problem due to both randomness and subjectivity presented by fuzzy numbers, it is not well-defined. Therefore, introducing Sharpe ratio which is one of important performance measures of portfolio models, the main problem is transformed into the standard fuzzy programming problem. Furthermore, using the sensitivity analysis for fuzziness, the analytical optimal portfolio with the sensitivity factor is obtained.

  10. Response selection in dual task paradigms: observations from random generation tasks.

    PubMed

    Dirnberger, Georg; Jahanshahi, Marjan

    2010-03-01

    Performance of attention-demanding tasks is worse if two tasks are carried out simultaneously than if each of the tasks is performed alone. Our aim was to determine whether these 'dual task costs' can be attributed to mechanisms on a supra-trial level such as switching of limited resources between trials or concurrent breakdown of supervisory functions, or to mechanisms effective within each trial such as demands of response selection. Twenty healthy volunteers performed verbal random number generation (RNG) and random movement generation (RMG) at three different rates. For each rate, both tasks were examined once in a single task condition and once in a dual task condition. Results showed that performance (quality of randomness) in each random generation task (RNG/RMG) was reduced at faster rates and impaired by concurrent performance of a secondary random generation task. In the dual task condition, transient increase or decrease of bias in one random generation task during any short interval was not associated with concurrent increase or decrease of bias in the other task. In conclusion, the fact that during dual task performance transient bias in one task was not associated with concurrent improvement of performance in the other task indicates that alternation of supervisory control or attentional resources from one to the other task does not mediate the observed dual task costs. Resources of the central executive are not re-allocated or 'switched' from one to the other task. Dual task costs may result from mechanisms effective within each trial such as the demands of response selection.

  11. Alternative Modal Basis Selection Procedures For Reduced-Order Nonlinear Random Response Simulation

    NASA Technical Reports Server (NTRS)

    Przekop, Adam; Guo, Xinyun; Rizi, Stephen A.

    2012-01-01

    Three procedures to guide selection of an efficient modal basis in a nonlinear random response analysis are examined. One method is based only on proper orthogonal decomposition, while the other two additionally involve smooth orthogonal decomposition. Acoustic random response problems are employed to assess the performance of the three modal basis selection approaches. A thermally post-buckled beam exhibiting snap-through behavior, a shallowly curved arch in the auto-parametric response regime and a plate structure are used as numerical test articles. The results of a computationally taxing full-order analysis in physical degrees of freedom are taken as the benchmark for comparison with the results from the three reduced-order analyses. For the cases considered, all three methods are shown to produce modal bases resulting in accurate and computationally efficient reduced-order nonlinear simulations.

  12. Alternative Modal Basis Selection Procedures For Reduced-Order Nonlinear Random Response Simulation

    NASA Technical Reports Server (NTRS)

    Przekop, Adam; Guo, Xinyun; Rizi, Stephen A.

    2012-01-01

    Three procedures to guide selection of an efficient modal basis in a nonlinear random response analysis are examined. One method is based only on proper orthogonal decomposition, while the other two additionally involve smooth orthogonal decomposition. Acoustic random response problems are employed to assess the performance of the three modal basis selection approaches. A thermally post-buckled beam exhibiting snap-through behavior, a shallowly curved arch in the auto-parametric response regime and a plate structure are used as numerical test articles. The results of a computationally taxing full-order analysis in physical degrees of freedom are taken as the benchmark for comparison with the results from the three reduced-order analyses. For the cases considered, all three methods are shown to produce modal bases resulting in accurate and computationally efficient reduced-order nonlinear simulations.

  13. Alternative modal basis selection procedures for reduced-order nonlinear random response simulation

    NASA Astrophysics Data System (ADS)

    Przekop, Adam; Guo, Xinyun; Rizzi, Stephen A.

    2012-08-01

    Three procedures to guide selection of an efficient modal basis in a nonlinear random response analysis are examined. One method is based only on proper orthogonal decomposition, while the other two additionally involve smooth orthogonal decomposition. Acoustic random response problems are employed to assess the performance of the three modal basis selection approaches. A thermally post-buckled beam exhibiting snap-through behavior, a shallowly curved arch in the auto-parametric response regime and a plate structure are used as numerical test articles. The results of a computationally taxing full-order analysis in physical degrees of freedom are taken as the benchmark for comparison with the results from the three reduced-order analyses. For the cases considered, all three methods are shown to produce modal bases resulting in accurate and computationally efficient reduced-order nonlinear simulations.

  14. Random Drift versus Selection in Academic Vocabulary: An Evolutionary Analysis of Published Keywords

    PubMed Central

    Bentley, R. Alexander

    2008-01-01

    The evolution of vocabulary in academic publishing is characterized via keyword frequencies recorded in the ISI Web of Science citations database. In four distinct case-studies, evolutionary analysis of keyword frequency change through time is compared to a model of random copying used as the null hypothesis, such that selection may be identified against it. The case studies from the physical sciences indicate greater selection in keyword choice than in the social sciences. Similar evolutionary analyses can be applied to a wide range of phenomena; wherever the popularity of multiple items through time has been recorded, as with web searches, or sales of popular music and books, for example. PMID:18728786

  15. Topology-selective jamming of fully-connected, code-division random-access networks

    NASA Technical Reports Server (NTRS)

    Polydoros, Andreas; Cheng, Unjeng

    1990-01-01

    The purpose is to introduce certain models of topology selective stochastic jamming and examine its impact on a class of fully-connected, spread-spectrum, slotted ALOHA-type random access networks. The theory covers dedicated as well as half-duplex units. The dominant role of the spatial duty factor is established, and connections with the dual concept of time selective jamming are discussed. The optimal choices of coding rate and link access parameters (from the users' side) and the jamming spatial fraction are numerically established for DS and FH spreading.

  16. Gene selection using iterative feature elimination random forests for survival outcomes

    PubMed Central

    Pang, Herbert; George, Stephen L.; Hui, Ken; Tong, Tiejun

    2012-01-01

    Although many feature selection methods for classification have been developed, there is a need to identify genes in high-dimensional data with censored survival outcomes. Traditional methods for gene selection in classification problems have several drawbacks. First, the majority of the gene selection approaches for classification are single-gene based. Second, many of the gene selection procedures are not embedded within the algorithm itself. The technique of random forests has been found to perform well in high dimensional data settings with survival outcomes. It also has an embedded feature to identify variables of importance. Therefore, it is an ideal candidate for gene selection in high dimensional data with survival outcomes. In this paper, we develop a novel method based on the random forests to identify a set of prognostic genes. We compare our method with several machine learning methods and various node split criteria using several real data sets. Our method performed well in both simulations and real data analysis. Additionally, we have shown the advantages of our approach over single-gene based approaches. Our method incorporates multivariate correlations in microarray data for survival outcomes. The described method allows us to best utilize the information available from microarray data with survival outcomes. PMID:22547432

  17. Malaria life cycle intensifies both natural selection and random genetic drift.

    PubMed

    Chang, Hsiao-Han; Moss, Eli L; Park, Daniel J; Ndiaye, Daouda; Mboup, Souleymane; Volkman, Sarah K; Sabeti, Pardis C; Wirth, Dyann F; Neafsey, Daniel E; Hartl, Daniel L

    2013-12-10

    Analysis of genome sequences of 159 isolates of Plasmodium falciparum from Senegal yields an extraordinarily high proportion (26.85%) of protein-coding genes with the ratio of nonsynonymous to synonymous polymorphism greater than one. This proportion is much greater than observed in other organisms. Also unusual is that the site-frequency spectra of synonymous and nonsynonymous polymorphisms are virtually indistinguishable. We hypothesized that the complicated life cycle of malaria parasites might lead to qualitatively different population genetics from that predicted from the classical Wright-Fisher (WF) model, which assumes a single random-mating population with a finite and constant population size in an organism with nonoverlapping generations. This paper summarizes simulation studies of random genetic drift and selection in malaria parasites that take into account their unusual life history. Our results show that random genetic drift in the malaria life cycle is more pronounced than under the WF model. Paradoxically, the efficiency of purifying selection in the malaria life cycle is also greater than under WF, and the relative efficiency of positive selection varies according to conditions. Additionally, the site-frequency spectrum under neutrality is also more skewed toward low-frequency alleles than expected with WF. These results highlight the importance of considering the malaria life cycle when applying existing population genetic tools based on the WF model. The same caveat applies to other species with similarly complex life cycles.

  18. Fuzzy Random λ-Mean SAD Portfolio Selection Problem: An Ant Colony Optimization Approach

    NASA Astrophysics Data System (ADS)

    Thakur, Gour Sundar Mitra; Bhattacharyya, Rupak; Mitra, Swapan Kumar

    2010-10-01

    To reach the investment goal, one has to select a combination of securities among different portfolios containing large number of securities. Only the past records of each security do not guarantee the future return. As there are many uncertain factors which directly or indirectly influence the stock market and there are also some newer stock markets which do not have enough historical data, experts' expectation and experience must be combined with the past records to generate an effective portfolio selection model. In this paper the return of security is assumed to be Fuzzy Random Variable Set (FRVS), where returns are set of random numbers which are in turn fuzzy numbers. A new λ-Mean Semi Absolute Deviation (λ-MSAD) portfolio selection model is developed. The subjective opinions of the investors to the rate of returns of each security are taken into consideration by introducing a pessimistic-optimistic parameter vector λ. λ-Mean Semi Absolute Deviation (λ-MSAD) model is preferred as it follows absolute deviation of the rate of returns of a portfolio instead of the variance as the measure of the risk. As this model can be reduced to Linear Programming Problem (LPP) it can be solved much faster than quadratic programming problems. Ant Colony Optimization (ACO) is used for solving the portfolio selection problem. ACO is a paradigm for designing meta-heuristic algorithms for combinatorial optimization problem. Data from BSE is used for illustration.

  19. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology

    PubMed Central

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C.; Hobbs, Brian P.; Berry, Donald A.; Pentz, Rebecca D.; Tam, Alda; Hong, Waun K.; Ellis, Lee M.; Abbruzzese, James; Overman, Michael J.

    2015-01-01

    Purpose The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. Methods We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. Results A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P < .001). Twenty-eight studies (37.8%) reported a total of 65 unplanned end points; 52 (80.0%) of which were not identified as unplanned. Thirty-one (41.9%) and 19 (25.7%) of 74 trials reported a total of 52 unplanned analyses involving primary end points and 33 unplanned analyses involving nonprimary end points, respectively. Studies reported positive unplanned end points and unplanned analyses more frequently than negative outcomes in abstracts (unplanned end points odds ratio, 6.8; P = .002; unplanned analyses odd ratio, 8.4; P = .007). Conclusion Despite public and reviewer access to protocols, selective outcome reporting persists and is a major concern in the reporting of randomized clinical trials. To foster credible evidence-based medicine, additional initiatives are needed to minimize selective reporting. PMID:26304898

  20. Developmental contributions to macronutrient selection: a randomized controlled trial in adult survivors of malnutrition

    PubMed Central

    Campbell, Claudia P.; Raubenheimer, David; Badaloo, Asha V.; Gluckman, Peter D.; Martinez, Claudia; Gosby, Alison; Simpson, Stephen J.; Osmond, Clive; Boyne, Michael S.; Forrester, Terrence E.

    2016-01-01

    Background and objectives: Birthweight differences between kwashiorkor and marasmus suggest that intrauterine factors influence the development of these syndromes of malnutrition and may modulate risk of obesity through dietary intake. We tested the hypotheses that the target protein intake in adulthood is associated with birthweight, and that protein leveraging to maintain this target protein intake would influence energy intake (EI) and body weight in adult survivors of malnutrition. Methodology: Sixty-three adult survivors of marasmus and kwashiorkor could freely compose a diet from foods containing 10, 15 and 25 percentage energy from protein (percentage of energy derived from protein (PEP); Phase 1) for 3 days. Participants were then randomized in Phase 2 (5 days) to diets with PEP fixed at 10%, 15% or 25%. Results: Self-selected PEP was similar in both groups. In the groups combined, selected PEP was 14.7, which differed significantly (P < 0.0001) from the null expectation (16.7%) of no selection. Self-selected PEP was inversely related to birthweight, the effect disappearing after adjusting for sex and current body weight. In Phase 2, PEP correlated inversely with EI (P = 0.002) and weight change from Phase 1 to 2 (P = 0.002). Protein intake increased with increasing PEP, but to a lesser extent than energy increased with decreasing PEP. Conclusions and implications: Macronutrient intakes were not independently related to birthweight or diagnosis. In a free-choice situation (Phase 1), subjects selected a dietary PEP significantly lower than random. Lower PEP diets induce increased energy and decreased protein intake, and are associated with weight gain. PMID:26817484

  1. Selective of informative metabolites using random forests based on model population analysis.

    PubMed

    Huang, Jian-Hua; Yan, Jun; Wu, Qing-Hua; Duarte Ferro, Miguel; Yi, Lun-Zhao; Lu, Hong-Mei; Xu, Qing-Song; Liang, Yi-Zeng

    2013-12-15

    One of the main goals of metabolomics studies is to discover informative metabolites or biomarkers, which may be used to diagnose diseases and to find out pathology. Sophisticated feature selection approaches are required to extract the information hidden in such complex 'omics' data. In this study, it is proposed a new and robust selective method by combining random forests (RF) with model population analysis (MPA), for selecting informative metabolites from three metabolomic datasets. According to the contribution to the classification accuracy, the metabolites were classified into three kinds: informative, no-informative, and interfering metabolites. Based on the proposed method, some informative metabolites were selected for three datasets; further analyses of these metabolites between healthy and diseased groups were then performed, showing by T-test that the P values for all these selected metabolites were lower than 0.05. Moreover, the informative metabolites identified by the current method were demonstrated to be correlated with the clinical outcome under investigation. The source codes of MPA-RF in Matlab can be freely downloaded from http://code.google.com/p/my-research-list/downloads/list.

  2. Estimating genetic architectures from artificial-selection responses: a random-effect framework.

    PubMed

    Le Rouzic, Arnaud; Skaug, Hans J; Hansen, Thomas F

    2010-03-01

    Artificial-selection experiments on plants and animals generate large datasets reporting phenotypic changes in the course of time. The dynamics of the changes reflect the underlying genetic architecture, but only simple statistical tools have so far been available to analyze such time series. This manuscript describes a general statistical framework based on random-effect models aiming at estimating key parameters of genetic architectures from artificial-selection responses. We derive explicit Mendelian models (in which the genetic architecture relies on one or two large-effect loci), and compare them with classical polygenic models. With simulations, we show that the models are accurate and powerful enough to provide useful estimates from realistic experimental designs, and we demonstrate that model selection is effective in picking few-locus vs. polygenic genetic architectures even from medium-quality artificial-selection data. The method is illustrated by the analysis of a historical selection experiment, carried on color pattern in rats by Castle et al.

  3. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology.

    PubMed

    Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H

    2017-07-01

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in

  4. Identification of residues critical for metallo-β-lactamase function by codon randomization and selection

    PubMed Central

    Materon, Isabel C.; Palzkill, Timothy

    2001-01-01

    IMP-1 β-lactamase is a zinc metallo-enzyme encoded by the transferable blaIMP-1 gene, which confers resistance to virtually all β-lactam antibiotics including carbapenems. To understand how IMP-1 recognizes and hydrolyzes β-lactam antibiotics it is important to determine which amino acid residues are critical for catalysis and which residues control substrate specificity. We randomized 27 individual codons in the blaIMP-1 gene to create libraries that contain all possible amino acid substitutions at residue positions in and near the active site of IMP-1. Mutants from the random libraries were selected for the ability to confer ampicillin resistance to Escherichia coli. Of the positions randomized, >50% do not tolerate amino acid substitutions, suggesting they are essential for IMP-1 function. The remaining positions tolerate amino acid substitutions and may influence the substrate specificity of the enzyme. Interestingly, kinetic studies for one of the functional mutants, Asn233Ala, indicate that an alanine substitution at this position significantly increases catalytic efficiency as compared with the wild-type enzyme. PMID:11714924

  5. Feature selection for outcome prediction in oesophageal cancer using genetic algorithm and random forest classifier.

    PubMed

    Paul, Desbordes; Su, Ruan; Romain, Modzelewski; Sébastien, Vauclin; Pierre, Vera; Isabelle, Gardin

    2016-12-28

    The outcome prediction of patients can greatly help to personalize cancer treatment. A large amount of quantitative features (clinical exams, imaging, …) are potentially useful to assess the patient outcome. The challenge is to choose the most predictive subset of features. In this paper, we propose a new feature selection strategy called GARF (genetic algorithm based on random forest) extracted from positron emission tomography (PET) images and clinical data. The most relevant features, predictive of the therapeutic response or which are prognoses of the patient survival 3 years after the end of treatment, were selected using GARF on a cohort of 65 patients with a local advanced oesophageal cancer eligible for chemo-radiation therapy. The most relevant predictive results were obtained with a subset of 9 features leading to a random forest misclassification rate of 18±4% and an areas under the of receiver operating characteristic (ROC) curves (AUC) of 0.823±0.032. The most relevant prognostic results were obtained with 8 features leading to an error rate of 20±7% and an AUC of 0.750±0.108. Both predictive and prognostic results show better performances using GARF than using 4 other studied methods.

  6. Trends of periodontal conditions in two different randomly selected Swiss (Bernese) cohorts 25 years apart.

    PubMed

    Schürch, Ernst; Dulla, Joëlle A; Bürgin, Walter; Lussi, Adrian; Lang, Niklaus P

    2015-10-01

    To assess the periodontal conditions of two randomly selected Swiss cohorts 25 years apart. Standardized examinations were performed to assess the periodontal conditions of two randomly selected populations of the Canton of Bern; oral cleanliness was evaluated using the plaque index (PlI) and the retention index (RI). Gingival health was scored according to the gingival index (GI). Periodontal conditions were evaluated by pocket probing depth (PPD) and loss of attachment (LA). At the first examination in 1985, 206 out of 350 subjects were evaluated, while in the second examination in 2010, 134 out of 490 subjects attended the examinations. In 1985, subjects showed a mean PlI of 1.16, and 0.77 in 2010. RI was 0.81 and 0.36 in 1985 and 2010 respectively. Mean GI was 1.34 and 0.6. The mean proportion of PPD ≤3 mm was 72% in 1985 and 97.3% in 2010. PPD ≥ 6 mm affected 2.0% in 1985 and 0.3% in 2010. In 1985, subjects had an average of 20.7 teeth, while in 2010, the average was 24.6. In 1985, 7.3% of the subjects were edentulous, while in 2010, 4.5% had no teeth. Trends to improvements resulting in more teeth in function and better periodontal conditions were recognized. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Distribution of Orientation Selectivity in Recurrent Networks of Spiking Neurons with Different Random Topologies

    PubMed Central

    Sadeh, Sadra; Rotter, Stefan

    2014-01-01

    Neurons in the primary visual cortex are more or less selective for the orientation of a light bar used for stimulation. A broad distribution of individual grades of orientation selectivity has in fact been reported in all species. A possible reason for emergence of broad distributions is the recurrent network within which the stimulus is being processed. Here we compute the distribution of orientation selectivity in randomly connected model networks that are equipped with different spatial patterns of connectivity. We show that, for a wide variety of connectivity patterns, a linear theory based on firing rates accurately approximates the outcome of direct numerical simulations of networks of spiking neurons. Distance dependent connectivity in networks with a more biologically realistic structure does not compromise our linear analysis, as long as the linearized dynamics, and hence the uniform asynchronous irregular activity state, remain stable. We conclude that linear mechanisms of stimulus processing are indeed responsible for the emergence of orientation selectivity and its distribution in recurrent networks with functionally heterogeneous synaptic connectivity. PMID:25469704

  8. The selection and design of control conditions for randomized controlled trials of psychological interventions.

    PubMed

    Mohr, David C; Spring, Bonnie; Freedland, Kenneth E; Beckner, Victoria; Arean, Patricia; Hollon, Steven D; Ockene, Judith; Kaplan, Robert

    2009-01-01

    The randomized controlled trial (RCT) provides critical support for evidence-based practice using psychological interventions. The control condition is the principal method of removing the influence of unwanted variables in RCTs. There is little agreement or consistency in the design and construction of control conditions. Because control conditions have variable effects, the results of RCTs can depend as much on control condition selection as on the experimental intervention. The aim of this paper is to present a framework for the selection and design of control conditions for these trials. Threats to internal validity arising from modern RCT methodology are reviewed and reconsidered. The strengths and weaknesses of several categories of control conditions are examined, including the ones that are under experimental control, the ones that are under the control of clinical service providers, and no-treatment controls. Considerations in the selection of control conditions are discussed and several recommendations are proposed. The aim of this paper is to begin to define principles by which control conditions can be selected or developed in a manner that can assist both investigators and grant reviewers. Copyright 2009 S. Karger AG, Basel.

  9. A Prescriptively Selected Nonthrust Manipulation Versus a Therapist-Selected Nonthrust Manipulation for Treatment of Individuals With Low Back Pain: A Randomized Clinical Trial.

    PubMed

    Donaldson, Megan; Petersen, Shannon; Cook, Chad; Learman, Ken

    2016-04-01

    Randomized controlled trial. Several studies that have investigated the effects of a therapist-selected versus a randomly assigned segmental approach have looked at immediate effects only for pain-related outcomes. To examine differences in outcomes following a therapist-selected nonthrust manipulation versus a prescriptively selected nonthrust manipulation in subjects with low back pain. Subjects with mechanically producible low back pain were randomly treated with nonthrust manipulation in a therapist-selected approach or a prescriptively selected approach. All subjects received a standardized home exercise program. Outcome measures included pain, disability, global rating of change, and patient acceptable symptom state. Analyses of covariance, chi-square tests, and Mann-Whitney U tests were used to determine differences between groups. Sixty-three subjects were tracked for 6 months, during which subjects in both groups significantly improved. There were no differences between groups in pain, disability, or patient acceptable symptom state scores at 6 months. There was a significant difference in global rating of change scores favoring the therapist-selected manipulation group at 6 months. This study measured long-term differences between a prescriptively selected nonthrust manipulation and a therapist-selected approach to nonthrust manipulation. In pain, disability, and patient acceptable symptom state there were no differences in outcomes, findings similar to studies of immediate effects. After 6 months, perceived well-being was significantly higher for those in the therapist-selected treatment group. The study was registered at ClinicalTrials.gov (NCT01940744). Therapy, level 1b.

  10. Using interviewer random effects to remove selection bias from HIV prevalence estimates.

    PubMed

    McGovern, Mark E; Bärnighausen, Till; Salomon, Joshua A; Canning, David

    2015-02-05

    Selection bias in HIV prevalence estimates occurs if non-participation in testing is correlated with HIV status. Longitudinal data suggests that individuals who know or suspect they are HIV positive are less likely to participate in testing in HIV surveys, in which case methods to correct for missing data which are based on imputation and observed characteristics will produce biased results. The identity of the HIV survey interviewer is typically associated with HIV testing participation, but is unlikely to be correlated with HIV status. Interviewer identity can thus be used as a selection variable allowing estimation of Heckman-type selection models. These models produce asymptotically unbiased HIV prevalence estimates, even when non-participation is correlated with unobserved characteristics, such as knowledge of HIV status. We introduce a new random effects method to these selection models which overcomes non-convergence caused by collinearity, small sample bias, and incorrect inference in existing approaches. Our method is easy to implement in standard statistical software, and allows the construction of bootstrapped standard errors which adjust for the fact that the relationship between testing and HIV status is uncertain and needs to be estimated. Using nationally representative data from the Demographic and Health Surveys, we illustrate our approach with new point estimates and confidence intervals (CI) for HIV prevalence among men in Ghana (2003) and Zambia (2007). In Ghana, we find little evidence of selection bias as our selection model gives an HIV prevalence estimate of 1.4% (95% CI 1.2% - 1.6%), compared to 1.6% among those with a valid HIV test. In Zambia, our selection model gives an HIV prevalence estimate of 16.3% (95% CI 11.0% - 18.4%), compared to 12.1% among those with a valid HIV test. Therefore, those who decline to test in Zambia are found to be more likely to be HIV positive. Our approach corrects for selection bias in HIV prevalence

  11. Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.

    PubMed

    Ma, Yunbei; Zhou, Xiao-Hua

    2017-02-01

    For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.

  12. Selection of DNA binding sites for zinc fingers using rationally randomized DNA reveals coded interactions.

    PubMed Central

    Choo, Y; Klug, A

    1994-01-01

    In the preceding paper [Choo, Y. & Klug, A. (1994) Proc. Natl. Acad. Sci. USA 91, 11163-11167], we showed how selections from a library of zinc fingers displayed on phage yielded fingers able to bind to a number of DNA triplets. Here, we describe a technique to deal efficiently with the converse problem--namely, the selection of a DNA binding site for a given zinc finger. This is done by screening against libraries of DNA triplet binding sites randomized in two positions but having one base fixed in the third position. The technique is applied here to determine the specificity of fingers previously selected by phage display. We find that some of these fingers are able to specify a unique base in each position of the cognate triplet. This is further illustrated by examples of fingers which can discriminate between closely related triplets as measured by their respective equilibrium dissociation constants. Comparing the amino acid sequences of fingers which specify a particular base in a triplet, we infer that in most instances, sequence-specific binding of zinc fingers to DNA can be achieved by using a small set of amino acid-nucleotide base contacts amenable to a code. Images PMID:7972028

  13. Direct selection of targeted adenovirus vectors by random peptide display on the fiber knob.

    PubMed

    Miura, Y; Yoshida, K; Nishimoto, T; Hatanaka, K; Ohnami, S; Asaka, M; Douglas, J T; Curiel, D T; Yoshida, T; Aoki, K

    2007-10-01

    Targeting of gene transfer at the level of cell entry is one of the most attractive challenges in vector development. However, attempts to redirect adenovirus vectors to alternative receptors by engineering the capsid-coding region have shown limited success because proper targeting ligand-receptor systems on the cells of interest are generally unknown. Systematic approaches to generate adenovirus vectors targeting any given cell type need to be developed to achieve this goal. Here, we constructed an adenovirus library that was generated by a Cre-lox-mediated in vitro recombination between an adenoviral fiber-modified plasmid library and genomic DNA to display random peptides on a fiber knob. As proof of concept, we screened the adenovirus display library on a glioma cell line and observed selection of several particular peptide sequences. The targeted vector carrying the most frequently isolated peptide significantly enhanced gene transduction in the glioma cell line but not in many other cell lines. Because the insertion of a pre-selected peptide into a fiber knob often fails to generate an adenovirus vector, the selection of targeting peptides is highly useful in the context of the adenoviral capsid. This vector-screening system can facilitate the development of a targeted adenovirus vector for a variety of applications in medicine.

  14. Selection of Unique Escherichia coli Clones by Random Amplified Polymorphic DNA (RAPD): Evaluation by Whole Genome Sequencing

    PubMed Central

    Nielsen, Karen L.; Godfrey, Paul A.; Stegger, Marc; Andersen, Paal S.; Feldgarden, Michael; Frimodt-Møller, Niels

    2014-01-01

    Identifying and characterizing clonal diversity is important when analysing fecal flora. We evaluated random amplified polymorphic DNA (RAPD) PCR, applied for selection of Escherichia coli isolates, by whole genome sequencing. RAPD was fast, and reproducible as screening method for selection of distinct E. coli clones in fecal swabs. PMID:24912108

  15. Selective outcome reporting and sponsorship in randomized controlled trials in IVF and ICSI.

    PubMed

    Braakhekke, M; Scholten, I; Mol, F; Limpens, J; Mol, B W; van der Veen, F

    2017-10-01

    Are randomized controlled trials (RCTs) on IVF and ICSI subject to selective outcome reporting and is this related to sponsorship? There are inconsistencies, independent from sponsorship, in the reporting of primary outcome measures in the majority of IVF and ICSI trials, indicating selective outcome reporting. RCTs are subject to bias at various levels. Of these biases, selective outcome reporting is particularly relevant to IVF and ICSI trials since there is a wide variety of outcome measures to choose from. An established cause of reporting bias is sponsorship. It is, at present, unknown whether RCTs in IVF/ICSI are subject to selective outcome reporting and whether this is related with sponsorship. We systematically searched RCTs on IVF and ICSI published between January 2009 and March 2016 in MEDLINE, EMBASE, the Cochrane Central Register of Controlled Trials and the publisher subset of PubMed. We analysed 415 RCTs. Per included RCT, we extracted data on impact factor of the journal, sample size, power calculation, and trial registry and thereafter data on primary outcome measure, the direction of trial results and sponsorship. Of the 415 identified RCTs, 235 were excluded for our primary analysis, because the sponsorship was not reported. Of the 180 RCTs included in our analysis, 7 trials did not report on any primary outcome measure and 107 of the remaining 173 trials (62%) reported on surrogate primary outcome measures. Of the 114 registered trials, 21 trials (18%) provided primary outcomes in their manuscript that were different from those in the trial registry. This indicates selective outcome reporting. We found no association between selective outcome reporting and sponsorship. We ran additional analyses to include the trials that had not reported sponsorship and found no outcomes that differed from our primary analysis. Since the majority of the trials did not report on sponsorship, there is a risk on sampling bias. IVF and ICSI trials are subject, to

  16. A randomized clinical trial of selective laser trabeculoplasty versus argon laser trabeculoplasty in patients with pseudoexfoliation.

    PubMed

    Kent, Shefalee S; Hutnik, Cindy M L; Birt, Catherine M; Damji, Karim F; Harasymowycz, Paul; Si, Francie; Hodge, William; Pan, Irene; Crichton, Andrew

    2015-01-01

    To evaluate the efficacy of selective laser trabeculoplasty (SLT) versus argon laser trabeculoplasty (ALT) in lowering the intraocular pressure (IOP) in patients with open-angle glaucoma or ocular hypertension secondary to pseudoexfoliation. Multicentered randomized clinical trial. A total of 76 eyes from 60 patients with pseudoexfoliation and uncontrolled IOP were recruited from 5 Canadian academic institutions. Patients with prior laser trabeculoplasty, ocular surgery within 6 months, previous glaucoma surgery, an advanced visual field defect, current steroid use, and monocular patients were excluded. Eyes were randomized to receive either 180-degree SLT or 180-degree ALT by a nonblocked randomization schedule stratified by center. The primary outcome was the change in IOP at 6 months versus baseline and secondary outcomes included change in number of glaucoma medications after laser. Baseline variables included age, sex, angle grade, angle pigmentation, and number of glaucoma medications. Of the 76 eyes, 45 eyes received SLT and 31 eyes received ALT. The overall age was 72.9 years (65% females). The baseline IOPs in the SLT and ALT groups were 23.1 and 25.2 mm Hg, respectively (P=0.03). The IOP reduction 6 months after SLT was -6.8 mm Hg and post-ALT was -7.7 mm Hg (P>0.05). The SLT group had reduced glaucoma medications by 0.16 medications at 6 months and the ALT group had no decrease in medications over the same time period (P=0.59). There were no postlaser IOP spikes in either group. ALT and SLT are equivalent in lowering IOP at 6 months posttreatment in patients with PXF.

  17. Implications of structural genomics target selection strategies: Pfam5000, whole genome, and random approaches

    SciTech Connect

    Chandonia, John-Marc; Brenner, Steven E.

    2004-07-14

    The structural genomics project is an international effort to determine the three-dimensional shapes of all important biological macromolecules, with a primary focus on proteins. Target proteins should be selected according to a strategy which is medically and biologically relevant, of good value, and tractable. As an option to consider, we present the Pfam5000 strategy, which involves selecting the 5000 most important families from the Pfam database as sources for targets. We compare the Pfam5000 strategy to several other proposed strategies that would require similar numbers of targets. These include including complete solution of several small to moderately sized bacterial proteomes, partial coverage of the human proteome, and random selection of approximately 5000 targets from sequenced genomes. We measure the impact that successful implementation of these strategies would have upon structural interpretation of the proteins in Swiss-Prot, TrEMBL, and 131 complete proteomes (including 10 of eukaryotes) from the Proteome Analysis database at EBI. Solving the structures of proteins from the 5000 largest Pfam families would allow accurate fold assignment for approximately 68 percent of all prokaryotic proteins (covering 59 percent of residues) and 61 percent of eukaryotic proteins (40 percent of residues). More fine-grained coverage which would allow accurate modeling of these proteins would require an order of magnitude more targets. The Pfam5000 strategy may be modified in several ways, for example to focus on larger families, bacterial sequences, or eukaryotic sequences; as long as secondary consideration is given to large families within Pfam, coverage results vary only slightly. In contrast, focusing structural genomics on a single tractable genome would have only a limited impact in structural knowledge of other proteomes: a significant fraction (about 30-40 percent of the proteins, and 40-60 percent of the residues) of each proteome is classified in small

  18. Multilabel learning via random label selection for protein subcellular multilocations prediction.

    PubMed

    Wang, Xiao; Li, Guo-Zheng

    2013-01-01

    Prediction of protein subcellular localization is an important but challenging problem, particularly when proteins may simultaneously exist at, or move between, two or more different subcellular location sites. Most of the existing protein subcellular localization methods are only used to deal with the single-location proteins. In the past few years, only a few methods have been proposed to tackle proteins with multiple locations. However, they only adopt a simple strategy, that is, transforming the multilocation proteins to multiple proteins with single location, which does not take correlations among different subcellular locations into account. In this paper, a novel method named random label selection (RALS) (multilabel learning via RALS), which extends the simple binary relevance (BR) method, is proposed to learn from multilocation proteins in an effective and efficient way. RALS does not explicitly find the correlations among labels, but rather implicitly attempts to learn the label correlations from data by augmenting original feature space with randomly selected labels as its additional input features. Through the fivefold cross-validation test on a benchmark data set, we demonstrate our proposed method with consideration of label correlations obviously outperforms the baseline BR method without consideration of label correlations, indicating correlations among different subcellular locations really exist and contribute to improvement of prediction performance. Experimental results on two benchmark data sets also show that our proposed methods achieve significantly higher performance than some other state-of-the-art methods in predicting subcellular multilocations of proteins. The prediction web server is available at >http://levis.tongji.edu.cn:8080/bioinfo/MLPred-Euk/ for the public usage.

  19. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    PubMed

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  20. Selecting instruments for Mendelian randomization in the wake of genome-wide association studies

    PubMed Central

    Swerdlow, Daniel I; Kuchenbaecker, Karoline B; Shah, Sonia; Sofat, Reecha; Holmes, Michael V; White, Jon; Mindell, Jennifer S; Kivimaki, Mika; Brunner, Eric J; Whittaker, John C; Casas, Juan P; Hingorani, Aroon D

    2016-01-01

    Mendelian randomization (MR) studies typically assess the pathogenic relevance of environmental exposures or disease biomarkers, using genetic variants that instrument these exposures. The approach is gaining popularity—our systematic review reveals a greater than 10-fold increase in MR studies published between 2004 and 2015. When the MR paradigm was first proposed, few biomarker- or exposure-related genetic variants were known, most having been identified by candidate gene studies. However, genome-wide association studies (GWAS) are now providing a rich source of potential instruments for MR analysis. Many early reviews covering the concept, applications and analytical aspects of the MR technique preceded the surge in GWAS, and thus the question of how best to select instruments for MR studies from the now extensive pool of available variants has received insufficient attention. Here we focus on the most common category of MR studies—those concerning disease biomarkers. We consider how the selection of instruments for MR analysis from GWAS requires consideration of: the assumptions underlying the MR approach; the biology of the biomarker; the genome-wide distribution, frequency and effect size of biomarker-associated variants (the genetic architecture); and the specificity of the genetic associations. Based on this, we develop guidance that may help investigators to plan and readers interpret MR studies. PMID:27342221

  1. Selective spinal anesthesia for outpatient transurethral prostatectomy (TURP): randomized controlled comparison of chloroprocaine with lidocaine.

    PubMed

    Vaghadia, H; Neilson, G; Lennox, P H

    2012-02-01

    This is a study comparing two short-acting local anesthetics lidocaine and 2-chloroprocaine in combination with fentanyl, to provide selective spinal anesthesia for outpatient transurethral resection of the prostate (TURP). In this prospective, randomized double-blind study, selective spinal anesthesia was performed in 40 American Society of Anesthesiologists I-III outpatients undergoing TURP using either 40 mg of chloroprocaine mixed with 12.5 μg of fentanyl (n = 20) or 35 mg of lidocaine mixed with 12.5 μg of fentanyl (n = 20). The primary outcome was duration of spinal block. Secondary outcomes were time to reach T10 (onset), time to maximal level, duration above T10 and lidocaine 3, maximal level of block, and adverse effects. The median (minimum, maximum) onset time was 4 (1, 16) and 3 (2, 10) min for chloroprocaine and lidocaine, respectively. Time to maximal level was 20 (17, 29) and 22 (16, 26) min for chloroprocaine and lidocaine, respectively. Mean maximal level was T7-T8 for both agents. Duration of block above T10 was 54 (28, 88) and 63 (31, 87) min for chloroprocaine and lidocaine, respectively. Duration of block above lidocaine 3 was 93 (56, 218) and 98 (58, 151) min for chloroprocaine and lidocaine, respectively. There was no statistical difference between the two groups with respect to these clinical end points. Four patients in the lidocaine group developed transient neurological symptoms. One patient in the chloroprocaine group developed a cauda equina-like syndrome but recovered fully after several weeks. Selective spinal anesthesia with chloroprocaine and lidocaine for TURP yielded comparable results for clinical characteristics. Further research on transient neurological symptom and cauda equina risk with chloroprocaine is warranted. © 2012 The Authors Acta Anaesthesiologica Scandinavica © 2012 The Acta Anaesthesiologica Scandinavica Foundation.

  2. Pilot randomized trial of selective internal radiation therapy vs. chemoembolization in unresectable hepatocellular carcinoma.

    PubMed

    Kolligs, Frank T; Bilbao, Jose I; Jakobs, Tobias; Iñarrairaegui, Mercedes; Nagel, Jutta M; Rodriguez, Macarena; Haug, Alexander; D'Avola, Delia; op den Winkel, Mark; Martinez-Cuesta, Antonio; Trumm, Christoph; Benito, Alberto; Tatsch, Klaus; Zech, Christoph J; Hoffmann, Ralf-Thorsten; Sangro, Bruno

    2015-06-01

    To compare selective internal radiation therapy (SIRT) with transarterial chemoembolization (TACE), the standard-of-care for intermediate-stage unresectable, hepatocellular carcinoma (HCC), as first-line treatment. SIRTACE was an open-label multicenter randomized-controlled pilot study, which prospectively compared primarily safety and health-related quality of life (HRQoL) changes following TACE and SIRT. Patients with unresectable HCC, Child-Pugh ≤B7, ECOG performance status ≤2 and ≤5 liver lesions (≤20 cm total maximum diameter) without extrahepatic spread were randomized to receive either TACE (at 6-weekly intervals until tumour enhancement was not observed on MRI or disease progression) or single-session SIRT (yttrium-90 resin microspheres). Twenty-eight patients with BCLC stage A (32.1%), B (46.4%) or C (21.4%) received either a mean of 3.4 (median 2) TACE interventions (N = 15) or single SIRT (N = 13). Both treatments were well tolerated. Despite SIRT patients having significantly worse physical functioning at baseline, at week-12, neither treatment had a significantly different impact on HRQoL as measured by Functional Assessment of Cancer Therapy-Hepatobiliary total or its subscales. Both TACE and SIRT were effective for the local control of liver tumours. Best overall response-rate (RECIST 1.0) of target lesions were 13.3% and 30.8%, disease control rates were 73.3% and 76.9% for TACE and SIRT, respectively. Two patients in each group were down-staged for liver transplantation (N = 3) or radiofrequency ablation (N = 1). Single-session SIRT appeared to be as safe and had a similar impact on HRQoL as multiple sessions of TACE, suggesting that SIRT might be an alternative option for patients eligible for TACE. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. A randomized controlled trial of third-trimester routine ultrasound in a non-selected population.

    PubMed

    Skråstad, Ragnhild B; Eik-Nes, Sturla H; Sviggum, Oddvar; Johansen, Ole J; Salvesen, Kjell Å; Romundstad, Pål R; Blaas, Harm-Gerd K

    2013-12-01

    To compare detection rates of small-for-gestational-age fetuses, large-for-gestational-age fetuses, congenital anomalies and adverse perinatal outcomes in pregnancies randomized to third-trimester routine ultrasound or ultrasound on clinical indication. Randomized controlled trial. National Center for Fetal Medicine in Norway between 1989 and 1992. A total of 6780 pregnancies from a non-selected population. Two routine ultrasound examinations at 18 and 33 weeks were compared with routine ultrasound at 18 weeks and ultrasound on clinical indication. Suspected small-for-gestational-age fetuses were followed with serial scans and cardiotocography. Doppler ultrasound was not used. Detection rates of small-for-gestational-age and large-for-gestational-age fetuses, congenital anomalies and adverse perinatal outcomes. Third trimester routine ultrasound improved detection rates of small-for-gestational-age fetuses from 46 to 80%, but overall perinatal morbidity and mortality remained unchanged. Detection of large-for-gestational-age fetuses increased from 36 to 91%. There was a significant increase of induction of labor and elective cesarean sections due to suspected small-for-gestational-age and a significant decrease of induction of labor and elective cesarean sections due to suspected large-for-gestational-age in the study group; there were no other differences regarding intervention. The detection rate of congenital anomalies was 56%, with no significant difference between the groups. Routine use of third-trimester routine ultrasound increased detection rates of small-for-gestational-age and large-for-gestational-age fetuses. This did not alter perinatal outcomes. Third-trimester ultrasound screening should not be rejected before a policy of adding Doppler surveillance to the high-risk group identified has been investigated further. © 2013 Nordic Federation of Societies of Obstetrics and Gynecology.

  4. Construction of random tumor transcriptome expression library for creating and selecting novel tumor antigens.

    PubMed

    Zhao, Huizhun; Zhao, Xiuyun; Du, Peng; Qi, Gaofu

    2016-09-01

    Novel tumor antigens are necessary for the development of efficient tumor vaccines for overcoming the immunotolerance and immunosuppression induced by tumors. Here, we developed a novel strategy to create tumor antigens by construction of random tumor transcriptome expression library (RTTEL). The complementary DNA (cDNA) from S180 sarcoma was used as template for arbitrarily amplifying gene fragments with random primers by PCR, then ligated to the C-terminal of HSP65 in a plasmid pET28a-HSP for constructing RTTEL in Escherichia coli. A novel antigen of A5 was selected from RTTEL with the strongest immunotherapeutic effects on S180 sarcoma. Adoptive immunotherapy with anti-A5 sera also inhibited tumor growth, further confirming the key antitumor roles of A5-specific antibodies in mice. A5 contains a sequence similar to protein-L-isoaspartate (D-aspartate) O-methyltransferase (PCMT1). The antisera of A5 were verified to cross-react with PCMT1 by Western blotting assay and vice versa. Both anti-A5 sera and anti-PCMT1 sera could induce antibody-dependent cell-mediated cytotoxicity and complement-dependent cytotoxicity toward S180 cells by in vitro assay. Further assay with fluorescent staining showed that PCMT1 is detectable on the surface of S180 cells. Summary, the strategy to construct RTTEL is potential for creating and screening novel tumor antigens to develop efficient tumor vaccines. By RTTEL, we successfully created a protein antigen of A5 with significant immunotherapeutic effects on S180 sarcoma by induction of antibodies targeting for PCMT1.

  5. Selective laser trabeculoplasty versus medical therapy as initial treatment of glaucoma: a prospective, randomized trial.

    PubMed

    Katz, L Jay; Steinmann, William C; Kabir, Azad; Molineaux, Jeanne; Wizov, Sheryl S; Marcellino, George

    2012-09-01

    To compare outcomes of selective laser trabeculoplasty (SLT) with drug therapy for glaucoma patients in a prospective randomized clinical trial. Sixty-nine patients (127 eyes) with open-angle glaucoma or ocular hypertension were randomized to SLT or medical therapy. Target intraocular pressure (IOP) was determined using the Collaborative Initial Glaucoma Treatment Study formula. Patients were treated with SLT (100 applications 360 degrees) or medical therapy (prostaglandin analog). Six visits over 1 year followed initial treatment. If target IOP range was not attained with SLT, additional SLT was the next step, or in the medical arm additional medications were added. IOP; secondary: number of steps. Sixty-nine patients were treated. Data collection terminated with 54 patients reaching 9 to 12-months follow-up. Twenty-nine patients were in the SLT group, 25 patients in the medical group. Baseline mean IOP for all eyes was 24.5 mm Hg in the SLT group, 24.7 mm Hg in the medical group. Mean IOP (both eyes) at last follow-up was 18.2 mm Hg (6.3 mm Hg reduction) in the SLT arm, 17.7 mm Hg (7.0 mm Hg reduction) in the medical arm. By last follow-up, 11% of eyes received additional SLT, 27% required additional medication. There was not a statistically significant difference between the SLT and medication groups. IOP reduction was similar in both arms after 9 to 12-months follow-up. More treatment steps were necessary to maintain target IOP in the medication group, although there was not a statistically significant difference between groups. These results support the option of SLT as a safe and effective initial therapy in open-angle glaucoma or ocular hypertension.

  6. Application of random coherence order selection in gradient-enhanced multidimensional NMR

    NASA Astrophysics Data System (ADS)

    Bostock, Mark J.; Nietlispach, Daniel

    2016-03-01

    Development of multidimensional NMR is essential to many applications, for example in high resolution structural studies of biomolecules. Multidimensional techniques enable separation of NMR signals over several dimensions, improving signal resolution, whilst also allowing identification of new connectivities. However, these advantages come at a significant cost. The Fourier transform theorem requires acquisition of a grid of regularly spaced points to satisfy the Nyquist criterion, while frequency discrimination and acquisition of a pure phase spectrum require acquisition of both quadrature components for each time point in every indirect (non-acquisition) dimension, adding a factor of 2 N -1 to the number of free- induction decays which must be acquired, where N is the number of dimensions. Compressed sensing (CS) ℓ 1-norm minimisation in combination with non-uniform sampling (NUS) has been shown to be extremely successful in overcoming the Nyquist criterion. Previously, maximum entropy reconstruction has also been used to overcome the limitation of frequency discrimination, processing data acquired with only one quadrature component at a given time interval, known as random phase detection (RPD), allowing a factor of two reduction in the number of points for each indirect dimension (Maciejewski et al. 2011 PNAS 108 16640). However, whilst this approach can be easily applied in situations where the quadrature components are acquired as amplitude modulated data, the same principle is not easily extended to phase modulated (P-/N-type) experiments where data is acquired in the form exp (iωt) or exp (-iωt), and which make up many of the multidimensional experiments used in modern NMR. Here we demonstrate a modification of the CS ℓ 1-norm approach to allow random coherence order selection (RCS) for phase modulated experiments; we generalise the nomenclature for RCS and RPD as random quadrature detection (RQD). With this method, the power of RQD can be extended

  7. Darwinian dynamics of intratumoral heterogeneity: not solely random mutations but also variable environmental selection forces

    PubMed Central

    Lloyd, Mark C; Cunningham, Jessica J; Bui, Marilyn M; Gillies, Robert J; Brown, Joel S; Gatenby, Robert A

    2017-01-01

    molecular heterogeneity in cancer cells in tumors is governed by predictable regional variations in environmental selection forces, arguing against the assumption that cancer cells can evolve toward a local fitness maximum by random accumulation of mutations. Major Findings Like invasive species in nature, cancer cells at the leading edge of the tumor possess a different phenotype from cells in the tumor core. We conclude that at least some intratumoral heterogeneity in the molecular properties of cancer cells is governed by predictable regional variations in environmental selection forces. PMID:27009166

  8. Computerized stratified random site-selection approaches for design of a ground-water-quality sampling network

    USGS Publications Warehouse

    Scott, J.C.

    1990-01-01

    Computer software was written to randomly select sites for a ground-water-quality sampling network. The software uses digital cartographic techniques and subroutines from a proprietary geographic information system. The report presents the approaches, computer software, and sample applications. It is often desirable to collect ground-water-quality samples from various areas in a study region that have different values of a spatial characteristic, such as land-use or hydrogeologic setting. A stratified network can be used for testing hypotheses about relations between spatial characteristics and water quality, or for calculating statistical descriptions of water-quality data that account for variations that correspond to the spatial characteristic. In the software described, a study region is subdivided into areal subsets that have a common spatial characteristic to stratify the population into several categories from which sampling sites are selected. Different numbers of sites may be selected from each category of areal subsets. A population of potential sampling sites may be defined by either specifying a fixed population of existing sites, or by preparing an equally spaced population of potential sites. In either case, each site is identified with a single category, depending on the value of the spatial characteristic of the areal subset in which the site is located. Sites are selected from one category at a time. One of two approaches may be used to select sites. Sites may be selected randomly, or the areal subsets in the category can be grouped into cells and sites selected randomly from each cell.

  9. Random genetic drift, natural selection, and noise in human cranial evolution.

    PubMed

    Roseman, Charles C

    2016-08-01

    This study assesses the extent to which relationships among groups complicate comparative studies of adaptation in recent human cranial variation and the extent to which departures from neutral additive models of evolution hinder the reconstruction of population relationships among groups using cranial morphology. Using a maximum likelihood evolutionary model fitting approach and a mixed population genomic and cranial data set, I evaluate the relative fits of several widely used models of human cranial evolution. Moreover, I compare the goodness of fit of models of cranial evolution constrained by genomic variation to test hypotheses about population specific departures from neutrality. Models from population genomics are much better fits to cranial variation than are traditional models from comparative human biology. There is not enough evolutionary information in the cranium to reconstruct much of recent human evolution but the influence of population history on cranial variation is strong enough to cause comparative studies of adaptation serious difficulties. Deviations from a model of random genetic drift along a tree-like population history show the importance of environmental effects, gene flow, and/or natural selection on human cranial variation. Moreover, there is a strong signal of the effect of natural selection or an environmental factor on a group of humans from Siberia. The evolution of the human cranium is complex and no one evolutionary process has prevailed at the expense of all others. A holistic unification of phenome, genome, and environmental context, gives us a strong point of purchase on these problems, which is unavailable to any one traditional approach alone. Am J Phys Anthropol 160:582-592, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  10. Equivalence between Step Selection Functions and Biased Correlated Random Walks for Statistical Inference on Animal Movement.

    PubMed

    Duchesne, Thierry; Fortin, Daniel; Rivest, Louis-Paul

    2015-01-01

    Animal movement has a fundamental impact on population and community structure and dynamics. Biased correlated random walks (BCRW) and step selection functions (SSF) are commonly used to study movements. Because no studies have contrasted the parameters and the statistical properties of their estimators for models constructed under these two Lagrangian approaches, it remains unclear whether or not they allow for similar inference. First, we used the Weak Law of Large Numbers to demonstrate that the log-likelihood function for estimating the parameters of BCRW models can be approximated by the log-likelihood of SSFs. Second, we illustrated the link between the two approaches by fitting BCRW with maximum likelihood and with SSF to simulated movement data in virtual environments and to the trajectory of bison (Bison bison L.) trails in natural landscapes. Using simulated and empirical data, we found that the parameters of a BCRW estimated directly from maximum likelihood and by fitting an SSF were remarkably similar. Movement analysis is increasingly used as a tool for understanding the influence of landscape properties on animal distribution. In the rapidly developing field of movement ecology, management and conservation biologists must decide which method they should implement to accurately assess the determinants of animal movement. We showed that BCRW and SSF can provide similar insights into the environmental features influencing animal movements. Both techniques have advantages. BCRW has already been extended to allow for multi-state modeling. Unlike BCRW, however, SSF can be estimated using most statistical packages, it can simultaneously evaluate habitat selection and movement biases, and can easily integrate a large number of movement taxes at multiple scales. SSF thus offers a simple, yet effective, statistical technique to identify movement taxis.

  11. Equivalence between Step Selection Functions and Biased Correlated Random Walks for Statistical Inference on Animal Movement

    PubMed Central

    Duchesne, Thierry; Fortin, Daniel; Rivest, Louis-Paul

    2015-01-01

    Animal movement has a fundamental impact on population and community structure and dynamics. Biased correlated random walks (BCRW) and step selection functions (SSF) are commonly used to study movements. Because no studies have contrasted the parameters and the statistical properties of their estimators for models constructed under these two Lagrangian approaches, it remains unclear whether or not they allow for similar inference. First, we used the Weak Law of Large Numbers to demonstrate that the log-likelihood function for estimating the parameters of BCRW models can be approximated by the log-likelihood of SSFs. Second, we illustrated the link between the two approaches by fitting BCRW with maximum likelihood and with SSF to simulated movement data in virtual environments and to the trajectory of bison (Bison bison L.) trails in natural landscapes. Using simulated and empirical data, we found that the parameters of a BCRW estimated directly from maximum likelihood and by fitting an SSF were remarkably similar. Movement analysis is increasingly used as a tool for understanding the influence of landscape properties on animal distribution. In the rapidly developing field of movement ecology, management and conservation biologists must decide which method they should implement to accurately assess the determinants of animal movement. We showed that BCRW and SSF can provide similar insights into the environmental features influencing animal movements. Both techniques have advantages. BCRW has already been extended to allow for multi-state modeling. Unlike BCRW, however, SSF can be estimated using most statistical packages, it can simultaneously evaluate habitat selection and movement biases, and can easily integrate a large number of movement taxes at multiple scales. SSF thus offers a simple, yet effective, statistical technique to identify movement taxis. PMID:25898019

  12. CURE-SMOTE algorithm and hybrid algorithm for feature selection and parameter optimization based on random forests.

    PubMed

    Ma, Li; Fan, Suohai

    2017-03-14

    The random forests algorithm is a type of classifier with prominent universality, a wide application range, and robustness for avoiding overfitting. But there are still some drawbacks to random forests. Therefore, to improve the performance of random forests, this paper seeks to improve imbalanced data processing, feature selection and parameter optimization. We propose the CURE-SMOTE algorithm for the imbalanced data classification problem. Experiments on imbalanced UCI data reveal that the combination of Clustering Using Representatives (CURE) enhances the original synthetic minority oversampling technique (SMOTE) algorithms effectively compared with the classification results on the original data using random sampling, Borderline-SMOTE1, safe-level SMOTE, C-SMOTE, and k-means-SMOTE. Additionally, the hybrid RF (random forests) algorithm has been proposed for feature selection and parameter optimization, which uses the minimum out of bag (OOB) data error as its objective function. Simulation results on binary and higher-dimensional data indicate that the proposed hybrid RF algorithms, hybrid genetic-random forests algorithm, hybrid particle swarm-random forests algorithm and hybrid fish swarm-random forests algorithm can achieve the minimum OOB error and show the best generalization ability. The training set produced from the proposed CURE-SMOTE algorithm is closer to the original data distribution because it contains minimal noise. Thus, better classification results are produced from this feasible and effective algorithm. Moreover, the hybrid algorithm's F-value, G-mean, AUC and OOB scores demonstrate that they surpass the performance of the original RF algorithm. Hence, this hybrid algorithm provides a new way to perform feature selection and parameter optimization.

  13. Melanocytic Hyperplasia in the Epidermis Overlying Trichoblastomas in 100 Randomly Selected Cases.

    PubMed

    Al Omoush, Tahseen M M; Michal, Michael; Konstantinova, Anastasia M; Michal, Michal; Kutzner, Heinz; Kazakov, Dmitry V

    2016-04-01

    One hundred cases of trichoblastomas (large nodular, small nodular, cribriform, lymphadenoma, and columnar) were randomly selected and studied for the presence of melanocytic hyperplasia in the epidermis overlying the tumors, which was defined as foci of increased melanocytes in the basal layer of the epidermis (more than 1 per 4 basal keratinocytes). Focal melanocytic hyperplasia was detected in a total of 22 cases of trichoblastoma (22%), and this phenomenon was most frequently seen in columnar trichoblastoma (7 cases), followed by large nodular trichoblastoma (5 cases). The mechanism of epidermal melanocytic hyperplasia overlying trichoblastoma is unclear. Ultraviolet may be a contributing factor, as focal melanocytic hyperplasia was also detected in one-third of cases in the epidermis overlying uninvolved skin, usually associated with solar elastosis. This is further corroborated by the occurrence of the lesions predominantly on the face. Melanocytic hyperplasia overlying trichoblastoma appears to have no impact on the clinical appearance of the lesion and is recognized only microscopically. In an adequate biopsy specimen containing at least part of trichoblastoma, it should not cause any diagnostic problems.

  14. The Kilkenny Health Project: food and nutrient intakes in randomly selected healthy adults.

    PubMed

    Gibney, M J; Moloney, M; Shelley, E

    1989-03-01

    1. Sixty healthy subjects aged 35-44 years (thirty men and thirty women) were randomly selected from electoral registers to participate in a dietary survey using the 7 d weighed-intake method during June-August 1985. 2. Energy intake (MJ/d) was 12.5 for men and 8.4 for women. Fat contributed 36.0 and 39.1% of the total energy intake of men and women respectively. When this was adjusted to exclude energy derived from alcoholic beverages, the corresponding values were 38.8 and 39.7% respectively. The major sources of dietary fat (%) were spreadable fats (28), meat (23), milk (12) and biscuits and cakes (11). 3. The subjects were divided into low- and high-fat groups both on the relative intake of fat (less than 35% or greater than 40% dietary energy from fat) and on the absolute intake of fat (greater than or less than 120 g fat/d). By either criterion, high-fat consumers had lower than average intakes of low-fat, high-carbohydrate foods such as potatoes, bread, fruit and table sugar, and higher intakes of milk, butter and confectionery products. Meat intake was higher among high-fat eaters only when a high-fat diet was defined as a percentage of energy.

  15. Issues Relating to Selective Reporting When Including Non-Randomized Studies in Systematic Reviews on the Effects of Healthcare Interventions

    ERIC Educational Resources Information Center

    Norris, Susan L.; Moher, David; Reeves, Barnaby C.; Shea, Beverley; Loke, Yoon; Garner, Sarah; Anderson, Laurie; Tugwell, Peter; Wells, George

    2013-01-01

    Background: Selective outcome and analysis reporting (SOR and SAR) occur when only a subset of outcomes measured and analyzed in a study is fully reported, and are an important source of potential bias. Key methodological issues: We describe what is known about the prevalence and effects of SOR and SAR in both randomized controlled trials (RCTs)…

  16. Issues Relating to Selective Reporting When Including Non-Randomized Studies in Systematic Reviews on the Effects of Healthcare Interventions

    ERIC Educational Resources Information Center

    Norris, Susan L.; Moher, David; Reeves, Barnaby C.; Shea, Beverley; Loke, Yoon; Garner, Sarah; Anderson, Laurie; Tugwell, Peter; Wells, George

    2013-01-01

    Background: Selective outcome and analysis reporting (SOR and SAR) occur when only a subset of outcomes measured and analyzed in a study is fully reported, and are an important source of potential bias. Key methodological issues: We describe what is known about the prevalence and effects of SOR and SAR in both randomized controlled trials (RCTs)…

  17. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Sample selection by random number generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT POLYCHLORINATED BIPHENYLS (PCBs...

  18. Most Undirected Random Graphs Are Amplifiers of Selection for Birth-Death Dynamics, but Suppressors of Selection for Death-Birth Dynamics.

    PubMed

    Hindersin, Laura; Traulsen, Arne

    2015-11-01

    We analyze evolutionary dynamics on graphs, where the nodes represent individuals of a population. The links of a node describe which other individuals can be displaced by the offspring of the individual on that node. Amplifiers of selection are graphs for which the fixation probability is increased for advantageous mutants and decreased for disadvantageous mutants. A few examples of such amplifiers have been developed, but so far it is unclear how many such structures exist and how to construct them. Here, we show that almost any undirected random graph is an amplifier of selection for Birth-death updating, where an individual is selected to reproduce with probability proportional to its fitness and one of its neighbors is replaced by that offspring at random. If we instead focus on death-Birth updating, in which a random individual is removed and its neighbors compete for the empty spot, then the same ensemble of graphs consists of almost only suppressors of selection for which the fixation probability is decreased for advantageous mutants and increased for disadvantageous mutants. Thus, the impact of population structure on evolutionary dynamics is a subtle issue that will depend on seemingly minor details of the underlying evolutionary process.

  19. Bayesian dose selection design for a binary outcome using restricted response adaptive randomization.

    PubMed

    Meinzer, Caitlyn; Martin, Renee; Suarez, Jose I

    2017-09-08

    In phase II trials, the most efficacious dose is usually not known. Moreover, given limited resources, it is difficult to robustly identify a dose while also testing for a signal of efficacy that would support a phase III trial. Recent designs have sought to be more efficient by exploring multiple doses through the use of adaptive strategies. However, the added flexibility may potentially increase the risk of making incorrect assumptions and reduce the total amount of information available across the dose range as a function of imbalanced sample size. To balance these challenges, a novel placebo-controlled design is presented in which a restricted Bayesian response adaptive randomization (RAR) is used to allocate a majority of subjects to the optimal dose of active drug, defined as the dose with the lowest probability of poor outcome. However, the allocation between subjects who receive active drug or placebo is held constant to retain the maximum possible power for a hypothesis test of overall efficacy comparing the optimal dose to placebo. The design properties and optimization of the design are presented in the context of a phase II trial for subarachnoid hemorrhage. For a fixed total sample size, a trade-off exists between the ability to select the optimal dose and the probability of rejecting the null hypothesis. This relationship is modified by the allocation ratio between active and control subjects, the choice of RAR algorithm, and the number of subjects allocated to an initial fixed allocation period. While a responsive RAR algorithm improves the ability to select the correct dose, there is an increased risk of assigning more subjects to a worse arm as a function of ephemeral trends in the data. A subarachnoid treatment trial is used to illustrate how this design can be customized for specific objectives and available data. Bayesian adaptive designs are a flexible approach to addressing multiple questions surrounding the optimal dose for treatment efficacy

  20. Herpes thymidine kinase mutants with altered catalytic efficiencies obtained by random sequence selection.

    PubMed

    Munir, K M; French, D C; Dube, D K; Loeb, L A

    1994-01-01

    We have obtained 190 active Herpes simplex virus type 1 thymidine kinase mutants by substituting a 33 nucleotide sequence with 20% degeneracy for a portion of the nucleotide sequence that encodes the putative thymidine binding site [K.M. Munir, D.C. French, D.K. Dube and L.A. Loeb (1992) J. Biol. Chem., 167, 6584-6589]. In order to classify these mutants with respect to thymidine kinase activity we determined the ability of Escherichia coli harboring these mutants to form colonies in the presence of varying concentrations of thymidine. Escherichia coli harboring one of the mutant enzymes was able to form colonies at a concentration of thymidine lower than did the wild type. It was able to phosphorylate thymidine more rapidly than the wild type both in vivo and in vitro. The increased thymidine kinase activity was manifested by (i) a 42% enhanced uptake of [methyl-3H]thymidine into E. coli, (ii) a 2.4 times higher rate of [methyl-3H]thymidine incorporation into acid-insoluble material and (iii) a 5-fold increase in the kcat of the purified enzyme compared to the wild type. Herpes thymidine kinase purified from other mutants that formed colonies at higher thymidine concentrations than that of the wild type exhibited a decrease in kcat. The kcat of one of these mutant thymidine kinases was 10(-4) of that of the wild type enzyme. This study demonstrates that a spectrum of mutant enzymes with different catalytic properties can be obtained by selection from a plasmid with random sequence substitutions and this can be done in the absence of rational protein design.

  1. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti.

    PubMed

    Wampler, Peter J; Rediske, Richard R; Molla, Azizur R

    2013-01-18

    A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This

  2. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti

    PubMed Central

    2013-01-01

    Background A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. Methods The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. Results A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. Conclusions The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge

  3. Automatised selection of load paths to construct reduced-order models in computational damage micromechanics: from dissipation-driven random selection to Bayesian optimization

    NASA Astrophysics Data System (ADS)

    Goury, Olivier; Amsallem, David; Bordas, Stéphane Pierre Alain; Liu, Wing Kam; Kerfriden, Pierre

    2016-08-01

    In this paper, we present new reliable model order reduction strategies for computational micromechanics. The difficulties rely mainly upon the high dimensionality of the parameter space represented by any load path applied onto the representative volume element. We take special care of the challenge of selecting an exhaustive snapshot set. This is treated by first using a random sampling of energy dissipating load paths and then in a more advanced way using Bayesian optimization associated with an interlocked division of the parameter space. Results show that we can insure the selection of an exhaustive snapshot set from which a reliable reduced-order model can be built.

  4. Generation of Aptamers from A Primer-Free Randomized ssDNA Library Using Magnetic-Assisted Rapid Aptamer Selection

    NASA Astrophysics Data System (ADS)

    Tsao, Shih-Ming; Lai, Ji-Ching; Horng, Horng-Er; Liu, Tu-Chen; Hong, Chin-Yih

    2017-04-01

    Aptamers are oligonucleotides that can bind to specific target molecules. Most aptamers are generated using random libraries in the standard systematic evolution of ligands by exponential enrichment (SELEX). Each random library contains oligonucleotides with a randomized central region and two fixed primer regions at both ends. The fixed primer regions are necessary for amplifying target-bound sequences by PCR. However, these extra-sequences may cause non-specific bindings, which potentially interfere with good binding for random sequences. The Magnetic-Assisted Rapid Aptamer Selection (MARAS) is a newly developed protocol for generating single-strand DNA aptamers. No repeat selection cycle is required in the protocol. This study proposes and demonstrates a method to isolate aptamers for C-reactive proteins (CRP) from a randomized ssDNA library containing no fixed sequences at 5‧ and 3‧ termini using the MARAS platform. Furthermore, the isolated primer-free aptamer was sequenced and binding affinity for CRP was analyzed. The specificity of the obtained aptamer was validated using blind serum samples. The result was consistent with monoclonal antibody-based nephelometry analysis, which indicated that a primer-free aptamer has high specificity toward targets. MARAS is a feasible platform for efficiently generating primer-free aptamers for clinical diagnoses.

  5. Generation of Aptamers from A Primer-Free Randomized ssDNA Library Using Magnetic-Assisted Rapid Aptamer Selection.

    PubMed

    Tsao, Shih-Ming; Lai, Ji-Ching; Horng, Horng-Er; Liu, Tu-Chen; Hong, Chin-Yih

    2017-04-03

    Aptamers are oligonucleotides that can bind to specific target molecules. Most aptamers are generated using random libraries in the standard systematic evolution of ligands by exponential enrichment (SELEX). Each random library contains oligonucleotides with a randomized central region and two fixed primer regions at both ends. The fixed primer regions are necessary for amplifying target-bound sequences by PCR. However, these extra-sequences may cause non-specific bindings, which potentially interfere with good binding for random sequences. The Magnetic-Assisted Rapid Aptamer Selection (MARAS) is a newly developed protocol for generating single-strand DNA aptamers. No repeat selection cycle is required in the protocol. This study proposes and demonstrates a method to isolate aptamers for C-reactive proteins (CRP) from a randomized ssDNA library containing no fixed sequences at 5' and 3' termini using the MARAS platform. Furthermore, the isolated primer-free aptamer was sequenced and binding affinity for CRP was analyzed. The specificity of the obtained aptamer was validated using blind serum samples. The result was consistent with monoclonal antibody-based nephelometry analysis, which indicated that a primer-free aptamer has high specificity toward targets. MARAS is a feasible platform for efficiently generating primer-free aptamers for clinical diagnoses.

  6. Generation of Aptamers from A Primer-Free Randomized ssDNA Library Using Magnetic-Assisted Rapid Aptamer Selection

    PubMed Central

    Tsao, Shih-Ming; Lai, Ji-Ching; Horng, Horng-Er; Liu, Tu-Chen; Hong, Chin-Yih

    2017-01-01

    Aptamers are oligonucleotides that can bind to specific target molecules. Most aptamers are generated using random libraries in the standard systematic evolution of ligands by exponential enrichment (SELEX). Each random library contains oligonucleotides with a randomized central region and two fixed primer regions at both ends. The fixed primer regions are necessary for amplifying target-bound sequences by PCR. However, these extra-sequences may cause non-specific bindings, which potentially interfere with good binding for random sequences. The Magnetic-Assisted Rapid Aptamer Selection (MARAS) is a newly developed protocol for generating single-strand DNA aptamers. No repeat selection cycle is required in the protocol. This study proposes and demonstrates a method to isolate aptamers for C-reactive proteins (CRP) from a randomized ssDNA library containing no fixed sequences at 5′ and 3′ termini using the MARAS platform. Furthermore, the isolated primer-free aptamer was sequenced and binding affinity for CRP was analyzed. The specificity of the obtained aptamer was validated using blind serum samples. The result was consistent with monoclonal antibody-based nephelometry analysis, which indicated that a primer-free aptamer has high specificity toward targets. MARAS is a feasible platform for efficiently generating primer-free aptamers for clinical diagnoses. PMID:28367958

  7. The basic science and mathematics of random mutation and natural selection.

    PubMed

    Kleinman, Alan

    2014-12-20

    The mutation and natural selection phenomenon can and often does cause the failure of antimicrobial, herbicidal, pesticide and cancer treatments selection pressures. This phenomenon operates in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures. The mathematical behavior of mutation and selection is derived using the principles given by probability theory. The derivation of the equations describing the mutation and selection phenomenon is carried out in the context of an empirical example. Copyright © 2014 John Wiley & Sons, Ltd.

  8. Introduction of Mismatches in a Random shRNA-Encoding Library Improves Potency for Phenotypic Selection

    PubMed Central

    Wang, Yongping; Speier, Jacqueline S.; Engram-Pearl, Jessica; Wilson, Robert B.

    2014-01-01

    RNA interference (RNAi) is a mechanism for interfering with gene expression through the action of small, non-coding RNAs. We previously constructed a short-hairpin-loop RNA (shRNA) encoding library that is random at the nucleotide level [1]. In this library, the stems of the hairpin are completely complementary. To improve the potency of initial hits, and therefore signal-to-noise ratios in library screening, as well as to simplify hit-sequence retrieval by PCR, we constructed a second-generation library in which we introduced random mismatches between the two halves of the stem of each hairpin, on a random template background. In a screen for shRNAs that protect an interleukin-3 (IL3) dependent cell line from IL3 withdrawal, our second-generation library yielded hit sequences with significantly higher potencies than those from the first-generation library in the same screen. Our method of random mutagenesis was effective for a random template and is likely suitable, therefore, for any DNA template of interest. The improved potency of our second-generation library expands the range of possible unbiased screens for small-RNA therapeutics and biologic tools. PMID:24498319

  9. Maintenance of Genetic Variability under the Joint Effect of Mutation, Selection and Random Drift

    PubMed Central

    Li, Wen-Hsiung

    1978-01-01

    Formulae are developed for the distribution of allele frequencies (the frequency spectrum), the mean number of alleles in a sample, and the mean and variance of heterozygosity under mutation pressure and under either genic or recessive selection. Numerical computations are carried out by using these formulae and Watterson's (1977) formula for the distribution of allele frequencies under overdominant selection. The following properties are observed: (1) The effect of selection on the distribution of allele frequencies is slight when 4Ns ≤ 4, but becomes strong when 4Ns becomes larger than 10, where N denotes the effective size and s the selective difference between alleles. Genic selection and recessive selection tend to force the distribution to be U-shaped, whereas overdominant selection has the opposite tendency. (2) The mean total number of alleles in a sample is much more strongly affected by selection than the mean number of rare alleles in a sample. (3) Even slight heterozygote advantage, as small as 10-5, increases considerably the mean heterozygosity of a population, as compared to the case of neutral mutations. On the other hand, even slight genic or recessive selection causes a great reduction in heterozygosity when population size is large. (4) As a test statistic, the variance of heterozygosity can be used to detect the presence of selection, though it is not efficient when the selection intensity is very weak, say when 4Ns is around 4 or less. A model, which is somewhat similar to Ohta's (1976) model of slightly deleterious mutations, has been proposed to explain the following general patterns of genic variation: (i) There seems to be an upper limit for the observed average heterozygosities. (ii) The distribution of allele frequencies is U-shaped for every species surveyed. (iii) Most of the species surveyed tend to have an excess of rare alleles as compared with that expected under the neutral mutation hypothesis. PMID:17248867

  10. The Implications of Teacher Selection and Teacher Effects in Individually Randomized Group Treatment Trials

    ERIC Educational Resources Information Center

    Weiss, Michael J.

    2010-01-01

    Randomized experiments have become an increasingly popular design to evaluate the effectiveness of interventions in education (Spybrook, 2008). Many of the interventions evaluated in education are delivered to groups of students, rather than to individuals. Experiments designed to evaluate programs delivered at the group level often…

  11. Demonstrating an Interactive Genetic Drift Exercise: Examining the Processes of Random Mating and Selection.

    ERIC Educational Resources Information Center

    Carter, Ashley J. R.

    2002-01-01

    Presents a hands-on activity on the phenomenon of genetic drift in populations that reinforces the random nature of drift and demonstrates the effect of the population size on the mean frequency of an allele over a few generations. Includes materials for the demonstration, procedures, and discussion topics. (KHR)

  12. Prevalence and Severity of College Student Bereavement Examined in a Randomly Selected Sample

    ERIC Educational Resources Information Center

    Balk, David E.; Walker, Andrea C.; Baker, Ardith

    2010-01-01

    The authors used stratified random sampling to assess the prevalence and severity of bereavement in college undergraduates, providing an advance over findings that emerge from convenience sampling methods or from anecdotal observations. Prior research using convenience sampling indicated that 22% to 30% of college students are within 12 months of…

  13. The Jackprot Simulation Couples Mutation Rate with Natural Selection to Illustrate How Protein Evolution Is Not Random

    PubMed Central

    Espinosa, Avelina; Bai, Chunyan Y.

    2016-01-01

    Protein evolution is not a random process. Views which attribute randomness to molecular change, deleterious nature to single-gene mutations, insufficient geological time, or population size for molecular improvements to occur, or invoke “design creationism” to account for complexity in molecular structures and biological processes, are unfounded. Scientific evidence suggests that natural selection tinkers with molecular improvements by retaining adaptive peptide sequence. We used slot-machine probabilities and ion channels to show biological directionality on molecular change. Because ion channels reside in the lipid bilayer of cell membranes, their residue location must be in balance with the membrane's hydrophobic/philic nature; a selective “pore” for ion passage is located within the hydrophobic region. We contrasted the random generation of DNA sequence for KcsA, a bacterial two-transmembrane-domain (2TM) potassium channel, from Streptomyces lividans, with an under-selection scenario, the “jackprot,” which predicted much faster evolution than by chance. We wrote a computer program in JAVA APPLET version 1.0 and designed an online interface, The Jackprot Simulation http://faculty.rwu.edu/cbai/JackprotSimulation.htm, to model a numerical interaction between mutation rate and natural selection during a scenario of polypeptide evolution. Winning the “jackprot,” or highest-fitness complete-peptide sequence, required cumulative smaller “wins” (rewarded by selection) at the first, second, and third positions in each of the 161 KcsA codons (“jackdons” that led to “jackacids” that led to the “jackprot”). The “jackprot” is a didactic tool to demonstrate how mutation rate coupled with natural selection suffices to explain the evolution of specialized proteins, such as the complex six-transmembrane (6TM) domain potassium, sodium, or calcium channels. Ancestral DNA sequences coding for 2TM-like proteins underwent nucleotide

  14. The Jackprot Simulation Couples Mutation Rate with Natural Selection to Illustrate How Protein Evolution Is Not Random.

    PubMed

    Paz-Y-Miño C, Guillermo; Espinosa, Avelina; Bai, Chunyan Y

    2011-09-01

    Protein evolution is not a random process. Views which attribute randomness to molecular change, deleterious nature to single-gene mutations, insufficient geological time, or population size for molecular improvements to occur, or invoke "design creationism" to account for complexity in molecular structures and biological processes, are unfounded. Scientific evidence suggests that natural selection tinkers with molecular improvements by retaining adaptive peptide sequence. We used slot-machine probabilities and ion channels to show biological directionality on molecular change. Because ion channels reside in the lipid bilayer of cell membranes, their residue location must be in balance with the membrane's hydrophobic/philic nature; a selective "pore" for ion passage is located within the hydrophobic region. We contrasted the random generation of DNA sequence for KcsA, a bacterial two-transmembrane-domain (2TM) potassium channel, from Streptomyces lividans, with an under-selection scenario, the "jackprot," which predicted much faster evolution than by chance. We wrote a computer program in JAVA APPLET version 1.0 and designed an online interface, The Jackprot Simulation http://faculty.rwu.edu/cbai/JackprotSimulation.htm, to model a numerical interaction between mutation rate and natural selection during a scenario of polypeptide evolution. Winning the "jackprot," or highest-fitness complete-peptide sequence, required cumulative smaller "wins" (rewarded by selection) at the first, second, and third positions in each of the 161 KcsA codons ("jackdons" that led to "jackacids" that led to the "jackprot"). The "jackprot" is a didactic tool to demonstrate how mutation rate coupled with natural selection suffices to explain the evolution of specialized proteins, such as the complex six-transmembrane (6TM) domain potassium, sodium, or calcium channels. Ancestral DNA sequences coding for 2TM-like proteins underwent nucleotide "edition" and gene duplications to generate the 6

  15. The Effect of Basis Selection on Static and Random Acoustic Response Prediction Using a Nonlinear Modal Simulation

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Przekop, Adam

    2005-01-01

    An investigation of the effect of basis selection on geometric nonlinear response prediction using a reduced-order nonlinear modal simulation is presented. The accuracy is dictated by the selection of the basis used to determine the nonlinear modal stiffness. This study considers a suite of available bases including bending modes only, bending and membrane modes, coupled bending and companion modes, and uncoupled bending and companion modes. The nonlinear modal simulation presented is broadly applicable and is demonstrated for nonlinear quasi-static and random acoustic response of flat beam and plate structures with isotropic material properties. Reduced-order analysis predictions are compared with those made using a numerical simulation in physical degrees-of-freedom to quantify the error associated with the selected modal bases. Bending and membrane responses are separately presented to help differentiate the bases.

  16. Does the Use of a Decision Aid Improve Decision Making in Prosthetic Heart Valve Selection? A Multicenter Randomized Trial.

    PubMed

    Korteland, Nelleke M; Ahmed, Yunus; Koolbergen, David R; Brouwer, Marjan; de Heer, Frederiek; Kluin, Jolanda; Bruggemans, Eline F; Klautz, Robert J M; Stiggelbout, Anne M; Bucx, Jeroen J J; Roos-Hesselink, Jolien W; Polak, Peter; Markou, Thanasie; van den Broek, Inge; Ligthart, Rene; Bogers, Ad J J C; Takkenberg, Johanna J M

    2017-02-01

    A Dutch online patient decision aid to support prosthetic heart valve selection was recently developed. A multicenter randomized controlled trial was conducted to assess whether use of the patient decision aid results in optimization of shared decision making in prosthetic heart valve selection. In a 5-center randomized controlled trial, patients were allocated to receive either standard preoperative care (control group) or additional access to the patient decision aid (intervention group). Legally capable adult patients accepted for elective isolated or combined aortic and mitral valve replacement were included. Primary outcome was preoperative decisional conflict (Decisional Conflict Scale); secondary outcomes included patient knowledge, involvement in valve selection, anxiety and depression, (valve-specific) quality of life, and regret. Out of 306 eligible patients, 155 were randomized (78 control and 77 intervention). Preoperative decisional conflict did not differ between the groups (34% versus 33%; P=0.834). Intervention patients felt better informed (median Decisional Conflict Scale informed subscore: 8 versus 17; P=0.046) and had a better knowledge of prosthetic valves (85% versus 68%; P=0.004). Intervention patients experienced less anxiety and depression (median Hospital Anxiety and Depression Scale score: 6 versus 9; P=0.015) and better mental well-being (mean Short Form Health Survey score: 54 versus 50; P=0.032). Three months postoperatively, valve-specific quality of life and regret did not differ between the groups. A patient decision aid to support shared decision making in prosthetic heart valve selection does not lower decisional conflict. It does result in more knowledgeable, better informed, and less anxious and depressed patients, with a better mental well-being. http://www.trialregister.nl. Unique identifier: NTR4350. © 2017 American Heart Association, Inc.

  17. Single-primer-limited amplification: a method to generate random single-stranded DNA sub-library for aptamer selection.

    PubMed

    He, Chao-Zhu; Zhang, Kun-He; Wang, Ting; Wan, Qin-Si; Hu, Piao-Ping; Hu, Mei-Di; Huang, De-Qiang; Lv, Nong-Hua

    2013-09-01

    The amplification of a random single-stranded DNA (ssDNA) library by polymerase chain reaction (PCR) is a key step in each round of aptamer selection by systematic evolution of ligands by exponential enrichment (SELEX), but it can be impeded by the amplification of by-products due to the severely nonspecific hybridizations among various sequences in the PCR system. To amplify a random ssDNA library free from by-products, we developed a novel method termed single-primer-limited amplification (SPLA), which was initiated from the amplification of minus-stranded DNA (msDNA) of an ssDNA library with reverse primer limited to 5-fold molar quantity of the template, followed by the amplification of plus-stranded DNA (psDNA) of the msDNA with forward primer limited to 10-fold molar quantity of the template and recovery of psDNA by gel excision. We found that the amount of by-products increased with the increase of template amount and thermal cycle number. With the optimized template amount and thermal cycle, SPLA could amplify target ssDNA without detectable by-products and nonspecific products and could produce psDNA 16.1 times as much as that by asymmetric PCR. In conclusion, SPLA is a simple and feasible method to efficiently generate a random ssDNA sub-library for aptamer selection.

  18. Final state-selected spectra in unimolecular reactions: A transition-state-based random matrix model for overlapping resonances

    SciTech Connect

    Peskin, U.; Miller, W.H.; Reisler, H.

    1995-06-08

    Final state-selected spectra in unimolecular decomposition are obtained by a random matrix version of Feshbach`s optical model. The number of final states which are independently coupled to the molecular quasibound states is identified with the number of states at the dividing surface of transition state theory (TST). The coupling of the transition state to the molecular complex is modeled via a universal random matrix effective Hamiltonian which is characterized by its resonance eigenstates and provides the correct average unimolecular decay rate. The transition from nonoverlapping resonances which are associated with isolated Lorentzian spectral peaks, to overlapping resonances, associated with more complex spectra, is characterized in terms of deviations from a {chi}{sup 2}-like distribution of the resonance widths and the approach to a random phase-distribution of the resonance scattering amplitudes. The evolution of the system from a tight transition state to reaction products is treated explicitly as a scattering process where specific dynamics can be incorporated. Comparisons with recently measured final state-selected spectra and rotational distributions for the unimolecular reaction of NO{sub 2} show that the present model provides a useful new approach for understanding and interpreting experimental results which are dominated by overlapping resonances.

  19. Robust estimates of divergence times and selection with a poisson random field model: a case study of comparative phylogeographic data.

    PubMed

    Amei, Amei; Smith, Brian Tilston

    2014-01-01

    Mutation frequencies can be modeled as a Poisson random field (PRF) to estimate speciation times and the degree of selection on newly arisen mutations. This approach provides a quantitative theory for comparing intraspecific polymorphism with interspecific divergence in the presence of selection and can be used to estimate population genetic parameters. Although the original PRF model has been extended to more general biological settings to make statistical inference about selection and divergence among model organisms, it has not been incorporated into phylogeographic studies that focus on estimating population genetic parameters for nonmodel organisms. Here, we modified a recently developed time-dependent PRF model to independently estimate genetic parameters from a nuclear and mitochondrial DNA data set of 22 sister pairs of birds that have diverged across a biogeographic barrier. We found that species that inhabit humid habitats had more recent divergence times and larger effective population sizes than those that inhabit drier habitats, and divergence time estimated from the PRF model were similar to estimates from a coalescent species-tree approach. Selection coefficients were higher in sister pairs that inhabited drier habitats than in those in humid habitats, but overall the mitochondrial DNA was under weak selection. Our study indicates that PRF models are useful for estimating various population genetic parameters and serve as a framework for incorporating estimates of selection into comparative phylogeographic studies.

  20. r2VIM: A new variable selection method for random forests in genome-wide association studies.

    PubMed

    Szymczak, Silke; Holzinger, Emily; Dasgupta, Abhijit; Malley, James D; Molloy, Anne M; Mills, James L; Brody, Lawrence C; Stambolian, Dwight; Bailey-Wilson, Joan E

    2016-01-01

    Machine learning methods and in particular random forests (RFs) are a promising alternative to standard single SNP analyses in genome-wide association studies (GWAS). RFs provide variable importance measures (VIMs) to rank SNPs according to their predictive power. However, in contrast to the established genome-wide significance threshold, no clear criteria exist to determine how many SNPs should be selected for downstream analyses. We propose a new variable selection approach, recurrent relative variable importance measure (r2VIM). Importance values are calculated relative to an observed minimal importance score for several runs of RF and only SNPs with large relative VIMs in all of the runs are selected as important. Evaluations on simulated GWAS data show that the new method controls the number of false-positives under the null hypothesis. Under a simple alternative hypothesis with several independent main effects it is only slightly less powerful than logistic regression. In an experimental GWAS data set, the same strong signal is identified while the approach selects none of the SNPs in an underpowered GWAS. The novel variable selection method r2VIM is a promising extension to standard RF for objectively selecting relevant SNPs in GWAS while controlling the number of false-positive results.

  1. Identification of polypeptides with selective affinity to intact mouse cerebellar granule neurons from a random peptide-presenting phage library.

    PubMed

    Hou, Sheng T; Dove, Mike; Anderson, Erica; Zhang, Jiangbing; MacKenzie, C Roger

    2004-09-30

    Targeting of postmitotic neurons selectively for gene delivery poses a challenge. One way to achieve such a selective targeting is to link the gene delivery vector with small ligand-binding polypeptides which have selective affinity to intact neurons. In order to identify such novel neuron selective polypeptides, we screened a phage-display library displaying random 12-mer polypeptides and subtractively bio-panned for clones having selectivity towards cultured mouse cerebellar granule neurons. The selected phage clones were amplified and sequenced. Affinities of these clones to neurons were determined by the visible presence or absence of fluorescence of phage particles as detected by immunocytochemistry using an antibody to M-13 phage. This affinity was further qualified by how much phage was bound, and where in or on the cell it tended to accumulate. The selectivity of binding to neurons was determined by the negative binding of these clones to several cultured non-neuronal cells, including, primary glial cells, NT2 cells, human embryonic kidney 293 cells, neuroblastoma cells, and mouse 3T3 cells. Among the 46 clones that we have sequenced and characterized, four clones appeared to have excellent selectivity in binding to neurons. Homology comparison of these polypeptides revealed that three of them contained a consensus D(E)-W(F)-I(N)-D-W motif. This motif was also present in the Bdm1 gene product which was predominantly expressed in postnatal brains. Further characterizations of these polypeptides are required to reveal the utilities of these peptides to function as an effective linker to facilitate gene transfer selectively to neurons.

  2. Selecting Random Latin Hypercube Dimensions and Designs through Estimation of Maximum Absolute Pairwise Correlation

    DTIC Science & Technology

    2012-12-01

    values are assigned at random to the n design points, with all n! possible permutations being equally likely. This generates the X j column in the...design matrix. The permutation process is performed independently for each of the k input variables. Therefore, for each column jX , all of the n...lattice RLH corresponds to independently generating k permutations of the first n natural numbers and appropriately scaling the columns to cover the

  3. Random frog: an efficient reversible jump Markov Chain Monte Carlo-like approach for variable selection with applications to gene selection and disease classification.

    PubMed

    Li, Hong-Dong; Xu, Qing-Song; Liang, Yi-Zeng

    2012-08-31

    The identification of disease-relevant genes represents a challenge in microarray-based disease diagnosis where the sample size is often limited. Among established methods, reversible jump Markov Chain Monte Carlo (RJMCMC) methods have proven to be quite promising for variable selection. However, the design and application of an RJMCMC algorithm requires, for example, special criteria for prior distributions. Also, the simulation from joint posterior distributions of models is computationally extensive, and may even be mathematically intractable. These disadvantages may limit the applications of RJMCMC algorithms. Therefore, the development of algorithms that possess the advantages of RJMCMC methods and are also efficient and easy to follow for selecting disease-associated genes is required. Here we report a RJMCMC-like method, called random frog that possesses the advantages of RJMCMC methods and is much easier to implement. Using the colon and the estrogen gene expression datasets, we show that random frog is effective in identifying discriminating genes. The top 2 ranked genes for colon and estrogen are Z50753, U00968, and Y10871_at, Z22536_at, respectively. (The source codes with GNU General Public License Version 2.0 are freely available to non-commercial users at: http://code.google.com/p/randomfrog/.).

  4. Inbreeding Effects on Quantitative Traits in Random Mating and Selected Populations of the Mulberry Silkworm, Bombyx mori

    PubMed Central

    Doreswamy, Jamuna; Gopal, Subramanya

    2012-01-01

    The objective of the present study was to estimate the level of inbreeding coefficient during inbreeding of the pedigree of random mating and selected populations of two distinct races of mulberry silkworm, Bombyx mori (Lepidoptera: Bombycidae), in the silkworm germplasm. The six generation data of the two races, namely multivoltine Pure Mysore and bivoltine NB4D2, were studied for inbreeding depression coefficient using the residual maximum likelihood method, utilizing two statistical models by analyzing six quantitative traits, namely, larval weight, cocoon weight, shell weight, shell ratio, pupation rate, and filament length. The results of the present experiment demonstrated that the inbreeding coefficient was significant in Model 1 for most of the economic traits in the random mating populations of both the races compared to those of selected populations. These results suggest that during stock maintenance, application of rigid selection for increased numbers of generations helps to retain original characteristics of the pure races while reducing the deleterious effects of inbreeding. The significance of inbreeding coefficient is discussed with reference to the inbreeding of silk moths in the silkworm germplasm. PMID:23461728

  5. Yearling trait comparisons among inbred lines and selected noninbred and randomly bred control groups of Rambouillet, Targhee and Columbia ewes.

    PubMed

    Ercanbrack, S K; Knight, A D

    1983-02-01

    Inbreeding with concurrent selection was used to develop 26 Rambouillet, 20 Targhee and 10 Columbia inbred lines of sheep. Inbreeding coefficients averaged 30, 29 and 30% for the three breeds, respectively, at the conclusion of the study. A selected noninbred control group and a randomly bred unselected control group were maintained for each breed. Yearling traits were evaluated for 545 Rambouillet, 572 Targhee and 411 Columbia yearling ewes, each belonging to one of the inbred lines or control groups. In each breed, the selected controls were generally of greatest overall merit, the unselected controls intermediate and the inbred lines of least merit. Only a few yearling traits of only a few inbred lines were superior (P less than .05) to those of their appropriate selected control groups. Selection within inbred lines was generally ineffective in offsetting inbreeding depression. However, single trait selection for traits of high heritability, notably yearling weight, clean fleece weight and staple length, appeared to compensate for inbreeding effects on those traits. Deleterious consequences of inbreeding were particularly apparent in yearling weight, average daily gain, type and condition scores, grease and clean fleece weights and index of overall merit. Inbreeding also resulted in fewer neck folds among inbreds of all three breeds. Correlations between the rankings of inbred lines at weaning and yearling ages were high for traits of higher heritability. Superiority of the selected controls in most traits was of about the same magnitude at weaning and yearling ages. In no case did the final overall merit (index value) of an inbred line of any of the three breeds significantly exceed the overall merit of its respective selected control group.

  6. Feature selection for MLP neural network: the use of random permutation of probabilistic outputs.

    PubMed

    Yang, Jian-Bo; Shen, Kai-Quan; Ong, Chong-Jin; Li, Xiao-Ping

    2009-12-01

    This paper presents a new wrapper-based feature selection method for multilayer perceptron (MLP) neural networks. It uses a feature ranking criterion to measure the importance of a feature by computing the aggregate difference, over the feature space, of the probabilistic outputs of the MLP with and without the feature. Thus, a score of importance with respect to every feature can be provided using this criterion. Based on the numerical experiments on several artificial and real-world data sets, the proposed method performs, in general, better than several selected feature selection methods for MLP, particularly when the data set is sparse or has many redundant features. In addition, as a wrapper-based approach, the computational cost for the proposed method is modest.

  7. Natural Selection VS. Random Drift: Evidence from Temporal Variation in Allele Frequencies in Nature

    PubMed Central

    Mueller, Laurence D.; Barr, Lorraine G.; Ayala, Francisco J.

    1985-01-01

    We have obtained monthly samples of two species, Drosophila pseudoobscura and Drosophila persimilis, in a natural population from Napa County, California. In each species, about 300 genes have been assayed by electrophoresis for each of seven enzyme loci in each monthly sample from March 1972 to June 1975. Using statistical methods developed for the purpose, we have examined whether the allele frequencies at different loci vary in a correlated fashion. The methods used do not detect natural selection when it is deterministic (e.g., overdominance or directional selection), but only when alleles at different loci vary simultaneously in response to the same environmental variations. Moreover, only relatively large fitness differences (of the order of 15%) are detectable. We have found strong evidence of correlated allele frequency variation in 13–20% of the cases examined. We interpret this as evidence that natural selection plays a major role in the evolution of protein polymorphisms in nature. PMID:4054608

  8. Pattern selection and self-organization induced by random boundary initial values in a neuronal network

    NASA Astrophysics Data System (ADS)

    Ma, Jun; Xu, Ying; Wang, Chunni; Jin, Wuyin

    2016-11-01

    Regular spatial patterns could be observed in spatiotemporal systems far from equilibrium states. Artificial networks with different topologies are often designed to reproduce the collective behaviors of nodes (or neurons) which the local kinetics of node is described by kinds of oscillator models. It is believed that the self-organization of network much depends on the bifurcation parameters and topology connection type. Indeed, the boundary effect is every important on the pattern formation of network. In this paper, a regular network of Hindmarsh-Rose neurons is designed in a two-dimensional square array with nearest-neighbor connection type. The neurons on the boundary are excited with random stimulus. It is found that spiral waves, even a pair of spiral waves could be developed in the network under appropriate coupling intensity. Otherwise, the spatial distribution of network shows irregular states. A statistical variable is defined to detect the collective behavior by using mean field theory. It is confirmed that regular pattern could be developed when the synchronization degree is low. The potential mechanism could be that random perturbation on the boundary could induce coherence resonance-like behavior thus spiral wave could be developed in the network.

  9. Instrument Selection for Randomized Controlled Trials Why This and Not That?

    PubMed Central

    Records, Kathie; Keller, Colleen; Ainsworth, Barbara; Permana, Paska

    2011-01-01

    A fundamental linchpin for obtaining rigorous findings in quantitative research involves the selection of survey instruments. Psychometric recommendations are available for the processes for scale development and testing and guidance for selection of established scales. These processes are necessary to address the validity link between the phenomena under investigation, the empirical measures and, ultimately, the theoretical ties between these and the world views of the participants. Detailed information is most often provided about study design and protocols, but far less frequently is a detailed theoretical explanation provided for why specific instruments are chosen. Guidance to inform choices is often difficult to find when scales are needed for specific cultural, ethnic, or racial groups. This paper details the rationale underlying instrument selection for measurement of the major processes (intervention, mediator and moderator variables, outcome variables) in an ongoing study of postpartum Latinas, Madres para la Salud [Mothers for Health]. The rationale underpinning our choices includes a discussion of alternatives, when appropriate. These exemplars may provide direction for other intervention researchers who are working with specific cultural, racial, or ethnic groups or for other investigators who are seeking to select the ‘best’ instrument. Thoughtful consideration of measurement and articulation of the rationale underlying our choices facilitates the maintenance of rigor within the study design and improves our ability to assess study outcomes. PMID:21986392

  10. Specific and selective probes for Staphylococcus aureus from phage-displayed random peptide libraries.

    PubMed

    De Plano, Laura M; Carnazza, Santina; Messina, Grazia M L; Rizzo, Maria Giovanna; Marletta, Giovanni; Guglielmino, Salvatore P P

    2017-09-01

    Staphylococcus aureus is a major human pathogen causing health care-associated and community-associated infections. Early diagnosis is essential to prevent disease progression and to reduce complications that can be serious. In this study, we selected, from a 9-mer phage peptide library, a phage clone displaying peptide capable of specific binding to S. aureus cell surface, namely St.au9IVS5 (sequence peptide RVRSAPSSS).The ability of the isolated phage clone to interact specifically with S. aureus and the efficacy of its bacteria-binding properties were established by using enzyme linked immune-sorbent assay (ELISA). We also demonstrated by Western blot analysis that the most reactive and selective phage peptide binds a 78KDa protein on the bacterial cell surface. Furthermore, we observed selectivity of phage-bacteria-binding allowing to identify clinical isolates of S. aureus in comparison with a panel of other bacterial species. In order to explore the possibility of realizing a selective bacteria biosensor device, based on immobilization of affinity-selected phage, we have studied the physisorbed phage deposition onto a mica surface. Atomic Force Microscopy (AFM) was used to determine the organization of phage on mica surface and then the binding performance of mica-physisorbed phage to bacterial target was evaluated during the time by fluorescent microscopy. The system is able to bind specifically about 50% of S. aureus cells after 15' and 90% after one hour. Due to specificity and rapidness, this biosensing strategy paves the way to the further development of new cheap biosensors to be used in developing countries, as lab-on-chip (LOC) to detect bacterial agents in clinical diagnostics applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Moral hazard and selection among the poor: evidence from a randomized experiment.

    PubMed

    Spenkuch, Jörg L

    2012-01-01

    Not only does economic theory predict high-risk individuals to be more likely to purchase insurance, but insurance coverage is also thought to crowd out precautionary activities. In spite of stark theoretical predictions, there is conflicting empirical evidence on adverse selection, and evidence on ex ante moral hazard is very scarce. Using data from the Seguro Popular Experiment in Mexico, this paper documents patterns of selection on observables into health insurance as well as the existence of non-negligible ex ante moral hazard. More specifically, the findings indicate that (i) agents in poor self-assessed health prior to the intervention have, all else equal, a higher propensity to take up insurance; and (ii) insurance coverage reduces the demand for self-protection in the form of preventive care. Curiously, however, individuals do not sort based on objective measures of their health.

  12. A Randomized Controlled Trial of Cognitive Debiasing Improves Assessment and Treatment Selection for Pediatric Bipolar Disorder

    PubMed Central

    Jenkins, Melissa M.; Youngstrom, Eric A.

    2015-01-01

    Objective This study examined the efficacy of a new cognitive debiasing intervention in reducing decision-making errors in the assessment of pediatric bipolar disorder (PBD). Method The study was a randomized controlled trial using case vignette methodology. Participants were 137 mental health professionals working in different regions of the US (M=8.6±7.5 years of experience). Participants were randomly assigned to a (1) brief overview of PBD (control condition), or (2) the same brief overview plus a cognitive debiasing intervention (treatment condition) that educated participants about common cognitive pitfalls (e.g., base-rate neglect; search satisficing) and taught corrective strategies (e.g., mnemonics, Bayesian tools). Both groups evaluated four identical case vignettes. Primary outcome measures were clinicians’ diagnoses and treatment decisions. The vignette characters’ race/ethnicity was experimentally manipulated. Results Participants in the treatment group showed better overall judgment accuracy, p < .001, and committed significantly fewer decision-making errors, p < .001. Inaccurate and somewhat accurate diagnostic decisions were significantly associated with different treatment and clinical recommendations, particularly in cases where participants missed comorbid conditions, failed to detect the possibility of hypomania or mania in depressed youths, and misdiagnosed classic manic symptoms. In contrast, effects of patient race were negligible. Conclusions The cognitive debiasing intervention outperformed the control condition. Examining specific heuristics in cases of PBD may identify especially problematic mismatches between typical habits of thought and characteristics of the disorder. The debiasing intervention was brief and delivered via the Web; it has the potential to generalize and extend to other diagnoses as well as to various practice and training settings. PMID:26727411

  13. Code to generate random identifiers and select QA/QC samples

    USGS Publications Warehouse

    Mehnert, Edward

    1992-01-01

    SAMPLID is a PC-based, FORTRAN-77 code which generates unique numbers for identification of samples, selection of QA/QC samples, and generation of labels. These procedures are tedious, but using a computer code such as SAMPLID can increase efficiency and reduce or eliminate errors and bias. The algorithm, used in SAMPLID, for generation of pseudorandom numbers is free of statistical flaws present in commonly available algorithms.

  14. The ecological effects of universal and selective violence prevention programs for middle school students: a randomized trial.

    PubMed

    2009-06-01

    This study reports the findings of a multisite randomized trial evaluating the separate and combined effects of 2 school-based approaches to reduce violence among early adolescents. A total of 37 schools at 4 sites were randomized to 4 conditions: (1) a universal intervention that involved implementing a student curriculum and teacher training with 6th-grade students and teachers, (2) a selective intervention in which a family intervention was implemented with a subset of 6th-grade students exhibiting high levels of aggression and social influence, (3) a combined intervention condition, and (4) a no-intervention control condition. Analyses of multiple waves of data from 2 cohorts of students at each school (N = 5,581) within the grade targeted by the interventions revealed a complex pattern. There was some evidence to suggest that the universal intervention was associated with increases in aggression and reductions in victimization; however, these effects were moderated by preintervention risk. In contrast, the selective intervention was associated with decreases in aggression but no changes in victimization. These findings have important implications for efforts to develop effective violence prevention programs.

  15. The Ecological Effects of Universal and Selective Violence Prevention Programs for Middle School Students: A Randomized Trial

    PubMed Central

    2009-01-01

    This study reports the findings of a multisite randomized trial evaluating the separate and combined effects of 2 school-based approaches to reduce violence among early adolescents. A total of 37 schools at 4 sites were randomized to 4 conditions: (1) a universal intervention that involved implementing a student curriculum and teacher training with 6th-grade students and teachers, (2) a selective intervention in which a family intervention was implemented with a subset of 6th-grade students exhibiting high levels of aggression and social influence, (3) a combined intervention condition, and (4) a no-intervention control condition. Analyses of multiple waves of data from 2 cohorts of students at each school (N = 5,581) within the grade targeted by the interventions revealed a complex pattern. There was some evidence to suggest that the universal intervention was associated with increases in aggression and reductions in victimization; however, these effects were moderated by preintervention risk. In contrast, the selective intervention was associated with decreases in aggression but no changes in victimization. These findings have important implications for efforts to develop effective violence prevention programs. PMID:19485593

  16. Randomized Comparison of Actual and Ideal Body Weight for Size Selection of the Laryngeal Mask Airway Classic in Overweight Patients.

    PubMed

    Kim, Min-Soo; Lee, Jong Seok; Nam, Sang Beom; Kang, Hyo Jong; Kim, Ji Eun

    2015-08-01

    Size selection of the laryngeal mask airway (LMA) Classic based on actual body weight remains a common practice. However, ideal body weight might allow for a better size selection in obese patients. The purpose of our study was to compare the utility of ideal body weight and actual body weight when choosing the appropriate size of the LMA Classic by a randomized clinical trial. One hundred patients with age 20 to 70 yr, body mass index ≥25 kg/m(2), and the difference between LMA sizes based on actual weight and ideal weight were allocated to insert the LMA Classic using either actual body weight or ideal body weight in a weight-based formula for size selection. After insertion of the device, several variables including insertion parameters, sealing function, fiberoptic imaging, and complications were investigated. The insertion success rate at the first attempt was lower in the actual weight group (82%) than in the ideal weight group (96%), even it did not show significant difference. The ideal weight group had significantly shorter insertion time and easier placement. However, fiberoptic views were significantly better in the actual weight group. Intraoperative complications, sore throat in the recovery room, and dysphonia at postoperative 24 hr occurred significantly less often in the ideal weight group than in the actual weight group. It is suggested that the ideal body weight may be beneficial to the size selection of the LMA Classic in overweight patients (Clinical Trial Registry, NCT 01843270).

  17. Simple Random Sampling-Based Probe Station Selection for Fault Detection in Wireless Sensor Networks

    PubMed Central

    Huang, Rimao; Qiu, Xuesong; Rui, Lanlan

    2011-01-01

    Fault detection for wireless sensor networks (WSNs) has been studied intensively in recent years. Most existing works statically choose the manager nodes as probe stations and probe the network at a fixed frequency. This straightforward solution leads however to several deficiencies. Firstly, by only assigning the fault detection task to the manager node the whole network is out of balance, and this quickly overloads the already heavily burdened manager node, which in turn ultimately shortens the lifetime of the whole network. Secondly, probing with a fixed frequency often generates too much useless network traffic, which results in a waste of the limited network energy. Thirdly, the traditional algorithm for choosing a probing node is too complicated to be used in energy-critical wireless sensor networks. In this paper, we study the distribution characters of the fault nodes in wireless sensor networks, validate the Pareto principle that a small number of clusters contain most of the faults. We then present a Simple Random Sampling-based algorithm to dynamic choose sensor nodes as probe stations. A dynamic adjusting rule for probing frequency is also proposed to reduce the number of useless probing packets. The simulation experiments demonstrate that the algorithm and adjusting rule we present can effectively prolong the lifetime of a wireless sensor network without decreasing the fault detected rate. PMID:22163789

  18. Simple random sampling-based probe station selection for fault detection in wireless sensor networks.

    PubMed

    Huang, Rimao; Qiu, Xuesong; Rui, Lanlan

    2011-01-01

    Fault detection for wireless sensor networks (WSNs) has been studied intensively in recent years. Most existing works statically choose the manager nodes as probe stations and probe the network at a fixed frequency. This straightforward solution leads however to several deficiencies. Firstly, by only assigning the fault detection task to the manager node the whole network is out of balance, and this quickly overloads the already heavily burdened manager node, which in turn ultimately shortens the lifetime of the whole network. Secondly, probing with a fixed frequency often generates too much useless network traffic, which results in a waste of the limited network energy. Thirdly, the traditional algorithm for choosing a probing node is too complicated to be used in energy-critical wireless sensor networks. In this paper, we study the distribution characters of the fault nodes in wireless sensor networks, validate the Pareto principle that a small number of clusters contain most of the faults. We then present a Simple Random Sampling-based algorithm to dynamic choose sensor nodes as probe stations. A dynamic adjusting rule for probing frequency is also proposed to reduce the number of useless probing packets. The simulation experiments demonstrate that the algorithm and adjusting rule we present can effectively prolong the lifetime of a wireless sensor network without decreasing the fault detected rate.

  19. Benefits of Selected Physical Exercise Programs in Detention: A Randomized Controlled Study

    PubMed Central

    Battaglia, Claudia; di Cagno, Alessandra; Fiorilli, Giovanni; Giombini, Arrigo; Fagnani, Federica; Borrione, Paolo; Marchetti, Marco; Pigozzi, Fabio

    2013-01-01

    The aim of the study was to determine which kind of physical activity could be useful to inmate populations to improve their health status and fitness levels. A repeated measure design was used to evaluate the effects of two different training protocols on subjects in a state of detention, tested pre- and post-experimental protocol.Seventy-five male subjects were enrolled in the studyand randomly allocated to three groups: the cardiovascular plus resistance training protocol group (CRT) (n = 25; mean age 30.9 ± 8.9 years),the high-intensity strength training protocol group (HIST) (n = 25; mean age 33.9 ± 6.8 years), and a control group (C) (n = 25; mean age 32.9 ± 8.9 years) receiving no treatment. All subjects underwent a clinical assessmentandfitness tests. MANOVA revealed significant multivariate effects on group (p < 0.01) and group-training interaction (p < 0.05). CRT protocol resulted the most effective protocol to reach the best outcome in fitness tests. Both CRT and HIST protocols produced significant gains in the functional capacity (cardio-respiratory capacity and cardiovascular disease risk decrease) of incarcerated males. The significant gains obtained in functional capacity reflect the great potential of supervised exercise interventions for improving the health status of incarcerated people. PMID:24185842

  20. Content analysis of a stratified random selection of JVME articles: 1974-2004.

    PubMed

    Olson, Lynne E

    2011-01-01

    A content analysis was performed on a random sample (N = 168) of 25% of the articles published in the Journal of Veterinary Medical Education (JVME) per year from 1974 through 2004. Over time, there were increased numbers of authors per paper, more cross-institutional collaborations, greater prevalence of references or endnotes, and lengthier articles, which could indicate a trend toward publications describing more complex or complete work. The number of first authors that could be identified as female was greatest for the most recent time period studied (2000-2004). Two different categorization schemes were created to assess the content of the publications. The first categorization scheme identified the most frequently published topics as admissions, descriptions of courses, the effect of changing teaching methods, issues facing the profession, and examples of uses of technology. The second categorization scheme identified the subset of articles that described medical education research on the basis of the purpose of the research, which represented only 14% of the sample articles (24 of 168). Of that group, only three of 24, or 12%, represented studies based on a firm conceptual framework that could be confirmed or refuted by the study's results. The results indicate that JVME is meeting its broadly based mission and that publications in the veterinary medical education literature have features common to publications in medicine and medical education.

  1. Random walk designs for selecting pool sizes in group testing estimation with small samples.

    PubMed

    Haber, Gregory; Malinovsky, Yaakov

    2017-08-09

    Group testing estimation, which utilizes pooled rather than individual units for testing, has been an ongoing area of research for over six decades. While it is often argued that such methods can yield large savings in terms of resources and/or time, these benefits depend very much on the initial choice of pool sizes. In fact, when poor group sizes are used, the results can be much worse than those obtained using standard techniques. Tools for addressing this problem in the literature have been based on either large sample results or prior knowledge of the parameter being estimated, with little guidance when these assumptions are not met. In this paper, we introduce and study random walk designs for choosing pool sizes when only a small number of tests can be run and prior knowledge is vague. To illustrate these methods, application is made to the estimation of prevalence for two diseases among Australian chrysanthemum crops. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Conflicts of Interest, Selective Inertia, and Research Malpractice in Randomized Clinical Trials: An Unholy Trinity

    PubMed Central

    Berger, Vance W.

    2014-01-01

    Recently a great deal of attention has been paid to conflicts of interest in medical research, and the Institute of Medicine has called for more research into this important area. One research question that has not received sufficient attention concerns the mechanisms of action by which conflicts of interest can result in biased and/or flawed research. What discretion do conflicted researchers have to sway the results one way or the other? We address this issue from the perspective of selective inertia, or an unnatural selection of research methods based on which are most likely to establish the preferred conclusions, rather than on which are most valid. In many cases it is abundantly clear that a method that is not being used in practice is superior to the one that is being used in practice, at least from the perspective of validity, and that it is only inertia, as opposed to any serious suggestion that the incumbent method is superior (or even comparable), that keeps the inferior procedure in use, to the exclusion of the superior one. By focusing on these flawed research methods we can go beyond statements of potential harm from real conflicts of interest, and can more directly assess actual (not potential) harm. PMID:25150846

  3. The Effect of Basis Selection on Thermal-Acoustic Random Response Prediction Using Nonlinear Modal Simulation

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Przekop, Adam

    2004-01-01

    The goal of this investigation is to further develop nonlinear modal numerical simulation methods for prediction of geometrically nonlinear response due to combined thermal-acoustic loadings. As with any such method, the accuracy of the solution is dictated by the selection of the modal basis, through which the nonlinear modal stiffness is determined. In this study, a suite of available bases are considered including (i) bending modes only; (ii) coupled bending and companion modes; (iii) uncoupled bending and companion modes; and (iv) bending and membrane modes. Comparison of these solutions with numerical simulation in physical degrees-of-freedom indicates that inclusion of any membrane mode variants (ii - iv) in the basis affects the bending displacement and stress response predictions. The most significant effect is on the membrane displacement, where it is shown that only the type (iv) basis accurately predicts its behavior. Results are presented for beam and plate structures in the thermally pre-buckled regime.

  4. Selective medicated (saline + natural surfactant) bronchoalveolar lavage in unilateral lung contusion. A clinical randomized controlled trial.

    PubMed

    Marraro, Giuseppe A; Denaro, Carmelo; Spada, Claudio; Luchetti, Marco; Giansiracusa, Carla

    2010-02-01

    Open lung and low tidal volume ventilation appear to be a promising ventilation for chest trauma as it can reduce ARDS and improve outcome. Local therapy (e.g. BAL) can be synergic to remove from the lung the debris, mitigate inflammatory cascade and avoid damage spreading to not compromised lung areas. 44 pulmonary contused patients were randomized to receive broncho-suction and volume controlled low tidal volume ventilation-VCLTVV (Control Group) or the same ventilation plus medicated (saline + surfactant) BAL (Treatment Group). Tidal volume <10 ml/kg, PEEP of 10-12 cm H(2)O and PaO(2) 60-100 mm Hg and PaCO(2) 35-45 mm Hg were used in both groups. BAL was performed using a fiberscope. 4 boluses of 25 ml saline with 2.4 mg/ml of surfactant were introduced into each contused lobe in which, subsequently, 240 mg of surfactant was instilled. All patients survived. In the Control Group 18 patients developed pneumonia, 5 ARDS and days of intubation were 11.50 (3.83) compared to 5.05 (1.21) of Treatment Group in which OI and PaO(2)/FiO(2) significantly improved from 36 h. VCLTVV alone was not able to prevent ARDS and infection in the Control Group as the reduction of intubation. In the Treatment Group, VCLTVV and medicated BAL facilitated the removal of degradated lung material and recruited the contused lung regions, enabling the healing of the lung pathology.

  5. Mirnacle: machine learning with SMOTE and random forest for improving selectivity in pre-miRNA ab initio prediction.

    PubMed

    Marques, Yuri Bento; de Paiva Oliveira, Alcione; Ribeiro Vasconcelos, Ana Tereza; Cerqueira, Fabio Ribeiro

    2016-12-15

    MicroRNAs (miRNAs) are key gene expression regulators in plants and animals. Therefore, miRNAs are involved in several biological processes, making the study of these molecules one of the most relevant topics of molecular biology nowadays. However, characterizing miRNAs in vivo is still a complex task. As a consequence, in silico methods have been developed to predict miRNA loci. A common ab initio strategy to find miRNAs in genomic data is to search for sequences that can fold into the typical hairpin structure of miRNA precursors (pre-miRNAs). The current ab initio approaches, however, have selectivity issues, i.e., a high number of false positives is reported, which can lead to laborious and costly attempts to provide biological validation. This study presents an extension of the ab initio method miRNAFold, with the aim of improving selectivity through machine learning techniques, namely, random forest combined with the SMOTE procedure that copes with imbalance datasets. By comparing our method, termed Mirnacle, with other important approaches in the literature, we demonstrate that Mirnacle substantially improves selectivity without compromising sensitivity. For the three datasets used in our experiments, our method achieved at least 97% of sensitivity and could deliver a two-fold, 20-fold, and 6-fold increase in selectivity, respectively, compared with the best results of current computational tools. The extension of miRNAFold by the introduction of machine learning techniques, significantly increases selectivity in pre-miRNA ab initio prediction, which optimally contributes to advanced studies on miRNAs, as the need of biological validations is diminished. Hopefully, new research, such as studies of severe diseases caused by miRNA malfunction, will benefit from the proposed computational tool.

  6. Random forests for feature selection in QSPR Models - an application for predicting standard enthalpy of formation of hydrocarbons

    PubMed Central

    2013-01-01

    Background One of the main topics in the development of quantitative structure-property relationship (QSPR) predictive models is the identification of the subset of variables that represent the structure of a molecule and which are predictors for a given property. There are several automated feature selection methods, ranging from backward, forward or stepwise procedures, to further elaborated methodologies such as evolutionary programming. The problem lies in selecting the minimum subset of descriptors that can predict a certain property with a good performance, computationally efficient and in a more robust way, since the presence of irrelevant or redundant features can cause poor generalization capacity. In this paper an alternative selection method, based on Random Forests to determine the variable importance is proposed in the context of QSPR regression problems, with an application to a manually curated dataset for predicting standard enthalpy of formation. The subsequent predictive models are trained with support vector machines introducing the variables sequentially from a ranked list based on the variable importance. Results The model generalizes well even with a high dimensional dataset and in the presence of highly correlated variables. The feature selection step was shown to yield lower prediction errors with RMSE values 23% lower than without feature selection, albeit using only 6% of the total number of variables (89 from the original 1485). The proposed approach further compared favourably with other feature selection methods and dimension reduction of the feature space. The predictive model was selected using a 10-fold cross validation procedure and, after selection, it was validated with an independent set to assess its performance when applied to new data and the results were similar to the ones obtained for the training set, supporting the robustness of the proposed approach. Conclusions The proposed methodology seemingly improves the prediction

  7. Selected characteristics of a new polyvinyl siloxane impression material--a randomized clinical trial.

    PubMed

    Blatz, Markus B; Sadan, Avishai; Burgess, John O; Mercante, Donald; Hoist, Stefan

    2005-02-01

    This study evaluated the ability of a new polyvinyl siloxane impression material (Affinis, Coltène/Whaledent, material A) to obtain final impressions free of bubbles and voids for indirect fixed cuspal-coverage restorations. The results were compared to a control polyvinyl siloxane impression material (material B). Both materials were handled by inexperienced clinicians (undergraduate dental students) in student clinics. One-hundred and thirty patients who were treated in the Louisiana State University School of Dentistry Junior Student Clinic for indirect fixed cuspal-coverage restorations and who met the inclusion criteria were randomly assigned to either one of two treatment groups, group A (n = 65) or group B (n = 65). Two calibrated examiners evaluated the first impression of prepared posterior teeth at a magnification of 10x for acceptability (no voids or bubbles). Position of tooth, type of preparation, preparation finish line (Class I-V), and gingival bleeding scores were recorded. All statistical tests were performed with the level of significance set at .05. The Fisher-Freeman-Halton test did not reveal significant associations between material and gingival bleeding score (P = .492). Significant differences in the location of the preparation finish line between materials were observed (P = .0096); material A was more frequently used in cases where the preparation finish line was located at least 2 mm subgingivally. Logistic regression was used to assess the effect of the material on the success of the impression (acceptable/ unacceptable). Material was highly significant in the logistic model (P < .001) with an odds in favor of an acceptable impression being eight times higher with material A than with material B (odds ratio = 8.00; 95% confidence index for odds ratio: 2.832, 22.601). The 60/65 (92.3%) impressions made with material A and 39/65 (60%) impressions made with material B were rated "acceptable." The new polyvinyl siloxane impression material

  8. Selective bowel decontamination for the prevention of infection in acute myelogenous leukemia: a prospective randomized trial.

    PubMed

    Lee, Dong Gun; Choi, Su Mi; Choi, Jung Hyun; Yoo, Jin Hong; Park, Yoon Hee; Kim, Yoo Jin; Lee, Seok; Min, Chang Ki; Kim, Hee Je; Kim, Dong Wook; Lee, Jong Wook; Min, Woo Sung; Shin, Wan Shik; Kim, Chun Choo

    2002-03-01

    Infection is still a frequent cause of morbidity and mortality in acute myelogenous leukemia (AML) patients receiving chemotherapy. Recently the main cause of infection has changed from gram-negative to gram-positive bacteria and the resistance to antibiotics has increased. This study aimed to access the effectiveness of antimicrobial prophylaxis (AP) with orally absorbable antibiotics. Ninety-five AML patients receiving chemotherapy at Catholic Hemopoietic Stem Cell Transplantation Center from March 1999 to July 1999 were randomly divided into the AP group (250 mg ciprofloxacin twice a day, 150 mg roxithromycin twice a day, 50 mg fluconazole once a day) and the control group for a prospective analysis. The incidence of fever was 82.6% in the AP group and 91.6% in the control group (p = 0.15). Though classification and sites of infections showed no difference between the two groups, the catheter associated infection occurred more frequently in the AP group in significance. The time interval between initiation of chemotherapy and onset of fever, white blood cell (WBC) count at the onset of fever, duration of leukopenia (WBC < 1,000/mm3), duration of systemic antibiotic therapy, mortality due to infection and hospitalization period from the data starting chemotherapy showed no differences between the two groups. Infections due to gram negative bacteria decreased to 33.3% in the AP group (vs. 92% in the control group), but infections due to gram positive bacteria increased to 66.7% (vs. 8% in the control group). Gram negative bacteria showed 100% resistance to ciprofloxacin in the AP group and gram-positive bacteria showed 90-100% resistance to erythromycin, regardless of the presence of AP. The AP could not reduce the occurrence of infection or infection associated death in AML patients receiving chemotherapy. On considering increased gram-positive infection and resistance to fluoroquinolone and macrolide, routine prescription of AP should be reconsidered. Further

  9. Nitrates and bone turnover (NABT) - trial to select the best nitrate preparation: study protocol for a randomized controlled trial.

    PubMed

    Bucur, Roxana C; Reid, Lauren S; Hamilton, Celeste J; Cummings, Steven R; Jamal, Sophie A

    2013-09-08

    comparisons with the best' approach for data analyses, as this strategy allows practical considerations of ease of use and tolerability to guide selection of the preparation for future studies. Data from this protocol will be used to develop a randomized, controlled trial of nitrates to prevent osteoporotic fractures. ClinicalTrials.gov Identifier: NCT01387672. Controlled-Trials.com: ISRCTN08860742.

  10. Nitrates and bone turnover (NABT) - trial to select the best nitrate preparation: study protocol for a randomized controlled trial

    PubMed Central

    2013-01-01

    will use the ‘multiple comparisons with the best’ approach for data analyses, as this strategy allows practical considerations of ease of use and tolerability to guide selection of the preparation for future studies. Discussion Data from this protocol will be used to develop a randomized, controlled trial of nitrates to prevent osteoporotic fractures. Trial registration ClinicalTrials.gov Identifier: NCT01387672. Controlled-Trials.com: ISRCTN08860742. PMID:24010992

  11. Predicting the continuum between corridors and barriers to animal movements using Step Selection Functions and Randomized Shortest Paths.

    PubMed

    Panzacchi, Manuela; Van Moorter, Bram; Strand, Olav; Saerens, Marco; Kivimäki, Ilkka; St Clair, Colleen C; Herfindal, Ivar; Boitani, Luigi

    2016-01-01

    The loss, fragmentation and degradation of habitat everywhere on Earth prompts increasing attention to identifying landscape features that support animal movement (corridors) or impedes it (barriers). Most algorithms used to predict corridors assume that animals move through preferred habitat either optimally (e.g. least cost path) or as random walkers (e.g. current models), but neither extreme is realistic. We propose that corridors and barriers are two sides of the same coin and that animals experience landscapes as spatiotemporally dynamic corridor-barrier continua connecting (separating) functional areas where individuals fulfil specific ecological processes. Based on this conceptual framework, we propose a novel methodological approach that uses high-resolution individual-based movement data to predict corridor-barrier continua with increased realism. Our approach consists of two innovations. First, we use step selection functions (SSF) to predict friction maps quantifying corridor-barrier continua for tactical steps between consecutive locations. Secondly, we introduce to movement ecology the randomized shortest path algorithm (RSP) which operates on friction maps to predict the corridor-barrier continuum for strategic movements between functional areas. By modulating the parameter Ѳ, which controls the trade-off between exploration and optimal exploitation of the environment, RSP bridges the gap between algorithms assuming optimal movements (when Ѳ approaches infinity, RSP is equivalent to LCP) or random walk (when Ѳ → 0, RSP → current models). Using this approach, we identify migration corridors for GPS-monitored wild reindeer (Rangifer t. tarandus) in Norway. We demonstrate that reindeer movement is best predicted by an intermediate value of Ѳ, indicative of a movement trade-off between optimization and exploration. Model calibration allows identification of a corridor-barrier continuum that closely fits empirical data and demonstrates that RSP

  12. Pregnancy is not a risk factor for gallstone disease: Results of a randomly selected population sample

    PubMed Central

    Walcher, Thomas; Haenle, Mark Martin; Kron, Martina; Hay, Birgit; Mason, Richard Andrew; von Schmiesing, Alexa Friederike Alice; Imhof, Armin; Koenig, Wolfgang; Kern, Peter; Boehm, Bernhard Otto; Kratzer, Wolfgang

    2005-01-01

    AIM: To investigate the prevalence, risk factors, and selection of the study population for cholecystolithiasis in an urban population in Germany, in relation to our own findings and to the results in the international literature. METHODS: A total of 2 147 persons (1 111 females, age 42.8 ± 12.7 years; 1 036 males, age 42.3 ± 13.1 years) participating in an investigation on the prevalence of Echinococcus multilocularis were studied for risk factors and prevalence of gallbladder stone disease. Risk factors were assessed by means of a standardized interview and calculation of body mass index (BMI). A diagnostic ultrasound examination of the gallbladder was performed. Data were analyzed by multiple logistic regression, using the SAS statistical software package. RESULTS: Gallbladder stones were detected in 171 study participants (8.0%, n = 2 147). Risk factors for the development of gallbladder stone disease included age, sex, BMI, and positive family history. In a separate analysis of female study participants, pregnancy (yes/no) and number of pregnancies did not exert any influence. CONCLUSION: Findings of the present study confirm that age, female sex, BMI, and positive family history are risk factors for the development of gallbladder stone disease. Pregnancy and the number of pregnancies, however, could not be shown to be risk factors. There seem to be no differences in the respective prevalence for gallbladder stone disease in urban and rural populations. PMID:16425387

  13. Pregnancy is not a risk factor for gallstone disease: results of a randomly selected population sample.

    PubMed

    Walcher, Thomas; Haenle, Mark Martin; Kron, Martina; Hay, Birgit; Mason, Richard Andrew; von Schmiesing, Alexa Friederike Alice; Imhof, Armin; Koenig, Wolfgang; Kern, Peter; Boehm, Bernhard Otto; Kratzer, Wolfgang

    2005-11-21

    To investigate the prevalence, risk factors, and selection of the study population for cholecystolithiasis in an urban population in Germany, in relation to our own findings and to the results in the international literature. A total of 2 147 persons (1,111 females, age 42.8+/-12.7 years; 1,036 males, age 42.3+/-13.1 years) participating in an investigation on the prevalence of Echinococcus multilocularis were studied for risk factors and prevalence of gallbladder stone disease. Risk factors were assessed by means of a standardized interview and calculation of body mass index (BMI). A diagnostic ultrasound examination of the gallbladder was performed. Data were analyzed by multiple logistic regression, using the SAS statistical software package. Gallbladder stones were detected in 171 study participants (8.0%, n=2,147). Risk factors for the development of gallbladder stone disease included age, sex, BMI, and positive family history. In a separate analysis of female study participants, pregnancy (yes/no) and number of pregnancies did not exert any influence. Findings of the present study confirm that age, female sex, BMI, and positive family history are risk factors for the development of gallbladder stone disease. Pregnancy and the number of pregnancies, however, could not be shown to be risk factors. There seem to be no differences in the respective prevalence for gallbladder stone disease in urban and rural populations.

  14. Sequence-Based Prediction of RNA-Binding Proteins Using Random Forest with Minimum Redundancy Maximum Relevance Feature Selection

    PubMed Central

    Ma, Xin; Guo, Jing; Sun, Xiao

    2015-01-01

    The prediction of RNA-binding proteins is one of the most challenging problems in computation biology. Although some studies have investigated this problem, the accuracy of prediction is still not sufficient. In this study, a highly accurate method was developed to predict RNA-binding proteins from amino acid sequences using random forests with the minimum redundancy maximum relevance (mRMR) method, followed by incremental feature selection (IFS). We incorporated features of conjoint triad features and three novel features: binding propensity (BP), nonbinding propensity (NBP), and evolutionary information combined with physicochemical properties (EIPP). The results showed that these novel features have important roles in improving the performance of the predictor. Using the mRMR-IFS method, our predictor achieved the best performance (86.62% accuracy and 0.737 Matthews correlation coefficient). High prediction accuracy and successful prediction performance suggested that our method can be a useful approach to identify RNA-binding proteins from sequence information. PMID:26543860

  15. Characterization of indoor air contaminants in a randomly selected set of commercial nail salons in Salt Lake County, Utah, USA.

    PubMed

    Alaves, Victor M; Sleeth, Darrah K; Thiese, Matthew S; Larson, Rodney R

    2013-01-01

    Air samples were collected in 12 randomly selected commercial nail salons in Salt Lake County, Utah. Measurements of salon physical/chemical parameters (room volume, CO2 levels) were obtained. Volatile organic compound (VOC) concentrations were collected using summa air canisters and sorbent media tubes for an 8-h period. Multivariate analyses were used to identify relationships between salon physical/chemical characteristics and the VOCs found in the air samples. The ACGIH(®) additive mixing formula was also applied to determine if there were potential overexposures to the combined airborne concentrations of chemicals monitored. Methyl methacrylate was detected in 58% of the establishments despite having been banned for use in nail products by the state of Utah. Formaldehyde was found above the NIOSH REL(®) (0.016 ppm) in 58% of the establishments. Given the assortment of VOCs to which nail salon workers are potentially exposed, a combination of engineering as well as personal protective equipment is recommended.

  16. Prospective randomized study of selective neck dissection versus observation for N0 neck of early tongue carcinoma.

    PubMed

    Yuen, Anthony Po-Wing; Ho, Chiu Ming; Chow, Tam Lin; Tang, Lap Chiu; Cheung, Wing Yung; Ng, Raymond Wai-Man; Wei, William Ignace; Kong, Chi Kwan; Book, Kwok Shing; Yuen, Wai Cheung; Lam, Alfred King-Yin; Yuen, Nancy Wah-Fun; Trendell-Smith, Nigel Jeremy; Chan, Yue Wai; Wong, Birgitta Yee-Hang; Li, George Kam-Hop; Ho, Ambrose Chung-Wai; Ho, Wai Kuen; Wong, Sau Yan; Yao, Tzy-Jyun

    2009-06-01

    There are controversies on the benefits of elective neck dissection (END) for oral tongue carcinoma. This is a prospective randomized study of elective selective I, II, III neck dissection versus observation for N0 neck of stage I to II oral tongue carcinoma. There were 35 patients on the observation arm and 36 patients on the END arm. The main outcome assessment parameters are node-related mortality and disease-specific survival rate. There were 11 patients in the observed arm and 2 patients in the END arm who developed nodal recurrence alone without associated local or distant recurrence. All 13 patients were salvaged, and no patient died of nodal recurrence. The 5-year disease-specific survival rate was 87% for the observation arm and was 89% for the END arm; the 2% difference was not significant. Observation may be an acceptable alternative to END if strict adherence to a cancer surveillance protocol is followed. (c) 2009 Wiley Periodicals, Inc.

  17. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    PubMed Central

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  18. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness.

    PubMed

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia's marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to 'small p and large n' problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  19. Polarimetric SAR decomposition parameter subset selection and their optimal dynamic range evaluation for urban area classification using Random Forest

    NASA Astrophysics Data System (ADS)

    Hariharan, Siddharth; Tirodkar, Siddhesh; Bhattacharya, Avik

    2016-02-01

    Urban area classification is important for monitoring the ever increasing urbanization and studying its environmental impact. Two NASA JPL's UAVSAR datasets of L-band (wavelength: 23 cm) were used in this study for urban area classification. The two datasets used in this study are different in terms of urban area structures, building patterns, their geometric shapes and sizes. In these datasets, some urban areas appear oriented about the radar line of sight (LOS) while some areas appear non-oriented. In this study, roll invariant polarimetric SAR decomposition parameters were used to classify these urban areas. Random Forest (RF), which is an ensemble decision tree learning technique, was used in this study. RF performs parameter subset selection as a part of its classification procedure. In this study, parameter subsets were obtained and analyzed to infer scattering mechanisms useful for urban area classification. The Cloude-Pottier α, the Touzi dominant scattering amplitude αs1 and the anisotropy A were among the top six important parameters selected for both the datasets. However, it was observed that these parameters were ranked differently for the two datasets. The urban area classification using RF was compared with the Support Vector Machine (SVM) and the Maximum Likelihood Classifier (MLC) for both the datasets. RF outperforms SVM by 4% and MLC by 12% in Dataset 1. It also outperforms SVM and MLC by 3.5% and 11% respectively in Dataset 2.

  20. DNABP: Identification of DNA-Binding Proteins Based on Feature Selection Using a Random Forest and Predicting Binding Residues

    PubMed Central

    Guo, Jing; Sun, Xiao

    2016-01-01

    DNA-binding proteins are fundamentally important in cellular processes. Several computational-based methods have been developed to improve the prediction of DNA-binding proteins in previous years. However, insufficient work has been done on the prediction of DNA-binding proteins from protein sequence information. In this paper, a novel predictor, DNABP (DNA-binding proteins), was designed to predict DNA-binding proteins using the random forest (RF) classifier with a hybrid feature. The hybrid feature contains two types of novel sequence features, which reflect information about the conservation of physicochemical properties of the amino acids, and the binding propensity of DNA-binding residues and non-binding propensities of non-binding residues. The comparisons with each feature demonstrated that these two novel features contributed most to the improvement in predictive ability. Furthermore, to improve the prediction performance of the DNABP model, feature selection using the minimum redundancy maximum relevance (mRMR) method combined with incremental feature selection (IFS) was carried out during the model construction. The results showed that the DNABP model could achieve 86.90% accuracy, 83.76% sensitivity, 90.03% specificity and a Matthews correlation coefficient of 0.727. High prediction accuracy and performance comparisons with previous research suggested that DNABP could be a useful approach to identify DNA-binding proteins from sequence information. The DNABP web server system is freely available at http://www.cbi.seu.edu.cn/DNABP/. PMID:27907159

  1. K-Ras(G12D)-selective inhibitory peptides generated by random peptide T7 phage display technology.

    PubMed

    Sakamoto, Kotaro; Kamada, Yusuke; Sameshima, Tomoya; Yaguchi, Masahiro; Niida, Ayumu; Sasaki, Shigekazu; Miwa, Masanori; Ohkubo, Shoichi; Sakamoto, Jun-Ichi; Kamaura, Masahiro; Cho, Nobuo; Tani, Akiyoshi

    2017-03-11

    Amino-acid mutations of Gly(12) (e.g. G12D, G12V, G12C) of V-Ki-ras2 Kirsten rat sarcoma viral oncogene homolog (K-Ras), the most promising drug target in cancer therapy, are major growth drivers in various cancers. Although over 30 years have passed since the discovery of these mutations in most cancer patients, effective mutated K-Ras inhibitors have not been marketed. Here, we report novel and selective inhibitory peptides to K-Ras(G12D). We screened random peptide libraries displayed on T7 phage against purified recombinant K-Ras(G12D), with thorough subtraction of phages bound to wild-type K-Ras, and obtained KRpep-2 (Ac-RRCPLYISYDPVCRR-NH2) as a consensus sequence. KRpep-2 showed more than 10-fold binding- and inhibition-selectivity to K-Ras(G12D), both in SPR analysis and GDP/GTP exchange enzyme assay. KD and IC50 values were 51 and 8.9 nM, respectively. After subsequent sequence optimization, we successfully generated KRpep-2d (Ac-RRRRCPLYISYDPVCRRRR-NH2) that inhibited enzyme activity of K-Ras(G12D) with IC50 = 1.6 nM and significantly suppressed ERK-phosphorylation, downstream of K-Ras(G12D), along with A427 cancer cell proliferation at 30 μM peptide concentration. To our knowledge, this is the first report of a K-Ras(G12D)-selective inhibitor, contributing to the development and study of K-Ras(G12D)-targeting drugs.

  2. The prevalence of symptoms associated with pulmonary tuberculosis in randomly selected children from a high burden community.

    PubMed

    Marais, B J; Obihara, C C; Gie, R P; Schaaf, H S; Hesseling, A C; Lombard, C; Enarson, D; Bateman, E; Beyers, N

    2005-11-01

    Diagnosis of childhood tuberculosis is problematic and symptom based diagnostic approaches are often promoted in high burden settings. This study aimed (i) to document the prevalence of symptoms associated with tuberculosis among randomly selected children living in a high burden community, and (ii) to compare the prevalence of these symptoms in children without tuberculosis to those in children with newly diagnosed tuberculosis. A cross sectional, community based survey was performed on a 15% random sample of residential addresses. A symptom based questionnaire and tuberculin skin test (TST) were completed in all children. Chest radiographs were performed according to South African National Tuberculosis Control Program guidelines. Results were available in 1415 children of whom 451 (31.9%) were TST positive. Tuberculosis was diagnosed in 18 (1.3%) children. Of the 1397 children without tuberculosis, 253 (26.4%) reported a cough during the preceding 3 months. Comparison of individual symptoms (cough, dyspnoea, chest pain, haemoptysis, anorexia, weight loss, fatigue, fever, night sweats) in children with and without tuberculosis revealed that only weight loss differed significantly (OR = 4.5, 95% CI 1.5 to 12.3), while the combination of cough and weight loss was most significant (OR = 5.4, 95% CI 1.7 to 16.9). Children with newly diagnosed tuberculosis reported no symptoms in 50% of cases. Children from this high burden community frequently reported symptoms associated with tuberculosis. These symptoms had limited value to differentiate children diagnosed with tuberculosis from those without tuberculosis. Improved case definitions and symptom characterisation are required when evaluating the diagnostic value of symptoms.

  3. Quantitative structure-property relationships of retention indices of some sulfur organic compounds using random forest technique as a variable selection and modeling method.

    PubMed

    Goudarzi, Nasser; Shahsavani, Davood; Emadi-Gandaghi, Fereshteh; Chamjangali, Mansour Arab

    2016-10-01

    In this work, a noble quantitative structure-property relationship technique is proposed on the basis of the random forest for prediction of the retention indices of some sulfur organic compounds. In order to calculate the retention indices of these compounds, the theoretical descriptors produced using their molecular structures are employed. The influence of the significant parameters affecting the capability of the developed random forest prediction power such as the number of randomly selected variables applied to split each node (m) and the number of trees (nt ) is studied to obtain the best model. After optimizing the nt and m parameters, the random forest model conducted for m = 70 and nt = 460 was found to yield the best results. The artificial neural network and multiple linear regression modeling techniques are also used to predict the retention index values for these compounds for comparison with the results of random forest model. The descriptors selected by the stepwise regression and random forest model are used to build the artificial neural network models. The results achieved showed the superiority of the random forest model over the other models for prediction of the retention indices of the studied compounds.

  4. Clinical validation of embryo culture and selection by morphokinetic analysis: a randomized, controlled trial of the EmbryoScope.

    PubMed

    Rubio, Irene; Galán, Arancha; Larreategui, Zaloa; Ayerdi, Fernando; Bellver, Jose; Herrero, Javier; Meseguer, Marcos

    2014-11-01

    To determine whether incubation in the integrated EmbryoScope time-lapse monitoring system (TMS) and selection supported by the use of a multivariable morphokinetic model improve reproductive outcomes in comparison with incubation in a standard incubator (SI) embryo culture and selection based exclusively on morphology. Prospective, randomized, double-blinded, controlled study. University-affiliated private in vitro fertilization (IVF) clinic. Eight hundred forty-three infertile couples undergoing intracytoplasmic sperm injection (ICSI). No patient intervention; embryos cultured in SI with development evaluated only by morphology (control group) and embryos cultured in TMS with embryo selection was based on a multivariable model (study group). Rates of embryo implantation, pregnancy, ongoing pregnancy (OPR), and early pregnancy loss. Analyzing per treated cycle, the ongoing pregnancy rate was statistically significantly increased 51.4% (95% CI, 46.7-56.0) for the TMS group compared with 41.7% (95% CI, 36.9-46.5) for the SI group. For pregnancy rate, differences were not statistically significant at 61.6% (95% CI, 56.9-66.0) versus 56.3% (95% CI, 51.4-61.0). The results per transfer were similar: statistically significant differences in ongoing pregnancy rate of 54.5% (95% CI, 49.6-59.2) versus 45.3% (95% CI, 40.3-50.4) and not statistically significant for pregnancy rate at 65.2% (95% CI, 60.6-69.8) versus 61.1% (95% CI, 56.2-66.1). Early pregnancy loss was statistically significantly decreased for the TMS group with 16.6% (95% CI, 12.6-21.4) versus 25.8% (95% CI, 20.6-31.9). The implantation rate was statistically significantly increased at 44.9% (95% CI, 41.4-48.4) versus 37.1% (95% CI, 33.6-40.7). The strategy of culturing and selecting embryos in the integrated EmbryoScope time-lapse monitoring system improves reproductive outcomes. NCT01549262. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Biased random key genetic algorithm with insertion and gender selection for capacitated vehicle routing problem with time windows

    NASA Astrophysics Data System (ADS)

    Rochman, Auliya Noor; Prasetyo, Hari; Nugroho, Munajat Tri

    2017-06-01

    Vehicle Routing Problem (VRP) often occurs when the manufacturers need to distribute their product to some customers/outlets. The distribution process is typically restricted by the capacity of the vehicle and the working hours at the distributor. This type of VRP is also known as Capacitated Vehicle Routing Problem with Time Windows (CVRPTW). A Biased Random Key Genetic Algorithm (BRKGA) was designed and coded in MATLAB to solve the CVRPTW case of soft drink distribution. The standard BRKGA was then modified by applying chromosome insertion into the initial population and defining chromosome gender for parent undergoing crossover operation. The performance of the established algorithms was then compared to a heuristic procedure for solving a soft drink distribution. Some findings are revealed (1) the total distribution cost of BRKGA with insertion (BRKGA-I) results in a cost saving of 39% compared to the total cost of heuristic method, (2) BRKGA with the gender selection (BRKGA-GS) could further improve the performance of the heuristic method. However, the BRKGA-GS tends to yield worse results compared to that obtained from the standard BRKGA.

  6. Estimating the Optimal Personalized Treatment Strategy Based on Selected Variables to Prolong Survival via Random Survival Forest with Weighted Bootstrap.

    PubMed

    Shen, Jincheng; Wang, Lu; Daignault, Stephanie; Spratt, Daniel E; Morgan, Todd M; Taylor, Jeremy M G

    2017-09-21

    A personalized treatment policy requires defining the optimal treatment for each patient based on their clinical and other characteristics. Here we consider a commonly encountered situation in practice, when analyzing data from observational cohorts, that there are auxiliary variables which affect both the treatment and the outcome, yet these variables are not of primary interest to be included in a generalizable treatment strategy. Furthermore, there is not enough prior knowledge of the effect of the treatments or of the importance of the covariates for us to explicitly specify the dependency between the outcome and different covariates, thus we choose a model that is flexible enough to accommodate the possibly complex association of the outcome on the covariates. We consider observational studies with a survival outcome and propose to use Random Survival Forest with Weighted Bootstrap (RSFWB) to model the counterfactual outcomes while marginalizing over the auxiliary covariates. By maximizing the restricted mean survival time, we estimate the optimal regime for a target population based on a selected set of covariates. Simulation studies illustrate that the proposed method performs reliably across a range of different scenarios. We further apply RSFWB to a prostate cancer study.

  7. Acute Hemodynamic Effects of a Selective Serotonin Reuptake Inhibitor in Postural Tachycardia Syndrome: A Randomized, Crossover Trial

    PubMed Central

    Mar, Philip L; Raj, Vidya; Black, Bonnie K; Biaggioni, Italo; Shibao, Cyndya A; Paranjape, Sachin Y; Dupont, William D; Robertson, David; Raj, Satish R

    2014-01-01

    Background Selective serotonin reuptake inhibitors (SSRIs) are often prescribed in patients with postural tachycardia syndrome (POTS), and act at synaptic terminals to increase monoamine neurotransmitters. We hypothesized that they act to increase blood pressure (BP) and attenuate reflex tachycardia, thereby improving symptoms. Acute hemodynamic profiles after SSRI administration in POTS patients have not previously been reported. Methods Patients with POTS (n=39; F=37, 39 ±9 years) underwent a randomized crossover trial with sertraline 50mg and placebo. Heart rate (HR), systolic, diastolic, and mean BP were measured with the patient seated and standing for 10 minutes prior to drug or placebo administration, and then hourly for 4 hours. The primary endpoint was standing HR at 4 hours. Results At 4 hours, standing HR and systolic BP were not significantly different between sertraline and placebo. Seated systolic (106±12 mmHg vs. 101±8 mmHg; P=0.041), diastolic (72±8 mmHg vs. 69±8 mmHg; P=0.022), and mean BP (86±9 mmHg vs. 81±9 mmHg; P=0.007) were significantly higher after sertraline administration than placebo. At 4 hours, symptoms were worse with sertraline than placebo. Conclusions Sertraline had a modest pressor effect in POTS patients, but this did not translate into a reduced HR or improved symptoms. PMID:24227635

  8. Effects of a selective serotonin reuptake inhibitor escitalopram on the cutaneous silent period: a randomized controlled study in healthy volunteers.

    PubMed

    Pujia, Francesco; Serrao, Mariano; Brienza, Marianna; Vestrini, Elisa; Valente, Gabriele Oreste; Coppola, Gianluca; Pierelli, Francesco

    2014-04-30

    The cutaneous silent period (CSP) involves a transient inhibition of the electromyographic (EMG) activity in the hand muscles induced by a painful electrical stimulation of the digital nerves. The neurotransmitters potentially involved in mediating the CSP have not been completely elucidated thus far. However, few studies suggest that the monoaminergic system may play a role in the CSP. We elicited CSPs in the first dorsal interosseous muscle of the right hand before and 3h after administration of a single oral dose of the selective serotonin reuptake inhibitor escitalopram (20mg) or placebo. The two experimental sessions (drug and placebo) were performed in a random order at ≥1-week intervals. All recordings were numbered anonymously and analysed offline in a blind manner by one investigator. A significant increase in the CSP duration was observed 3h after escitalopram administration (p=0.01), and no changes were observed in the reflex latency and subjective pain sensation (p>0.05). No significant changes were observed in the CSP duration in subjects who received the placebo (all, p>0.05). Our results indicate that escitalopram increases the central disposition of serotonin and increases the activity of the spinal inhibitory interneurons on the α-motoneurons of the hand muscles. Thus, our results indicate the involvement of the monoaminergic system in controlling the spinal pain mechanisms by supraspinal descending pathways originating from the brainstem neural structures.

  9. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling

    PubMed Central

    Zhou, Fuqun; Zhang, Aining

    2016-01-01

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2–3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests’ features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data. PMID:27792152

  10. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling.

    PubMed

    Zhou, Fuqun; Zhang, Aining

    2016-10-25

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.

  11. High resolution general purpose D-layer experiment for EISCAT incoherent scatter radars using selected set of random codes

    NASA Astrophysics Data System (ADS)

    Turunen, T.; Westman, A.; Häggström, I.; Wannberg, G.

    2002-09-01

    The ionospheric D-layer is a narrow bandwidth radar target often with a very small scattering cross section. The target autocorrelation function can be obtained by transmitting a series of relatively short coded pulses and computing the correlation between data obtained from different pulses. The spatial resolution should be as high as possible and the spatial side lobes of the codes used should be as small as possible. However, due to the short pulse repetition period (in the order of milliseconds) at any instant, the radar receives detectable scattered signals not only from the pulse illuminating the D-region but also from 3 5 ambiguous-range pulses, which makes it difficult to produce a reliable estimate near zero lag of the autocorrelation function. A new experimental solution to this measurement problem, using a selected set of 40-bit random codes with 4 µs elements giving 600 m spatial resolution is presented. The zero lag is approximated by dividing the pulse into two 20-bit codes and computing the correlation between those two pulses. The lowest altitudes of the E-layer are measured by dividing the pulse into 5 pieces of 8 bits, which allows for computation of 4 lags. In addition, coherent integration of data from four pulses is used for obtaining separately the autocorrelation function estimate for the lowest altitudes and in cases when the target contains structures with a long coherence time. Design details and responses of the experiment are given, and analysed test data are shown.

  12. A Preliminary Investigation of the Jack-Bean Urease Inhibition by Randomly Selected Traditionally Used Herbal Medicine

    PubMed Central

    Biglar, Mahmood; Soltani, Khadijeh; Nabati, Farzaneh; Bazl, Roya; Mojab, Faraz; Amanlou, Massoud

    2012-01-01

    Helicobacter pylori (H. pylori) infection leads to different clinical and pathological outcomes in humans, including chronic gastritis, peptic ulcer disease and gastric neoplasia and even gastric cancer and its eradiation dependst upon multi-drug therapy. The most effective therapy is still unknown and prompts people to make great efforts to find better and more modern natural or synthetic anti-H. pylori agents. In this report 21 randomly selected herbal methanolic extracts were evaluated for their effect on inhibition of Jack-bean urease using the indophenol method as described by Weatherburn. The inhibition potency was measured by UV spectroscopy technique at 630 nm which attributes to released ammonium. Among these extracts, five showed potent inhibitory activities with IC50 ranges of 18-35 μg/mL. These plants are Matricaria disciforme (IC50:35 μg/mL), Nasturtium officinale (IC50:18 μg/mL), Punica granatum (IC50:30 μg/mL), Camelia sinensis (IC50:35 μg/mL), Citrus aurantifolia (IC50:28 μg/mL). PMID:24250509

  13. Enumeration of Escherichia coli cells on chicken carcasses as a potential measure of microbial process control in a random selection of slaughter establishments in the United States

    USDA-ARS?s Scientific Manuscript database

    The purpose of this study was to evaluate whether the measurement of Escherichia coli levels at two points during the chicken slaughter process has utility as a measure of quality control. A one year long survey was conducted during 2004 and 2005 in 20 randomly selected United States chicken slaught...

  14. The Long-Term Effectiveness of a Selective, Personality-Targeted Prevention Program in Reducing Alcohol Use and Related Harms: A Cluster Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Newton, Nicola C.; Conrod, Patricia J.; Slade, Tim; Carragher, Natacha; Champion, Katrina E.; Barrett, Emma L.; Kelly, Erin V.; Nair, Natasha K.; Stapinski, Lexine; Teesson, Maree

    2016-01-01

    Background: This study investigated the long-term effectiveness of Preventure, a selective personality-targeted prevention program, in reducing the uptake of alcohol, harmful use of alcohol, and alcohol-related harms over a 3-year period. Methods: A cluster randomized controlled trial was conducted to assess the effectiveness of Preventure.…

  15. The Long-Term Effectiveness of a Selective, Personality-Targeted Prevention Program in Reducing Alcohol Use and Related Harms: A Cluster Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Newton, Nicola C.; Conrod, Patricia J.; Slade, Tim; Carragher, Natacha; Champion, Katrina E.; Barrett, Emma L.; Kelly, Erin V.; Nair, Natasha K.; Stapinski, Lexine; Teesson, Maree

    2016-01-01

    Background: This study investigated the long-term effectiveness of Preventure, a selective personality-targeted prevention program, in reducing the uptake of alcohol, harmful use of alcohol, and alcohol-related harms over a 3-year period. Methods: A cluster randomized controlled trial was conducted to assess the effectiveness of Preventure.…

  16. Robust prediction of B-factor profile from sequence using two-stage SVR based on random forest feature selection.

    PubMed

    Pan, Xiao-Yong; Shen, Hong-Bin

    2009-01-01

    B-factor is highly correlated with protein internal motion, which is used to measure the uncertainty in the position of an atom within a crystal structure. Although the rapid progress of structural biology in recent years makes more accurate protein structures available than ever, with the avalanche of new protein sequences emerging during the post-genomic Era, the gap between the known protein sequences and the known protein structures becomes wider and wider. It is urgent to develop automated methods to predict B-factor profile from the amino acid sequences directly, so as to be able to timely utilize them for basic research. In this article, we propose a novel approach, called PredBF, to predict the real value of B-factor. We firstly extract both global and local features from the protein sequences as well as their evolution information, then the random forests feature selection is applied to rank their importance and the most important features are inputted to a two-stage support vector regression (SVR) for prediction, where the initial predicted outputs from the 1(st) SVR are further inputted to the 2nd layer SVR for final refinement. Our results have revealed that a systematic analysis of the importance of different features makes us have deep insights into the different contributions of features and is very necessary for developing effective B-factor prediction tools. The two-layer SVR prediction model designed in this study further enhanced the robustness of predicting the B-factor profile. As a web server, PredBF is freely available at: http://www.csbio.sjtu.edu.cn/bioinf/PredBF for academic use.

  17. The UCSD Statin Study: a randomized controlled trial assessing the impact of statins on selected noncardiac outcomes.

    PubMed

    Golomb, Beatrice A; Criqui, Michael H; White, Halbert L; Dimsdale, Joel E

    2004-04-01

    There has been persistent controversy regarding possible favorable or adverse effects of statins or of cholesterol reduction on cognition, mood and behavior (including aggressive or violent behavior), muscle function, and quality of life. The UCSD Statin Study seeks to ascertain the beneficial or adverse effects of statin cholesterol-lowering drugs on a set of noncardiac endpoints, including cognition, behavior, and serotonin biochemistry. The study will enroll 1000 subjects (minimum 20% female) of mixed ethnicity from San Diego. Subjects must be age 20 and older, postmenopausal if female, without known cardiovascular disease or diabetes, and with LDL-cholesterol between 115 and 190 mg/dl. Subjects will be randomized to a double-blind, placebo-controlled trial with assignment 1/3, 1/3, 1/3 to placebo, simvastatin 20 mg, or pravastatin 40 mg (equipotent LDL-cholesterol-lowering doses for drug arms with simvastatin and pravastatin chosen to represent the extremes of the lipophilicity spectrum) for 6 months of treatment followed by 2 months postcessation follow-up. Primary outcomes are cognition (cognitive battery), irritability/aggression (behavior measure), and serotonin (gauged by whole blood serotonin), assessed as the difference between baseline and 6 months, judging combined statin groups vs. placebo. Secondary outcomes include mood (CES-D and Wakefield depression inventory), quality of life (SF-12V), sleep (Leeds sleep scale, modified), and secondary aggression measures (Conflict Tactics Scale; Overt Aggression Scale, Modified). Cardiovascular reactivity will be examined in a 10% subset. As additional secondary endpoints, primary and selected secondary outcomes will be assessed by statin assignment (lipophilic simvastatin vs. hydrophilic pravastatin). "Reversibility" of changes, if any, at 2 months postcessation will be determined. If effects (favorable or unfavorable) are identified, we will seek to ascertain whether there are baseline variables that predict

  18. The UCSD Statin Study: a randomized controlled trial assessing the impact of statins on selected noncardiac outcomes

    PubMed Central

    Golomb, Beatrice A.; Criqui, Michael H.; White, Halbert L.; Dimsdale, Joel E.

    2013-01-01

    There has been persistent controversy regarding possible favorable or adverse effects of statins or of cholesterol reduction on cognition, mood and behavior (including aggressive or violent behavior), muscle function, and quality of life. The UCSD Statin Study seeks to ascertain the beneficial or adverse effects of statin cholesterol-lowering drugs on a set of noncardiac endpoints, including cognition, behavior, and serotonin biochemistry. The study will enroll 1000 subjects (minimum 20% female) of mixed ethnicity from San Diego. Subjects must be age 20 and older, postmenopausal if female, without known cardiovascular disease or diabetes, and with LDL-cholesterol between 115 and 190 mg/dl. Subjects will be randomized to a double-blind, placebo-controlled trial with assignment 1/3, 1/3, 1/3 to placebo, simvastatin 20 mg, or pravastatin 40 mg (equipotent LDL-cholesterol-lowering doses for drug arms with simvastatin and pravastatin chosen to represent the extremes of the lipophilicity spectrum) for 6 months of treatment followed by 2 months postcessation follow-up. Primary outcomes are cognition (cognitive battery), irritability/aggression (behavior measure), and serotonin (gauged by whole blood serotonin), assessed as the difference between baseline and 6 months, judging combined statin groups vs. placebo. Secondary outcomes include mood (CES-D and Wakefield depression inventory), quality of life (SF-12V), sleep (Leeds sleep scale, modified), and secondary aggression measures (Conflict Tactics Scale; Overt Aggression Scale, Modified). Cardiovascular reactivity will be examined in a 10% subset. As additional secondary endpoints, primary and selected secondary outcomes will be assessed by statin assignment (lipophilic simvastatin vs. hydrophilic pravastatin). “Reversibility” of changes, if any, at 2 months postcessation will be determined. If effects (favorable or unfavorable) are identified, we will seek to ascertain whether there are baseline variables that

  19. Water chemistry in 179 randomly selected Swedish headwater streams related to forest production, clear-felling and climate.

    PubMed

    Löfgren, Stefan; Fröberg, Mats; Yu, Jun; Nisell, Jakob; Ranneby, Bo

    2014-12-01

    From a policy perspective, it is important to understand forestry effects on surface waters from a landscape perspective. The EU Water Framework Directive demands remedial actions if not achieving good ecological status. In Sweden, 44 % of the surface water bodies have moderate ecological status or worse. Many of these drain catchments with a mosaic of managed forests. It is important for the forestry sector and water authorities to be able to identify where, in the forested landscape, special precautions are necessary. The aim of this study was to quantify the relations between forestry parameters and headwater stream concentrations of nutrients, organic matter and acid-base chemistry. The results are put into the context of regional climate, sulphur and nitrogen deposition, as well as marine influences. Water chemistry was measured in 179 randomly selected headwater streams from two regions in southwest and central Sweden, corresponding to 10 % of the Swedish land area. Forest status was determined from satellite images and Swedish National Forest Inventory data using the probabilistic classifier method, which was used to model stream water chemistry with Bayesian model averaging. The results indicate that concentrations of e.g. nitrogen, phosphorus and organic matter are related to factors associated with forest production but that it is not forestry per se that causes the excess losses. Instead, factors simultaneously affecting forest production and stream water chemistry, such as climate, extensive soil pools and nitrogen deposition, are the most likely candidates The relationships with clear-felled and wetland areas are likely to be direct effects.

  20. Factors that influence the selection of sterile glove brand: a randomized controlled trial evaluating the performance and cost of gloves.

    PubMed

    Johnson, Rebecca L; Smith, Hugh M; Duncan, Christopher M; Torsher, Laurence C; Schroeder, Darrell R; Hebl, James R

    2013-07-01

    To determine whether glove use modifies tactile and psychomotor performance of health care providers when compared with no glove use and to evaluate factors that influence the selection of sterile glove brand. Forty-two anesthesia providers (nine anesthesiologists, seven nurse anesthetists, 20 residents, six student nurse anesthetists) enrolled in and completed this cross-over randomized trial from May 2010 until August 2011. Participants underwent standardized psychomotor testing while wearing five different types of protective gloves. Assessments of psychomotor performance included tactile, fine motor/dexterity, and hand-eye coordination tests. Subjective ratings of glove comfort and performance were reported at the completion of each glove trial. The manufacturer's suggested retail price was collected for each glove tested. There were statistically significant differences in touch sensitivity for all nerve distributions, with all glove types resulting in less sensitivity than a bare hand. When compared with the non-sterile glove, only the thickest glove tested (Ansell Perry Orthopaedic) was found to have less touch sensitivity. Fine motor dexterity testing revealed no statistically significant differences in time to completion amongst glove types or bare handed performance. In hand-eye coordination testing across treatment conditions, the thickest glove tested (Ansell Perry(®) Orthopaedic) was the only glove to show a statistically significant difference from a bare hand. There were statistically significant differences in glove comfort ratings across glove types, with latex-free, powder-free (Cardinal Esteem(®)), and latex powder-free (Mölnlycke-Biogel(®)) rated highest; however, there were no statistically significant differences in subjective performance ratings across glove types. Given the observed similarities in touch sensitivity and psychomotor performance associated with five different glove types, our results suggest that subjective provider

  1. Sexual selection has minimal impact on effective population sizes in species with high rates of random offspring mortality: An empirical demonstration using fitness distributions.

    PubMed

    Pischedda, Alison; Friberg, Urban; Stewart, Andrew D; Miller, Paige M; Rice, William R

    2015-10-01

    The effective population size (N(e)) is a fundamental parameter in population genetics that influences the rate of loss of genetic diversity. Sexual selection has the potential to reduce N(e) by causing the sex-specific distributions of individuals that successfully reproduce to diverge. To empirically estimate the effect of sexual selection on N(e), we obtained fitness distributions for males and females from an outbred, laboratory-adapted population of Drosophila melanogaster. We observed strong sexual selection in this population (the variance in male reproductive success was ∼14 times higher than that for females), but found that sexual selection had only a modest effect on N(e), which was 75% of the census size. This occurs because the substantial random offspring mortality in this population diminishes the effects of sexual selection on N(e), a result that necessarily applies to other high fecundity species. The inclusion of this random offspring mortality creates a scaling effect that reduces the variance/mean ratios for male and female reproductive success and causes them to converge. Our results demonstrate that measuring reproductive success without considering offspring mortality can underestimate Ne and overestimate the genetic consequences of sexual selection. Similarly, comparing genetic diversity among different genomic components may fail to detect strong sexual selection. © 2015 The Author(s). Evolution © 2015 The Society for the Study of Evolution.

  2. Sexual selection has minimal impact on effective population sizes in species with high rates of random offspring mortality: an empirical demonstration using fitness distributions

    PubMed Central

    Pischedda, Alison; Friberg, Urban; Stewart, Andrew D.; Miller, Paige M.; Rice, William R.

    2015-01-01

    The effective population size (Ne) is a fundamental parameter in population genetics that influences the rate of loss of genetic diversity. Sexual selection has the potential to reduce Ne by causing the sex-specific distributions of individuals that successfully reproduce to diverge. To empirically estimate the effect of sexual selection on Ne, we obtained fitness distributions for males and females from an outbred, laboratory-adapted population of Drosophila melanogaster. We observed strong sexual selection in this population (the variance in male reproductive success was ∼14 times higher than that for females), but found that sexual selection had only a modest effect on Ne, which was 75% of the census size. This occurs because the substantial random offspring mortality in this population diminishes the effects of sexual selection on Ne, a result that necessarily applies to other high fecundity species. The inclusion of this random offspring mortality creates a scaling effect that reduces the variance/mean ratios for male and female reproductive success and causes them to converge. Our results demonstrate that measuring reproductive success without considering offspring mortality can underestimate Ne and overestimate the genetic consequences of sexual selection. Similarly, comparing genetic diversity among different genomic components may fail to detect strong sexual selection. PMID:26374275

  3. The Ecological Effects of Universal and Selective Violence Prevention Programs for Middle School Students: A Randomized Trial

    ERIC Educational Resources Information Center

    Simon, Thomas R.; Ikeda, Robin M.; Smith, Emilie Phillips; Reese, Le'Roy E.; Rabiner, David L.; Miller, Shari; Winn, Donna-Marie; Dodge, Kenneth A.; Asher, Steven R.; Horne, Arthur M.; Orpinas, Pamela; Martin, Roy; Quinn, William H.; Tolan, Patrick H.; Gorman-Smith, Deborah; Henry, David B.; Gay, Franklin N.; Schoeny, Michael; Farrell, Albert D.; Meyer, Aleta L.; Sullivan, Terri N.; Allison, Kevin W.

    2009-01-01

    This study reports the findings of a multisite randomized trial evaluating the separate and combined effects of 2 school-based approaches to reduce violence among early adolescents. A total of 37 schools at 4 sites were randomized to 4 conditions: (1) a universal intervention that involved implementing a student curriculum and teacher training…

  4. The Ecological Effects of Universal and Selective Violence Prevention Programs for Middle School Students: A Randomized Trial

    ERIC Educational Resources Information Center

    Simon, Thomas R.; Ikeda, Robin M.; Smith, Emilie Phillips; Reese, Le'Roy E.; Rabiner, David L.; Miller, Shari; Winn, Donna-Marie; Dodge, Kenneth A.; Asher, Steven R.; Horne, Arthur M.; Orpinas, Pamela; Martin, Roy; Quinn, William H.; Tolan, Patrick H.; Gorman-Smith, Deborah; Henry, David B.; Gay, Franklin N.; Schoeny, Michael; Farrell, Albert D.; Meyer, Aleta L.; Sullivan, Terri N.; Allison, Kevin W.

    2009-01-01

    This study reports the findings of a multisite randomized trial evaluating the separate and combined effects of 2 school-based approaches to reduce violence among early adolescents. A total of 37 schools at 4 sites were randomized to 4 conditions: (1) a universal intervention that involved implementing a student curriculum and teacher training…

  5. Changing friend selection in middle school: A social network analysis of a randomized intervention study designed to prevent adolescent problem behavior

    PubMed Central

    DeLay, Dawn; Ha, Thao; Van Ryzin, Mark; Winter, Charlotte; Dishion, Thomas J.

    2015-01-01

    Adolescent friendships that promote problem behavior are often chosen in middle school. The current study examines the unintended impact of a randomized school based intervention on the selection of friends in middle school, as well as on observations of deviant talk with friends five years later. Participants included 998 middle school students (526 boys and 472 girls) recruited at the onset of middle school (age 11-12 years) from three public middle schools participating in the Family Check-up model intervention. The current study focuses only on the effects of the SHAPe curriculum—one level of the Family Check-up model—on friendship choices. Participants nominated friends and completed measures of deviant peer affiliation. Approximately half of the sample (n=500) was randomly assigned to the intervention and the other half (n=498) comprised the control group within each school. The results indicate that the SHAPe curriculum affected friend selection within School 1, but not within Schools 2 or 3. The effects of friend selection in School 1 translated into reductions in observed deviancy training five years later (age 16-17 years). By coupling longitudinal social network analysis with a randomized intervention study the current findings provide initial evidence that a randomized public middle school intervention can disrupt the formation of deviant peer groups and diminish levels of adolescent deviance five years later. PMID:26377235

  6. Changing Friend Selection in Middle School: A Social Network Analysis of a Randomized Intervention Study Designed to Prevent Adolescent Problem Behavior.

    PubMed

    DeLay, Dawn; Ha, Thao; Van Ryzin, Mark; Winter, Charlotte; Dishion, Thomas J

    2016-04-01

    Adolescent friendships that promote problem behavior are often chosen in middle school. The current study examines the unintended impact of a randomized school-based intervention on the selection of friends in middle school, as well as on observations of deviant talk with friends 5 years later. Participants included 998 middle school students (526 boys and 472 girls) recruited at the onset of middle school (age 11-12 years) from three public middle schools participating in the Family Check-up model intervention. The current study focuses only on the effects of the SHAPe curriculum-one level of the Family Check-up model-on friendship choices. Participants nominated friends and completed measures of deviant peer affiliation. Approximately half of the sample (n = 500) was randomly assigned to the intervention, and the other half (n = 498) comprised the control group within each school. The results indicate that the SHAPe curriculum affected friend selection within school 1 but not within schools 2 or 3. The effects of friend selection in school 1 translated into reductions in observed deviancy training 5 years later (age 16-17 years). By coupling longitudinal social network analysis with a randomized intervention study, the current findings provide initial evidence that a randomized public middle school intervention can disrupt the formation of deviant peer groups and diminish levels of adolescent deviance 5 years later.

  7. Affinity selection of Nipah and Hendra virus-related vaccine candidates from a complex random peptide library displayed on bacteriophage virus-like particles

    DOEpatents

    Peabody, David S.; Chackerian, Bryce; Ashley, Carlee; Carnes, Eric; Negrete, Oscar

    2017-01-24

    The invention relates to virus-like particles of bacteriophage MS2 (MS2 VLPs) displaying peptide epitopes or peptide mimics of epitopes of Nipah Virus envelope glycoprotein that elicit an immune response against Nipah Virus upon vaccination of humans or animals. Affinity selection on Nipah Virus-neutralizing monoclonal antibodies using random sequence peptide libraries on MS2 VLPs selected peptides with sequence similarity to peptide sequences found within the envelope glycoprotein of Nipah itself, thus identifying the epitopes the antibodies recognize. The selected peptide sequences themselves are not necessarily identical in all respects to a sequence within Nipah Virus glycoprotein, and therefore may be referred to as epitope mimics VLPs displaying these epitope mimics can serve as vaccine. On the other hand, display of the corresponding wild-type sequence derived from Nipah Virus and corresponding to the epitope mapped by affinity selection, may also be used as a vaccine.

  8. Increasing the Generalizability of ANOVA Results by Judicious Selection of Fixed-, Random-, and Mixed-Effects ANOVA Models.

    ERIC Educational Resources Information Center

    Baugh, Frank G.

    The analysis of variance (ANOVA) is a frequently used statistical procedure by which the equality of more than two population means can be tested without inflating the Type I error rate (D. Hinkle, W. Wiersma, and S. Jurs, 1998). Fixed-, random-, and mixed-effects ANOVA models are each capable of yielding interesting and useful results when…

  9. Blood Selenium Concentration and Blood Cystatin C Concentration in a Randomly Selected Population of Healthy Children Environmentally Exposed to Lead and Cadmium.

    PubMed

    Gać, Paweł; Pawlas, Natalia; Wylężek, Paweł; Poręba, Rafał; Poręba, Małgorzata; Pawlas, Krystyna

    2017-01-01

    This study aimed at evaluation of a relationship between blood selenium concentration (Se-B) and blood cystatin C concentration (CST) in a randomly selected population of healthy children, environmentally exposed to lead and cadmium. The studies were conducted on 172 randomly selected children (7.98 ± 0.97 years). Among participants, the subgroups were distinguished, manifesting marginally low blood selenium concentration (Se-B 40-59 μg/l), suboptimal blood selenium concentration (Se-B: 60-79 μg/l) or optimal blood selenium concentration (Se-B ≥ 80 μg/l). At the subsequent stage, analogous subgroups of participants were selected separately in groups of children with BMI below median value (BMI <16.48 kg/m(2)) and in children with BMI ≥ median value (BMI ≥16.48 kg/m(2)). In all participants, values of Se-B and CST were estimated. In the entire group of examined children no significant differences in mean CST values were detected between groups distinguished on the base of normative Se-B values. Among children with BMI below 16.48 kg/m(2), children with marginally low Se-B manifested significantly higher mean CST values, as compared to children with optimum Se-B (0.95 ± 0.07 vs. 0.82 ± 0.15 mg/l, p < 0.05). In summary, in a randomly selected population of healthy children no relationships could be detected between blood selenium concentration and blood cystatin C concentration. On the other hand, in children with low body mass index, a negative non-linear relationship was present between blood selenium concentration and blood cystatin C concentration.

  10. Purification of polyclonal anti-conformational antibodies for use in affinity selection from random peptide phage display libraries: A study using the hydatid vaccine EG95

    PubMed Central

    Read, A.J.; Gauci, C.G.; Lightowlers, M.W.

    2009-01-01

    The use of polyclonal antibodies to screen random peptide phage display libraries often results in the recognition of a large number of peptides that mimic linear epitopes on various proteins. There appears to be a bias in the use of this technology toward the selection of peptides that mimic linear epitopes. In many circumstances the correct folding of a protein immunogen is required for conferring protection. The use of random peptide phage display libraries to identify peptide mimics of conformational epitopes in these cases requires a strategy for overcoming this bias. Conformational epitopes on the hydatid vaccine EG95 have been shown to result in protective immunity in sheep, whereas linear epitopes are not protective. In this paper we describe a strategy that results in the purification of polyclonal antibodies directed against conformational epitopes while eliminating antibodies directed against linear epitopes. These affinity purified antibodies were then used to select a peptide from a random peptide phage display library that has the capacity to mimic conformational epitopes on EG95. This peptide was subsequently used to affinity purify monospecific antibodies against EG95. PMID:19349218

  11. Defining the sequence specificity of DNA-binding proteins by selecting binding sites from random-sequence oligonucleotides: analysis of yeast GCN4 protein.

    PubMed

    Oliphant, A R; Brandl, C J; Struhl, K

    1989-07-01

    We describe a new method for accurately defining the sequence recognition properties of DNA-binding proteins by selecting high-affinity binding sites from random-sequence DNA. The yeast transcriptional activator protein GCN4 was coupled to a Sepharose column, and binding sites were isolated by passing short, random-sequence oligonucleotides over the column and eluting them with increasing salt concentrations. Of 43 specifically bound oligonucleotides, 40 contained the symmetric sequence TGA(C/G)TCA, whereas the other 3 contained sequences matching six of these seven bases. The extreme preference for this 7-base-pair sequence suggests that each position directly contacts GCN4. The three nucleotide positions on each side of this core heptanucleotide also showed sequence preferences, indicating their effect on GCN4 binding. Interestingly, deviations in the core and a stronger sequence preference in the flanking region were found on one side of the central C . G base pair. Although GCN4 binds as a dimer, this asymmetry supports a model in which interactions on each side of the binding site are not equivalent. The random selection method should prove generally useful for defining the specificities of other DNA-binding proteins and for identifying putative target sequences from genomic DNA.

  12. A randomized, double-blind, placebo-controlled trial of a selective COX-2 inhibitor, GW406381, in patients with postherpetic neuralgia.

    PubMed

    Shackelford, Steve; Rauck, Richard; Quessy, Steve; Blum, David; Hodge, Rachel; Philipson, Richard

    2009-06-01

    In this randomized, double-blind, placebo-controlled study, we evaluated the efficacy and safety of GW406381, an investigational selective cyclooxygenase (COX)-2 inhibitor with both peripheral and central actions, in 209 patients with postherpetic neuralgia (PHN). Patients were randomly assigned to GW406381 25 mg or 50 mg or placebo treatments for 3 weeks. The primary efficacy outcome measure was the change in average daily pain intensity score from baseline to the last week of treatment. Both doses of GW406381 produced greater reduction in pain score than placebo, but the treatment difference did not reach statistical significance. It was possible that the 3-week duration was too short, as there was a tendency for increasing separation from placebo over time that did not appear to reach maximum effect by the end of the study for either GW406381 treatment group. Overall, GW406381 was well tolerated in this elderly population. To our knowledge, this is the first report of a randomized, controlled clinical trial of a selective or nonselective COX inhibitor in neuropathic pain. The results of this study were inconclusive regarding the clinical relevance of the role of COX-2 in modulation of the symptoms of PHN.

  13. Prevalence and classification of chronic kidney disease in cats randomly selected from four age groups and in cats recruited for degenerative joint disease studies.

    PubMed

    Marino, Christina L; Lascelles, B Duncan X; Vaden, Shelly L; Gruen, Margaret E; Marks, Steven L

    2014-06-01

    Chronic kidney disease (CKD) and degenerative joint disease are both considered common in older cats. Information on the co-prevalence of these two diseases is lacking. This retrospective study was designed to determine the prevalence of CKD in two cohorts of cats: cats randomly selected from four evenly distributed age groups (RS group) and cats recruited for degenerative joint disease studies (DJD group), and to evaluate the concurrence of CKD and DJD in these cohorts. The RS group was randomly selected from four age groups from 6 months to 20 years, and the DJD group comprised cats recruited to four previous DJD studies, with the DJD group excluding cats with a blood urea nitrogen and/or serum creatinine concentration >20% (the upper end of normal) for two studies and cats with CKD stages 3 and 4 for the other two studies. The prevalence of CKD in the RS and DJD groups was higher than expected at 50% and 68.8%, respectively. CKD was common in cats between 1 and 15 years of age, with a similar prevalence of CKD stages 1 and 2 across age groups in both the RS and DJD cats, respectively. We found significant concurrence between CKD and DJD in cats of all ages, indicating the need for increased screening for CKD when selecting DJD treatments. Additionally, this study offers the idea of a relationship and causal commonality between CKD and DJD owing to the striking concurrence across age groups and life stages. © ISFM and AAFP 2013.

  14. The prevalence and classification of chronic kidney disease in cats randomly selected within four age groups and in cats recruited for degenerative joint disease studies

    PubMed Central

    Marino, Christina L; Lascelles, B Duncan X; Vaden, Shelly L; Gruen, Margaret E; Marks, Steven L

    2015-01-01

    Chronic kidney disease (CKD) and degenerative joint disease are both considered common in older cats. Information on the co-prevalence of these two diseases is lacking. This retrospective study was designed to determine the prevalence of CKD in two cohorts of cats: cats randomly selected from four evenly distributed age groups (RS group) and cats recruited for degenerative joint disease studies (DJD group), and to evaluate the concurrence of CKD and DJD in these cohorts. The RS group was randomly selected from four age groups from 6 months to 20 years, and the DJD group comprised cats recruited to four previous DJD studies, with the DJD group excluding cats with a blood urea nitrogen and/or serum creatinine concentration >20% (the upper end of normal) for two studies and cats with CKD stages 3 and 4 for the other two studies. The prevalence of CKD in the RS and DJD groups was higher than expected at 50% and 68.8%, respectively. CKD was common in cats between 1 and 15 years of age, with a similar prevalence of CKD stages 1 and 2 across age groups in both the RS and DJD cats, respectively. We found significant concurrence between CKD and DJD in cats of all ages, indicating the need for increased screening for CKD when selecting DJD treatments. Additionally, this study offers the idea of a relationship and causal commonality between CKD and DJD owing to the striking concurrence across age groups and life stages. PMID:24217707

  15. Free variable selection QSPR study to predict (19)F chemical shifts of some fluorinated organic compounds using Random Forest and RBF-PLS methods.

    PubMed

    Goudarzi, Nasser

    2016-04-05

    In this work, two new and powerful chemometrics methods are applied for the modeling and prediction of the (19)F chemical shift values of some fluorinated organic compounds. The radial basis function-partial least square (RBF-PLS) and random forest (RF) are employed to construct the models to predict the (19)F chemical shifts. In this study, we didn't used from any variable selection method and RF method can be used as variable selection and modeling technique. Effects of the important parameters affecting the ability of the RF prediction power such as the number of trees (nt) and the number of randomly selected variables to split each node (m) were investigated. The root-mean-square errors of prediction (RMSEP) for the training set and the prediction set for the RBF-PLS and RF models were 44.70, 23.86, 29.77, and 23.69, respectively. Also, the correlation coefficients of the prediction set for the RBF-PLS and RF models were 0.8684 and 0.9313, respectively. The results obtained reveal that the RF model can be used as a powerful chemometrics tool for the quantitative structure-property relationship (QSPR) studies.

  16. Effects of one versus two bouts of moderate intensity physical activity on selective attention during a school morning in Dutch primary schoolchildren: A randomized controlled trial.

    PubMed

    Altenburg, Teatske M; Chinapaw, Mai J M; Singh, Amika S

    2016-10-01

    Evidence suggests that physical activity is positively related to several aspects of cognitive functioning in children, among which is selective attention. To date, no information is available on the optimal frequency of physical activity on cognitive functioning in children. The current study examined the acute effects of one and two bouts of moderate-intensity physical activity on children's selective attention. Randomized controlled trial (ISRCTN97975679). Thirty boys and twenty-six girls, aged 10-13 years, were randomly assigned to three conditions: (A) sitting all morning working on simulated school tasks; (B) one 20-min physical activity bout after 90min; and (C) two 20-min physical activity bouts, i.e. at the start and after 90min. Selective attention was assessed at five time points during the morning (i.e. at baseline and after 20, 110, 130 and 220min), using the 'Sky Search' subtest of the 'Test of Selective Attention in Children'. We used GEE analysis to examine differences in Sky Search scores between the three experimental conditions, adjusting for school, baseline scores, self-reported screen time and time spent in sports. Children who performed two 20-min bouts of moderate-intensity physical activity had significantly better Sky Search scores compared to children who performed one physical activity bout or remained seated the whole morning (B=-0.26; 95% CI=[-0.52; -0.00]). Our findings support the importance of repeated physical activity during the school day for beneficial effects on selective attention in children. Copyright © 2015 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  17. Toward a code for the interactions of zinc fingers with DNA: selection of randomized fingers displayed on phage.

    PubMed Central

    Choo, Y; Klug, A

    1994-01-01

    We have used two selection techniques to study sequence-specific DNA recognition by the zinc finger, a small, modular DNA-binding minidomain. We have chosen zinc fingers because they bind as independent modules and so can be linked together in a peptide designed to bind a predetermined DNA site. In this paper, we describe how a library of zinc fingers displayed on the surface of bacteriophage enables selection of fingers capable of binding to given DNA triplets. The amino acid sequences of selected fingers which bind the same triplet are compared to examine how sequence-specific DNA recognition occurs. Our results can be rationalized in terms of coded interactions between zinc fingers and DNA, involving base contacts from a few alpha-helical positions. In the paper following this one, we describe a complementary technique which confirms the identity of amino acids capable of DNA sequence discrimination from these positions. Images PMID:7972027

  18. [Selective hemoperfusion in gram-negative severe sepsis of patients after cardiac surgery: a prospective randomized study].

    PubMed

    Iarustovskiĭ, M B; Abramian, M V; Popok, Z V; Nazarova, E I; Stupchenko, O S; Popov, D A; Pliushch, M G

    2010-01-01

    Early in the new millennium, sepsis remains one of the most urgent problems of modern reanimatology. Endotoxin, a component of the cell wall of gram-negative bacteria is of paramount importance in the pathogenesis of sepsis. Complex intensive care for severe sepsis involves selective endotoxin hemoperfusion with Polymyxin B and Alteco LPS adsorber, which has been performed in 2 patients. This study will enable specialists to formulate their opinion as to whether it is expedient to incorporate selective endotoxin hemoperfusion into complex intensive care for severe sepsis.

  19. SNPs selected by information content outperform randomly selected microsatellite loci for delineating genetic identification and introgression in the endangered dark European honeybee (Apis mellifera mellifera).

    PubMed

    Muñoz, Irene; Henriques, Dora; Jara, Laura; Johnston, J Spencer; Chávez-Galarza, Julio; De La Rúa, Pilar; Pinto, M Alice

    2017-07-01

    The honeybee (Apis mellifera) has been threatened by multiple factors including pests and pathogens, pesticides and loss of locally adapted gene complexes due to replacement and introgression. In western Europe, the genetic integrity of the native A. m. mellifera (M-lineage) is endangered due to trading and intensive queen breeding with commercial subspecies of eastern European ancestry (C-lineage). Effective conservation actions require reliable molecular tools to identify pure-bred A. m. mellifera colonies. Microsatellites have been preferred for identification of A. m. mellifera stocks across conservation centres. However, owing to high throughput, easy transferability between laboratories and low genotyping error, SNPs promise to become popular. Here, we compared the resolving power of a widely utilized microsatellite set to detect structure and introgression with that of different sets that combine a variable number of SNPs selected for their information content and genomic proximity to the microsatellite loci. Contrary to every SNP data set, microsatellites did not discriminate between the two lineages in the PCA space. Mean introgression proportions were identical across the two marker types, although at the individual level, microsatellites' performance was relatively poor at the upper range of Q-values, a result reflected by their lower precision. Our results suggest that SNPs are more accurate and powerful than microsatellites for identification of A. m. mellifera colonies, especially when they are selected by information content. © 2016 John Wiley & Sons Ltd.

  20. Selective internal radiotherapy (SIRT) versus transarterial chemoembolization (TACE) for the treatment of intrahepatic cholangiocellular carcinoma (CCC): study protocol for a randomized controlled trial.

    PubMed

    Kloeckner, Roman; Ruckes, Christian; Kronfeld, Kai; Wörns, Marcus Alexander; Weinmann, Arndt; Galle, Peter Robert; Lang, Hauke; Otto, Gerd; Eichhorn, Waltraud; Schreckenberger, Mathias; Dueber, Christoph; Pitton, Michael Bernhard

    2014-08-06

    Cholangiocellular carcinoma is the second most common primary liver cancer after hepatocellular carcinoma. Over the last 30 years, the incidence of intrahepatic cholangiocellular carcinoma has risen continuously worldwide. Meanwhile, the intrahepatic cholangiocellular carcinoma has become more common than the extrahepatic growth type and currently accounts for 10-15% of all primary hepatic malignancies. Intrahepatic cholangiocellular carcinoma is typically diagnosed in advanced stages due to late clinical symptoms and an absence of classic risk factors. A late diagnosis precludes curative surgical resection. There is evidence that transarterial chemoembolization leads to better local tumor control and prolongs survival compared to systemic chemotherapy. New data indicates that selective internal radiotherapy, also referred to as radioembolization, provides promising results for treating intrahepatic cholangiocellular carcinoma. This pilot study is a randomized, controlled, single center, phase II trial. Twenty-four patients with intrahepatic cholangiocellular carcinoma will be randomized in a 1:1 ratio to receive either chemoembolization or radioembolization. Randomization will be stratified according to tumor load. Progression-free survival is the primary endpoint; overall survival and time to progression are secondary endpoints. To evaluate treatment success, patients will receive contrast enhanced magnetic resonance imaging every 3 months. Currently, chemoembolization is routinely performed in many centers instead of systemic chemotherapy for treating intrahepatic cholangiocellular carcinoma confined to the liver. Recently, radioembolization has been increasingly applied to cholangiocellular carcinoma as second line therapy after TACE failure or even as an alternative first line therapy. Nonetheless, no randomized studies have compared radioembolization and chemoembolization. Considering all this background information, we recognized a strong need for a

  1. Risk-based Patient Selection for Magnetic Resonance Imaging-targeted Prostate Biopsy after Negative Transrectal Ultrasound-guided Random Biopsy Avoids Unnecessary Magnetic Resonance Imaging Scans.

    PubMed

    Alberts, Arnout R; Schoots, Ivo G; Bokhorst, Leonard P; van Leenders, Geert J; Bangma, Chris H; Roobol, Monique J

    2016-06-01

    Multiparametric magnetic resonance imaging (mpMRI) is increasingly used in men with suspicion of prostate cancer (PCa) after negative transrectal ultrasound (TRUS)-guided random biopsy. Risk-based patient selection for mpMRI could help to avoid unnecessary mpMRIs. To study the rate of potentially avoided mpMRIs after negative TRUS-guided random biopsy by risk-based patient selection using the Rotterdam Prostate Cancer Risk Calculator (RPCRC). One hundred and twenty two consecutive men received a mpMRI scan and subsequent MRI-TRUS fusion targeted biopsy in case of suspicious lesion(s) (Prostate Imaging Reporting and Data System ≥ 3) after negative TRUS-guided random biopsy. Men were retrospectively stratified according to the RPCRC biopsy advice to compare targeted biopsy outcomes after risk-based patient selection with standard (prostate specific antigen and/or digital rectal examination-driven) patient selection. The rate of potentially avoided mpMRIs by RPCRC-based patient selection in relation to the rate of missed high-grade (Gleason ≥ 3+4) PCa. Receiver operating characteristic curve analysis was performed to determine the area under the curve of the RPCRC for (high-grade) PCa. Of the 60 men with a positive biopsy advice, six (10%) had low-grade PCa and 28 (47%) had high-grade PCa in targeted biopsy. Of the 62 men with a negative advice, two (3%) had low-grade PCa and three (5%) had high-grade PCa. Upfront RPCRC-based patient selection would have avoided 62 (51%) of 122 mpMRIs and two (25%) of eight low-grade PCa diagnoses, missing three (10%) of 31 high-grade PCa. The area under the curve of the RPCRC for PCa and high-grade PCa was respectively 0.76 (95% confidence interval 0.67-0.85) and 0.84 (95% confidence interval 0.76-0.93). Risk-based patient selection with the RPCRC can avoid half of mpMRIs after a negative prostate specific antigen and/or digital rectal examination-driven TRUS-guided random biopsy. Further improvement in risk-based patient

  2. Obstetric and perinatal outcome of babies born from sperm selected by MACS from a randomized controlled trial.

    PubMed

    Romany, Laura; Garrido, Nicolas; Cobo, Ana; Aparicio-Ruiz, Belen; Serra, Vicente; Meseguer, Marcos

    2017-02-01

    The purpose of this study is to assess outcomes after magnetic-activated cell sorting (MACS) technology on obstetric and perinatal outcomes compared with those achieved after swim up from randomized controlled trial. This is a two-arm, unicentric, prospective, randomized, and triple-blinded trial and has a total of 237 infertile couples, between October 2010 and January 2013. A total of 65 and 66 newborns from MACS and control group, respectively, were described. MACS had no clinically relevant adverse effects on obstetric and perinatal outcomes. No differences were found for obstetric problems including premature rupture of membranes 6.1% (CI95% 0-12.8) vs. 5.9% (CI95% 0-12.4), 1st trimester bleeding 28.6% (CI95% 15.9-41.2) vs. 23.5% (CI95% 11.9-35.1), invasive procedures as amniocentesis 2.0% (CI95% 0-5.9) vs. 3.9% (CI95% 0-9.2), diabetes 14.3% (CI95% 4.5-24.1) vs. 9.8% (CI95% 1.6-17.9), anemia 6.1% (CI95% 0-12.8) vs. 5.9%(CI95% 0-12.4), 2nd and 3rd trimesters 10.2% (CI95% 1.7-18.7) vs. 5.9% (CI95% 0-12.4), urinary tract infection 8.2% (CI95% 0.5-15.9) vs. 3.9% (CI95% 0-9.2), pregnancy-induced hypertension 6.1% (CI95% 0-12.8) vs. 15.7% (CI95% 5.7-25.7), birth weight (g) 2684.10 (CI95% 2499.48-2868.72) vs. 2676.12 (CI95% 2499.02-2852.21), neonatal height (cm) 48.3 (CI95% 47.1-49.4) vs. 46.5 (CI95% 44.6-48.4), and gestational cholestasis 0%(CI95% 0-0) vs. 3.9% (CI95% 0-9.2), respectively, in MACS group compared with control group. Our data suggest that MACS technology does not increase or decrease Powered by Editorial Manager® and ProduXion Manager® from Aries Systems Corporation adverse obstetric and perinatal outcomes in children conceived when this technology was performed, being the largest randomized control trial with live birth reported results with MACS.

  3. Selective prevention of combat-related post-traumatic stress disorder using attention bias modification training: a randomized controlled trial.

    PubMed

    Wald, I; Fruchter, E; Ginat, K; Stolin, E; Dagan, D; Bliese, P D; Quartana, P J; Sipos, M L; Pine, D S; Bar-Haim, Y

    2016-09-01

    Efficacy of pre-trauma prevention for post-traumatic stress disorder (PTSD) has not yet been established in a randomized controlled trial. Attention bias modification training (ABMT), a computerized intervention, is thought to mitigate stress-related symptoms by targeting disruptions in threat monitoring. We examined the efficacy of ABMT delivered before combat in mitigating risk for PTSD following combat. We conducted a double-blind, four-arm randomized controlled trial of 719 infantry soldiers to compare the efficacy of eight sessions of ABMT (n = 179), four sessions of ABMT (n = 184), four sessions of attention control training (ACT; n = 180), or no-training control (n = 176). Outcome symptoms were measured at baseline, 6-month follow-up, 10 days following combat exposure, and 4 months following combat. Primary outcome was PTSD prevalence 4 months post-combat determined in a clinical interview using the Clinician-Administered PTSD Scale. Secondary outcomes were self-reported PTSD and depression symptoms, collected at all four assessments. PTSD prevalence 4 months post-combat was 7.8% in the no-training control group, 6.7% with eight-session ABMT, 2.6% with four-session ABMT, and 5% with ACT. Four sessions of ABMT reduced risk for PTSD relative to the no-training condition (odds ratio 3.13, 95% confidence interval 1.01-9.22, p < 0.05, number needed to treat = 19.2). No other between-group differences were found. The results were consistent across a variety of analytic techniques and data imputation approaches. Four sessions of ABMT, delivered prior to combat deployment, mitigated PTSD risk following combat exposure. Given its low cost and high scalability potential, and observed number needed to treat, research into larger-scale applications is warranted. The ClinicalTrials.gov identifier is NCT01723215.

  4. Acute changes of hip joint range of motion using selected clinical stretching procedures: A randomized crossover study.

    PubMed

    Hammer, Adam M; Hammer, Roger L; Lomond, Karen V; O'Connor, Paul

    2017-09-01

    Hip adductor flexibility and strength is an important component of athletic performance and many activities of daily living. Little research has been done on the acute effects of a single session of stretching on hip abduction range of motion (ROM). The aim of this study was to compare 3 clinical stretching procedures against passive static stretching and control on ROM and peak isometric maximal voluntary contraction (MVC). Using a randomized crossover study design, a total of 40 participants (20 male and 20 female) who had reduced hip adductor muscle length attended a familiarization session and 5 testing sessions on non-consecutive days. Following the warm-up and pre-intervention measures of ROM and MVC, participants were randomly assigned 1 of 3 clinical stretching procedures (modified lunge, multidirectional, and joint mobilization) or a static stretch or control condition. Post-intervention measures of ROM and MVC were taken immediately following completion of the assigned condition. An ANOVA using a repeated measure design with the change score was conducted. All interventions resulted in small but statistically significant (p < 0.05) increases (1.0°-1.7°) in ROM with no inter-condition differences except one. Multidirectional stretching was greater than control (p = 0.031). These data suggest that a single session of stretching has only a minimal effect on acute changes of hip abduction ROM. Although hip abduction is a frontal plane motion, to effectively increase the extensibility of the structures that limit abduction, integrating multi-planar stretches may be indicated. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Feature selection and classification of urinary mRNA microarray data by iterative random forest to diagnose renal fibrosis: a two-stage study

    PubMed Central

    Zhou, Le-Ting; Cao, Yu-Han; Lv, Lin-Li; Ma, Kun-Ling; Chen, Ping-Sheng; Ni, Hai-Feng; Lei, Xiang-Dong; Liu, Bi-Cheng

    2017-01-01

    Renal fibrosis is a common pathological pathway of progressive chronic kidney disease (CKD). However, kidney function parameters are suboptimal for detecting early fibrosis, and therefore, novel biomarkers are urgently needed. We designed a 2-stage study and constructed a targeted microarray to detect urinary mRNAs of CKD patients with renal biopsy and healthy participants. We analysed the microarray data by an iterative random forest method to select candidate biomarkers and produce a more accurate classifier of renal fibrosis. Seventy-six and 49 participants were enrolled into stage I and stage II studies, respectively. By the iterative random forest method, we identified a four-mRNA signature in urinary sediment, including TGFβ1, MMP9, TIMP2, and vimentin, as important features of tubulointerstitial fibrosis (TIF). All four mRNAs significantly correlated with TIF scores and discriminated TIF with high sensitivity, which was further validated in the stage-II study. The combined classifiers showed excellent sensitivity and outperformed serum creatinine and estimated glomerular filtration rate measurements in diagnosing TIF. Another four mRNAs significantly correlated with glomerulosclerosis. These findings showed that urinary mRNAs can serve as sensitive biomarkers of renal fibrosis, and the random forest classifier containing urinary mRNAs showed favourable performance in diagnosing early renal fibrosis. PMID:28045061

  6. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination.

    PubMed

    Koh, Bongyeun; Hong, Sunggi; Kim, Soon-Sim; Hyun, Jin-Sook; Baek, Milye; Moon, Jundong; Kwon, Hayran; Kim, Gyoungyong; Min, Seonggi; Kang, Gu-Hyun

    2016-01-01

    The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE), which requires examinees to select items randomly. The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01), as well as 4 of the 5 items on the advanced skills test (P<0.05). In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01), as well as all 3 of the advanced skills test items (P<0.01). In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  7. Expression analysis onto microarrays of randomly selected cDNA clones highlights HOXB13 as a marker of human prostate cancer

    PubMed Central

    Edwards, S; Campbell, C; Flohr, P; Shipley, J; Giddings, I; te-Poele, R; Dodson, A; Foster, C; Clark, J; Jhavar, S; Kovacs, G; Cooper, C S

    2004-01-01

    In a strategy aimed at identifying novel markers of human prostate cancer, we performed expression analysis using microarrays of clones randomly selected from a cDNA library prepared from the LNCaP prostate cancer cell line. Comparisons of expression profiles in primary human prostate cancer, adjacent normal prostate tissue, and a selection of other (nonprostate) normal human tissues, led to the identification of a set of clones that were judged as the best candidate markers of normal and/or malignant prostate tissue. DNA sequencing of the selected clones revealed that they included 10 genes that had previously been established as prostate markers: NKX3.1, KLK2, KLK3 (PSA), FOLH1 (PSMA), STEAP2, PSGR, PRAC, RDH11, Prostein and FASN. Following analysis of the expression patterns of all selected and sequenced genes through interrogation of SAGE databases, a further three genes from our clone set, HOXB13, SPON2 and NCAM2, emerged as additional candidate markers of human prostate cancer. Quantitative RT–PCR demonstrated the specificity of expression of HOXB13 in prostate tissue and revealed its ubiquitous expression in a series of 37 primary prostate cancers and 20 normal prostates. These results demonstrate the utility of this expression-microarray approach in hunting for new markers of individual human cancer types. PMID:15583692

  8. Expression analysis onto microarrays of randomly selected cDNA clones highlights HOXB13 as a marker of human prostate cancer.

    PubMed

    Edwards, S; Campbell, C; Flohr, P; Shipley, J; Giddings, I; Te-Poele, R; Dodson, A; Foster, C; Clark, J; Jhavar, S; Kovacs, G; Cooper, C S

    2005-01-31

    In a strategy aimed at identifying novel markers of human prostate cancer, we performed expression analysis using microarrays of clones randomly selected from a cDNA library prepared from the LNCaP prostate cancer cell line. Comparisons of expression profiles in primary human prostate cancer, adjacent normal prostate tissue, and a selection of other (nonprostate) normal human tissues, led to the identification of a set of clones that were judged as the best candidate markers of normal and/or malignant prostate tissue. DNA sequencing of the selected clones revealed that they included 10 genes that had previously been established as prostate markers: NKX3.1, KLK2, KLK3 (PSA), FOLH1 (PSMA), STEAP2, PSGR, PRAC, RDH11, Prostein and FASN. Following analysis of the expression patterns of all selected and sequenced genes through interrogation of SAGE databases, a further three genes from our clone set, HOXB13, SPON2 and NCAM2, emerged as additional candidate markers of human prostate cancer. Quantitative RT-PCR demonstrated the specificity of expression of HOXB13 in prostate tissue and revealed its ubiquitous expression in a series of 37 primary prostate cancers and 20 normal prostates. These results demonstrate the utility of this expression-microarray approach in hunting for new markers of individual human cancer types.

  9. H-DROP: an SVM based helical domain linker predictor trained with features optimized by combining random forest and stepwise selection

    NASA Astrophysics Data System (ADS)

    Ebina, Teppei; Suzuki, Ryosuke; Tsuji, Ryotaro; Kuroda, Yutaka

    2014-08-01

    Domain linker prediction is attracting much interest as it can help identifying novel domains suitable for high throughput proteomics analysis. Here, we report H-DROP, an SVM-based Helical Domain linker pRediction using OPtimal features. H-DROP is, to the best of our knowledge, the first predictor for specifically and effectively identifying helical linkers. This was made possible first because a large training dataset became available from IS-Dom, and second because we selected a small number of optimal features from a huge number of potential ones. The training helical linker dataset, which included 261 helical linkers, was constructed by detecting helical residues at the boundary regions of two independent structural domains listed in our previously reported IS-Dom dataset. 45 optimal feature candidates were selected from 3,000 features by random forest, which were further reduced to 26 optimal features by stepwise selection. The prediction sensitivity and precision of H-DROP were 35.2 and 38.8 %, respectively. These values were over 10.7 % higher than those of control methods including our previously developed DROP, which is a coil linker predictor, and PPRODO, which is trained with un-differentiated domain boundary sequences. Overall, these results indicated that helical linkers can be predicted from sequence information alone by using a strictly curated training data set for helical linkers and carefully selected set of optimal features. H-DROP is available at http://domserv.lab.tuat.ac.jp.

  10. Effects of choice architecture and chef-enhanced meals on the selection and consumption of healthier school foods: a randomized clinical trial.

    PubMed

    Cohen, Juliana F W; Richardson, Scott A; Cluggish, Sarah A; Parker, Ellen; Catalano, Paul J; Rimm, Eric B

    2015-05-01

    Little is known about the long-term effect of a chef-enhanced menu on healthier food selection and consumption in school lunchrooms. In addition, it remains unclear if extended exposure to other strategies to promote healthier foods (eg, choice architecture) also improves food selection or consumption. To evaluate the short- and long-term effects of chef-enhanced meals and extended exposure to choice architecture on healthier school food selection and consumption. A school-based randomized clinical trial was conducted during the 2011-2012 school year among 14 elementary and middle schools in 2 urban, low-income school districts (intent-to-treat analysis). Included in the study were 2638 students in grades 3 through 8 attending participating schools (38.4% of eligible participants). Schools were first randomized to receive a professional chef to improve school meal palatability (chef schools) or to a delayed intervention (control group). To assess the effect of choice architecture (smart café), all schools after 3 months were then randomized to the smart café intervention or to the control group. School food selection was recorded, and consumption was measured using plate waste methods. After 3 months, vegetable selection increased in chef vs control schools (odds ratio [OR], 1.75; 95% CI, 1.36-2.24), but there was no effect on the selection of other components or on meal consumption. After long-term or extended exposure to the chef or smart café intervention, fruit selection increased in the chef (OR, 3.08; 95% CI, 2.23-4.25), smart café (OR, 1.45; 95% CI, 1.13-1.87), and chef plus smart café (OR, 3.10; 95% CI, 2.26-4.25) schools compared with the control schools, and consumption increased in the chef schools (OR, 0.17; 95% CI, 0.03-0.30 cups/d). Vegetable selection increased in the chef (OR, 2.54; 95% CI, 1.83-3.54), smart café (OR, 1.91; 95% CI, 1.46-2.50), and chef plus smart café schools (OR, 7.38, 95% CI, 5.26-10.35) compared with the control schools

  11. Age-related Cataract in a Randomized Trial of Selenium and Vitamin E in Men: The SELECT Eye Endpoints (SEE) Study

    PubMed Central

    Christen, William G.; Glynn, Robert J.; Gaziano, J. Michael; Darke, Amy K.; Crowley, John J.; Goodman, Phyllis J.; Lippman, Scott M.; Lad, Thomas E.; Bearden, James D.; Goodman, Gary E.; Minasian, Lori M.; Thompson, Ian M.; Blanke, Charles D.; Klein, Eric A.

    2014-01-01

    Importance Observational studies suggest a role for dietary nutrients such as vitamin E and selenium in cataract prevention. However, the results of randomized trials of vitamin E supplements and cataract have been disappointing, and are not yet available for selenium. Objective To test whether long-term supplementation with selenium and vitamin E affects the incidence of cataract in a large cohort of men. Design, Setting, and Participants The SELECT Eye Endpoints (SEE) study was an ancillary study of the SWOG-coordinated Selenium and Vitamin E Cancer Prevention Trial (SELECT), a randomized, placebo-controlled, four arm trial of selenium and vitamin E conducted among 35,533 men aged 50 years and older for African Americans and 55 and older for all other men, at 427 participating sites in the US, Canada, and Puerto Rico. A total of 11,267 SELECT participants from 128 SELECT sites participated in the SEE ancillary study. Intervention Individual supplements of selenium (200 µg/d from L-selenomethionine) and vitamin E (400 IU/d of all rac-α-tocopheryl acetate). Main Outcome Measures Incident cataract, defined as a lens opacity, age-related in origin, responsible for a reduction in best-corrected visual acuity to 20/30 or worse based on self-report confirmed by medical record review, and cataract extraction, defined as the surgical removal of an incident cataract. Results During a mean (SD) of 5.6 (1.2) years of treatment and follow-up, 389 cases of cataract were documented. There were 185 cataracts in the selenium group and 204 in the no selenium group (hazard ratio [HR], 0.91; 95 percent confidence interval [CI], 0.75 to 1.11; P=.37). For vitamin E, there were 197 cases in the treated group and 192 in the placebo group (HR, 1.02; CI, 0.84 to 1.25; P=.81). Similar results were observed for cataract extraction. Conclusions and Relevance These randomized trial data from a large cohort of apparently healthy men indicate that long-term daily supplementation with selenium

  12. Selective press extinctions, but not random pulse extinctions, cause delayed ecological recovery in communities of digital organisms.

    PubMed

    Yedid, Gabriel; Ofria, Charles A; Lenski, Richard E

    2009-04-01

    A key issue concerning recovery from mass extinctions is how extinction and diversification mechanisms affect the recovery process. We evolved communities of digital organisms, subjecting them to instantaneous "pulse" extinctions, choosing survivors at random, or to prolonged "pulse" extinctions involving a period of low resource availability. Functional activity at low trophic levels recovered faster than at higher levels, with the most extensive delays seen at the top level. Postpress communities generally did not fully recover functional activity in the allotted time, which equaled that of their original diversification. We measured recovery of phenotypic diversity, observing considerable variation in outcomes. Communities subjected to pulse extinctions recovered functional activity and phenotypic diversity substantially faster than when subjected to press extinctions. Follow-up experiments tested whether organisms with shorter generation times and low functional activity contributed to delayed recovery after press extinctions. The results indicate that adaptation during the press episode degraded the organisms' ability to re-evolve preextinction functionality. There are interesting parallels with patterns from the paleontological record. We suggest that some delayed recoveries from mass extinction may reflect the need to both re-evolve biological functions and reconstruct ecological interactions lost during the extinction. Adaptation to conditions during an extended disturbance may hinder subsequent recovery.

  13. Bendamustine, thalidomide and dexamethasone combination therapy for relapsed/refractory myeloma patients: results of the MUKone randomized dose selection trial.

    PubMed

    Schey, Steve; Brown, Sarah R; Tillotson, Avie-Lee; Yong, Kwee; Williams, Cathy; Davies, Faith; Morgan, Gareth; Cavenagh, Jamie; Cook, Gordon; Cook, Mark; Orti, Guillermo; Morris, Curly; Sherratt, Debbie; Flanagan, Louise; Gregory, Walter; Cavet, James

    2015-08-01

    There is a significant unmet need in effective therapy for relapsed myeloma patients once they become refractory to bortezomib and lenalidomide. While data from the front line setting suggest bendamustine is superior to melphalan, there is no information defining optimal bendamustine dose in multiply-treated patients. We report a multi-centre randomized two-stage phase 2 trial simultaneously assessing deliverability and activity of two doses of bendamustine (60 mg/m2 vs. 100 mg/m2) days 1 and 8, thalidomide (100 mg) days 1-21 and low dose dexamethasone (20 mg) days 1, 8, 15 and 22 of a 28-d cycle. Ninety-four relapsing patients were treated on trial, with a median three prior treatment lines. A pre-planned interim deliverability and activity assessment led to closure of the 100 mg/m2 arm due to excess cytopenias, and led to amendment of entry criteria for cytopenias. Non-haematological toxicities including thromboembolism and neurotoxicity were infrequent. In the 60 mg/m2 arm, treatment was deliverable in 61.1% subjects and the partial response rate was 46.3% in the study eligible population, with 7.5 months progression-free survival. This study demonstrates bendamustine at 60 mg/m2 twice per month with thalidomide and dexamethasone is deliverable for repeated cycles in heavily pre-treated myeloma patients and has substantial clinical activity.

  14. Recruitment strategies should not be randomly selected: empirically improving recruitment success and diversity in developmental psychology research

    PubMed Central

    Sugden, Nicole A.; Moulson, Margaret C.

    2015-01-01

    Psychological and developmental research have been critiqued for the lack of diversity of research samples. Because differences in culture, race, and ethnicity can influence participant behavior, limited diversity limits the generalizability of the findings. These differences may also impact how participants behave in response to recruitment attempts, which suggests that recruitment itself may be leveraged to increase sample diversity. The goal of the current study was to determine what factors, within a recruitment interaction, could be leveraged to increase success and diversity when recruiting families with children for developmental research. Study 1 found three factors influenced success: (1) recruitment was more successful when other potential participants were also interested (i.e., recruiters were busy), (2) recruiters of particular races were more successful than recruiters of other races, and (3) differences in success were related to what the recruiter said to engage the potential participant (i.e., the script). The latter two factors interacted, suggesting some recruiters were using less optimal scripts. To improve success rates, study 2 randomly assigned scripts to recruiters and encouraged them to recruit more vigorously during busy periods. Study 2 found that two factors influenced success: (1) some scripts were more successful than others and (2) we were more successful at recruiting non-White potential participants than White participants. These two interacted, with some scripts being more successful with White and other scripts being more successful with non-White families. This intervention significantly increased recruitment success rate by 8.1% and the overall number of families recruited by 15.3%. These findings reveal that empirically evaluating and tailoring recruitment efforts based on the most successful strategies is effective in boosting diversity through increased participation of children from non-White families. PMID:25972829

  15. Recruitment strategies should not be randomly selected: empirically improving recruitment success and diversity in developmental psychology research.

    PubMed

    Sugden, Nicole A; Moulson, Margaret C

    2015-01-01

    Psychological and developmental research have been critiqued for the lack of diversity of research samples. Because differences in culture, race, and ethnicity can influence participant behavior, limited diversity limits the generalizability of the findings. These differences may also impact how participants behave in response to recruitment attempts, which suggests that recruitment itself may be leveraged to increase sample diversity. The goal of the current study was to determine what factors, within a recruitment interaction, could be leveraged to increase success and diversity when recruiting families with children for developmental research. Study 1 found three factors influenced success: (1) recruitment was more successful when other potential participants were also interested (i.e., recruiters were busy), (2) recruiters of particular races were more successful than recruiters of other races, and (3) differences in success were related to what the recruiter said to engage the potential participant (i.e., the script). The latter two factors interacted, suggesting some recruiters were using less optimal scripts. To improve success rates, study 2 randomly assigned scripts to recruiters and encouraged them to recruit more vigorously during busy periods. Study 2 found that two factors influenced success: (1) some scripts were more successful than others and (2) we were more successful at recruiting non-White potential participants than White participants. These two interacted, with some scripts being more successful with White and other scripts being more successful with non-White families. This intervention significantly increased recruitment success rate by 8.1% and the overall number of families recruited by 15.3%. These findings reveal that empirically evaluating and tailoring recruitment efforts based on the most successful strategies is effective in boosting diversity through increased participation of children from non-White families.

  16. Randomization and In Vivo Selection Reveal a GGRG Motif Essential for Packaging Human Immunodeficiency Virus Type 2 RNA ▿ †

    PubMed Central

    Baig, Tayyba T.; Lanchy, Jean-Marc; Lodmell, J. Stephen

    2009-01-01

    The packaging signal (ψ) of human immunodeficiency virus type 2 (HIV-2) is present in the 5′ noncoding region of RNA and contains a 10-nucleotide palindrome (pal; 5′-392-GGAGUGCUCC) located upstream of the dimerization signal stem-loop 1 (SL1). pal has been shown to be functionally important in vitro and in vivo. We previously showed that the 3′ side of pal (GCUCC-3′) is involved in base-pairing interactions with a sequence downstream of SL1 to make an extended SL1, which is important for replication in vivo and the regulation of dimerization in vitro. However, the role of the 5′ side of pal (5′-GGAGU) was less clear. Here, we characterized this role using an in vivo SELEX approach. We produced a population of HIV-2 DNA genomes with random sequences within the 5′ side of pal and transfected these into COS-7 cells. Viruses from COS-7 cells were used to infect C8166 permissive cells. After several weeks of serial passage in C8166 cells, surviving viruses were sequenced. On the 5′ side of pal there was a striking convergence toward a GGRGN consensus sequence. Individual clones with consensus and nonconsensus sequences were tested in infectivity and packaging assays. Analysis of individuals that diverged from the consensus sequence showed normal viral RNA and protein synthesis but had replication defects and impaired RNA packaging. These findings clearly indicate that the GGRG motif is essential for viral replication and genomic RNA packaging. PMID:18971263

  17. Sex-specific fitness returns are too weak to select for non-random patterns of sex allocation in a viviparous snake.

    PubMed

    Baron, Jean-Pierre; Tully, Thomas; Le Galliard, Jean-François

    2010-10-01

    When environmental conditions exert sex-specific selection on offspring, mothers should benefit from biasing their sex allocation towards the sex with the highest fitness in a given environment. Yet, studies show mixed support for such adaptive strategies in vertebrates, which may be due to mechanistic constraints and/or weak selection on facultative sex allocation. In an attempt to disentangle these alternatives, we quantified sex-specific fitness returns and sex allocation (sex ratio and sex-specific mass at birth) according to maternal factors (body size, age, birth date, and litter size), habitat, and year in a viviparous snake with genotypic sex determination. We used data on 106 litters from 19 years of field survey in two nearby habitats occupied by the meadow viper Vipera ursinii ursinii in south-eastern France. Maternal reproductive investment and habitat quality had no differential effects on the growth and survival of sons and daughters. Sex ratio at birth was balanced despite a slight female-biased mortality before birth. No sexual mass dimorphism between offspring was evident. Sex allocation was almost random apart for a trend towards more male-biased litters as females grew older, which could be explained by an inbreeding avoidance strategy. Thus, a weak selection for facultative sex allocation seems sufficient to explain the almost equal sex allocation in the meadow viper.

  18. A Compact Half Select Disturb Free Static Random Access Memory Cell with Stacked Vertical Metal-Oxide-Semiconductor Field-Effect Transistor

    NASA Astrophysics Data System (ADS)

    Na, Hyoungjun; Endoh, Tetsuo

    2012-02-01

    In this paper, a half select disturb free compact static random access memory (SRAM) cell with the stacked vertical metal-oxide-semiconductor field-effect transistor (MOSFET) is proposed, and the impacts on its cell size, stability and speed performance are evaluated. The proposed SRAM cell has a small cell size, which is 67% of the conventional eight-transistor (8T) SRAM cell, because of its stacked vertical MOSFET structure. It realizes a half select disturb free SRAM operation; therefore, a larger static noise margin of 5.9 times is achieved in comparison with the conventional 8T SRAM cell. It suppresses the degradation of the write margin, thus its write margin is 84.2% of the conventional 8T SRAM cell. Furthermore, it suppresses the degradation of the write time by 39% (0.249 ns). The proposed compact SRAM cell with the stacked vertical MOSFET is a suitable SRAM cell with a small cell size, immunity to the half select disturb, wide write margin and fast write time.

  19. Impact of random and systematic recall errors and selection bias in case--control studies on mobile phone use and brain tumors in adolescents (CEFALO study).

    PubMed

    Aydin, Denis; Feychting, Maria; Schüz, Joachim; Andersen, Tina Veje; Poulsen, Aslak Harbo; Prochazka, Michaela; Klaeboe, Lars; Kuehni, Claudia E; Tynes, Tore; Röösli, Martin

    2011-07-01

    Whether the use of mobile phones is a risk factor for brain tumors in adolescents is currently being studied. Case--control studies investigating this possible relationship are prone to recall error and selection bias. We assessed the potential impact of random and systematic recall error and selection bias on odds ratios (ORs) by performing simulations based on real data from an ongoing case--control study of mobile phones and brain tumor risk in children and adolescents (CEFALO study). Simulations were conducted for two mobile phone exposure categories: regular and heavy use. Our choice of levels of recall error was guided by a validation study that compared objective network operator data with the self-reported amount of mobile phone use in CEFALO. In our validation study, cases overestimated their number of calls by 9% on average and controls by 34%. Cases also overestimated their duration of calls by 52% on average and controls by 163%. The participation rates in CEFALO were 83% for cases and 71% for controls. In a variety of scenarios, the combined impact of recall error and selection bias on the estimated ORs was complex. These simulations are useful for the interpretation of previous case-control studies on brain tumor and mobile phone use in adults as well as for the interpretation of future studies on adolescents.

  20. Treatment Selection Choices Should Not Be Based on Benefits or Costs Alone: A Head-to-Head Randomized Controlled Trial of Antiviral Drugs for Hepatitis C

    PubMed Central

    Davitkov, Perica; Chandar, Apoorva Krishna; Hirsch, Amy; Compan, Anita; Silveira, Marina G.; Anthony, Donald D.; Smith, Suzanne; Gideon, Clare; Bonomo, Robert A.; Falck-Ytter, Yngve

    2016-01-01

    , pragmatic randomized controlled trials are necessary for guidance beyond just acquisition costs and to make evidence-based formulary selections when multiple effective treatments are available. (Clinicaltrials.gov registration: NCT02113631). PMID:27741230

  1. Efficacy and acceptability of selective serotonin reuptake inhibitors for the treatment of depression in Parkinson's disease: a systematic review and meta-analysis of randomized controlled trials

    PubMed Central

    2010-01-01

    Background Selective serotonin reuptake inhibitors (SSRIs) are the most commonly prescribed antidepressants for the treatment of depression in patients with Parkinson's Disease (PD) but data on their efficacy are controversial. Methods We conducted a systematic review and meta-analysis of randomized controlled trials to investigate the efficacy and acceptability of SSRIs in the treatment of depression in PD. Results Ten studies were included. In the comparison between SSRIs and Placebo (n = 6 studies), the combined risk ratio (random effects) was 1.08 (95% confidence interval: 0.77 - 1.55, p = 0.67). In the comparison between SSRIs and Tricyclic Antidepressants (TCAs) (n = 3 studies) the combined risk ratio was 0.75 (0.39 - 1.42, p = 0.37). An acceptability analysis showed that SSRIs were generally well tolerated. Conclusions These results suggest that there is insufficient evidence to reject the null hypothesis of no differences in efficacy between SSRIs and placebo in the treatment of depression in PD. Due to the limited number of studies and the small sample sizes a type II error (false negative) cannot be excluded. The comparison between SSRIs and TCAs is based on only three studies and further trials with more pragmatic design are needed. PMID:20565960

  2. A Comparison of the Effects of Random and Selective Mass Extinctions on Erosion of Evolutionary History in Communities of Digital Organisms

    PubMed Central

    Yedid, Gabriel; Stredwick, Jason; Ofria, Charles A.; Agapow, Paul-Michael

    2012-01-01

    The effect of mass extinctions on phylogenetic diversity and branching history of clades remains poorly understood in paleobiology. We examined the phylogenies of communities of digital organisms undergoing open-ended evolution as we subjected them to instantaneous “pulse” extinctions, choosing survivors at random, and to prolonged “press” extinctions involving a period of low resource availability. We measured age of the phylogenetic root and tree stemminess, and evaluated how branching history of the phylogenetic trees was affected by the extinction treatments. We found that strong random (pulse) and strong selective extinction (press) both left clear long-term signatures in root age distribution and tree stemminess, and eroded deep branching history to a greater degree than did weak extinction and control treatments. The widely-used Pybus-Harvey gamma statistic showed a clear short-term response to extinction and recovery, but differences between treatments diminished over time and did not show a long-term signature. The characteristics of post-extinction phylogenies were often affected as much by the recovery interval as by the extinction episode itself. PMID:22693570

  3. A theory for the origin of a self-replicating chemical system. I - Natural selection of the autogen from short, random oligomers

    NASA Technical Reports Server (NTRS)

    White, D. H.

    1980-01-01

    A general theory is presented for the origin of a self-replicating chemical system, termed an autogen, which is capable of both crude replication and translation (protein synthesis). The theory requires the availability of free energy and monomers to the system, a significant background low-yield synthesis of kinetically stable oligopeptides and oligonucleotides, the localization of the oligomers, crude oligonucleotide selectivity of amino acids during oligopeptide synthesis, crude oligonucleotide replication, and two short peptide families which catalyze replication and translation, to produce a localized group of at least one copy each of two protogenes and two protoenzymes. The model posits a process of random oligomerization, followed by the random nucleation of functional components and the rapid autocatalytic growth of the functioning autogen to macroscopic amounts, to account for the origin of the first self-replicating system. Such a process contains steps of such high probability and short time periods that it is suggested that the emergence of an autogen in a laboratory experiment of reasonable time scale may be possible.

  4. Customer-oriented counseling for physical activity in older people: study protocol and selected baseline results of a randomized-controlled trial (ISRCTN 07330512).

    PubMed

    Leinonen, R; Heikkinen, E; Hirvensalo, M; Lintunen, T; Rasinaho, M; Sakari-Rantala, R; Kallinen, M; Koski, J; Möttönen, S; Kannas, S; Huovinen, P; Rantanen, T

    2007-04-01

    The objective of this study is to describe the rationale, design and selected baseline results of a 2-year randomized-controlled trial (RCT) on the effects of physical activity counseling in community-living older people. After a four-phase screening and data-collection process targeting all independently living people in the city center of Jyväskylä, Finland, six hundred and thirty-two 75-81-year-old cognitively intact, sedentary persons who were able to move independently outdoors at least minimally and willing to take part in the RCT were randomized into intervention and control groups. At baseline, over half of the subjects exercised less than two to three times a month and two-thirds were willing to increase their physical activity level. The desire to increase physical activity was more common (86%) among subjects with mobility limitation compared with those without (60%, P=0.004). The intervention group received an individualized face-to-face counseling session, followed by phone contacts every 3 months throughout the intervention. The study outcomes include physical activity level, mobility limitation, functional impairments, disability, mood, quality of life, use of services, institutionalization and mortality. The screening and recruitment process was feasible and succeeded well, and showed that unmet physical activity needs are common in older people.

  5. Deviations from compositional randomness in eukaryotic and prokaryotic proteins: the hypothesis of selective-stochastic stability and a principle of charge conservation.

    PubMed

    Holmquist, R

    1975-03-24

    Eight proteins of diverse lengths, functions, and origin, are examined for compositional non-randomness amino acid by amino acid. The proteins investigated are human fibrinopeptide A, guinea pig Insulin, rattlesnake cytochrome c, MS2 phage coat protein, rabbit triosephosphate isomerase, bovine pancreatic deoxyribonuclease A, bovine glutamate dehydrogenase, and Bacillus thermoproteolyticus thermolysin. As a result of this study the experimentally testable hypothesis is put forth that for a large class of proteins the ratio of that fraction of the molecule which exhibits compositional non-randomness to that fraction which does not is on the average, stable about a mean value (estimated as 0.32 plus or minus 0.17) and (nearly) independent of protein length. Stochastic and selective evolutionary forces are viewed as interacting rather than independent phenomena. With respect to amino acid composition, this coupling ameliorates the current controversy over Darwinian vs. non-Darwinian evolution, selectionist vs. neutralist, in favor of neither: Within the context of the quantitative data, the evolution of real proteins is seen as a compromise between the two viewpoints, both important. The compositional fluctuations of the electrically charged amino acids glutamic and aspartic acid, lysine and arginine, are examined in depth for over eighty protein families, both prokaryotic and eukaryotic. For both taxa, each of the acidic amino acids is present in amounts roughly twice that predicted from the genetic code. The presence of an excess of glutamic acid is independent of the presence of an excess of aspartic acid and vice versa.

  6. A comparison of the effects of random and selective mass extinctions on erosion of evolutionary history in communities of digital organisms.

    PubMed

    Yedid, Gabriel; Stredwick, Jason; Ofria, Charles A; Agapow, Paul-Michael

    2012-01-01

    The effect of mass extinctions on phylogenetic diversity and branching history of clades remains poorly understood in paleobiology. We examined the phylogenies of communities of digital organisms undergoing open-ended evolution as we subjected them to instantaneous "pulse" extinctions, choosing survivors at random, and to prolonged "press" extinctions involving a period of low resource availability. We measured age of the phylogenetic root and tree stemminess, and evaluated how branching history of the phylogenetic trees was affected by the extinction treatments. We found that strong random (pulse) and strong selective extinction (press) both left clear long-term signatures in root age distribution and tree stemminess, and eroded deep branching history to a greater degree than did weak extinction and control treatments. The widely-used Pybus-Harvey gamma statistic showed a clear short-term response to extinction and recovery, but differences between treatments diminished over time and did not show a long-term signature. The characteristics of post-extinction phylogenies were often affected as much by the recovery interval as by the extinction episode itself.

  7. A theory for the origin of a self-replicating chemical system. I - Natural selection of the autogen from short, random oligomers

    NASA Technical Reports Server (NTRS)

    White, D. H.

    1980-01-01

    A general theory is presented for the origin of a self-replicating chemical system, termed an autogen, which is capable of both crude replication and translation (protein synthesis). The theory requires the availability of free energy and monomers to the system, a significant background low-yield synthesis of kinetically stable oligopeptides and oligonucleotides, the localization of the oligomers, crude oligonucleotide selectivity of amino acids during oligopeptide synthesis, crude oligonucleotide replication, and two short peptide families which catalyze replication and translation, to produce a localized group of at least one copy each of two protogenes and two protoenzymes. The model posits a process of random oligomerization, followed by the random nucleation of functional components and the rapid autocatalytic growth of the functioning autogen to macroscopic amounts, to account for the origin of the first self-replicating system. Such a process contains steps of such high probability and short time periods that it is suggested that the emergence of an autogen in a laboratory experiment of reasonable time scale may be possible.

  8. Genetic evaluation and selection response for growth in meat-type quail through random regression models using B-spline functions and Legendre polynomials.

    PubMed

    Mota, L F M; Martins, P G M A; Littiere, T O; Abreu, L R A; Silva, M A; Bonafé, C M

    2017-08-14

    The objective was to estimate (co)variance functions using random regression models (RRM) with Legendre polynomials, B-spline function and multi-trait models aimed at evaluating genetic parameters of growth traits in meat-type quail. A database containing the complete pedigree information of 7000 meat-type quail was utilized. The models included the fixed effects of contemporary group and generation. Direct additive genetic and permanent environmental effects, considered as random, were modeled using B-spline functions considering quadratic and cubic polynomials for each individual segment, and Legendre polynomials for age. Residual variances were grouped in four age classes. Direct additive genetic and permanent environmental effects were modeled using 2 to 4 segments and were modeled by Legendre polynomial with orders of fit ranging from 2 to 4. The model with quadratic B-spline adjustment, using four segments for direct additive genetic and permanent environmental effects, was the most appropriate and parsimonious to describe the covariance structure of the data. The RRM using Legendre polynomials presented an underestimation of the residual variance. Lesser heritability estimates were observed for multi-trait models in comparison with RRM for the evaluated ages. In general, the genetic correlations between measures of BW from hatching to 35 days of age decreased as the range between the evaluated ages increased. Genetic trend for BW was positive and significant along the selection generations. The genetic response to selection for BW in the evaluated ages presented greater values for RRM compared with multi-trait models. In summary, RRM using B-spline functions with four residual variance classes and segments were the best fit for genetic evaluation of growth traits in meat-type quail. In conclusion, RRM should be considered in genetic evaluation of breeding programs.

  9. The effects of Nordic Walking training on selected upper-body muscle groups in female-office workers: A randomized trial.

    PubMed

    Kocur, Piotr; Pospieszna, Barbara; Choszczewski, Daniel; Michalowski, Lukasz; Wiernicka, Marzena; Lewandowski, Jacek

    2017-01-01

    Regular Nordic Walking training could improve fitness and reduce tenderness in selected muscle groups in office workers. An assessment of the effects of a 12-week Nordic Walking training program on the perceived pain threshold (PPT) and the flexibility of selected upper-body muscle groups in postmenopausal female office workers. 39 office workers were selected at random for the treatment group (NWg, n = 20) and the control group (Cg, n = 19). The persons from the NW group completed a 12-week Nordic Walking training program (3 times a week/1 hour). PPTs measurements in selected muscles and functional tests evaluating upper-body flexibility (Back Scratch - BS) were carried out twice in every participant of the study: before and after the training program. A significant increase in PPT (kg/cm2) was observed in the following muscles in the NW group only: upper trapezius (from 1,32 kg/cm2 to 1,99 kg/cm2), mid trapezius (from 2,92 kg/cm2 to 3,30 kg/cm2), latissimus dorsi (from 1,66 kg/cm2 to 2,21 kg/cm2) and infraspinatus (from 1,63 kg/cm2 to 2,93 kg/cm2). Moreover, a significant improvement in the BS test was noted in the NW group compared with the control group (from -1,16±5,7 cm to 2,18±5,1 cm in the NW group vs from -2,52±6,1 to -2,92±6,2 in the control group). A 12-week Nordic Walking training routine improves shoulder mobility and reduces tenderness in the following muscles: trapezius pars descendens and middle trapezius, infraspinatus and latissimus dorsi, in female office workers.

  10. Examination of the transcription factor NtcA-binding motif by in vitro selection of DNA sequences from a random library.

    PubMed

    Jiang, F; Wisén, S; Widersten, M; Bergman, B; Mannervik, B

    2000-08-25

    A recursive in vitro selection among random DNA sequences was used for analysis of the cyanobacterial transcription factor NtcA-binding motifs. An eight-base palindromic sequence, TGTA-(N(8))-TACA, was found to be the optimal NtcA-binding sequence. The more divergent the binding sequences, compared to this consensus sequence, the lower the NtcA affinity. The second and third bases in each four-nucleotide half of the consensus sequence were crucial for NtcA binding, and they were in general highly conserved. The most frequently occurring sequence in the middle weakly conserved region was similar to that of the NtcA-binding motif of the Anabaena sp. strain PCC 7120 glnA gene, previously known to have high affinity for NtcA. This indicates that the middle sequences were selected for high NtcA affinity. Analysis of natural NtcA-binding motifs showed that these could be classified into two groups based on differences in recognition consensus sequences. It is suggested that NtcA naturally recognizes different DNA-binding motifs, or has differential affinities to these sequences under different physiological conditions.

  11. A randomized controlled trial investigating the use of a predictive nomogram for the selection of the FSH starting dose in IVF/ICSI cycles.

    PubMed

    Allegra, Adolfo; Marino, Angelo; Volpes, Aldo; Coffaro, Francesco; Scaglione, Piero; Gullo, Salvatore; La Marca, Antonio

    2017-01-23

    The number of oocytes retrieved is a relevant intermediate outcome in women undergoing IVF/intracytoplasmic sperm injection (ICSI). This trial compared the efficiency of the selection of the FSH starting dose according to a nomogram based on multiple biomarkers (age, day 3 FSH, anti-Müllerian hormone) versus an age-based strategy. The primary outcome measure was the proportion of women with an optimal number of retrieved oocytes defined as 8-14. At their first IVF/ICSI cycle, 191 patients underwent a long gonadotrophin-releasing hormone agonist protocol and were randomized to receive a starting dose of recombinant (human) FSH, based on their age (150 IU if ≤35 years, 225 IU if >35 years) or based on the nomogram. Optimal response was observed in 58/92 patients (63%) in the nomogram group and in 42/99 (42%) in the control group (+21%, 95% CI = 0.07 to 0.35, P = 0.0037). No significant differences were found in the clinical pregnancy rate or the number of embryos cryopreserved per patient. The study showed that the FSH starting dose selected according to ovarian reserve is associated with an increase in the proportion of patients with an optimal response: large trials are recommended to investigate any possible effect on the live-birth rate.

  12. Routine invasive versus selective invasive strategies for Non-ST-elevation acute coronary syndromes: An Updated meta-analysis of randomized trials.

    PubMed

    Elgendy, Islam Y; Kumbhani, Dharam J; Mahmoud, Ahmed N; Wen, Xuerong; Bhatt, Deepak L; Bavry, Anthony A

    2016-11-01

    To perform an updated systematic review comparing a routine invasive strategy with a selective invasive strategy for patients with non-ST-elevation acute coronary syndromes (NSTE-ACS) in the era of stents and antiplatelet therapy. Recent meta-analyses comparing both strategies have shown conflicting results. Electronic databases were searched for randomized trials that compared a routine invasive strategy (i.e., routine coronary angiography +/- revascularization) versus a selective invasive strategy (i.e., medical stabilization and coronary angiography +/- revascularization if objective evidence of ischemia or refractory ischemia) in patients with NSTE-ACS. Summary odds ratios (OR) were primarily constructed using Peto's model. Twelve trials with 9,650 patients were included. Compared with a selective invasive strategy, a routine invasive strategy was associated with a reduction in the composite of all-cause mortality or myocardial infarction (MI) [OR: 0.86, 95% confidence interval (CI) 0.77-0.96] at a mean follow-up of 39 months, primarily due to a reduction in the risk of MI (OR: 0.78, 95% CI: 0.68-0.88). The risk of all-cause mortality was non-significantly reduced with a routine invasive strategy (OR: 0.88, 95% CI: 0.77-1.01). The risk of recurrent angina was reduced with a routine invasive strategy (OR: 0.55, 95% CI: 0.49-0.62), as well as the risk of future revascularization procedures (OR: 0.35, 95% CI: 0.30-0.39). In patients with NSTE-ACS, a routine invasive strategy reduced the risk of ischemic events, including the risk of mortality or MI. Routine invasive therapy reduced the risk of recurrent angina and future revascularization procedures. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. Recognizing and predicting thioether bridges formed by lanthionine and β-methyllanthionine in lantibiotics using a random forest approach with feature selection.

    PubMed

    Wang, ShaoPeng; Zhang, Yu-Hang; Zhang, Ning; Chen, Lei; Huang, Tao; Cai, Yu-Dong

    2017-03-10

    Lantibiotics, which are usually produced from Gram-positive bacteria, are regarded as one type of special bacteriocins. Lantibiotics have unsaturated amino acid residues formed by lanthionine (Lan) and β-methyllanthionine (MeLan) residues as a ring structure in the peptide. They are derived from the serine and threonine residues and are essential to preventing the growth of other similar strains. In this pioneering work, we firstly proposed a machine learning method to recognize and predict the Lan and MeLan residues in the protein sequences of lantibiotics. We adopted maximal relevance minimal redundancy (mRMR) and incremental feature selection (IFS) to select optimal features and random forest (RF) to build classifiers determining the Lan and MeLan residues. A 10-fold cross-validation test was performed on the classifiers to evaluate their predicted performances. As a result, the Matthew's correlation coefficient (MCC) values for predicting the Lan and MeLan residues were 0.813 and 0.769, respectively. Our constructed RF classifiers were shown to have a reliable ability to recognize Lan and MeLan residues from lantibiotic sequences. Furthermore, three other methods, Dagging, the nearest neighbor algorithm (NNA) and sequential minimal optimization (SMO) were also utilized to build classifiers to predict Lan and MeLan residues for comparison. Analysis was also performed on the optimal features, and the relationships between the optimal features and their biological importance were provided. We believe the selected optimal features and analysis in this work will contribute to a better understanding of the sequence and structural features around the Lan and MeLan residues. It could provide useful information and practical suggestions for experimental and computational methods toward exploring the biological features of such special residues in lantibiotics. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  14. Excimer laser trabeculotomy vs 180 degrees selective laser trabeculoplasty in primary open-angle glaucoma. A 2-year randomized, controlled trial.

    PubMed

    Babighian, S; Caretti, L; Tavolato, M; Cian, R; Galan, A

    2010-04-01

    To compare the effectiveness and safety of excimer laser trabeculotomy (ELT) ab interno vs selective laser trabeculoplasty (SLT) over 24 months of follow-up in patients with primary open-angle glaucoma (POAG) refractory to medical therapy. This prospective, randomized study included 30 consecutive eyes assigned randomly to either ELT or SLT group. ELT was carried out using a XeCl Excimer Laser with an emission wavelength of 308 nm. Eight spots were equally distributed at a distance of 500 microm from one another over the anterior trabeculum. The SLT patients were treated with a frequency-doubled q-switched neodymium:yytrium-aluminum-garnet laser (wavelength 532 nm). Approximately 50 adjacent, but not overlapping, laser spots were distributed over 180 degrees of the trabecular meshwork, using an energy level ranging from 0.7 to 1.0 mJ per pulse. The main outcome measure was intraocular pressure (IOP) lowering after ELT and SLT. Success was defined as >or=20% reduction in IOP without further glaucoma intervention. At 24 months, complete success rates were 53.3% for the ELT group and 40% for the SLT group (P=0.35, Fisher's exact test); qualified success rates were 33.3% for the ELT and 26.6% for the SLT group (P=0.5, Fisher's exact test).Mean IOP decreased from 25.0+/-1.9 to 17.6+/-2.2 mmHg (-29.6%; P<0.0001) in the ELT group and from 23.9+/-0.9 to 19.1+/-1.8 mmHg (-21%; P<0.0001) in the SLT group. Both ELT and SLT proved to be effective techniques in the treatment of POAG refractory to medical therapy.

  15. NBI‐98854, a selective monoamine transport inhibitor for the treatment of tardive dyskinesia: A randomized, double‐blind, placebo‐controlled study

    PubMed Central

    Jimenez, Roland; Hauser, Robert A.; Factor, Stewart A.; Burke, Joshua; Mandri, Daniel; Castro‐Gayol, Julio C.

    2015-01-01

    ABSTRACT Background Tardive dyskinesia is a persistent movement disorder induced by chronic neuroleptic exposure. NBI‐98854 is a novel, highly selective, vesicular monoamine transporter 2 inhibitor. We present results of a randomized, 6‐week, double‐blind, placebo‐controlled, dose‐titration study evaluating the safety, tolerability, and efficacy of NBI‐98854 for the treatment of tardive dyskinesia. Methods Male and female adult subjects with moderate or severe tardive dyskinesia were included. NBI‐98854 or placebo was given once per day starting at 25 mg and then escalated by 25 mg to a maximum of 75 mg based on dyskinesia and tolerability assessment. The primary efficacy endpoint was the change in Abnormal Involuntary Movement Scale from baseline at week 6 scored by blinded, central video raters. The secondary endpoint was the Clinical Global Impression of Change—Tardive Dyskinesia score assessed by the blinded investigator. Results Two hundred five potential subjects were screened, and 102 were randomized; 76% of NBI‐98854 subjects and 80% of placebo subjects reached the maximum allowed dose. Abnormal Involuntary Movement Scale scores for NBI‐98854 compared with placebo were significantly reduced (p = 0.0005). Active drug was also superior on the Clinical Global Impression of Change—Tardive Dyskinesia (p < 0.0001). Treatment‐emergent adverse event rates were 49% in the NBI‐98854 and 33% in the placebo subjects. The most common adverse events (active vs. placebo) were fatigue and headache (9.8% vs. 4.1%) and constipation and urinary tract infection (3.9% vs. 6.1%). No clinically relevant changes in safety assessments were noted. Conclusion NBI‐98854 significantly improved tardive dyskinesia and was well tolerated in patients. These results support the phase 3 clinical trials of NBI‐98854 now underway. © 2015 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and Movement Disorder

  16. A Randomized, Phase II, Biomarker-Selected Study Comparing Erlotinib to Erlotinib Intercalated With Chemotherapy in First-Line Therapy for Advanced Non–Small-Cell Lung Cancer

    PubMed Central

    Hirsch, Fred R.; Kabbinavar, Fairooz; Eisen, Tim; Martins, Renato; Schnell, Fredrick M.; Dziadziuszko, Rafal; Richardson, Katherine; Richardson, Frank; Wacker, Bret; Sternberg, David W.; Rusk, Jason; Franklin, Wilbur A.; Varella-Garcia, Marileila; Bunn, Paul A.; Camidge, D. Ross

    2011-01-01

    Purpose Erlotinib prolongs survival in patients with advanced non–small-cell lung cancer (NSCLC). We report the results of a randomized, phase II study of erlotinib alone or intercalated with chemotherapy (CT + erlotinib) in chemotherapy-naïve patients with advanced NSCLC who were positive for epidermal growth factor receptor (EGFR) protein expression and/or with high EGFR gene copy number. Patients and Methods A total of 143 patients were randomly assigned to either erlotinib 150 mg daily orally until disease progression (PD) occurred or to chemotherapy with paclitaxel 200 mg/m2 intravenously (IV) and carboplatin dosed by creatinine clearance (AUC 6) IV on day 1 intercalated with erlotinib 150 mg orally on days 2 through 15 every 3 weeks for four cycles followed by erlotinib 150 mg orally until PD occurred (CT + erlotinib). The primary end point was 6-month progression-free survival (PFS); secondary end points included response rate, PFS, and survival. EGFR, KRAS mutation, EGFR fluorescent in situ hybridization and immunohistochemistry, and E-cadherin and vimentin protein levels were also assessed. Results Six-month PFS rates were 26% and 31% for the two arms (CT + erlotinib and erlotinib alone, respectively). Both were less than the historical control of 45% (P = .001 and P = .011, respectively). Median PFS times were 4.57 and 2.69 months, respectively. Patients with tumors harboring EGFR activating mutations fared better on erlotinib alone (median PFS, 18.2 months v 4.9 months for CT + erlotinib). Conclusion The feasibility of a multicenter biomarker-driven study was demonstrated, but neither treatment arms exceeded historical controls. This study does not support combined chemotherapy and erlotinib in first-line treatment of EGFR-selected advanced NSCLC, and the patients with tumors harboring EGFR mutations had a better outcome on erlotinib alone. PMID:21825259

  17. Impact of retreatment with an artemisinin-based combination on malaria incidence and its potential selection of resistant strains: study protocol for a randomized controlled clinical trial

    PubMed Central

    2013-01-01

    Background Artemisinin-based combination therapy is currently recommended by the World Health Organization as first-line treatment of uncomplicated malaria. Recommendations were adapted in 2010 regarding rescue treatment in case of treatment failure. Instead of quinine monotherapy, it should be combined with an antibiotic with antimalarial properties; alternatively, another artemisinin-based combination therapy may be used. However, for informing these policy changes, no clear evidence is yet available. The need to provide the policy makers with hard data on the appropriate rescue therapy is obvious. We hypothesize that the efficacy of the same artemisinin-based combination therapy used as rescue treatment is as efficacious as quinine + clindamycin or an alternative artemisinin-based combination therapy, without the risk of selecting drug resistant strains. Design We embed a randomized, open label, three-arm clinical trial in a longitudinal cohort design following up children with uncomplicated malaria until they are malaria parasite free for 4 weeks. The study is conducted in both the Democratic Republic of Congo and Uganda and performed in three steps. In the first step, the pre-randomized controlled trial (RCT) phase, children aged 12 to 59 months with uncomplicated malaria are treated with the recommended first-line drug and constitute a cohort that is passively followed up for 42 days. If the patients experience an uncomplicated malaria episode between days 14 and 42 of follow-up, they are randomized either to quinine + clindamycin, or an alternative artemisinin-based combination therapy, or the same first-line artemisinin-based combination therapy to be followed up for 28 additional days. If between days 14 and 28 the patients experience a recurrent parasitemia, they are retreated with the recommended first-line regimen and actively followed up for another 28 additional days (step three; post-RCT phase). The same methodology is followed for each subsequent

  18. Mavoglurant Augmentation in OCD Patients Resistant to Selective Serotonin Reuptake Inhibitors: A Proof-of-Concept, Randomized, Placebo-Controlled, Phase 2 Study.

    PubMed

    Rutrick, Daniel; Stein, Dan J; Subramanian, Ganesan; Smith, Brian; Fava, Maurizio; Hasler, Gregor; Cha, Jang-Ho; Gasparini, Fabrizio; Donchev, Toni; Ocwieja, Magdalena; Johns, Donald; Gomez-Mancilla, Baltazar

    2017-02-01

    To determine if mavoglurant (modified release) as an augmentation therapy to selective serotonin reuptake inhibitors (SSRIs) could have beneficial effects reducing Yale-Brown Obsessive Compulsive Scale (Y-BOCS) total score in patients with obsessive-compulsive disorder (OCD) resistant to SSRI treatment. This was a multicenter, randomized, double-blind, placebo-controlled, parallel-group, phase 2 study. Patients remained on their SSRI treatment and mavoglurant or placebo was added on. Non-smoking men and women aged 18-65 years primarily diagnosed with OCD according to Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev.; DSM-IV-TR) criteria were randomized (1:1) to mavoglurant or placebo groups. After 50 patients were randomized, an interim analysis was conducted to determine whether the study should be continued. The primary outcome measure was absolute change in Y-BOCS from baseline at week 17. Safety was assessed by recording adverse events (AEs) and serious adverse events (SAEs). Interim analysis led to a decision to terminate the study. In total 38 (76.0%) participants completed 17 weeks of treatment and 37 (74.0%) completed the study. There was no significant difference in least squares (LS) mean change from baseline at week 17 in Y-BOCS total score for mavoglurant compared with placebo groups [-6.9 (1.75) vs. -8.0 (1.78), respectively; LS mean difference 1.1; 95% CI -3.9, 6.2; p = 0.671]. The incidence of AEs was higher in the mavoglurant compared with the placebo group (80.8% vs. 70.8%, respectively). This study of mavoglurant in OCD was terminated because of the lack of efficacy at interim analysis. The study did not support the use of an antagonist of mGluR5 receptors for OCD treatment. The study was registered with ClinicalTrials.gov: NCT01813019. This study was sponsored by Novartis Pharma AG, Basel, Switzerland.

  19. Randomized trial of switching from prescribed non-selective non-steroidal anti-inflammatory drugs to prescribed celecoxib: the Standard care vs. Celecoxib Outcome Trial (SCOT).

    PubMed

    MacDonald, Thomas M; Hawkey, Chris J; Ford, Ian; McMurray, John J V; Scheiman, James M; Hallas, Jesper; Findlay, Evelyn; Grobbee, Diederick E; Hobbs, F D Richard; Ralston, Stuart H; Reid, David M; Walters, Matthew R; Webster, John; Ruschitzka, Frank; Ritchie, Lewis D; Perez-Gutthann, Susana; Connolly, Eugene; Greenlaw, Nicola; Wilson, Adam; Wei, Li; Mackenzie, Isla S

    2017-06-14

    Selective cyclooxygenase-2 inhibitors and conventional non-selective non-steroidal anti-inflammatory drugs (nsNSAIDs) have been associated with adverse cardiovascular (CV) effects. We compared the CV safety of switching to celecoxib vs. continuing nsNSAID therapy in a European setting. Patients aged 60 years and over with osteoarthritis or rheumatoid arthritis, free from established CV disease and taking chronic prescribed nsNSAIDs, were randomized to switch to celecoxib or to continue their previous nsNSAID. The primary endpoint was hospitalization for non-fatal myocardial infarction or other biomarker positive acute coronary syndrome, non-fatal stroke or CV death analysed using a Cox model with a pre-specified non-inferiority limit of 1.4 for the hazard ratio (HR). In total, 7297 participants were randomized. During a median 3-year follow-up, fewer subjects than expected developed an on-treatment (OT) primary CV event and the rate was similar for celecoxib, 0.95 per 100 patient-years, and nsNSAIDs, 0.86 per 100 patient-years (HR = 1.12, 95% confidence interval, 0.81-1.55; P = 0.50). Comparable intention-to-treat (ITT) rates were 1.14 per 100 patient-years with celecoxib and 1.10 per 100 patient-years with nsNSAIDs (HR = 1.04; 95% confidence interval, 0.81-1.33; P = 0.75). Pre-specified non-inferiority was achieved in the ITT analysis. The upper bound of the 95% confidence limit for the absolute increase in OT risk associated with celecoxib treatment was two primary events per 1000 patient-years exposure. There were only 15 adjudicated secondary upper gastrointestinal complication endpoints (0.078/100 patient-years on celecoxib vs. 0.053 on nsNSAIDs OT, 0.078 vs. 0.053 ITT). More gastrointestinal serious adverse reactions and haematological adverse reactions were reported on nsNSAIDs than celecoxib, but more patients withdrew from celecoxib than nsNSAIDs (50.9% patients vs. 30.2%; P < 0.0001). In subjects 60 years and over, free from CV disease and

  20. Selection of IgG Variants with Increased FcRn Binding Using Random and Directed Mutagenesis: Impact on Effector Functions

    PubMed Central

    Monnet, Céline; Jorieux, Sylvie; Urbain, Rémi; Fournier, Nathalie; Bouayadi, Khalil; De Romeuf, Christophe; Behrens, Christian K.; Fontayne, Alexandre; Mondon, Philippe

    2015-01-01

    Despite the reasonably long half-life of immunoglogulin G (IgGs), market pressure for higher patient convenience while conserving efficacy continues to drive IgG half-life improvement. IgG half-life is dependent on the neonatal Fc receptor (FcRn), which among other functions, protects IgG from catabolism. FcRn binds the Fc domain of IgG at an acidic pH ensuring that endocytosed IgG will not be degraded in lysosomal compartments and will then be released into the bloodstream. Consistent with this mechanism of action, several Fc-engineered IgG with increased FcRn affinity and conserved pH dependency were designed and resulted in longer half-life in vivo in human FcRn-transgenic mice (hFcRn), cynomolgus monkeys, and recently in healthy humans. These IgG variants were usually obtained by in silico approaches or directed mutagenesis in the FcRn-binding site. Using random mutagenesis, combined with a pH-dependent phage display selection process, we isolated IgG variants with improved FcRn-binding, which exhibited longer in vivo half-life in hFcRn mice. Interestingly, many mutations enhancing Fc/FcRn interaction were located at a distance from the FcRn-binding site validating our random molecular approach. Directed mutagenesis was then applied to generate new variants to further characterize our IgG variants and the effect of the mutations selected. Since these mutations are distributed over the whole Fc sequence, binding to other Fc effectors, such as complement C1q and FcγRs, was dramatically modified, even by mutations distant from these effectors’ binding sites. Hence, we obtained numerous IgG variants with increased FcRn-binding and different binding patterns to other Fc effectors, including variants without any effector function, providing distinct “fit-for-purpose” Fc molecules. We therefore provide evidence that half-life and effector functions should be optimized simultaneously as mutations can have unexpected effects on all Fc receptors that are critical

  1. Rock magnetic evidence of non-random raw material selection criteria in Cerro Toledo Obsidian Artifacts from Valles Caldera, New Mexico

    NASA Astrophysics Data System (ADS)

    Gregovich, A.; Feinberg, J. M.; Steffen, A.; Sternberg, R. S.

    2014-12-01

    Stone tools are one of the most enduring forms of ancient human behavior available to anthropologists. The geologic materials that comprise stone tools are a reflection of the rocks that were available locally or through trade, as are the intended use of the tools and the knapping technology needed to produce them. Investigation of the rock magnetic and geochemical characteristics of the artifacts and the geological source materials provides a baseline to explore these past behaviors. This study uses rock magnetic properties to explore the raw material selection criteria involved in the production of obsidian tools in the region around Valles Caldera in northern New Mexico. Obsidian is locally abundant and was traded by tribes across the central United States. Here we compare the rock magnetic properties of a sample of obsidian projectile points (N =25) that have been geochemically sourced to the Cerro Toledo obsidian flow with geological samples collected from four sites within the same flow (N =135). This collection of archaeological artifacts, albeit small, contains representatives of at least 8 different point styles that were used over 6000 years from the Archaic into the Late Prehistoric. Bulk rock hysteresis parameters (Mr, Ms, Bc, and Bcr) and low-field susceptibility (Χ) measurements show that the projectile points generally contain a lower concentration of magnetic minerals than the geologic samples. For example, the artifacts' median Ms value is 2.9 x 10-3 Am2kg-1, while that of the geological samples is 6.5 x 10-3 Am2kg-1. The concentration of magnetic minerals in obsidian is a proxy for the concentration of microlites in general, and this relationship suggests that although obsidian was locally abundant, toolmakers employed non-random selection criteria resulting in generally lower concentrations of microlites in their obsidian tools.

  2. Evaluation of minimally invasive therapies and rationale for a prospective randomized trial to evaluate selective intra-arterial lysis for clinically complete central retinal artery occlusion.

    PubMed

    Mueller, Arthur J; Neubauer, Aljoscha S; Schaller, Ulrich; Kampik, Anselm

    2003-10-01

    To determine the effect of commonly used minimally invasive treatments for clinically complete nonarteritic central retinal artery occlusion (CRAO) and design a prospective randomized trial to evaluate selective intra-arterial lysis for this condition. In this retrospective noncomparative case series, all medical records of patients with a diagnosis of CRAO treated at the Department of Ophthalmology, Ludwig-Maximilians-Universität, Munich, Germany, from 1994 through 1999 were reviewed for treatments administered and course of visual acuity. Best-corrected visual acuity (BCVA) at initial and last visit. We identified 102 patient medical records; 71 were suitable for further analysis. Forty-four (62%) of the 71 patients included were treated with oral acetylsalicylate; 44 (62%), with oral acetazolamide; 32 (45%), with ocular massage; 22 (31%), with isovolemic hemodilution; 19 (27%), with oral pentoxifylline; 8 (11%), with topical beta-blocker; 6 (8%), with paracentesis of the anterior chamber; 4 (6%), with subcutaneous heparin. A mean +/- SD number of treatments of 2.5 +/- 1.4 was administered per patient, and BCVA increased by a mean +/- SD number of Snellen lines of 0.7 +/- 2.8. The BCVA in 11 patients (15%) increased by 3 or more lines. Multivariate stepwise regression did not reveal any single or combination treatment as a significant factor for improvement in BCVA. Patient age and duration of visual impairment before initial examination were not significant predictors of final BCVA. Commonly used minimally invasive treatments of CRAO do not improve the natural course of the disease. A prospective trial by the European Assessment Group for Lysis in the Eye is under way to evaluate selective intra-arterial lysis, and in this trial some of these minimally invasive treatments are used in the control group.

  3. Selective processing of auditory evoked responses with iterative-randomized stimulation and averaging: A strategy for evaluating the time-invariant assumption.

    PubMed

    Valderrama, Joaquin T; de la Torre, Angel; Medina, Carlos; Segura, Jose C; Thornton, A Roger D

    2016-03-01

    The recording of auditory evoked potentials (AEPs) at fast rates allows the study of neural adaptation, improves accuracy in estimating hearing threshold and may help diagnosing certain pathologies. Stimulation sequences used to record AEPs at fast rates require to be designed with a certain jitter, i.e., not periodical. Some authors believe that stimuli from wide-jittered sequences may evoke auditory responses of different morphology, and therefore, the time-invariant assumption would not be accomplished. This paper describes a methodology that can be used to analyze the time-invariant assumption in jittered stimulation sequences. The proposed method [Split-IRSA] is based on an extended version of the iterative randomized stimulation and averaging (IRSA) technique, including selective processing of sweeps according to a predefined criterion. The fundamentals, the mathematical basis and relevant implementation guidelines of this technique are presented in this paper. The results of this study show that Split-IRSA presents an adequate performance and that both fast and slow mechanisms of adaptation influence the evoked-response morphology, thus both mechanisms should be considered when time-invariance is assumed. The significance of these findings is discussed.

  4. Prevalence, diagnostics and management of musculoskeletal disorders in primary health care in Sweden – an investigation of 2000 randomly selected patient records

    PubMed Central

    Fahlström, Martin; Djupsjöbacka, Mats

    2016-01-01

    Abstract Rationale, aims and objectives The aims of this study is to investigate the prevalence of patients seeking care due to different musculoskeletal disorders (MSDs) at primary health care centres (PHCs), to chart different factors such as symptoms, diagnosis and actions prescribed for patients that visited the PHCs due to MSD and to make comparisons regarding differences due to gender, age and rural or urban PHC. Methods Patient records (2000) for patients in working age were randomly selected equally distributed on one rural and one urban PHC. A 3‐year period was reviewed retrospectively. For all patient records' background data, cause to the visit and diagnosis were registered. For visits due to MSD, type and location of symptoms and actions to resolve the patients problems were registered. Data was analysed using cross tabulation, multidimensional chi‐squared. Results The prevalence of MSD was high; almost 60% of all patients were seeking care due to MSD. Upper and lower limb problems were most common. Symptoms were most prevalent in the young and middle age groups. The patients got a variety of different diagnoses, and between 13 and 35% of the patients did not receive a MSD diagnose despite having MSD symptoms. There was a great variation in how the cases were handled. Conclusions The present study points out some weaknesses regarding diagnostics and management of MSD in primary care. PMID:27538347

  5. Prevalence of skeletal and eye malformations in frogs from north-central United States: estimations based on collections from randomly selected sites

    USGS Publications Warehouse

    Schoff, P.K.; Johnson, C.M.; Schotthoefer, A.M.; Murphy, J.E.; Lieske, C.; Cole, R.A.; Johnson, L.B.; Beasley, V.R.

    2003-01-01

    Skeletal malformation rates for several frog species were determined in a set of randomly selected wetlands in the north-central USA over three consecutive years. In 1998, 62 sites yielded 389 metamorphic frogs, nine (2.3%) of which had skeletal or eye malformations. A subset of the original sites was surveyed in the following 2 yr. In 1999, 1,085 metamorphic frogs were collected from 36 sites and 17 (1.6%) had skeletal or eye malformations, while in 2000, examination of 1,131 metamorphs yielded 16 (1.4%) with skeletal or eye malformations. Hindlimb malformations predominated in all three years, but other abnormalities, involving forelimb, eye, and pelvis were also found. Northern leopard frogs (Rana pipiens) constituted the majority of collected metamorphs as well as most of the malformed specimens. However, malformations were also noted in mink frogs (R. septentrionalis), wood frogs (R. sylvatica), and gray tree frogs (Hyla spp.). The malformed specimens were found in clustered sites in all three years but the cluster locations were not the same in any year. The malformation rates reported here are higher than the 0.3% rate determined for metamorphic frogs collected from similar sites in Minnesota in the 1960s, and thus, appear to represent an elevation of an earlier baseline malformation rate.

  6. Early prevention of antisocial personality: long-term follow-up of two randomized controlled trials comparing indicated and selective approaches.

    PubMed

    Scott, Stephen; Briskman, Jackie; O'Connor, Thomas G

    2014-06-01

    Antisocial personality is a common adult problem that imposes a major public health burden, but for which there is no effective treatment. Affected individuals exhibit persistent antisocial behavior and pervasive antisocial character traits, such as irritability, manipulativeness, and lack of remorse. Prevention of antisocial personality in childhood has been advocated, but evidence for effective interventions is lacking. The authors conducted two follow-up studies of randomized trials of group parent training. One involved 120 clinic-referred 3- to 7-year-olds with severe antisocial behavior for whom treatment was indicated, 93 of whom were reassessed between ages 10 and 17. The other involved 109 high-risk 4- to 6-year-olds with elevated antisocial behavior who were selectively screened from the community, 90 of whom were reassessed between ages 9 and 13. The primary psychiatric outcome measures were the two elements of antisocial personality, namely, antisocial behavior (assessed by a diagnostic interview) and antisocial character traits (assessed by a questionnaire). Also assessed were reading achievement (an important domain of youth functioning at work) and parent-adolescent relationship quality. In the indicated sample, both elements of antisocial personality were improved in the early intervention group at long-term follow-up compared with the control group (antisocial behavior: odds ratio of oppositional defiant disorder=0.20, 95% CI=0.06, 0.69; antisocial character traits: B=-4.41, 95% CI=-1.12, -8.64). Additionally, reading ability improved (B=9.18, 95% CI=0.58, 18.0). Parental expressed emotion was warmer (B=0.86, 95% CI=0.20, 1.41) and supervision was closer (B=-0.43, 95% CI=-0.11, -0.75), but direct observation of parenting showed no differences. Teacher-rated and self-rated antisocial behavior were unchanged. In contrast, in the selective high-risk sample, early intervention was not associated with improved long-term outcomes. Early intervention with

  7. Pattern randomness aftereffect

    PubMed Central

    Yamada, Yuki; Kawabe, Takahiro; Miyazaki, Makoto

    2013-01-01

    Humans can easily discriminate a randomly spaced from a regularly spaced visual pattern. Here, we demonstrate that observers can adapt to pattern randomness. Following their adaption to prolonged exposure to two-dimensional patterns with varying levels of physical randomness, observers judged the randomness of the pattern. Perceived randomness decreased (increased) following adaptation to high (low) physical randomness (Experiment 1). Adaptation to 22.5°-rotated adaptor stimuli did not cause a randomness aftereffect (Experiment 2), suggesting that positional variation is unlikely to be responsible for the pattern randomness perception. Moreover, the aftereffect was not selective to contrast polarity (Experiment 3) and was not affected by spatial jitter (Experiment 4). Last, the aftereffect was not affected by adaptor configuration (Experiment 5). Our data were consistent with a model assuming filter-rectify-filter processing for orientation inputs. Thus, we infer that neural processing for orientation grouping/segregation underlies the perception of pattern randomness. PMID:24113916

  8. The effects of selective head cooling versus whole-body cooling on some neural and inflammatory biomarkers: a randomized controlled pilot study.

    PubMed

    Çelik, Yalçın; Atıcı, Aytuğ; Gülaşı, Selvi; Makharoblıdze, Khatuna; Eskandari, Gülçin; Sungur, Mehmet Ali; Akbayır, Serin

    2015-10-15

    Therapeutic hypothermia (TH) has become standard care in newborns with moderate to severe hypoxic ischemic encephalopathy (HIE), and the 2 most commonly used methods are selective head cooling (SHC) and whole body cooling (WBC). This study aimed to determine if the effects of the 2 methods on some neural and inflammatory biomarkers differ. This prospective randomized pilot study included newborns delivered after >36 weeks of gestation. SHC or WBC was administered randomly to newborns with moderate to severe HIE that were prescribed TH. The serum interleukin (IL)-1β, IL-6, neuron-specific enolase (NSE), brain-specific creatine kinase (CK-BB), tumor necrosis factor-alpha (TNF-α), and protein S100 levels, the urine S100B level, and the urine lactate/creatinine (L/C) ratio were evaluated 6 and 72 h after birth. The Bayley Scales of Infant and Toddler Development-III was administered at month 12 for assessment of neurodevelopmental findings. The SHC group included 14 newborns, the WBC group included 10, the mild HIE group included 7, and the control group included 9. All the biomarker levels in the SHC and WBC groups at 6 and 72 h were similar, and all the changes in the biomarker levels between 6 and 72 h were similar in both groups. The serum IL-6 and protein S100 levels at 6 h in the SHC and WBC groups were significantly higher than in the control group. The urine L/C ratio at 6 h in the SHC and WBC groups was significantly higher than in the mild HIE and control groups. The IL-6 level and L/C ratio at 6 and 72 h in the patients that had died or had disability at month 12 were significantly higher than in the patients without disability at month 12. The effects of SHC and WBC on the biomarkers evaluated did not differ. The urine L/C ratio might be useful for differentiating newborns with moderate and severe HIE from those with mild HIE. Furthermore, the serum IL-6 level and the L/C ratio might be useful for predicting disability and mortality in newborns with HIE.

  9. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    ERIC Educational Resources Information Center

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  10. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    ERIC Educational Resources Information Center

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  11. Sequence Based Prediction of DNA-Binding Proteins Based on Hybrid Feature Selection Using Random Forest and Gaussian Naïve Bayes

    PubMed Central

    Lou, Wangchao; Wang, Xiaoqing; Chen, Fan; Chen, Yixiao; Jiang, Bo; Zhang, Hua

    2014-01-01

    Developing an efficient method for determination of the DNA-binding proteins, due to their vital roles in gene regulation, is becoming highly desired since it would be invaluable to advance our understanding of protein functions. In this study, we proposed a new method for the prediction of the DNA-binding proteins, by performing the feature rank using random forest and the wrapper-based feature selection using forward best-first search strategy. The features comprise information from primary sequence, predicted secondary structure, predicted relative solvent accessibility, and position specific scoring matrix. The proposed method, called DBPPred, used Gaussian naïve Bayes as the underlying classifier since it outperformed five other classifiers, including decision tree, logistic regression, k-nearest neighbor, support vector machine with polynomial kernel, and support vector machine with radial basis function. As a result, the proposed DBPPred yields the highest average accuracy of 0.791 and average MCC of 0.583 according to the five-fold cross validation with ten runs on the training benchmark dataset PDB594. Subsequently, blind tests on the independent dataset PDB186 by the proposed model trained on the entire PDB594 dataset and by other five existing methods (including iDNA-Prot, DNA-Prot, DNAbinder, DNABIND and DBD-Threader) were performed, resulting in that the proposed DBPPred yielded the highest accuracy of 0.769, MCC of 0.538, and AUC of 0.790. The independent tests performed by the proposed DBPPred on completely a large non-DNA binding protein dataset and two RNA binding protein datasets also showed improved or comparable quality when compared with the relevant prediction methods. Moreover, we observed that majority of the selected features by the proposed method are statistically significantly different between the mean feature values of the DNA-binding and the non DNA-binding proteins. All of the experimental results indicate that the proposed DBPPred

  12. Combining a dopamine agonist and selective serotonin reuptake inhibitor for the treatment of depression: A double-blind, randomized pilot study

    PubMed Central

    Franco-Chaves, Jose A.; Mateus, Camilo F.; Luckenbaugh, David A.; Martinez, Pedro E.; Mallinger, Alan G.; Zarate, Carlos A.

    2013-01-01

    Background Antidepressants that act on two or more amine neurotransmitters may confer higher remission rates when first-line agents affecting a single neurotransmitter have failed. Pramipexole, a dopamine agonist, has antidepressant effects in patients with major depressive disorder (MDD). This pilot study examined the efficacy and safety of combination therapy with pramipexole and the selective serotonin reuptake inhibitor (SSRI) escitalopram in MDD. Methods In this double-blind, controlled, pilot study, 39 patients with DSM-IV MDD who had failed to respond to a standard antidepressant treatment trial were randomized to receive pramipexole (n=13), escitalopram (n=13), or their combination (n=13) for six weeks. Pramipexole was started at 0.375 mg/day and titrated weekly up to 2.25 mg/day; escitalopram dosage remained at 10 mg/day. The primary outcome measure was the Montgomery–Asberg Depression Rating Scale (MADRS). Results Subjects receiving pramipexole monotherapy had significantly lower MADRS scores than the combination group (p=0.01); no other primary drug comparisons were significant. The combination group had a substantially higher dropout rate than the escitalopram and pramipexole groups (69%, 15%, 15%, respectively). Only 15% of patients in the combination group tolerated regularly scheduled increases of pramipexole throughout the study, compared with 46% of patients in the pramipexole group. Limitations Group size was small and the treatment phase lasted for only six weeks. Conclusions The combination of an SSRI and a dopamine agonist was not more effective than either agent alone, nor did it produce a more rapid onset of antidepressant action. Combination therapy with escitalopram and pramipexole may not be well-tolerated. PMID:23517885

  13. Trigemino-gustatory interactions: a randomized controlled clinical trial assessing the effects of selective anesthesia of dental afferents on taste thresholds.

    PubMed

    Lecor, Papa Abdou; Touré, Babacar; Boucher, Yves

    2017-08-31

    This study aimed at analyzing the effect of the temporary removal of trigeminal dental afferents on electrogustometric thresholds (EGMt). EGMt were measured in 300 healthy subjects randomized in three groups, in nine loci on the right and left side (RS, LS) of the tongue surface before and after anesthesia. Group IAN (n = 56 RS, n = 44 LS) received intraosseous local anesthesia of the inferior alveolar nerve (IAN). Group MdN received mandibular nerve (MdN) block targeting IAN before its entrance into the mandibular foramen (n = 60, RS, and n = 40, LS); group MxN receiving maxillary nerve (MxN) anesthesia (n = 56 RS and n = 44 LS) was the control group. Differences between mean EGMt were analyzed with the Wilcoxon test; correlation between type of anesthesia and EGMt was performed with Spearman's rho, all with a level of significance set at p ≤ 0.05. Significant EGMt (μA) differences before and after anesthesia were found in all loci with MdN and IAN on the ipsilateral side (p < 0.05), but not with MxN. Anesthesia of the MdN was positively correlated with the increase in EGMt (p < 0.001). Selective anesthesia of IAN was positively correlated only with the increase in EGMt measured at posterior and dorsal loci of the tongue surface (p < 0.01). The increase in EGMt following IAN anesthesia suggests a participation of dental afferents in taste perception. Extraction of teeth may impair food intake not only due to impaired masticatory ability but also to alteration of neurological trigemino-gustatory interactions. PACTR201602001452260.

  14. An assessment of the quality of care for children in eighteen randomly selected district and sub-district hospitals in Bangladesh

    PubMed Central

    2012-01-01

    Background Quality hospital care is important in ensuring that the needs of severely ill children are met to avert child mortality. However, the quality of hospital care for children in developing countries has often been found poor. As the first step of a country road map for improving hospital care for children, we assessed the baseline situation with respect to the quality of care provided to children under-five years age in district and sub-district level hospitals in Bangladesh. Methods Using adapted World Health Organization (WHO) hospital assessment tools and standards, an assessment of 18 randomly selected district (n=6) and sub-district (n=12) hospitals was undertaken. Teams of trained assessors used direct case observation, record review, interviews, and Management Information System (MIS) data to assess the quality of clinical case management and monitoring; infrastructure, processes and hospital administration; essential hospital and laboratory supports, drugs and equipment. Results Findings demonstrate that the overall quality of care provided in these hospitals was poor. No hospital had a functioning triage system to prioritise those children most in need of immediate care. Laboratory supports and essential equipment were deficient. Only one hospital had all of the essential drugs for paediatric care. Less than a third of hospitals had a back-up power supply, and just under half had functioning arrangements for safe-drinking water. Clinical case management was found to be sub-optimal for prevalent illnesses, as was the quality of neonatal care. Conclusion Action is needed to improve the quality of paediatric care in hospital settings in Bangladesh, with a particular need to invest in improving newborn care. PMID:23268650

  15. Gastrointestinal symptoms in NSAID users in an 'average risk population': results of a large population-based study in randomly selected Dutch inhabitants.

    PubMed

    Tielemans, M M; van Rossum, L G M; Eikendal, T; Focks, J J; Laheij, R J F; Jansen, J B M J; van Oijen, M G H

    2014-04-01

    Non-steroidal anti-inflammatory drug (NSAID) use is widespread and associated with gastrointestinal symptoms and complications. The aims of this study were to assess (i) gastrointestinal symptoms in users of prescribed and over-the-counter (OTC) NSAIDs and (ii) proton pump inhibitor (PPI) co-prescription rates in NSAID users at increased risk for gastrointestinal complications. Surveys were sent to a randomly selected sample of the adult Dutch general population in December 2008. Questions included demographics, gastrointestinal symptoms, medication use and comorbidity. Main outcome measure was presence of gastrointestinal symptoms. A total of 18,317 surveys were returned (response rate 35%), of which 16,758 surveys were eligible for analysis. Of these, 3233 participants (19%) reported NSAID use. NSAID users more frequently reported gastrointestinal symptoms than persons not using NSAIDs (33% vs. 24%, p < 0.01). Respondents who specified on prescription NSAID use (n = 683) were older, reported more comorbidity, and experienced more gastrointestinal symptoms (41%) compared with OTC users (n = 894, 33%, p < 0.01). This difference was not statistically significant after adjustment for confounders (0.99, 95% CI 0.71-1.37). In respondents with an increased gastrointestinal risk profile, PPI co-prescription rates were 51% for on prescription users and 25% for OTC users. Prevalence of gastrointestinal symptoms was high in both prescribed and OTC NSAID users, emphasising the side effects of both types of NSAIDs. PPI co-prescription rates in NSAID users at risk for gastrointestinal complication were low. © 2014 John Wiley & Sons Ltd.

  16. An assessment of the quality of care for children in eighteen randomly selected district and sub-district hospitals in Bangladesh.

    PubMed

    Hoque, Dewan M E; Rahman, Muntasirur; Billah, Sk Masum; Savic, Michael; Karim, A Q M Rezaul; Chowdhury, Enayet K; Hossain, Altaf; Musa, S A J Md; Kumar, Harish; Malhotra, Sudhansh; Matin, Ziaul; Raina, Neena; Weber, Martin W; El Arifeen, Shams

    2012-12-26

    Quality hospital care is important in ensuring that the needs of severely ill children are met to avert child mortality. However, the quality of hospital care for children in developing countries has often been found poor. As the first step of a country road map for improving hospital care for children, we assessed the baseline situation with respect to the quality of care provided to children under-five years age in district and sub-district level hospitals in Bangladesh. Using adapted World Health Organization (WHO) hospital assessment tools and standards, an assessment of 18 randomly selected district (n=6) and sub-district (n=12) hospitals was undertaken. Teams of trained assessors used direct case observation, record review, interviews, and Management Information System (MIS) data to assess the quality of clinical case management and monitoring; infrastructure, processes and hospital administration; essential hospital and laboratory supports, drugs and equipment. Findings demonstrate that the overall quality of care provided in these hospitals was poor. No hospital had a functioning triage system to prioritise those children most in need of immediate care. Laboratory supports and essential equipment were deficient. Only one hospital had all of the essential drugs for paediatric care. Less than a third of hospitals had a back-up power supply, and just under half had functioning arrangements for safe-drinking water. Clinical case management was found to be sub-optimal for prevalent illnesses, as was the quality of neonatal care. Action is needed to improve the quality of paediatric care in hospital settings in Bangladesh, with a particular need to invest in improving newborn care.

  17. Cerebral blood flow reactivity in patients undergoing selective amygdalohippocampectomy for epilepsy of mesial temporal origin. A prospective randomized comparison of the trans-Sylvian and the transcortical approach.

    PubMed

    Schatlo, Bawarjan; Jägersberg, Max; Paass, Gerhard; Faltermeier, Rupert; Streich, Jörg; Meyer, Bernhard; Schaller, Karl

    2015-01-01

    The aim of this study was to assess (1) whether vasoreactivity is altered in patients with epilepsy and (2) whether the two most commonly used approaches, the trans-Sylvian (TS) and the trans-cortical (TC) route, differ in their impact on cortical blood flow. Patients were randomized to undergo selective amygdalohippocampectomy (selAH) through a TC or TS route. Before and after selAH, we recorded microcirculation parameters on the superficial cortex surrounding the surgical corridor. Blood flow and velocity were measured using laser Doppler flowmetry and micro-Doppler, respectively. Cortical oxygen saturation (SO2) was measured using remission spectrophotometry under hypocapnic and normocapnic conditions. Ten patients were operated using the TS approach, and eight were operated via the TC approach. Vasomotor reactivity patterns measured with micro-Doppler were physiologically prior to selAH in both groups. After completion of surgery, a significant increase in SO2-values occurred in the TS group (before: 56.7 ± 2.2, after: 65.5 ± 3.0%SO2), but not in the TC group (before: 52.9 ± 5.2, after: 53.0 ± 3.7%SO2). The rate of critical SO2 values below 25% was significantly higher after the TC approach (12.3%) compared to the TS approach (5.2%; p < 0.05). Our findings provide the first invasively measured evidence that patients with mesial temporal lobe epilepsy have preserved cerebral blood flow responses to alterations in CO2. In addition, local cortical SO2 was higher in the TS group than in the TC group after selAH. This may be a sign of reactive cortical vessel dilation after proximal vessel manipulation associated with the TS approach. In contrast, the lower values of SO2 after the TC approach indicate tissue ischaemia surrounding the surgical corridor surrounding the corticotomy.

  18. Acne RA-1,2, a novel UV-selective face cream for patients with acne: Efficacy and tolerability results of a randomized, placebo-controlled clinical study.

    PubMed

    Cestone, Enza; Michelotti, Angela; Zanoletti, Valentina; Zanardi, Andrea; Mantegazza, Raffaella; Dossena, Maurizia

    2017-01-29

    General skincare measures such as the use of moisturisers and products containing adequate photoprotection are important components of acne patients' management to complement the pharmacological regimen. Acne RA-1,2 is a novel dermato-cosmetic product which contains selective photofilters and active ingredients against the multifactorial pathophysiology of acne. To evaluate the tolerability of Acne RA-1,2 and its effect on the clinical signs of acne. This double-blind, placebo-controlled study randomized 40 adult patients with 10-25 comedones per half face to once-daily application of Acne RA-1,2 or placebo for 8 weeks. Evaluations after 4 and 8 weeks included the number of comedones, transepidermal water loss (TEWL), sebum production, and tolerability. In the Acne RA-1,2 group, there was a significant 35% decrease in the mean number of comedones from 26 at baseline to 17 at Week 8 (P<.001), a 7% significant reduction in TEWL (9.32 to 8.66 g/h/m(2) ; P<.001), and a 24% significant reduction in sebum production (154.8 to 117.6 μg/cm(2) ; P<.001). The reductions in TEWL and sebum production were significantly greater than those in the placebo group at Weeks 4 and 8 (P<0.05). There were no adverse events. Acne RA-1,2 was well tolerated and effective at reducing comedones and sebum production and improving epidermal barrier function. These results suggest that Acne RA-1,2 is useful against acne-prone facial skin, particularly as it targets sebum production, which topical pharmacological acne therapies do not address. © 2017 Wiley Periodicals, Inc.

  19. On Random Numbers and Design

    ERIC Educational Resources Information Center

    Ben-Ari, Morechai

    2004-01-01

    The term "random" is frequently used in discussion of the theory of evolution, even though the mathematical concept of randomness is problematic and of little relevance in the theory. Therefore, since the core concept of the theory of evolution is the non-random process of natural selection, the term random should not be used in teaching the…

  20. On Random Numbers and Design

    ERIC Educational Resources Information Center

    Ben-Ari, Morechai

    2004-01-01

    The term "random" is frequently used in discussion of the theory of evolution, even though the mathematical concept of randomness is problematic and of little relevance in the theory. Therefore, since the core concept of the theory of evolution is the non-random process of natural selection, the term random should not be used in teaching the…

  1. A prospective randomized multicenter trial of amnioreduction versus selective fetoscopic laser photocoagulation for the treatment of severe twin–twin transfusion syndrome

    PubMed Central

    Crombleholme, Timothy M.; Shera, David; Lee, Hanmin; Johnson, Mark; D’Alton, Mary; Porter, Flint; Chyu, Jacquelyn; Silver, Richard; Abuhamad, Alfred; Saade, George; Shields, Laurence; Kauffman, David; Stone, Joanne; Albanese, Craig T.; Bahado-Singh, Ray; Ball, Robert H.; Bilaniuk, Larissa; Coleman, Beverly; Farmer, Diana; Feldstein, Vickie; Harrison, Michael R.; Hedrick, Holly; Livingston, Jeffrey; Lorenz, Robert P.; Miller, David A.; Norton, Mary E.; Polzin, William J.; Robinson, Julian N.; Rychik, Jack; Sandberg, Per L.; Seri, Istvan; Simon, Erin; Simpson, Lynn L.; Yedigarova, Larisa; Wilson, R. Douglas; Young, Bruce

    2009-01-01

    Objective To examine the effect of selective fetoscopic laser photocoagulation (SFLP) versus serial amnioreduction (AR) on perinatal mortality in severe twin-twin transfusion syndrome (TTTS). Study Design 5-year multicenter prospective randomized controlled trial. The primary outcome variable was 30-day postnatal survival of donors and recipients. Results There is no statistically significant difference in 30-day postnatal survival between SFLP or AR treatment for donors at 55% (11/20) vs 55% (11/20) (p=1, OR=1, 95%CI=0.242 to 4.14) or recipients at 30% (6/20) vs 45% (9/20) (p=0.51, OR=1.88, 95%CI=0.44 to 8.64). There is no difference in 30-day survival of one or both twins on a per pregnancy basis between AR at 75% (15/20) and SFLP at 65% (13/20) (p=0.73, OR=1.62, 95%CI=0.34 to 8.09). Overall survival (newborns divided by the number of fetuses treated) is not statistically significant for AR at 60% (24/40) vs SFLP 45% (18/40) (p=0.18, OR=2.01, 95%CI=0.76 to 5.44). There is a statistically significant increase in fetal recipient mortality in the SFLP arm at 70% (14/20) versus the AR arm at 35% (7/20) (p=0.25, OR=5.31, 95%CI=1.19 to 27.6). This is offset by increased recipient neonatal mortality of 30% (6/20) in the AR arm. Echocardiographic abnormality in recipient twin Cardiovascular Profile Score is the most significant predictor of recipient mortality (p=0.055, OR=3.025/point) by logistic regression analysis. Conclusions The outcome of the trial does not conclusively determine whether AR or SFLP is a superior treatment modality. TTTS cardiomyopathy appears to be an important factor in recipient survival in TTTS. PMID:17904975

  2. Selective laser melting: a unit cell approach for the manufacture of porous, titanium, bone in-growth constructs, suitable for orthopedic applications. II. Randomized structures.

    PubMed

    Mullen, Lewis; Stamp, Robin C; Fox, Peter; Jones, Eric; Ngo, Chau; Sutcliffe, Christopher J

    2010-01-01

    In this study, the unit cell approach, which has previously been demonstrated as a method of manufacturing porous components suitable for use as orthopedic implants, has been further developed to include randomized structures. These random structures may aid the bone in-growth process because of their similarity in appearance to trabecular bone and are shown to carry legacy properties that can be related back to the original unit cell on which they are ultimately based. In addition to this, it has been shown that randomization improves the mechanical properties of regular unit cell structures, resulting in anticipated improvements to both implant functionality and longevity. The study also evaluates the effect that a post process sinter cycle has on the components, outlines the improved mechanical properties that are attainable, and also the changes in both the macro and microstructure that occur.

  3. Methods and analysis of realizing randomized grouping.

    PubMed

    Hu, Liang-Ping; Bao, Xiao-Lei; Wang, Qi

    2011-07-01

    Randomization is one of the four basic principles of research design. The meaning of randomization includes two aspects: one is to randomly select samples from the population, which is known as random sampling; the other is to randomly group all the samples, which is called randomized grouping. Randomized grouping can be subdivided into three categories: completely, stratified and dynamically randomized grouping. This article mainly introduces the steps of complete randomization, the definition of dynamic randomization and the realization of random sampling and grouping by SAS software.

  4. [An open randomized comparative trial of efficacy and safety of selective alpha-adrenoblocker setegis (terazosin) in therapy of patients with chronic bacterial prostatitis].

    PubMed

    Trapeznikova, M F; Morozov, A P; Dutov, V V; Urenkov, S B; Pozdniakov, K V; Bychkova, N V

    2007-01-01

    An open randomized comparative trial of setegis (terazosine) has shown good subjective and objective results in patients with chronic bacterial prostatitis. The drug is well tolerated and produces insignificant side effects. It is also demonstrated that combined therapy with alpha-adrenoblockers is more effective that monotherapy with antibacterial drugs in patients with bacterial prostatitis.

  5. Calcium imaging and selective computed tomography angiography in comparison to functional testing for suspected coronary artery disease: the multicentre, randomized CRESCENT trial.

    PubMed

    Lubbers, Marisa; Dedic, Admir; Coenen, Adriaan; Galema, Tjebbe; Akkerhuis, Jurgen; Bruning, Tobias; Krenning, Boudewijn; Musters, Paul; Ouhlous, Mohamed; Liem, Ahno; Niezen, Andre; Hunink, Miriam; de Feijter, Pim; Nieman, Koen

    2016-04-14

    To compare the effectiveness and safety of a cardiac computed tomography (CT) algorithm with functional testing in patients with symptoms suggestive of coronary artery disease (CAD). Between April 2011 and July 2013, 350 patients with stable angina, referred to the outpatient clinic of four Dutch hospitals, were prospectively randomized between cardiac CT and functional testing (2 : 1 ratio). The tiered cardiac CT protocol included a calcium scan followed by CT angiography if the Agatston calcium score was between 1 and 400. Patients with test-specific contraindications were not excluded from study participation. By 1 year, fewer patients randomized to cardiac CT reported anginal complaints (P = 0.012). The cumulative radiation dose was slightly higher in the CT group (6.6 ± 8.7 vs. 6.1 ± 9.3 mSv; P < 0.0001). After 1.2 years, event-free survival was 96.7% for patients randomized to CT and 89.8% for patients randomized to functional testing (P = 0.011). After CT, the final diagnosis was established sooner (P < 0.0001), and additional downstream testing was required less frequently (25 vs. 53%, P < 0.0001), resulting in lower cumulative diagnostic costs (€369 vs. €440; P < 0.0001). For patients with suspected stable CAD, a tiered cardiac CT protocol offers an effective and safe alternative to functional testing. Incorporating the calcium scan into the diagnostic workup was safe and lowered diagnostic expenses and radiation exposure. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2016. For permissions please email: journals.permissions@oup.com.

  6. The RANDOM computer program: A linear congruential random number generator

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  7. Glass transition of a particle in a random potential, front selection in nonlinear renormalization group, and entropic phenomena in Liouville and sinh-Gordon models

    NASA Astrophysics Data System (ADS)

    Carpentier, David; Le Doussal, Pierre

    2001-02-01

    We study via renormalization group (RG), numerics, exact bounds, and qualitative arguments the equilibrium Gibbs measure of a particle in a d-dimensional Gaussian random potential with translationally invariant logarithmic spatial correlations. We show that for any d>=1 it exhibits a transition at T=Tc>0. The low-temperature glass phase has a nontrivial structure, being dominated by a few distant states (with replica symmetry breaking phenomenology). In finite dimension this transition exists only in this ``marginal glass'' case (energy fluctuation exponent θ=0) and disappears if correlations grow faster (single ground-state dominance θ>0) or slower (high-temperature phase). The associated extremal statistics problem for correlated energy landscapes exhibits universal features which we describe using a nonlinear Kolmogorov (KPP) RG equation. These include the tails of the distribution of the minimal energy (or free energy) and the finite-size corrections, which are universal. The glass transition is closely related to Derrida's random energy models. In d=2, the connection between this problem and Liouville and sinh-Gordon models is discussed. The glass transition of the particle exhibits interesting similarities with the weak- to strong-coupling transition in Liouville (c=1 barrier) and with a transition that we conjecture for the sinh-Gordon model, with correspondence in some exact results and RG analysis. Glassy freezing of the particle is associated with the generation under RG of new local operators and of nonsmooth configurations in Liouville. Applications to Dirac fermions in random magnetic fields at criticality reveal a peculiar ``quasilocalized'' regime (corresponding to the glass phase for the particle), where eigenfunctions are concentrated over a finite number of distant regions, and allow us to recover the multifractal spectrum in the delocalized regime.

  8. Selected CD133⁺ progenitor cells to promote angiogenesis in patients with refractory angina: final results of the PROGENITOR randomized trial.

    PubMed

    Jimenez-Quevedo, Pilar; Gonzalez-Ferrer, Juan Jose; Sabate, Manel; Garcia-Moll, Xavier; Delgado-Bolton, Roberto; Llorente, Leopoldo; Bernardo, Esther; Ortega-Pozzi, Aranzazu; Hernandez-Antolin, Rosana; Alfonso, Fernando; Gonzalo, Nieves; Escaned, Javier; Bañuelos, Camino; Regueiro, Ander; Marin, Pedro; Fernandez-Ortiz, Antonio; Neves, Barbara Das; Del Trigo, Maria; Fernandez, Cristina; Tejerina, Teresa; Redondo, Santiago; Garcia, Eulogio; Macaya, Carlos

    2014-11-07

    Refractory angina constitutes a clinical problem. The aim of this study was to assess the safety and the feasibility of transendocardial injection of CD133(+) cells to foster angiogenesis in patients with refractory angina. In this randomized, double-blinded, multicenter controlled trial, eligible patients were treated with granulocyte colony-stimulating factor, underwent an apheresis and electromechanical mapping, and were randomized to receive treatment with CD133(+) cells or no treatment. The primary end point was the safety of transendocardial injection of CD133(+) cells, as measured by the occurrence of major adverse cardiac and cerebrovascular event at 6 months. Secondary end points analyzed the efficacy. Twenty-eight patients were included (n=19 treatment; n=9 control). At 6 months, 1 patient in each group had ventricular fibrillation and 1 patient in each group died. One patient (treatment group) had a cardiac tamponade during mapping. There were no significant differences between groups with respect to efficacy parameters; however, the comparison within groups showed a significant improvement in the number of angina episodes per month (median absolute difference, -8.5 [95% confidence interval, -15.0 to -4.0]) and in angina functional class in the treatment arm but not in the control group. At 6 months, only 1 simple-photon emission computed tomography (SPECT) parameter: summed score improved significantly in the treatment group at rest and at stress (median absolute difference, -1.0 [95% confidence interval, -1.9 to -0.1]) but not in the control arm. Our findings support feasibility and safety of transendocardial injection of CD133(+) cells in patients with refractory angina. The promising clinical results and favorable data observed in SPECT summed score may set up the basis to test the efficacy of cell therapy in a larger randomized trial. © 2014 American Heart Association, Inc.

  9. Glass transition of a particle in a random potential, front selection in nonlinear renormalization group, and entropic phenomena in Liouville and sinh-Gordon models.

    PubMed

    Carpentier, D; Le Doussal, P

    2001-02-01

    We study via renormalization group (RG), numerics, exact bounds, and qualitative arguments the equilibrium Gibbs measure of a particle in a d-dimensional Gaussian random potential with translationally invariant logarithmic spatial correlations. We show that for any d>/=1 it exhibits a transition at T=T(c)>0. The low-temperature glass phase has a nontrivial structure, being dominated by a few distant states (with replica symmetry breaking phenomenology). In finite dimension this transition exists only in this "marginal glass" case (energy fluctuation exponent straight theta=0) and disappears if correlations grow faster (single ground-state dominance straight theta>0) or slower (high-temperature phase). The associated extremal statistics problem for correlated energy landscapes exhibits universal features which we describe using a nonlinear Kolmogorov (KPP) RG equation. These include the tails of the distribution of the minimal energy (or free energy) and the finite-size corrections, which are universal. The glass transition is closely related to Derrida's random energy models. In d=2, the connection between this problem and Liouville and sinh-Gordon models is discussed. The glass transition of the particle exhibits interesting similarities with the weak- to strong-coupling transition in Liouville (c=1 barrier) and with a transition that we conjecture for the sinh-Gordon model, with correspondence in some exact results and RG analysis. Glassy freezing of the particle is associated with the generation under RG of new local operators and of nonsmooth configurations in Liouville. Applications to Dirac fermions in random magnetic fields at criticality reveal a peculiar "quasilocalized" regime (corresponding to the glass phase for the particle), where eigenfunctions are concentrated over a finite number of distant regions, and allow us to recover the multifractal spectrum in the delocalized regime.

  10. Similar delivery rates in a selected group of patients, for day 2 and day 5 embryos both cultured in sequential medium: a randomized study.

    PubMed

    Emiliani, Serena; Delbaere, Anne; Vannin, Anne-Sophie; Biramane, Jamila; Verdoodt, Miranda; Englert, Yvon; Devreker, Fabienne

    2003-10-01

    The existence of a real benefit of blastocyst transfer is still a matter of debate. The aim of this study was to compare, in a prospective randomized trial, the outcome of day 2 and day 5 transfer of human embryos cultured in an 'in-house' sequential medium. A total of 193 cycles from 171 patients with less than four previous IVF cycles, <39 years old and with four or more zygotes on day 1, were randomly allocated to day 2 (94 cycles) or day 5 (99 cycles) transfer. Zygotes were kept in fertilization medium until 18 h post-fertilization and then placed in a 'glucose-free' cleavage medium. Embryos allocated for day 5 transfer were placed in a blastocyst medium 66 h post-fertilization. Two or three embryos were replaced according to the morphology. A mean (+/- SEM) number of 2.1 +/- 0.4 and 1.9 +/- 0.3 embryos were replaced on day 2 and day 5 (P < 0.001) respectively. Delivery rates per transfer were 44.1 and 37.1% [P = not significant (NS)], implantation rates were 31.4 and 29.4% (NS) and multiple delivery rates 22 and 36% (NS) for day 2 and day 5 groups respectively. Ten patients (10.1%) had no blastocysts available for transfer. No clear benefits were observed using blastocyst transfer for patients aged <39 years who had had less than four previous IVF cycle attempts.

  11. Utility and sensitivity of the sore throat pain model: results of a randomized controlled trial on the COX-2 selective inhibitor valdecoxib.

    PubMed

    Schachtel, Bernard P; Pan, Sharon; Kohles, Joseph D; Sanner, Kathleen M; Schachtel, Emily P; Bey, Mary

    2007-07-01

    The sore throat pain model was employed in this randomized, placebo-controlled trial to examine the sensitivity of the model in testing the efficacy of valdecoxib as an acute analgesic drug. Changes were made to the study design by employing a different diagnostic index for tonsillo-pharyngitis, a different rating scale (derived from Lasagna's pain thermometer), and alternative analyses, individual responder rates. Under double-blind conditions, 197 patients with painful pharyngitis were randomly allocated to valdecoxib 20 mg bid (n = 65), valdecoxib 40 mg qd (n = 66), or placebo (n = 66) for 24 hours. The expanded Tonsillo-Pharyngitis Assessment and the Lasagna Pain Scale were validated as sensitive study instruments. Both dosage regimens provided significantly greater pain relief compared with placebo on standard efficacy measures over the 24-hour study (all P < .05). Tests for individual response (eg, percentage of patients with at least moderate relief) confirmed these results, and other response rates identified the high sensitivity of the model itself (eg, only 5% of placebo-treated patients achieved >or=50% of maximum total pain relief over 6 hours). These findings indicate that sore throat is a sensitive model to assess analgesic efficacy.

  12. Selection of Patients and Anesthetic Types for Endovascular Treatment in Acute Ischemic Stroke: A Meta-Analysis of Randomized Controlled Trials

    PubMed Central

    Ouyang, Fubing; Chen, Yicong; Zhao, Yuhui; Dang, Ge; Liang, Jiahui; Zeng, Jinsheng

    2016-01-01

    Background and Purpose Recent randomized controlled trials have demonstrated consistent effectiveness of endovascular treatment (EVT) for acute ischemic stroke, leading to update on stroke management guidelines. We conducted this meta-analysis to assess the efficacy and safety of EVT overall and in subgroups stratified by age, baseline stroke severity, brain imaging feature, and anesthetic type. Methods Published randomized controlled trials comparing EVT and standard medical care alone were evaluated. The measured outcomes were 90-day functional independence (modified Rankin Scale ≤2), all-cause mortality, and symptomatic intracranial hemorrhage. Results Nine trials enrolling 2476 patients were included (1338 EVT, 1138 standard medical care alone). For patients with large vessel occlusions confirmed by noninvasive vessel imaging, EVT yielded improved functional outcome (pooled odds ratio [OR], 2.02; 95% confidence interval [CI], 1.64–2.50), lower mortality (OR, 0.75; 95% CI, 0.58–0.97), and similar symptomatic intracranial hemorrhage rate (OR, 1.12; 95% CI, 0.72–1.76) compared with standard medical care. A higher proportion of functional independence was seen in patients with terminus intracranial artery occlusion (±M1) (OR, 3.16; 95% CI, 1.64–6.06), baseline Alberta Stroke Program Early CT score of 8–10 (OR, 2.11; 95% CI, 1.25–3.57) and age ≤70 years (OR, 3.01; 95% CI, 1.73–5.24). EVT performed under conscious sedation had better functional outcomes (OR, 2.08; 95% CI, 1.47–2.96) without increased risk of symptomatic intracranial hemorrhage or short-term mortality compared with general anesthesia. Conclusions Vessel-imaging proven large vessel occlusion, a favorable scan, and younger age are useful predictors to identify anterior circulation stroke patients who may benefit from EVT. Conscious sedation is feasible and safe in EVT based on available data. However, firm conclusion on the choice of anesthetic types should be drawn from more

  13. Three lessons from a randomized trial of massage and meditation at end of life: patient benefit, outcome measure selection, and design of trials with terminally ill patients.

    PubMed

    Downey, Lois; Engelberg, Ruth A; Standish, Leanna J; Kozak, Leila; Lafferty, William E

    2009-01-01

    Improving end-of-life care is a priority in the United States, but assigning priorities for standard care services requires evaluations using appropriate study design and appropriate outcome indicators. A recent randomized controlled trial with terminally ill patients produced no evidence of benefit from massage or guided meditation, when evaluated with measures of global quality of life or pain distress over the course of patient participation. However, reanalysis using a more targeted outcome, surrogates' assessment of patients' benefit from the study intervention, suggested significant gains from massage-the treatment patients gave their highest preassignment preference ratings. The authors conclude that adding a menu of complementary therapies as part of standard end-of-life care may yield significant benefit, that patient preference is an important predictor of outcome, and that modifications in trial design may be appropriate for end-of-life studies.

  14. A double-blind, placebo-controlled randomized trial of Serratulae quinquefoliae folium, a new source of β-arbutin, in selected skin hyperpigmentations.

    PubMed

    Morag, Monika; Nawrot, Joanna; Siatkowski, Idzi; Adamski, Zygmunt; Fedorowicz, Tomasz; Dawid-Pac, Renata; Urbanska, Maria; Nowak, Gerard

    2015-09-01

    Arbutin is one of the most effective lightening substances. Serratula quinquefolia is a new source of its β-anomer. The HPLC method showed that the solid content of this compound in the dried plant raw material accounts for 6.86%. The leaves of Serratula quinquefolia do not contain hydroquinone. To assess the efficacy of the aqueous extract from' leaf of five-leaf serratula as a skin-lightening agent. We did a randomized, placebo-controlled, double-blind trial. The study involved 102 women aged 26-55, with two kinds of hyperpigmentary diseases: melasma and lentigo solaris. Patients were randomly assigned to one of the treatment groups: a study group (N = 54) or a control group (N = 48). The study group applied the cream with the aqueous extract from leaf of five-leaf serratula containing 2.51% of arbutin. The cream was applied twice a day on the discolored side for 8 weeks. The experimental data showed that the cream with the extract causes decreased level of melanin in the skin pigmentation spot. Clinical effect in the form of lightening and evening skin tone on the discolored side was observed in 75.86% of the female patients with melasma and 56.00 % of the female patients with lentigo solaris. The cream with the aqueous extract from leaf of five-leaf serratula proved to be an effective and safe preparation for lightening skin discolorations (66.67 % of the female patients in the study group). © 2015 Wiley Periodicals, Inc.

  15. Selection of peripheral intravenous catheters with 24-gauge side-holes versus those with 22-gauge end-hole for MDCT: A prospective randomized study.

    PubMed

    Tamura, Akio; Kato, Kenichi; Kamata, Masayoshi; Suzuki, Tomohiro; Suzuki, Michiko; Nakayama, Manabu; Tomabechi, Makiko; Nakasato, Tatsuhiko; Ehara, Shigeru

    2017-02-01

    To compare the 24-gauge side-holes catheter and conventional 22-gauge end-hole catheter in terms of safety, injection pressure, and contrast enhancement on multi-detector computed tomography (MDCT). In a randomized single-center study, 180 patients were randomized to either the 24-gauge side-holes catheter or the 22-gauge end-hole catheter groups. The primary endpoint was safety during intravenous administration of contrast material for MDCT, using a non-inferiority analysis (lower limit 95% CI greater than -10% non-inferiority margin for the group difference). The secondary endpoints were injection pressure and contrast enhancement. A total of 174 patients were analyzed for safety during intravenous contrast material administration for MDCT. The overall extravasation rate was 1.1% (2/174 patients); 1 (1.2%) minor episode occurred in the 24-gauge side-holes catheter group and 1 (1.1%) in the 22-gauge end-hole catheter group (difference: 0.1%, 95% CI: -3.17% to 3.28%, non-inferiority P=1). The mean maximum pressure was higher with the 24-gauge side-holes catheter than with the 22-gauge end-hole catheter (8.16±0.95kg/cm(2) vs. 4.79±0.63kg/cm(2), P<0.001). The mean contrast enhancement of the abdominal aorta, celiac artery, superior mesenteric artery, and pancreatic parenchyma in the two groups were not significantly different. In conclusion, our study showed that the 24-gauge side-holes catheter is safe and suitable for delivering iodine with a concentration of 300mg/mL at a flow-rate of 3mL/s, and it may contribute to the care of some patients, such as patients who have fragile and small veins. (Trial registration: UMIN000023727). Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Efficacy, Safety, Pharmacokinetics, and Pharmacodynamics of Filgotinib, a Selective JAK-1 Inhibitor, After Short-Term Treatment of Rheumatoid Arthritis: Results of Two Randomized Phase IIa Trials.

    PubMed

    Vanhoutte, Frédéric; Mazur, Minodora; Voloshyn, Oleksandr; Stanislavchuk, Mykola; Van der Aa, Annegret; Namour, Florence; Galien, René; Meuleners, Luc; van 't Klooster, Gerben

    2017-06-16

    JAK inhibitors have shown efficacy in rheumatoid arthritis (RA). We undertook this study to test our hypothesis that selective inhibition of JAK-1 would combine good efficacy with a better safety profile compared with less selective JAK inhibitors. In two 4-week exploratory, double-blind, placebo-controlled phase IIa trials, 127 RA patients with an insufficient response to methotrexate (MTX) received filgotinib (GLPG0634, GS-6034) oral capsules (100 mg twice daily or 30, 75, 150, 200, or 300 mg once daily) or placebo, added onto a stable regimen of MTX, to evaluate safety, efficacy, pharmacokinetics (PK), and pharmacodynamics (PD) of filgotinib. The primary efficacy end point was the number and percentage of patients in each treatment group meeting the American College of Rheumatology 20% improvement criteria (achieving an ACR20 response) at week 4. Treatment with filgotinib at 75-300 mg met the primary end point and showed early onset of efficacy. ACR20 response rates progressively increased to week 4, and the Disease Activity Score in 28 joints using the C-reactive protein (CRP) level decreased. Marked and sustained improvements were observed in serum CRP level and other PD markers. The PK of filgotinib and its major metabolite was dose proportional over the 30-300 mg range. Early side effects seen with other less selective JAK inhibitors were not observed (e.g., there was no worsening of anemia [JAK-2 inhibition related], no effects on liver transaminases, and no increase in low-density lipoprotein or total cholesterol). A limited decrease in neutrophils without neutropenia was consistent with immunomodulatory effects through JAK-1 inhibition. There were no infections. Overall, filgotinib was well tolerated. Events related to study drug were mild or moderate and transient during therapy, and the most common such event was nausea. Selective inhibition of JAK-1 with filgotinib shows initial efficacy in RA with an encouraging safety profile in these exploratory

  17. Decision-making after continuous wins or losses in a randomized guessing task: implications for how the prior selection results affect subsequent decision-making

    PubMed Central

    2014-01-01

    Background Human decision-making is often affected by prior selections and their outcomes, even in situations where decisions are independent and outcomes are unpredictable. Methods In this study, we created a task that simulated real-life non-strategic gambling to examine the effect of prior outcomes on subsequent decisions in a group of male college students. Results Behavioral performance showed that participants needed more time to react after continuous losses (LOSS) than continuous wins (WIN) and discontinuous outcomes (CONTROL). In addition, participants were more likely to repeat their selections in both WIN and LOSS conditions. Functional MRI data revealed that decisions in WINs were associated with increased activation in the mesolimbic pathway, but decreased activation in the inferior frontal gyrus relative to LOSS. Increased prefrontal cortical activation was observed during LOSS relative to WIN and CONTROL conditions. Conclusion Taken together, the behavioral and neuroimaging findings suggest that participants tended to repeat previous selections during LOSS trials, a pattern resembling the gambler’s fallacy. However, during WIN trials, participants tended to follow their previous lucky decisions, like the ‘hot hand’ fallacy. PMID:24708897

  18. [Evaluation of the therapeutic efficacy and safety of the selective anxiolytic afobazole in generalized anxiety disorder and adjustment disorders: Results of a multicenter randomized comparative study of diazepam].

    PubMed

    Syunyakov, T S; Neznamov, G G

    to summarize the previously published results of a multicenter randomized clinical research phase III study trial of afobazole (INN: fabomotizole) versus diazepam in the treatment of patients with generalized anxiety disorder (GAD) and adjustment disorders (AD). Five investigating centers included 150 patients aged 18 to 60 years (60 patients with GAD and 90 with AD) a simple structure of anxiety disorders without concurrent mental, neurological or somatic disorders. Patients were randomized to take afobazole (30 mg/day; n=100) or diazepam (30 mg/day; n=50) for 30 days. Prior to drug administration, patients susceptible to placebo were excluded according to the results of its 7-day use. Withdrawal syndrome was evaluated within 10 days after completion of active therapy. The primary efficacy endpoint was the change of Hamilton Anxiety Rating Scale (HAMA) total score. The scores of the Clinical Global Impression (CGI) Scale and the Sheehan Scale as secondary efficacy endpoints  were analyzed. Drug safety was evaluated by assessment of adverse events. Afobazole and diazepam caused a significant reduction of HAMA total score. In the afobazole group, the reduction of anxiety  exceeded that in the diazepam group (the difference in the total score changes was 2.93 [0.67; 5.19]; p=0,01).The proportion of patients with reduction of disease severity was 72% in the afobazole group and 58% in the diazepam group. After therapy completion, the proportion of patients with no or mild disorder in the afobazole group was significantly higher than that in the diazepam group (69 and 44%, respectively; χ2=12.46; p=0,014). There was a trend toward a higher subjective patient-rated estimate of the afobazole effect using the Sheehan scale. There were a total of 15 and 199 adverse events in the afobazole and diazepam groups, respectively. No manifestations of afobazole withdrawal syndrome were found. Diazepam withdrawal syndrome was observed in 34 (68%) patients. Afobazole is an

  19. Randomized Comparison of Selective Internal Radiotherapy (SIRT) Versus Drug-Eluting Bead Transarterial Chemoembolization (DEB-TACE) for the Treatment of Hepatocellular Carcinoma

    SciTech Connect

    Pitton, Michael B. Kloeckner, Roman; Ruckes, Christian; Wirth, Gesine M.; Eichhorn, Waltraud; Wörns, Marcus A.; Weinmann, Arndt; Schreckenberger, Mathias; Galle, Peter R.; Otto, Gerd; Dueber, Christoph

    2015-04-15

    PurposeTo prospectively compare SIRT and DEB-TACE for treating hepatocellular carcinoma (HCC).MethodsFrom 04/2010–07/2012, 24 patients with histologically proven unresectable N0, M0 HCCs were randomized 1:1 to receive SIRT or DEB-TACE. SIRT could be repeated once in case of recurrence; while, TACE was repeated every 6 weeks until no viable tumor tissue was detected by MRI or contraindications prohibited further treatment. Patients were followed-up by MRI every 3 months; the final evaluation was 05/2013.ResultsBoth groups were comparable in demographics (SIRT: 8males/4females, mean age 72 ± 7 years; TACE: 10males/2females, mean age 71 ± 9 years), initial tumor load (1 patient ≥25 % in each group), and BCLC (Barcelona Clinic Liver Cancer) stage (SIRT: 12×B; TACE 1×A, 11×B). Median progression-free survival (PFS) was 180 days for SIRT versus 216 days for TACE patients (p = 0.6193) with a median TTP of 371 days versus 336 days, respectively (p = 0.5764). Median OS was 592 days for SIRT versus 788 days for TACE patients (p = 0.9271). Seven patients died in each group. Causes of death were liver failure (n = 4 SIRT group), tumor progression (n = 4 TACE group), cardiovascular events, and inconclusive (n = 1 in each group).ConclusionsNo significant differences were found in median PFS, OS, and TTP. The lower rate of tumor progression in the SIRT group was nullified by a greater incidence of liver failure. This pilot study is the first prospective randomized trial comparing SIRT and TACE for treating HCC, and results can be used for sample size calculations of future studies.

  20. Selective CO2 Sequestration with Monolithic Bimodal Micro/Macroporous Carbon Aerogels Derived from Stepwise Pyrolytic Decomposition of Polyamide-Polyimide-Polyurea Random Copolymers.

    PubMed

    Saeed, Adnan M; Rewatkar, Parwani M; Majedi Far, Hojat; Taghvaee, Tahereh; Donthula, Suraj; Mandal, Chandana; Sotiriou-Leventis, Chariklia; Leventis, Nicholas

    2017-04-19

    Polymeric aerogels (PA-xx) were synthesized via room-temperature reaction of an aromatic triisocyanate (tris(4-isocyanatophenyl) methane) with pyromellitic acid. Using solid-state CPMAS (13)C and (15)N NMR, it was found that the skeletal framework of PA-xx was a statistical copolymer of polyamide, polyurea, polyimide, and of the primary condensation product of the two reactants, a carbamic-anhydride adduct. Stepwise pyrolytic decomposition of those components yielded carbon aerogels with both open and closed microporosity. The open micropore surface area increased from <15 m(2) g(-1) in PA-xx to 340 m(2) g(-1) in the carbons. Next, reactive etching at 1,000 °C with CO2 opened access to the closed pores and the micropore area increased by almost 4× to 1150 m(2) g(-1) (out of 1750 m(2) g(-1) of a total BET surface area). At 0 °C, etched carbon aerogels demonstrated a good balance of adsorption capacity for CO2 (up to 4.9 mmol g(-1)), and selectivity toward other gases (via Henry's law). The selectivity for CO2 versus H2 (up to 928:1) is suitable for precombustion fuel purification. Relevant to postcombustion CO2 capture and sequestration (CCS), the selectivity for CO2 versus N2 was in the 17:1 to 31:1 range. In addition to typical factors involved in gas sorption (kinetic diameters, quadrupole moments and polarizabilities of the adsorbates), it is also suggested that CO2 is preferentially engaged by surface pyridinic and pyridonic N on carbon (identified with XPS) in an energy-neutral surface reaction. Relatively high uptake of CH4 (2.16 mmol g(-1) at 0 °C/1 bar) was attributed to its low polarizability, and that finding paves the way for further studies on adsorption of higher (i.e., more polarizable) hydrocarbons. Overall, high CO2 selectivities, in combination with attractive CO2 adsorption capacities, low monomer cost, and the innate physicochemical stability of carbon render the materials of this study reasonable candidates for further practical

  1. Randomized comparison of selective serotonin reuptake inhibitor (escitalopram) monotherapy and antidepressant combination pharmacotherapy for major depressive disorder with melancholic features: a CO-MED report.

    PubMed

    Bobo, William V; Chen, Helen; Trivedi, Madhukar H; Stewart, Jonathan W; Nierenberg, Andrew A; Fava, Maurizio; Kurian, Benji T; Warden, Diane; Morris, David W; Luther, James F; Husain, Mustafa M; Cook, Ian A; Lesser, Ira M; Kornstein, Susan G; Wisniewski, Stephen R; Rush, A John; Shelton, Richard C

    2011-10-01

    The clinical effects of antidepressant combinations vs. monotherapy as initial treatment for major depression with melancholic features (MDD-MF) are unknown. Outpatients with chronic or recurrent major depression (MDD) were randomized to initial treatment with escitalopram+placebo (the MONO condition), bupropion-sustained release+escitalopram, or venlafaxine-extended release+mirtazapine (the COMB conditions) in the Combining Medications to Enhance Depression Outcomes (CO-MED) trial. Secondary data analyses were conducted to compare demographic and clinical characteristics, and contrast clinical responses according to drug treatment, in patients with MDD-MF (n=124) and non-melancholic MDD (n=481). While numerically lower, remission rates in MDD-MF did not differ significantly from those with non-melancholic MDD either at 12 (33.1% vs. 41.0%, aOR 1.16, p=0.58) or 28 (39.5% vs. 46.8%, aOR=1.02, p=0.93) weeks of treatment. Remission rates did not differ significantly between combination and monotherapy groups in either MDD-MF or non-melancholic MDD patients at either time point. Similar conclusions were reached for response rates, premature study discontinuation, and self-rated depression symptom severity. This is a secondary analysis of data from the CO-MED trial, which was not designed to address differential treatment response in melancholic and non-melancholic MDD. We found no evidence of differential remission or response rates to antidepressant combination or monotherapy between melancholic/non-melancholic MDD patients, or according to antidepressant treatment group, after 12 and 28 weeks. Melancholic features may not be a valid predictor of more favorable response to antidepressant combination therapy as initial treatment. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Do Verbal and Tactile Cueing Selectively Alter Gluteus Maximus and Hamstring Recruitment During a Supine Bridging Exercise in Active Females? A Randomized Controlled Trial.

    PubMed

    Hollman, John H; Berling, Tyler A; Crum, Ellen O; Miller, Kelsie M; Simmons, Brent T; Youdas, James W

    2017-01-25

    Hip extension with hamstring- rather than gluteus maximus-dominant recruitment may increase anterior femoracetabular forces and contribute to conditions that cause hip pain. Cueing methods during hip extension exercises may facilitate greater gluteus maximus recruitment. We examined whether specific verbal and tactile cues facilitate gluteus maximus recruitment while inhibiting hamstring recruitment during a bridging exercise. Randomized controlled trial. Biomechanics laboratory. 30 young adult women [age 24 (3) years, BMI 22.2 (2.4) kg/m2]. Participants were tested over 2 sessions, 1 week apart, while performing 5 repetitions of a bridging exercise. At their second visit, participants in the experimental group received verbal and tactile cues intended to facilitate gluteus maximus recruitment and inhibit hamstring recruitment. Control group participants received no additional cues beyond original instructions. Gluteus maximus and hamstring recruitment were measured with surface electromyography, normalized to maximal voluntary isometric contractions (MVICs). Gluteus maximus recruitment was unchanged in the control group and increased from 16.8 to 33.0% MVIC in the cueing group (F = 33.369, P < .001). Hamstring recruitment was unchanged in the control group but also increased from 16.5 to 29.8% MVIC in the cueing group (F = 6.400, P = .018). The effect size of the change in gluteus maximus recruitment in the cueing group (Cohen's d = 1.5, 95% CI = 0.9 to 2.2) was not significantly greater than the effect size in hamstring recruitment (Cohen's d = 0.8, 95% CI = 0.1 to 1.5). Verbal and tactile cues hypothesized to facilitate gluteus maximus recruitment yielded comparable increases in both gluteus maximus and hamstring recruitment. If one intends to promote hip extension by facilitating gluteus maximus recruitment while inhibiting hamstring recruitment during bridging exercises, the cueing methods employed in this study may not produce desired effects.

  3. A Randomized Trial of a Low Trapping Non-Selective N-methyl-D-aspartate (NMDA) Channel Blocker in Major Depression

    PubMed Central

    Zarate, Carlos A; Mathews, Daniel; Ibrahim, Lobna; Chaves, Jose Franco; Marquardt, Craig; Ukoh, Immaculata; Jolkovsky, Libby; Brutsche, Nancy E; Smith, Mark A.; Luckenbaugh, David A

    2012-01-01

    Background The high-affinity N-methyl-D-aspartate (NMDA) antagonist ketamine exerts rapid antidepressant effects, but has psychotomimetic properties. AZD6765 is a low-trapping NMDA channel blocker with low rates of associated psychotomimetic effects. This study investigated whether AZD6765 could produce rapid antidepressant effects in subjects with treatment-resistant major depressive disorder (MDD). Methods In this double-blind, randomized, crossover, placebo-controlled study, 22 subjects with DSM-IV treatment-resistant MDD received a single infusion of either AZD6765 (150 mg) or placebo on two test days one week apart. The primary outcome measure was the Montgomery-Asberg Depression Rating Scale (MADRS), which was used to rate overall depressive symptoms at baseline; at 60, 80, 110, and 230 minutes post-infusion; and on Days 1, 2, 3, and 7 post-infusion. Several secondary outcome measures were also used, including the Hamilton Depression Rating Scale (HDRS). Results Within 80 minutes, MADRS scores significantly improved in subjects receiving AZD6765 compared to placebo; this improvement remained significant only through 110 minutes (d=0.40). On the HDRS, a drug difference was found at 80 and 110 minutes and at Day 2 (d=0.49). Overall, 32% of subjects responded to AZD6765 and 15% responded to placebo at some point during the trial. No difference was observed between the groups with regard to psychotomimetic or dissociative adverse effects. Conclusions In patients with treatment-resistant MDD, a single intravenous dose of the low trapping NMDA channel blockerAZD6765 was associated with rapid but short-lived antidepressant effects; no psychotomimetic effects were observed. PMID:23206319

  4. Randomization and sampling issues

    USGS Publications Warehouse

    Geissler, P.H.

    1996-01-01

    The need for randomly selected routes and other sampling issues have been debated by the Amphibian electronic discussion group. Many excellent comments have been made, pro and con, but we have not reached consensus yet. This paper brings those comments together and attempts a synthesis. I hope that the resulting discussion will bring us closer to a consensus.

  5. Enhanced mRNA-protein fusion efficiency of a single-domain antibody by selection of mRNA display with additional random sequences in the terminal translated regions

    PubMed Central

    Takahashi, Kazuki; Sunohara, Masato; Terai, Takuya; Kumachi, Shigefumi; Nemoto, Naoto

    2017-01-01

    In vitro display technologies such as mRNA and cDNA display are powerful tools to create and select functional peptides. However, in some cases, efficiency of mRNA-protein fusion is very low, which results in decreased library size and lower chance of successful selection. In this study, to improve mRNA-protein fusion efficiency, we prepared an mRNA display library of a protein with random N- and C-terminal coding regions consisting of 12 nucleotides (i.e. four amino acids), and performed an electrophoresis mobility shift assay (EMSA)-based selection of successfully formed mRNA display molecules. A single-domain antibody (Nanobody, or VHH) was used as a model protein, and as a result, a pair of sequences was identified that increased mRNA-protein fusion efficiency of this protein by approximately 20%. Interestingly, enhancement of the fusion efficiency induced by the identified sequences was protein-specific, and different results were obtained for other proteins including VHHs with different CDRs. The results suggested that conformation of mRNA as a whole, rather than the amino acid sequence of the translated peptide, is an important factor to determine mRNA-protein fusion efficiency. PMID:28275529

  6. Enhanced mRNA-protein fusion efficiency of a single-domain antibody by selection of mRNA display with additional random sequences in the terminal translated regions.

    PubMed

    Takahashi, Kazuki; Sunohara, Masato; Terai, Takuya; Kumachi, Shigefumi; Nemoto, Naoto

    2017-01-01

    In vitro display technologies such as mRNA and cDNA display are powerful tools to create and select functional peptides. However, in some cases, efficiency of mRNA-protein fusion is very low, which results in decreased library size and lower chance of successful selection. In this study, to improve mRNA-protein fusion efficiency, we prepared an mRNA display library of a protein with random N- and C-terminal coding regions consisting of 12 nucleotides (i.e. four amino acids), and performed an electrophoresis mobility shift assay (EMSA)-based selection of successfully formed mRNA display molecules. A single-domain antibody (Nanobody, or VHH) was used as a model protein, and as a result, a pair of sequences was identified that increased mRNA-protein fusion efficiency of this protein by approximately 20%. Interestingly, enhancement of the fusion efficiency induced by the identified sequences was protein-specific, and different results were obtained for other proteins including VHHs with different CDRs. The results suggested that conformation of mRNA as a whole, rather than the amino acid sequence of the translated peptide, is an important factor to determine mRNA-protein fusion efficiency.

  7. Promoting mobility after hip fracture (ProMo): study protocol and selected baseline results of a year-long randomized controlled trial among community-dwelling older people

    PubMed Central

    2011-01-01

    Background To cope at their homes, community-dwelling older people surviving a hip fracture need a sufficient amount of functional ability and mobility. There is a lack of evidence on the best practices supporting recovery after hip fracture. The purpose of this article is to describe the design, intervention and demographic baseline results of a study investigating the effects of a rehabilitation program aiming to restore mobility and functional capacity among community-dwelling participants after hip fracture. Methods/Design Population-based sample of over 60-year-old community-dwelling men and women operated for hip fracture (n = 81, mean age 79 years, 78% were women) participated in this study and were randomly allocated into control (Standard Care) and ProMo intervention groups on average 10 weeks post fracture and 6 weeks after discharged to home. Standard Care included written home exercise program with 5-7 exercises for lower limbs. Of all participants, 12 got a referral to physiotherapy. After discharged to home, only 50% adhered to Standard Care. None of the participants were followed-up for Standard Care or mobility recovery. ProMo-intervention included Standard Care and a year-long program including evaluation/modification of environmental hazards, guidance for safe walking, pain management, progressive home exercise program and physical activity counseling. Measurements included a comprehensive battery of laboratory tests and self-report on mobility limitation, disability, physical functional capacity and health as well as assessments for the key prerequisites for mobility, disability and functional capacity. All assessments were performed blinded at the research laboratory. No significant differences were observed between intervention and control groups in any of the demographic variables. Discussion Ten weeks post hip fracture only half of the participants were compliant to Standard Care. No follow-up for Standard Care or mobility recovery occurred

  8. Embolization of the Gastroduodenal Artery Before Selective Internal Radiotherapy: A Prospectively Randomized Trial Comparing Platinum-Fibered Microcoils with the Amplatzer Vascular Plug II

    SciTech Connect

    Pech, Maciej Kraetsch, Annett; Wieners, Gero; Redlich, Ulf; Gaffke, Gunnar; Ricke, Jens; Dudeck, Oliver

    2009-05-15

    The Amplatzer Vascular Plug II (AVP II) is a novel device for transcatheter vessel occlusion, for which only limited comparative data exist. Embolotherapy of the gastroduodenal artery (GDA) is essential before internal radiotherapy (SIRT) in order to prevent radiation-induced peptic ulcerations due to migration of yttrium-90 microspheres. The purpose of this study was to compare the vascular anatomical limitations, procedure time, effectiveness, and safety of embolization of the GDA with coils versus the AVP II. Fifty patients stratified for SIRT were prospectively randomized for embolization of the GDA with either coils or the AVP II. The angle between the aorta and the celiac trunk, diameter of the GDA, fluoroscopy time and total time for embolization, number of embolization devices, complications, and durability of vessel occlusion at follow-up angiography for SIRT were recorded. A t-test was used for statistical analysis. Embolizations with either coils or the AVP II were technically feasible in all but two patients scheduled for embolization of the GDA with the AVP II. In both cases the plug could not be positioned due to the small celiac trunk outlet angles of 17{sup o} and 21{sup o}. The mean diameter of the GDA was 3.7 mm (range, 2.2-4.8 mm) for both groups. The procedures differed significantly in fluoroscopy time (7.8 min for coils vs. 2.6 min for the AVP II; P < 0.001) and embolization time (23.1 min for coils vs. 8.8 min for the AVP II; P < 0.001). A mean of 6.0 {+-} 3.2 coils were used for GDA embolization, while no more than one AVP II was needed for successful vessel occlusion (P < 0.001). One coil migration occurred during coil embolization, whereas no procedural complication was encountered with the use of the AVP II. Vessel reperfusion was noted in only one patient, in whom coil embolization was performed. In conclusion, embolization of the GDA with the AVP II is safe, easy, rapid, and highly effective; only an extremely sharp-angled celiac trunk

  9. The selective beta 1-blocking agent metoprolol compared with antithyroid drug and thyroxine as preoperative treatment of patients with hyperthyroidism. Results from a prospective, randomized study.

    PubMed

    Adlerberth, A; Stenström, G; Hasselgren, P O

    1987-02-01

    Despite the increasing use of beta-blocking agents alone as preoperative treatment of patients with hyperthyroidism, there are no controlled clinical studies in which this regimen has been compared with a more conventional preoperative treatment. Thirty patients with newly diagnosed and untreated hyperthyroidism were randomized to preoperative treatment with methimazole in combination with thyroxine (Group I) or the beta 1-blocking agent metoprolol (Group II). Metoprolol was used since it has been demonstrated that the beneficial effect of beta-blockade in hyperthyroidism is mainly due to beta 1-blockade. The preoperative, intraoperative, and postoperative courses in the two groups were compared, and patients were followed up for 1 year after thyroidectomy. At the time of diagnosis, serum concentration of triiodothyronine (T3) was 6.1 +/- 0.59 nmol/L in Group I and 5.7 +/- 0.66 nmol/L in Group II (reference interval 1.5-3.0 nmol/L). Clinical improvement during preoperative treatment was similar in the two groups of patients, but serum T3 was normalized only in Group I. The median length of preoperative treatment was 12 weeks in Group I and 5 weeks in Group II (p less than 0.01). There were no serious adverse effects of the drugs during preoperative preparation in either treatment group. Operating time, consistency and vascularity of the thyroid gland, and intraoperative blood loss were similar in the two groups. No anesthesiologic or cardiovascular complications occurred during operation in either group. One patient in Group I (7%) and three patients in Group II (20%) had clinical signs of hyperthyroid function during the first postoperative day. These symptoms were abolished by the administration of small doses of metoprolol, and no case of thyroid storm occurred. Postoperative hypocalcemia or recurrent laryngeal nerve paralysis did not occur in either group. During the first postoperative year, hypothyroidism developed in two patients in Group I (13%) and in six

  10. SIRFLOX: Randomized Phase III Trial Comparing First-Line mFOLFOX6 (Plus or Minus Bevacizumab) Versus mFOLFOX6 (Plus or Minus Bevacizumab) Plus Selective Internal Radiation Therapy in Patients With Metastatic Colorectal Cancer.

    PubMed

    van Hazel, Guy A; Heinemann, Volker; Sharma, Navesh K; Findlay, Michael P N; Ricke, Jens; Peeters, Marc; Perez, David; Robinson, Bridget A; Strickland, Andrew H; Ferguson, Tom; Rodríguez, Javier; Kröning, Hendrik; Wolf, Ido; Ganju, Vinod; Walpole, Euan; Boucher, Eveline; Tichler, Thomas; Shacham-Shmueli, Einat; Powell, Alex; Eliadis, Paul; Isaacs, Richard; Price, David; Moeslein, Fred; Taieb, Julien; Bower, Geoff; Gebski, Val; Van Buskirk, Mark; Cade, David N; Thurston, Kenneth; Gibbs, Peter

    2016-05-20

    SIRFLOX was a randomized, multicenter trial designed to assess the efficacy and safety of adding selective internal radiation therapy (SIRT) using yttrium-90 resin microspheres to standard fluorouracil, leucovorin, and oxaliplatin (FOLFOX)-based chemotherapy in patients with previously untreated metastatic colorectal cancer. Chemotherapy-naïve patients with liver metastases plus or minus limited extrahepatic metastases were randomly assigned to receive either modified FOLFOX (mFOLFOX6; control) or mFOLFOX6 plus SIRT (SIRT) plus or minus bevacizumab. The primary end point was progression-free survival (PFS) at any site as assessed by independent centralized radiology review blinded to study arm. Between October 2006 and April 2013, 530 patients were randomly assigned to treatment (control, 263; SIRT, 267). Median PFS at any site was 10.2 v 10.7 months in control versus SIRT (hazard ratio, 0.93; 95% CI, 0.77 to 1.12; P = .43). Median PFS in the liver by competing risk analysis was 12.6 v 20.5 months in control versus SIRT (hazard ratio, 0.69; 95% CI, 0.55 to 0.90; P = .002). Objective response rates (ORRs) at any site were similar (68.1% v 76.4% in control v SIRT; P = .113). ORR in the liver was improved with the addition of SIRT (68.8% v 78.7% in control v SIRT; P = .042). Grade ≥ 3 adverse events, including recognized SIRT-related effects, were reported in 73.4% and 85.4% of patients in control versus SIRT. The addition of SIRT to FOLFOX-based first-line chemotherapy in patients with liver-dominant or liver-only metastatic colorectal cancer did not improve PFS at any site but significantly delayed disease progression in the liver. The safety profile was as expected and was consistent with previous studies. © 2016 by American Society of Clinical Oncology.

  11. Repetitive transcranial magnetic stimulation (rTMS) augmentation of selective serotonin reuptake inhibitors (SSRIs) for SSRI-resistant obsessive-compulsive disorder (OCD): a meta-analysis of randomized controlled trials.

    PubMed

    Ma, Zhong-Rui; Shi, Li-Jun

    2014-01-01

    Randomized controlled trials (RCTs) on repetitive transcranial magnetic stimulation (rTMS) as augmentation of selective serotonin reuptake inhibitors (SSRIs) for SSRI-resistant obsessive-compulsive disorder (OCD) have yielded conflicting results. Therefore, this meta-analysis was conducted to assess the efficacy of this strategy for SSRI-resistant OCD. Scientific and medical databases, including international databases (PubMed, MEDLINE, EMBASE, CCTR, Web of Science, PsycINFO), two Chinese databases (CBM-disc, CNKI), and relevant websites dated up to July 2014, were searched for RCTs on this strategy for treating OCD. Mantel-Haenszel random-effects model was used. Yale-Brown Obsessive Compulsive Scale (Y-BOCS) score, response rates and drop-out rates were evaluated. Data were obtained from nine RCTs consisting of 290 subjects. Active rTMS was an effective augmentation strategy in treating SSRI-resistant OCD with a pooled WMD of 3.89 (95% CI = [1.27, 6.50]) for reducing Y-BOCS score and a pooled odds ratio (OR) of 2.65 (95% CI = [1.36, 5.17] for response rates. No significant differences in drop-out rates were found. No publication bias was detected. The pooled examination demonstrated that this strategy seems to be efficacious and acceptable for treating SSRI-resistant OCD. As the number of RCTs included here was limited, further large-scale multi-center RCTs are required to validate our conclusions.

  12. Randomization Strategies.

    PubMed

    Kepler, Christopher K

    2017-04-01

    An understanding of randomization is important both for study design and to assist medical professionals in evaluating the medical literature. Simple randomization can be done through a variety of techniques, but carries a risk of unequal distribution of subjects into treatment groups. Block randomization can be used to overcome this limitation by ensuring that small subgroups are distributed evenly between treatment groups. Finally, techniques can be used to evenly distribute subjects between treatment groups while accounting for confounding variables, so as to not skew results when there is a high index of suspicion that a particular variable will influence outcome.

  13. Developing bifunctional beta-lactamase molecules with built-in target-recognizing module for prodrug therapy: identification of Enterobacter Cloacae P99 cephalosporinase loops suitable for randomization and phage-display selection.

    PubMed

    Shukla, Girja S; Krag, David N

    2009-01-01

    This study was focused on developing catalytically active beta-lactamase enzyme molecules that have target-recognizing sites built within their scaffold. Using phage-display approach, nine libraries were constructed by inserting the randomized linear or cysteine-constrained heptapeptides in the five different loops on the outer surface of P99 beta-lactamase molecule. The pIII signal peptide of Sec-pathway was employed for a periplasmic translocation of the beta-lactamase fusion protein, which we found more efficient than the DsbA signal peptide of SRP-pathway. The randomized heptapeptide loops replaced native amino acids between positions (34)Y-(37)K, (238)M-(246)A, (275)N-(280)A, (305)A-(311)S, or (329)I-(334)I of the P99 beta-lactamase molecules for generating the loop-1 to -5 libraries, respectively. The diversity of each loop library was judged by counting the primary and beta-lactamase-active clones. The linear peptide inserts in the loop-2 library showed the maximum number of the beta-lactamase-active clones, followed by the loop-5, loop-3, and loop-4. The insertion of the cysteine-constrained loops exhibited a dramatic loss of the enzyme-active beta-lactamase clones. The complexity of the loop-2 linear library, as determined by the frequency and diversity of amino acid distributions in the randomized region, appears consistent with the standards of other types of phage display library systems. The selection of the loop-2 linear library on streptavidin protein as a test target identified several beta-lactamase clones that specifically bound to streptavidin. In conclusion, this study identified the suitability of the loop-2 of P99 beta-lactamase for constructing a phage-display library of the beta-lactamase enzyme-active molecules that can be selected against a target. This is an enabling step in our long-term goal of developing bifunctional beta-lactamase molecules against cancer-specific targets for enzyme prodrug therapy of cancer.

  14. Random thoughts

    NASA Astrophysics Data System (ADS)

    ajansen; kwhitefoot; panteltje1; edprochak; sudhakar, the

    2014-07-01

    In reply to the physicsworld.com news story “How to make a quantum random-number generator from a mobile phone” (16 May, http://ow.ly/xFiYc, see also p5), which describes a way of delivering random numbers by counting the number of photons that impinge on each of the individual pixels in the camera of a Nokia N9 smartphone.

  15. Comparison of the Efficacy and Safety of Aripiprazole Versus Bupropion Augmentation in Patients With Major Depressive Disorder Unresponsive to Selective Serotonin Reuptake Inhibitors: A Randomized, Prospective, Open-Label Study.

    PubMed

    Cheon, Eun-Jin; Lee, Kwang-Hun; Park, Young-Woo; Lee, Jong-Hun; Koo, Bon-Hoon; Lee, Seung-Jae; Sung, Hyung-Mo

    2017-04-01

    The purpose of this study was to compare the efficacy and safety of aripiprazole versus bupropion augmentation in patients with major depressive disorder (MDD) unresponsive to selective serotonin reuptake inhibitors (SSRIs). This is the first randomized, prospective, open-label, direct comparison study between aripiprazole and bupropion augmentation. Participants had at least moderately severe depressive symptoms after 4 weeks or more of SSRI treatment. A total of 103 patients were randomized to either aripiprazole (n = 56) or bupropion (n = 47) augmentation for 6 weeks. Concomitant use of psychotropic agents was prohibited. Montgomery Asberg Depression Rating Scale, 17-item Hamilton Depression Rating scale, Iowa Fatigue Scale, Drug-Induced Extrapyramidal Symptoms Scale, Psychotropic-Related Sexual Dysfunction Questionnaire scores were obtained at baseline and after 1, 2, 4, and 6 weeks of treatment. Overall, both treatments significantly improved depressive symptoms without causing serious adverse events. There were no significant differences in the Montgomery Asberg Depression Rating Scale, 17-item Hamilton Depression Rating scale, and Iowa Fatigue Scale scores, and response rates. However, significant differences in remission rates between the 2 groups were evident at week 6 (55.4% vs 34.0%, respectively; P = 0.031), favoring aripiprazole over bupropion. There were no significant differences in adverse sexual events, extrapyramidal symptoms, or akathisia between the 2 groups. The present study suggests that aripiprazole augmentation is at least comparable to bupropion augmentation in combination with SSRI in terms of efficacy and tolerability in patients with MDD. Both aripiprazole and bupropion could help reduce sexual dysfunction and fatigue in patients with MDD. Aripiprazole and bupropion may offer effective and safe augmentation strategies in patients with MDD who are unresponsive to SSRIs. Double-blinded trials are warranted to confirm the present findings.

  16. Repetitive transcranial magnetic stimulation (rTMS) augmentation of selective serotonin reuptake inhibitors (SSRIs) for SSRI-resistant obsessive-compulsive disorder (OCD): a meta-analysis of randomized controlled trials

    PubMed Central

    Ma, Zhong-Rui; Shi, Li-Jun

    2014-01-01

    Background and objective: Randomized controlled trials (RCTs) on repetitive transcranial magnetic stimulation (rTMS) as augmentation of selective serotonin reuptake inhibitors (SSRIs) for SSRI-resistant obsessive-compulsive disorder (OCD) have yielded conflicting results. Therefore, this meta-analysis was conducted to assess the efficacy of this strategy for SSRI-resistant OCD. Methods: Scientific and medical databases, including international databases (PubMed, MEDLINE, EMBASE, CCTR, Web of Science, PsycINFO), two Chinese databases (CBM-disc, CNKI), and relevant websites dated up to July 2014, were searched for RCTs on this strategy for treating OCD. Mantel-Haenszel random-effects model was used. Yale-Brown Obsessive Compulsive Scale (Y-BOCS) score, response rates and drop-out rates were evaluated. Results: Data were obtained from nine RCTs consisting of 290 subjects. Active rTMS was an effective augmentation strategy in treating SSRI-resistant OCD with a pooled WMD of 3.89 (95% CI = [1.27, 6.50]) for reducing Y-BOCS score and a pooled odds ratio (OR) of 2.65 (95% CI = [1.36, 5.17] for response rates. No significant differences in drop-out rates were found. No publication bias was detected. Conclusion: The pooled examination demonstrated that this strategy seems to be efficacious and acceptable for treating SSRI-resistant OCD. As the number of RCTs included here was limited, further large-scale multi-center RCTs are required to validate our conclusions. PMID:25663986

  17. A phase IIA randomized, placebo-controlled clinical trial to study the efficacy and safety of the selective androgen receptor modulator (SARM), MK-0773 in female participants with sarcopenia.

    PubMed

    Papanicolaou, D A; Ather, S N; Zhu, H; Zhou, Y; Lutkiewicz, J; Scott, B B; Chandler, J

    2013-01-01

    Sarcopenia, the age-related loss of muscle mass [defined as appendicular LBM/Height2 (aLBM/ht2) below peak value by>1SD], strength and function, is a major contributing factor to frailty in the elderly. MK-0773 is a selective androgen receptor modulator designed to improve muscle function while minimizing effects on other tissues. The primary objective of this study was to demonstrate an improvement in muscle strength and lean body mass (LBM) in sarcopenic frail elderly women treated with MK-0773 relative to placebo. This was a randomized, double-blind, parallel-arm, placebo-controlled, multicenter, 6-month study. Participants were randomized in a 1:1 ratio to receive either MK-0773 50mg b.i.d. or placebo; all participants received Vitamin D and protein supplementation. General community. 170 Women aged ≥65 with sarcopenia and moderate physical dysfunction. Dual energy X-ray absorptiometry, muscle strength and power, physical performance measures. Participants receiving MK-0773 showed a statistically significant increase in LBM from baseline at Month 6 vs. placebo (p<0.001). Participants receiving both MK-0773 and placebo showed a statistically significant increase in strength from baseline to Month 6, but the mean difference between the two groups was not significant (p=0.269). Both groups showed significant improvement from baseline at Month 6 in physical performance measures, but there were no statistically significant differences between participants receiving MK-0773 and placebo. A greater number of participants experienced elevated transaminases in the MK-0773 group vs. placebo, which resolved after discontinuation of study therapy. MK-0773 was generally well-tolerated with no evidence of androgenization. The MK-0773-induced increase in LBM did not translate to improvement in strength or function vs. placebo. The improvement of strength and physical function in the placebo group could be at least partly attributed to protein and vitamin D supplementation.

  18. A pilot study examining the effectiveness of physical therapy as an adjunct to selective nerve root block in the treatment of lumbar radicular pain from disk herniation: a randomized controlled trial.

    PubMed

    Thackeray, Anne; Fritz, Julie M; Brennan, Gerard P; Zaman, Faisel M; Willick, Stuart E

    2010-12-01

    Therapeutic selective nerve root blocks (SNRBs) are a common intervention for patients with sciatica. Patients often are referred to physical therapy after SNRBs, although the effectiveness of this intervention sequence has not been investigated. This study was a preliminary investigation of the effectiveness of SNRBs, with or without subsequent physical therapy, in people with low back pain and sciatica. This investigation was a pilot randomized controlled clinical trial. The settings were spine specialty and physical therapy clinics. Forty-four participants (64% men; mean age=38.5 years, SD=11.6 years) with low back pain, with clinical and imaging findings consistent with lumbar disk herniation, and scheduled to receive SNRBs participated in the study. They were randomly assigned to receive either 4 weeks of physical therapy (SNRB+PT group) or no physical therapy (SNRB alone [SNRB group]) after the injections. All participants received at least 1 SNRB; 28 participants (64%) received multiple injections. Participants in the SNRB+PT group attended an average of 6.0 physical therapy sessions over an average of 23.9 days. Outcomes were assessed at baseline, 8 weeks, and 6 months with the Low Back Pain Disability Questionnaire, a numeric pain rating scale, and the Global Rating of Change. Significant reductions in pain and disability occurred over time in both groups, with no differences between groups at either follow-up for any outcome. Nine participants (5 in the SNRB group and 4 in the SNRB+PT group) underwent surgery during the follow-up period. The limitations of this study were a relatively short-term follow-up period and a small sample size. A physical therapy intervention after SNRBs did not result in additional reductions in pain and disability or perceived improvements in participants with low back pain and sciatica.

  19. Site selective bond breaking in random media

    NASA Astrophysics Data System (ADS)

    Antonyuk, B. P.; Obidin, A. Z.; Vartapetov, S. K.; Lapshin, K. E.

    2008-01-01

    We show that strong light with the frequency slightly less then the 'gap' in electron spectrum of fused quartz cuts off macroscopic balls with fixed diameter 2 ± 0.2 μm and throws them out of the ablation crater. The reason for the phenomenon is the light driven electron bunching and generation of strong static electric field on the bunch boundaries resulting in effective photon absorption and bonds breaking in these regions while the main part of a sample is almost transparent for the photons.

  20. Random Vibrations

    NASA Technical Reports Server (NTRS)

    Messaro. Semma; Harrison, Phillip

    2010-01-01

    Ares I Zonal Random vibration environments due to acoustic impingement and combustion processes are develop for liftoff, ascent and reentry. Random Vibration test criteria for Ares I Upper Stage pyrotechnic components are developed by enveloping the applicable zonal environments where each component is located. Random vibration tests will be conducted to assure that these components will survive and function appropriately after exposure to the expected vibration environments. Methodology: Random Vibration test criteria for Ares I Upper Stage pyrotechnic components were desired that would envelope all the applicable environments where each component was located. Applicable Ares I Vehicle drawings and design information needed to be assessed to determine the location(s) for each component on the Ares I Upper Stage. Design and test criteria needed to be developed by plotting and enveloping the applicable environments using Microsoft Excel Spreadsheet Software and documenting them in a report Using Microsoft Word Processing Software. Conclusion: Random vibration liftoff, ascent, and green run design & test criteria for the Upper Stage Pyrotechnic Components were developed by using Microsoft Excel to envelope zonal environments applicable to each component. Results were transferred from Excel into a report using Microsoft Word. After the report is reviewed and edited by my mentor it will be submitted for publication as an attachment to a memorandum. Pyrotechnic component designers will extract criteria from my report for incorporation into the design and test specifications for components. Eventually the hardware will be tested to the environments I developed to assure that the components will survive and function appropriately after exposure to the expected vibration environments.

  1. 5-year clinical outcomes in the ICTUS (Invasive versus Conservative Treatment in Unstable coronary Syndromes) trial a randomized comparison of an early invasive versus selective invasive management in patients with non-ST-segment elevation acute coronary syndrome.

    PubMed

    Damman, Peter; Hirsch, Alexander; Windhausen, Fons; Tijssen, Jan G P; de Winter, Robbert J

    2010-03-02

    We present the 5-year clinical outcomes according to treatment strategy with additional risk stratification of the ICTUS (Invasive versus Conservative Treatment in Unstable coronary Syndromes) trial. Long-term outcomes may be relevant to decide treatment strategy for patients presenting with non-ST-segment elevation acute coronary syndromes (NSTE-ACS) and elevated troponin T. We randomly assigned 1,200 patients to an early invasive or selective invasive strategy. The outcomes were the composite of death or myocardial infarction (MI) and its individual components. Risk stratification was performed with the FRISC (Fast Revascularization in InStability in Coronary artery disease) risk score. At 5-year follow-up, revascularization rates were 81% in the early invasive and 60% in the selective invasive group. Cumulative death or MI rates were 22.3% and 18.1%, respectively (hazard ratio [HR]: 1.29, 95% confidence interval [CI]: 1.00 to 1.66, p = 0.053). No difference was observed in mortality (HR: 1.13, 95% CI: 0.80 to 1.60, p = 0.49) or MI (HR: 1.24, 95% CI: 0.90 to 1.70, p = 0.20). After risk stratification, no benefit of an early invasive strategy was observed in reducing death or spontaneous MI in any of the risk groups. In patients presenting with NSTE-ACS and elevated troponin T, we could not demonstrate a long-term benefit of an early invasive strategy in reducing death or MI. (Invasive versus Conservative Treatment in Unstable coronary Syndromes [ICTUS]; ISRCTN82153174). Copyright 2010 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  2. Evaluation of the effect of aromatherapy with Rosa damascena Mill. on postoperative pain intensity in hospitalized children in selected hospitals affiliated to Isfahan University of Medical Sciences in 2013: A randomized clinical trial

    PubMed Central

    Marofi, Maryam; Sirousfard, Motahareh; Moeini, Mahin; Ghanadi, Alireza

    2015-01-01

    Background: Pain is the common complication after a surgery. The aim of this study was to evaluate the effect of aromatherapy with Rosa damascena Mill. on the postoperative pain in children. Materials and Methods: In a double-blind, placebo-controlled clinical trial, we selected 64 children of 3–6 years of age through convenient sampling and divided them randomly into two groups. Patients in group A were given inhalation aromatherapy with R. damascena Mill., and in group B, the patients were given almond oil as a placebo. Inhalation aromatherapy was used at the first time of subjects’ arrival to the ward and then at 3, 6, 9, and 12 h afterward. Common palliative treatments to relieve pain were used in both groups. Thirty minutes after aromatherapy, the postoperative pain in children was evaluated with the Toddler Preschooler Postoperative Pain Scale (TPPPS). Data were statistically analyzed using Chi-square test, one-way analysis of variance (ANOVA), and repeated measures ANOVA. Results: There was no significant difference in pain scores at the first time of subjects’ arrival to the ward (before receiving any aromatherapy or palliative care) between the two groups. After each time of aromatherapy and at the end of treatment, the pain score was significantly reduced in the aromatherapy group with R. damascena Mill. compared to the placebo group. Conclusions: According to our results, aromatherapy with R. damascena Mill. can be used in postoperative pain in children, together with other common treatments without any significant side effects. PMID:25878704

  3. A Randomized, Double-Blind Placebo-Controlled Trial of Oral Creatine Monohydrate Augmentation for Enhanced Response to a Selective Serotonin Reuptake Inhibitor in Women With Major Depressive Disorder

    PubMed Central

    Kim, Tae-Suk; Hwang, Jaeuk; Kim, Jieun E.; Won, Wangyoun; Bae, Sujin; Renshaw, Perry F.

    2015-01-01

    Objective Antidepressants targeting monoaminergic neurotransmitter systems, despite their immediate effects at the synaptic level, usually require several weeks of administration to achieve clinical efficacy. The authors propose a strategy of adding creatine monohydrate (creatine) to a selective serotonin reuptake inhibitor (SSRI) in the treatment of patients with major depressive disorder. Such augmentation may lead to a more rapid onset of antidepressant effects and a greater treatment response, potentially by restoring brain bioenergetics at the cellular level. Method Fifty-two women with major depressive disorder were enrolled in an 8-week double-blind placebo-controlled clinical trial and randomly assigned to receive escitalopram in addition to either creatine (5 g/day, N=25) or placebo (N=27). Efficacy was primarily assessed by changes in the Hamilton Depression Rating Scale (HAM-D) score. Results In comparison to the placebo augmentation group, patients receiving creatine augmentation showed significantly greater improvements in HAM-D score, as early as week 2 of treatment. This differential improvement favoring creatine was maintained at weeks 4 and 8. There were no differences between treatment groups in the proportion of patients who discontinued treatment prematurely (creatine: N=8, 32.0%; placebo: N=5, 18.5%) or in the overall frequency of all reported adverse events (creatine: 36 events; placebo: 45 events). Conclusions The current study suggests that creatine augmentation of SSRI treatment may be a promising therapeutic approach that exhibits more rapid and efficacious responses in women with major depressive disorder. PMID:22864465

  4. Fractional randomness

    NASA Astrophysics Data System (ADS)

    Tapiero, Charles S.; Vallois, Pierre

    2016-11-01

    The premise of this paper is that a fractional probability distribution is based on fractional operators and the fractional (Hurst) index used that alters the classical setting of random variables. For example, a random variable defined by its density function might not have a fractional density function defined in its conventional sense. Practically, it implies that a distribution's granularity defined by a fractional kernel may have properties that differ due to the fractional index used and the fractional calculus applied to define it. The purpose of this paper is to consider an application of fractional calculus to define the fractional density function of a random variable. In addition, we provide and prove a number of results, defining the functional forms of these distributions as well as their existence. In particular, we define fractional probability distributions for increasing and decreasing functions that are right continuous. Examples are used to motivate the usefulness of a statistical approach to fractional calculus and its application to economic and financial problems. In conclusion, this paper is a preliminary attempt to construct statistical fractional models. Due to the breadth and the extent of such problems, this paper may be considered as an initial attempt to do so.

  5. Efficacy and tolerability balance of oxycodone/naloxone and tapentadol in chronic low back pain with a neuropathic component: a blinded end point analysis of randomly selected routine data from 12-week prospective open-label observations

    PubMed Central

    Ueberall, Michael A; Mueller-Schwefe, Gerhard H H

    2016-01-01

    Objective To evaluate the benefit–risk profile (BRP) of oxycodone/naloxone (OXN) and tapentadol (TAP) in patients with chronic low back pain (cLBP) with a neuropathic component (NC) in routine clinical practice. Methods This was a blinded end point analysis of randomly selected 12-week routine/open-label data of the German Pain Registry on adult patients with cLBP-NC who initiated an index treatment in compliance with the current German prescribing information between 1st January and 31st October 2015 (OXN/TAP, n=128/133). Primary end point was defined as a composite of three efficacy components (≥30% improvement of pain, pain-related disability, and quality of life each at the end of observation vs baseline) and three tolerability components (normal bowel function, absence of either central nervous system side effects, and treatment-emergent adverse event [TEAE]-related treatment discontinuation during the observation period) adopted to reflect BRP assessments under real-life conditions. Results Demographic as well as baseline and pretreatment characteristics were comparable for the randomly selected data sets of both index groups without any indicators for critical selection biases. Treatment with OXN resulted formally in a BRP noninferior to that of TAP and showed a significantly higher primary end point response vs TAP (39.8% vs 25.6%, odds ratio: 1.93; P=0.014), due to superior analgesic effects. Between-group differences increased with stricter response definitions for all three efficacy components in favor of OXN: ≥30%/≥50%/≥70% response rates for OXN vs TAP were seen for pain intensity in 85.2%/67.2%/39.1% vs 83.5%/54.1%/15.8% (P= ns/0.031/<0.001), for pain-related disability in 78.1%/64.8%/43.8% vs 66.9%/50.4%/24.8% (P=0.043/0.018/0.001), and for quality of life in 76.6%/68.0%/50.0% vs 63.9%/54.1%/34.6% (P=0.026/0.022/0.017). Overall, OXN vs TAP treatments were well tolerated, and proportions of patients who either maintained a normal bowel

  6. Effect of soothing-liver and nourishing-heart acupuncture on early selective serotonin reuptake inhibitor treatment onset for depressive disorder and related indicators of neuroimmunology: a randomized controlled clinical trial.

    PubMed

    Liu, Yi; Feng, Hui; Mo, Yali; Gao, Jingfang; Mao, Hongjing; Song, Mingfen; Wang, Shengdong; Yin, Yan; Liu, Wenjuan

    2015-10-01

    To observe the effect of soothing-liver and nourishing-heart acupuncture on selective serotonin reuptake inhibitor (SSRIs) treatment effect onset in patients with depressive disorder and related indicators of neuroimmunology. Overall, 126 patients with depressive disorder were randomly divided into a medicine and acupuncture-medicine group using a random number table. Patients were treated for 6 consecutive weeks. The two groups were evaluated by the Montgomery-Asberg Depression Rating Scale (MADRS) and Side Effects Rating Scale (SERS) to assess the effect of the soothing-liver and nourishing-heart acupuncture method on early onset of SSRI treatment effect. Changes in serum 5-hydroxytryptamine (5-HT) and inflammatory cytokines before and after treatment were recorded and compared between the medicine group and the acupuncture-medicine group. The acupuncture-medicine group had significantly lower MADRS scores at weeks 1, 2, 4, and 6 after treatment compared with the medicine group (P < 0.01). The acupuncture group had significantly lower SERS scores at weeks 1, 2, 4, and 6 after treatment compared with the medicine group (P < 0.01). At 6 weeks after treatment, serum 5-HT in the acupuncture-medicine group was significantly higher compared with the medicine group (P < 0.01). Interleukin-6 (IL-6) in the acupuncture-medicine group was significantly lower than that in the medicine group (P < 0.01), whereas there was no significant difference in IL-1β between the groups (P > 0.05). Anti-inflammatory cytokines IL-4 and IL-10 were significantly higher in the acupuncture-medicine group compared with the medicine group (P < 0.01, P < 0.05, respectively). The soothing-liver and nourishing-heart acupuncture method can effectively accelerate the onset of SSRI effects when treating depressive disorder and can significantly reduce the adverse reactions of SSRIs. Moreover, acupuncture can enhance serum 5-HT and regulate the balance of pro-inflammatory cytokines and anti

  7. Implementing multifactorial psychotherapy research in online virtual environments (IMPROVE-2): study protocol for a phase III trial of the MOST randomized component selection method for internet cognitive-behavioural therapy for depression.

    PubMed

    Watkins, Edward; Newbold, Alexandra; Tester-Jones, Michelle; Javaid, Mahmood; Cadman, Jennifer; Collins, Linda M; Graham, John; Mostazir, Mohammod

    2016-10-06

    Depression is a global health challenge. Although there are effective psychological and pharmaceutical interventions, our best treatments achieve remission rates less than 1/3 and limited sustained recovery. Underpinning this efficacy gap is limited understanding of how complex psychological interventions for depression work. Recent reviews have argued that the active ingredients of therapy need to be identified so that therapy can be made briefer, more potent, and to improve scalability. This in turn requires the use of rigorous study designs that test the presence or absence of individual therapeutic elements, rather than standard comparative randomised controlled trials. One such approach is the Multiphase Optimization Strategy, which uses efficient experimentation such as factorial designs to identify active factors in complex interventions. This approach has been successfully applied to behavioural health but not yet to mental health interventions. A Phase III randomised, single-blind balanced fractional factorial trial, based in England and conducted on the internet, randomized at the level of the patient, will investigate the active ingredients of internet cognitive-behavioural therapy (CBT) for depression. Adults with depression (operationalized as PHQ-9 score ≥ 10), recruited directly from the internet and from an UK National Health Service Improving Access to Psychological Therapies service, will be randomized across seven experimental factors, each reflecting the presence versus absence of specific treatment components (activity scheduling, functional analysis, thought challenging, relaxation, concreteness training, absorption, self-compassion training) using a 32-condition balanced fractional factorial design (2IV(7-2)). The primary outcome is symptoms of depression (PHQ-9) at 12 weeks. Secondary outcomes include symptoms of anxiety and process measures related to hypothesized mechanisms. Better understanding of the active ingredients of

  8. Random grammars

    NASA Astrophysics Data System (ADS)

    Malyshev, V. A.

    1998-04-01

    Contents § 1. Definitions1.1. Grammars1.2. Random grammars and L-systems1.3. Semigroup representations § 2. Infinite string dynamics2.1. Cluster expansion2.2. Cluster dynamics2.3. Local observer § 3. Large time behaviour: small perturbations3.1. Invariant measures3.2. Classification § 4. Large time behaviour: context free case4.1. Invariant measures for grammars4.2. L-systems4.3. Fractal correlation functions4.4. Measures on languages Bibliography

  9. Protocol for Combined Analysis of FOXFIRE, SIRFLOX, and FOXFIRE-Global Randomized Phase III Trials of Chemotherapy +/- Selective Internal Radiation Therapy as First-Line Treatment for Patients With Metastatic Colorectal Cancer.

    PubMed

    Virdee, Pradeep S; Moschandreas, Joanna; Gebski, Val; Love, Sharon B; Francis, E Anne; Wasan, Harpreet S; van Hazel, Guy; Gibbs, Peter; Sharma, Ricky A

    2017-03-28

    In colorectal cancer (CRC), unresectable liver metastases are associated with a poor prognosis. The FOXFIRE (an open-label randomized phase III trial of 5-fluorouracil, oxaliplatin, and folinic acid +/- interventional radioembolization as first-line treatment for patients with unresectable liver-only or liver-predominant metastatic colorectal cancer), SIRFLOX (randomized comparative study of FOLFOX6m plus SIR-Spheres microspheres versus FOLFOX6m alone as first-line treatment in patients with nonresectable liver metastases from primary colorectal carcinoma), and FOXFIRE-Global (assessment of overall survival of FOLFOX6m plus SIR-Spheres microspheres versus FOLFOX6m alone as first-line treatment in patients with nonresectable liver metastases from primary colorectal carcinoma in a randomized clinical study) clinical trials were designed to evaluate the efficacy and safety of combining first-line chemotherapy with selective internal radiation therapy (SIRT) using yttrium-90 resin microspheres, also called transarterial radioembolization. The aim of this analysis is to prospectively combine clinical data from 3 trials to allow adequate power to evaluate the impact of chemotherapy with SIRT on overall survival. Eligible patients are adults with histologically confirmed CRC and unequivocal evidence of liver metastases which are not treatable by surgical resection or local ablation with curative intent at the time of study entry. Patients may also have limited extrahepatic metastases. Final analysis will take place when all participants have been followed up for a minimum of 2 years. Efficacy and safety estimates derived using individual participant data (IPD) from SIRFLOX, FOXFIRE, and FOXFIRE-Global will be pooled using 2-stage prospective meta-analysis. Secondary outcome measures include progression-free survival (PFS), liver-specific PFS, health-related quality of life, response rate, resection rate, and adverse event profile. The large study population will

  10. Protocol for Combined Analysis of FOXFIRE, SIRFLOX, and FOXFIRE-Global Randomized Phase III Trials of Chemotherapy +/- Selective Internal Radiation Therapy as First-Line Treatment for Patients With Metastatic Colorectal Cancer

    PubMed Central

    Virdee, Pradeep S; Moschandreas, Joanna; Gebski, Val; Love, Sharon B; Francis, E Anne; Wasan, Harpreet S; van Hazel, Guy; Gibbs, Peter

    2017-01-01

    Background In colorectal cancer (CRC), unresectable liver metastases are associated with a poor prognosis. The FOXFIRE (an open-label randomized phase III trial of 5-fluorouracil, oxaliplatin, and folinic acid +/- interventional radioembolization as first-line treatment for patients with unresectable liver-only or liver-predominant metastatic colorectal cancer), SIRFLOX (randomized comparative study of FOLFOX6m plus SIR-Spheres microspheres versus FOLFOX6m alone as first-line treatment in patients with nonresectable liver metastases from primary colorectal carcinoma), and FOXFIRE-Global (assessment of overall survival of FOLFOX6m plus SIR-Spheres microspheres versus FOLFOX6m alone as first-line treatment in patients with nonresectable liver metastases from primary colorectal carcinoma in a randomized clinical study) clinical trials were designed to evaluate the efficacy and safety of combining first-line chemotherapy with selective internal radiation therapy (SIRT) using yttrium-90 resin microspheres, also called transarterial radioembolization. Objective The aim of this analysis is to prospectively combine clinical data from 3 trials to allow adequate power to evaluate the impact of chemotherapy with SIRT on overall survival. Methods Eligible patients are adults with histologically confirmed CRC and unequivocal evidence of liver metastases which are not treatable by surgical resection or local ablation with curative intent at the time of study entry. Patients may also have limited extrahepatic metastases. Final analysis will take place when all participants have been followed up for a minimum of 2 years. Results Efficacy and safety estimates derived using individual participant data (IPD) from SIRFLOX, FOXFIRE, and FOXFIRE-Global will be pooled using 2-stage prospective meta-analysis. Secondary outcome measures include progression-free survival (PFS), liver-specific PFS, health-related quality of life, response rate, resection rate, and adverse event profile. The

  11. Random number generation and creativity.

    PubMed

    Bains, William

    2008-01-01

    A previous paper suggested that humans can generate genuinely random numbers. I tested this hypothesis by repeating the experiment with a larger number of highly numerate subjects, asking them to call out a sequence of digits selected from 0 through 9. The resulting sequences were substantially non-random, with an excess of sequential pairs of numbers and a deficit of repeats of the same number, in line with previous literature. However, the previous literature suggests that humans generate random numbers with substantial conscious effort, and distractions which reduce that effort reduce the randomness of the numbers. I reduced my subjects' concentration by asking them to call out in another language, and with alcohol - neither affected the randomness of their responses. This suggests that the ability to generate random numbers is a 'basic' function of the human mind, even if those numbers are not mathematically 'random'. I hypothesise that there is a 'creativity' mechanism, while not truly random, provides novelty as part of the mind's defence against closed programming loops, and that testing for the effects seen here in people more or less familiar with numbers or with spontaneous creativity could identify more features of this process. It is possible that training to perform better at simple random generation tasks could help to increase creativity, through training people to reduce the conscious mind's suppression of the 'spontaneous', creative response to new questions.

  12. How to do random allocation (randomization).

    PubMed

    Kim, Jeehyoung; Shin, Wonshik

    2014-03-01

    To explain the concept and procedure of random allocation as used in a randomized controlled study. We explain the general concept of random allocation and demonstrate how to perform the procedure easily and how to report it in a paper.

  13. Is random access memory random?

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    Most software is contructed on the assumption that the programs and data are stored in random access memory (RAM). Physical limitations on the relative speeds of processor and memory elements lead to a variety of memory organizations that match processor addressing rate with memory service rate. These include interleaved and cached memory. A very high fraction of a processor's address requests can be satified from the cache without reference to the main memory. The cache requests information from main memory in blocks that can be transferred at the full memory speed. Programmers who organize algorithms for locality can realize the highest performance from these computers.

  14. Single administration of Selective Internal Radiation Therapy versus continuous treatment with sorafeNIB in locally advanced hepatocellular carcinoma (SIRveNIB): study protocol for a phase iii randomized controlled trial.

    PubMed

    Gandhi, Mihir; Choo, Su Pin; Thng, Choon Hua; Tan, Say Beng; Low, Albert Su Chong; Cheow, Peng Chung; Goh, Anthony Soon Whatt; Tay, Kiang Hiong; Lo, Richard Hoau Gong; Goh, Brian Kim Poh; Wong, Jen San; Ng, David Chee Eng; Soo, Khee Chee; Liew, Wei Ming; Chow, Pierce K H

    2016-11-07

    Approximately 20 % of hepatocellular carcinoma (HCC) patients diagnosed in the early stages may benefit from potentially curative ablative therapies such as surgical resection, transplantation or radiofrequency ablation. For patients not eligible for such options, prognosis is poor. Sorafenib and Selective Internal Radiation Therapy (SIRT) are clinically proven treatment options in patients with unresectable HCC, and this study aims to assess overall survival following either SIRT or Sorafenib therapy for locally advanced HCC patients. This investigator-initiated, multi-centre, open-label, randomized, controlled trial will enrol 360 patients with locally advanced HCC, as defined by Barcelona Clinic Liver Cancer stage B or stage C, without distant metastases, and which is not amenable to immediate curative treatment. Exclusion criteria include previous systemic therapy, metastatic disease, complete occlusion of the main portal vein, or a Child-Pugh score of >7. Eligible patients will be randomised 1:1 and stratified by centre and presence or absence of portal vein thrombosis to receive either a single administration of SIRT using yttrium-90 resin microspheres (SIR-Spheres®, Sirtex Medical Limited, Sydney, Australia) targeted at HCC in the liver by the trans-arterial route or continuous oral Sorafenib (Nexavar®, Bayer Pharma AG, Berlin, Germany) at a dose of 400 mg twice daily until disease progression, no further response, complete regression or unacceptable toxicity. Patients for both the Sorafenib and SIRT arms will be followed-up every 4 weeks for the first 3 months and 12 weekly thereafter. Overall survival is the primary endpoint, assessed for the intention-to-treat population. Secondary endpoints are tumour response rate, time-to-tumour progression, progression free survival, quality of life and down-staging to receive potentially curative therapy. Definitive data comparing these two therapies will help to determine clinical practice in the large group

  15. The impact of chronic depression on acute and long-term outcomes in a randomized trial comparing selective serotonin reuptake inhibitor monotherapy versus each of 2 different antidepressant medication combinations.

    PubMed

    Sung, Sharon C; Haley, Charlotte L; Wisniewski, Stephen R; Fava, Maurizio; Nierenberg, Andrew A; Warden, Diane; Morris, David W; Kurian, Benji T; Trivedi, Madhukar H; Rush, A John

    2012-07-01

    To compare sociodemographic and clinical features, acute and continuation treatment outcomes, and adverse events/side effect burden between outpatients with chronic (current episode > 2 years) versus nonchronic major depressive disorder (MDD) who were treated with combination antidepressant therapy or selective serotonin reuptake inhibitor (SSRI) monotherapy. 663 outpatients with chronic (n = 368) or nonchronic (n = 295) moderate to severe DSM-IV-TR MDD (17-item Hamilton Depression Rating Scale score ≥ 16) were enrolled from March 2008 through September 2009 in a single-blind 7-month prospective randomized trial conducted at 6 primary and 9 psychiatric care sites across the United States. Participants were treated with escitalopram monotherapy plus placebo or 1 of 2 combination treatments (bupropion sustained-release [SR] + escitalopram or venlafaxine extended-release [XR] + mirtazapine). Analyses compared baseline sociodemographic and clinical characteristics, rates of remission (at least 1 of the last 2 consecutive scores on the 16-item Quick Inventory of Depressive Symptomatology-Self-Report [QIDS-SR16] < 6, with the other < 8), and adverse events/side effect burden (Frequency, Intensity, and Burden of Side Effects Ratings) obtained at 12 and 28 weeks. Participants with chronic MDD were at greater socioeconomic disadvantage and had greater medical and psychiatric disease burden. The chronic and nonchronic groups did not differ in rates of remission at 12 weeks (35.9% vs 42.0%, respectively; odds ratio [OR] = 0.778, P = .1500; adjusted OR [AOR] = 0.956, P = .8130) or at 28 weeks (41.0% vs 49.8%, respectively; OR = 0.706, P = .0416; AOR = 0.837, P = .3448). Participants with chronic MDD had higher final QIDS-SR(16) scores and smaller overall percent changes in QIDS-SR(16) from baseline to exit, but these differences did not remain after adjusting for covariates. There were no significant differences in adverse events or side effect burden. No significant

  16. Effect of chemotherapy on the impact of FDG-PET/CT in selection of patients for surgical resection of colorectal liver metastases: single center analysis of PET-CAM randomized trial.

    PubMed

    Metser, Ur; Halankar, Jaydeep; Langer, Deanna; Mohan, Ravi; Hussey, Douglas; Hadas, Moshonov; Tamir, Shlomit

    2017-02-01

    The largest randomized controlled trial (RCT) on the effect of FDG-PET on surgical management for metastatic colorectal adenocarcinoma to liver ("PET-CAM") reported only a modest change in surgical management (8%).

  17. Characteristics of Prostate Cancer Found at Fifth Screening in the European Randomized Study of Screening for Prostate Cancer Rotterdam: Can We Selectively Detect High-grade Prostate Cancer with Upfront Multivariable Risk Stratification and Magnetic Resonance Imaging?

    PubMed

    Alberts, Arnout R; Schoots, Ivo G; Bokhorst, Leonard P; Drost, Frank-Jan H; van Leenders, Geert J; Krestin, Gabriel P; Dwarkasing, Roy S; Barentsz, Jelle O; Schröder, Fritz H; Bangma, Chris H; Roobol, Monique J

    2017-06-21

    The harm of screening (unnecessary biopsies and overdiagnosis) generally outweighs the benefit of reducing prostate cancer (PCa) mortality in men aged ≥70 yr. Patient selection for biopsy using risk stratification and magnetic resonance imaging (MRI) may improve this benefit-to-harm ratio. To assess the potential of a risk-based strategy including MRI to selectively identify men aged ≥70 yr with high-grade PCa. Three hundred and thirty-seven men with prostate-specific antigen ≥3.0 ng/ml at a fifth screening (71-75 yr) in the European Randomized study of Screening for Prostate Cancer Rotterdam were biopsied. One hundred and seventy-nine men received six-core transrectal ultrasound biopsy (TRUS-Bx), while 158 men received MRI, 12-core TRUS-Bx, and fusion TBx in case of Prostate Imaging Reporting and Data System ≥3 lesions. The primary outcome was the overall, low-grade (Gleason Score 3+3) and high-grade (Gleason Score ≥ 3+4) PCa rate. Secondary outcome was the low- and high-grade PCa rate detected by six-core TRUS-Bx, 12-core TRUS-Bx, and MRI ± TBx. Tertiary outcome was the reduction of biopsies and low-grade PCa detection by upfront risk stratification with the Rotterdam Prostate Cancer Risk Calculator 4. Fifty-five percent of men were previously biopsied. The overall, low-grade, and high-grade PCa rates in biopsy naïve men were 48%, 27%, and 22%, respectively. In previously biopsied men these PCa rates were 25%, 20%, and 5%. Sextant TRUS-Bx, 12-core TRUS-Bx, and MRI ± TBx had a similar high-grade PCa rate (11%, 12%, and 11%) but a significantly different low-grade PCa rate (17%, 28%, and 7%). Rotterdam Prostate Cancer Risk Calculator 4-based stratification combined with 12-core TRUS-Bx ± MRI-TBx would have avoided 65% of biopsies and 68% of low-grade PCa while detecting an equal percentage of high-grade PCa (83%) compared with a TRUS-Bx all men approach (79%). After four repeated screens and ≥1 previous biopsies in half of men, a significant

  18. Randomization in clinical trials in orthodontics: its significance in research design and methods to achieve it.

    PubMed

    Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore

    2011-12-01

    Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.

  19. How to Do Random Allocation (Randomization)

    PubMed Central

    Shin, Wonshik

    2014-01-01

    Purpose To explain the concept and procedure of random allocation as used in a randomized controlled study. Methods We explain the general concept of random allocation and demonstrate how to perform the procedure easily and how to report it in a paper. PMID:24605197

  20. Fragmentation of random trees

    NASA Astrophysics Data System (ADS)

    Kalay, Z.; Ben-Naim, E.

    2015-01-01

    We study fragmentation of a random recursive tree into a forest by repeated removal of nodes. The initial tree consists of N nodes and it is generated by sequential addition of nodes with each new node attaching to a randomly-selected existing node. As nodes are removed from the tree, one at a time, the tree dissolves into an ensemble of separate trees, namely, a forest. We study statistical properties of trees and nodes in this heterogeneous forest, and find that the fraction of remaining nodes m characterizes the system in the limit N\\to ∞ . We obtain analytically the size density {{φ }s} of trees of size s. The size density has power-law tail {{φ }s}˜ {{s}-α } with exponent α =1+\\frac{1}{m}. Therefore, the tail becomes steeper as further nodes are removed, and the fragmentation process is unusual in that exponent α increases continuously with time. We also extend our analysis to the case where nodes are added as well as removed, and obtain the asymptotic size density for growing trees.

  1. 40 CFR 205.57-2 - Test vehicle sample selection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., the random selection will be achieved by sequentially numbering all of the vehicles in the batch and then using a table of random numbers to select the number of vehicles as specified in paragraph (c) of... alternative random selection plan may be used by a manufacturer: Provided, That such a plan is approved by the...

  2. The Patient Deficit Model Overturned: a qualitative study of patients' perceptions of invitation to participate in a randomized controlled trial comparing selective bladder preservation against surgery in muscle invasive bladder cancer (SPARE, CRUK/07/011).

    PubMed

    Moynihan, Clare; Lewis, Rebecca; Hall, Emma; Jones, Emma; Birtle, Alison; Huddart, Robert

    2012-11-29

    Evidence suggests that poor recruitment into clinical trials rests on a patient 'deficit' model - an inability to comprehend trial processes. Poor communication has also been cited as a possible barrier to recruitment. A qualitative patient interview study was included within the feasibility stage of a phase III non-inferiority Randomized Controlled Trial (RCT) (SPARE, CRUK/07/011) in muscle invasive bladder cancer. The aim was to illuminate problems in the context of randomization. The qualitative study used a 'Framework Analysis' that included 'constant comparison' in which semi-structured interviews are transcribed, analyzed, compared and contrasted both between and within transcripts. Three researchers coded and interpreted data. Twenty-four patients agreed to enter the interview study; 10 decliners of randomization and 14 accepters, of whom 2 subsequently declined their allocated treatment.The main theme applying to the majority of the sample was confusion and ambiguity. There was little indication that confusion directly impacted on decisions to enter the SPARE trial. However, confusion did appear to impact on ethical considerations surrounding 'informed consent', as well as cause a sense of alienation between patients and health personnel.Sub-optimal communication in many guises accounted for the confusion, together with the logistical elements of a trial that involved treatment options delivered in a number of geographical locations. These data highlight the difficulty of providing balanced and clear trial information within the UK health system, despite best intentions. Involvement of multiple professionals can impact on communication processes with patients who are considering participation in RCTs. Our results led us to question the 'deficit' model of patient behavior. It is suggested that health professionals might consider facilitating a context in which patients feel fully included in the trial enterprise and potentially consider alternatives to

  3. Random broadcast on random geometric graphs

    SciTech Connect

    Bradonjic, Milan; Elsasser, Robert; Friedrich, Tobias

    2009-01-01

    In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.

  4. Nonvolatile random access memory

    NASA Technical Reports Server (NTRS)

    Wu, Jiin-Chuan (Inventor); Stadler, Henry L. (Inventor); Katti, Romney R. (Inventor)

    1994-01-01

    A nonvolatile magnetic random access memory can be achieved by an array of magnet-Hall effect (M-H) elements. The storage function is realized with a rectangular thin-film ferromagnetic material having an in-plane, uniaxial anisotropy and inplane bipolar remanent magnetization states. The thin-film magnetic element is magnetized by a local applied field, whose direction is used to form either a 0 or 1 state. The element remains in the 0 or 1 state until a switching field is applied to change its state. The stored information is detcted by a Hall-effect sensor which senses the fringing field from the magnetic storage element. The circuit design for addressing each cell includes transistor switches for providing a current of selected polarity to store a binary digit through a separate conductor overlying the magnetic element of the cell. To read out a stored binary digit, transistor switches are employed to provide a current through a row of Hall-effect sensors connected in series and enabling a differential voltage amplifier connected to all Hall-effect sensors of a column in series. To avoid read-out voltage errors due to shunt currents through resistive loads of the Hall-effect sensors of other cells in the same column, at least one transistor switch is provided between every pair of adjacent cells in every row which are not turned on except in the row of the selected cell.

  5. How random is a random vector?

    SciTech Connect

    Eliazar, Iddo

    2015-12-15

    Over 80 years ago Samuel Wilks proposed that the “generalized variance” of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the “Wilks standard deviation” –the square root of the generalized variance–is indeed the standard deviation of a random vector. We further establish that the “uncorrelation index” –a derivative of the Wilks standard deviation–is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: “randomness measures” and “independence indices” of random vectors. In turn, these general notions give rise to “randomness diagrams”—tangible planar visualizations that answer the question: How random is a random vector? The notion of “independence indices” yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  6. Selective Mutism

    MedlinePlus

    ... and Statistical Manual of Mental Disorders: Fifth Edition (DSM-5: pp.195–197). Children with selective mutism ... How common is selective mutism? According to the DSM-5, selective mutism is an apparently rare disorder ...

  7. Directed random walk with random restarts: The Sisyphus random walk

    NASA Astrophysics Data System (ADS)

    Montero, Miquel; Villarroel, Javier

    2016-09-01

    In this paper we consider a particular version of the random walk with restarts: random reset events which suddenly bring the system to the starting value. We analyze its relevant statistical properties, like the transition probability, and show how an equilibrium state appears. Formulas for the first-passage time, high-water marks, and other extreme statistics are also derived; we consider counting problems naturally associated with the system. Finally we indicate feasible generalizations useful for interpreting different physical effects.

  8. Directed random walk with random restarts: The Sisyphus random walk.

    PubMed

    Montero, Miquel; Villarroel, Javier

    2016-09-01

    In this paper we consider a particular version of the random walk with restarts: random reset events which suddenly bring the system to the starting value. We analyze its relevant statistical properties, like the transition probability, and show how an equilibrium state appears. Formulas for the first-passage time, high-water marks, and other extreme statistics are also derived; we consider counting problems naturally associated with the system. Finally we indicate feasible generalizations useful for interpreting different physical effects.

  9. Associative Hierarchical Random Fields.

    PubMed

    Ladický, L'ubor; Russell, Chris; Kohli, Pushmeet; Torr, Philip H S

    2014-06-01

    This paper makes two contributions: the first is the proposal of a new model-The associative hierarchical random field (AHRF), and a novel algorithm for its optimization; the second is the application of this model to the problem of semantic segmentation. Most methods for semantic segmentation are formulated as a labeling problem for variables that might correspond to either pixels or segments such as super-pixels. It is well known that the generation of super pixel segmentations is not unique. This has motivated many researchers to use multiple super pixel segmentations for problems such as semantic segmentation or single view reconstruction. These super-pixels have not yet been combined in a principled manner, this is a difficult problem, as they may overlap, or be nested in such a way that the segmentations form a segmentation tree. Our new hierarchical random field model allows information from all of the multiple segmentations to contribute to a global energy. MAP inference in this model can be performed efficiently using powerful graph cut based move making algorithms. Our framework generalizes much of the previous work based on pixels or segments, and the resulting labelings can be viewed both as a detailed segmentation at the pixel level, or at the other extreme, as a segment selector that pieces together a solution like a jigsaw, selecting the best segments from different segmentations as pieces. We evaluate its performance on some of the most challenging data sets for object class segmentation, and show that this ability to perform inference using multiple overlapping segmentations leads to state-of-the-art results.

  10. 47 CFR 1.1604 - Post-selection hearings.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a...

  11. Fluctuating Selection in the Moran

    PubMed Central

    Dean, Antony M.; Lehman, Clarence; Yi, Xiao

    2017-01-01

    Contrary to classical population genetics theory, experiments demonstrate that fluctuating selection can protect a haploid polymorphism in the absence of frequency dependent effects on fitness. Using forward simulations with the Moran model, we confirm our analytical results showing that a fluctuating selection regime, with a mean selection coefficient of zero, promotes polymorphism. We find that increases in heterozygosity over neutral expectations are especially pronounced when fluctuations are rapid, mutation is weak, the population size is large, and the variance in selection is big. Lowering the frequency of fluctuations makes selection more directional, and so heterozygosity declines. We also show that fluctuating selection raises dn/ds ratios for polymorphism, not only by sweeping selected alleles into the population, but also by purging the neutral variants of selected alleles as they undergo repeated bottlenecks. Our analysis shows that randomly fluctuating selection increases the rate of evolution by increasing the probability of fixation. The impact is especially noticeable when the selection is strong and mutation is weak. Simulations show the increase in the rate of evolution declines as the rate of new mutations entering the population increases, an effect attributable to clonal interference. Intriguingly, fluctuating selection increases the dn/ds ratios for divergence more than for polymorphism, a pattern commonly seen in comparative genomics. Our model, which extends the classical neutral model of molecular evolution by incorporating random fluctuations in selection, accommodates a wide variety of observations, both neutral and selected, with economy. PMID:28108586

  12. 32 CFR 1624.4 - Selection and/or rescheduling of registrants for induction.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... their random sequence number (RSN) established by random selection procedures in accord with § 1624.1 (d) Registrants in the age 20 selection group for the current calendar year in the order of their random sequence number (RSN) established by random selection procedures in accord with § 1624.1. (e) Registrants in each...

  13. ABSTRACTS SELECTED FOR AEC 2017: LUMINAL PLENARY: OR-LUM-01: Comparison of endoscopic ultrasound-guided fine needle aspiration by capillary action, suction, and no suction methods: A randomized blinded study

    PubMed Central

    Bansal, Rinkesh; Puri, Rajesh; Choudhary, Narendra S.; Sud, Randhir; Patle, Saurabh; Guleria, Mridula; Sarin, Haimanti; Kaur, Gagandeep; Prabha, Chandra; Bhatia, Sumit

    2017-01-01

    Background: Different types of endoscopic ultrasound (EUS)-guided fine needle aspiration (FNA) techniques are used in clinical practice; the best method in terms of outcome is not standardized. Objectives: To compare diagnostic adequacy of aspirated material, cytopathologic and EUS morphological features among capillary action, suction, and no suction FNA methods. Methods: A prospective, single-blinded, randomized study was conducted at a tertiary care hospital. A total of 37 patients were excluded, and a total of 300 (100 in each arm) patients were included. Patients were randomized into the three groups, i.e., capillary action (Group 1), suction (Group 2), and no suction (Group 3). Results: A total of 300 patients (195 males) underwent EUS-guided FNA of 235 lymph nodes and 65 pancreatic masses (distribution not statistically different among groups); mean age was 52 ± 14 years. A 22-gauze needle (93%) was used in majority. There was no statistically difference among all the groups regarding lymph node size at large axis and ratio, type of needle, echo-features, echogenicity, calcification, necrosis, shape, borders (lymph nodes), number of passes, and cellularity. Diagnostic adequacy of the specimen was 91%, 91%, and 94% in Groups 1, 2, and 3, respectively (P = 0.665). The suction group had significantly more number of slides and more hemorrhagic slides in comparison to other groups. Conclusion: EUS-guided FNA by capillary action, suction, and no suction methods has similar diagnostic adequacy of specimen; suction method has disadvantage of more number of slides and more hemorrhagic slides.

  14. Efficacy and Safety of the Selective β3 -Adrenoceptor Agonist Mirabegron in Japanese Patients with Overactive Bladder: A Randomized, Double-Blind, Placebo-Controlled, Dose-Finding Study.

    PubMed

    Yamaguchi, Osamu; Marui, Eiji; Igawa, Yasuhiko; Takeda, Masayuki; Nishizawa, Osamu; Ikeda, Yasushi; Ohkawa, Sumito

    2015-05-01

    To evaluate the efficacy and safety of the β3 -adrenoceptor agonist, mirabegron, compared with placebo in Japanese patients with overactive bladder (OAB). Patients with OAB symptoms for ≥24 weeks, ≥8 micturitions/24 h on average, and ≥1 episode of urgency and/or urgency incontinence/24 h were randomized to mirabegron (25, 50 or 100 mg) or placebo for 12 weeks. The primary endpoint was change from baseline to end of study in the mean number of micturitions/24 h. Secondary endpoints included micturition variables related to urgency, incontinence, volume voided, and quality of life based on the King's Health Questionnaire (KHQ). Safety was evaluated based on adverse events (AEs), laboratory findings, vital signs, electrocardiogram, and post-void residual volume. In total, 842 patients were randomized to placebo (n = 214), mirabegron 25 mg (n = 211), 50 mg (n = 208), or 100 mg (n = 209). The primary endpoint was significantly improved in each mirabegron group compared with placebo (P < 0.001; Williams' multiple comparison test). The maximal efficacy in the primary endpoint was observed at the 50 mg dose. Significant improvements were also observed in incontinence, urgency incontinence, mean volume voided, and 3 of the 9 domains from the KHQ (incontinence impact, physical limitations, and severity measures) at each mirabegron dose. Urgency episodes decreased, and mean volume voided increased, dose-dependently. The incidence of AEs in each mirabegron dose was comparable with placebo. Mirabegron demonstrated significant improvements in OAB symptoms compared with placebo and was well tolerated. © 2014 Wiley Publishing Asia Pty Ltd.

  15. A machine learning methodology for the selection and classification of spontaneous spinal cord dorsum potentials allows disclosure of structured (non-random) changes in neuronal connectivity induced by nociceptive stimulation.

    PubMed

    Martin, Mario; Contreras-Hernández, Enrique; Béjar, Javier; Esposito, Gennaro; Chávez, Diógenes; Glusman, Silvio; Cortés, Ulises; Rudomin, Pablo

    2015-01-01

    Previous studies aimed to disclose the functional organization of the neuronal networks involved in the generation of the spontaneous cord dorsum potentials (CDPs) generated in the lumbosacral spinal segments used predetermined templates to select specific classes of spontaneous CDPs. Since this procedure was time consuming and required continuous supervision, it was limited to the analysis of two specific types of CDPs (negative CDPs and negative positive CDPs), thus excluding potentials that may reflect activation of other neuronal networks of presumed functional relevance. We now present a novel procedure based in machine learning that allows the efficient and unbiased selection of a variety of spontaneous CDPs with different shapes and amplitudes. The reliability and performance of the present method is evaluated by analyzing the effects on the probabilities of generation of different classes of spontaneous CDPs induced by the intradermic injection of small amounts of capsaicin in the anesthetized cat, a procedure known to induce a state of central sensitization leading to allodynia and hyperalgesia. The results obtained with the selection method presently described allowed detection of spontaneous CDPs with specific shapes and amplitudes that are assumed to represent the activation of functionally coupled sets of dorsal horn neurones that acquire different, structured configurations in response to nociceptive stimuli. These changes are considered as responses tending to adequate transmission of sensory information to specific functional requirements as part of homeostatic adjustments.

  16. A machine learning methodology for the selection and classification of spontaneous spinal cord dorsum potentials allows disclosure of structured (non-random) changes in neuronal connectivity induced by nociceptive stimulation

    PubMed Central

    Martin, Mario; Contreras-Hernández, Enrique; Béjar, Javier; Esposito, Gennaro; Chávez, Diógenes; Glusman, Silvio; Cortés, Ulises; Rudomin, Pablo

    2015-01-01

    Previous studies aimed to disclose the functional organization of the neuronal networks involved in the generation of the spontaneous cord dorsum potentials (CDPs) generated in the lumbosacral spinal segments used predetermined templates to select specific classes of spontaneous CDPs. Since this procedure was time consuming and required continuous supervision, it was limited to the analysis of two specific types of CDPs (negative CDPs and negative positive CDPs), thus excluding potentials that may reflect activation of other neuronal networks of presumed functional relevance. We now present a novel procedure based in machine learning that allows the efficient and unbiased selection of a variety of spontaneous CDPs with different shapes and amplitudes. The reliability and performance of the present method is evaluated by analyzing the effects on the probabilities of generation of different classes of spontaneous CDPs induced by the intradermic injection of small amounts of capsaicin in the anesthetized cat, a procedure known to induce a state of central sensitization leading to allodynia and hyperalgesia. The results obtained with the selection method presently described allowed detection of spontaneous CDPs with specific shapes and amplitudes that are assumed to represent the activation of functionally coupled sets of dorsal horn neurones that acquire different, structured configurations in response to nociceptive stimuli. These changes are considered as responses tending to adequate transmission of sensory information to specific functional requirements as part of homeostatic adjustments. PMID:26379540

  17. Hearing random matrices and random waves

    NASA Astrophysics Data System (ADS)

    Berry, M. V.; Shukla, Pragya

    2013-01-01

    The eigenangles of random matrices in the three standard circular ensembles are rendered as sounds in several different ways. The different fluctuation properties of these ensembles can be heard, and distinguished from the two extreme cases, of angles that are distributed uniformly round the unit circle and those that are random and uncorrelated. Similarly, in Gaussian random superpositions of monochromatic plane waves in one, two and three dimensions, the dimensions can be distinguished in sounds created from one-dimensional sections. This paper is dedicated to the memory of Richard E Crandall.

  18. Spectroscopy with Random and Displaced Random Ensembles

    NASA Astrophysics Data System (ADS)

    Velázquez, V.; Zuker, A. P.

    2002-02-01

    Because of the time reversal invariance of the angular momentum operator J2, the average energies and variances at fixed J for random two-body Hamiltonians exhibit odd-even- J staggering that may be especially strong for J = 0. It is shown that upon ensemble averaging over random runs, this behavior is reflected in the yrast states. Displaced (attractive) random ensembles lead to rotational spectra with strongly enhanced B(E2) transitions for a certain class of model spaces. It is explained how to generalize these results to other forms of collectivity.

  19. On Gaussian random supergravity

    NASA Astrophysics Data System (ADS)

    Bachlechner, Thomas C.

    2014-04-01

    We study the distribution of metastable vacua and the likelihood of slow roll inflation in high dimensional random landscapes. We consider two examples of landscapes: a Gaussian random potential and an effective supergravity potential defined via a Gaussian random superpotential and a trivial Kähler potential. To examine these landscapes we introduce a random matrix model that describes the correlations between various derivatives and we propose an efficient algorithm that allows for a numerical study of high dimensional random fields. Using these novel tools, we find that the vast majority of metastable critical points in N dimensional random supergravities are either approximately supersymmetric with | F| ≪ M susy or supersymmetric. Such approximately supersymmetric points are dynamical attractors in the landscape and the probability that a randomly chosen critical point is metastable scales as log( P ) ∝ - N. We argue that random supergravities lead to potentially interesting inflationary dynamics.

  20. Quantum random number generator based on twin beams.

    PubMed

    Zhang, Qiang; Deng, Xiaowei; Tian, Caixing; Su, Xiaolong

    2017-03-01

    We produce two strings of quantum random numbers simultaneously from the intensity fluctuations of the twin beams generated by a nondegenerate optical parametric oscillator. Two strings of quantum random numbers with bit rates up to 60 Mb/s are extracted simultaneously with a suitable post-processing algorithm. By post-selecting the identical data from two raw sequences and using a suitable hash function, we also extract two strings of identical quantum random numbers. The obtained random numbers pass all NIST randomness tests. The presented scheme shows the feasibility of generating quantum random numbers from the intensity of a macroscopic optical field.

  1. Randomization methods in emergency setting trials: a descriptive review

    PubMed Central

    Moe‐Byrne, Thirimon; Oddie, Sam; McGuire, William

    2015-01-01

    Background Quasi‐randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic indicators between treatment groups in trials using true randomization versus trials using quasi‐randomization. Results Seven reviews contained 16 trials that used true randomization and 11 that used quasi‐randomization. Baseline group imbalance was identified in four trials using true randomization (25%) and in two quasi‐randomized trials (18%). Of the four truly randomized trials with imbalance, three concealed treatment allocation adequately. Clinical heterogeneity and poor reporting limited the assessment of trial recruitment outcomes. Conclusions We did not find strong or consistent evidence that quasi‐randomization is associated with selection bias more often than true randomization. High risk of bias judgements for quasi‐randomized emergency studies should therefore not be assumed in systematic reviews. Clinical heterogeneity across trials within reviews, coupled with limited availability of relevant trial accrual data, meant it was not possible to adequately explore the possibility that true randomization might result in slower trial recruitment rates, or the recruitment of less representative populations. © 2015 The Authors. Research Synthesis Methods published by John Wiley & Sons, Ltd. PMID:26333419

  2. 9 CFR 590.350 - Procedures for selecting appeal samples.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... containers plus an equal number of containers selected at random. When the original sample containers cannot be located, the appeal sample shall consist of product taken at random from double the number of... the original sample containers plus an equal number of containers selected at random. A condition...

  3. 7 CFR 42.105 - Basis for selection of sample.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... either sampling procedure used to select the sample.) (1) Proportional random sampling. When the number... per code are known, select sample units at random within each mark and in a number proportionate to the number of containers represented by such mark. (2) Simple random sampling. When there are no code...

  4. 47 CFR 1.1604 - Post-selection hearings.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the random... grant. If the Commission is unable to make such a determination, it shall order that another random...

  5. 40 CFR 205.57-2 - Test vehicle sample selection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... random selection will be achieved by sequentially numbering all of the vehicles in the batch and then using a table of random numbers to select the number of vehicles as specified in paragraph (c) of this section based on the batch size designated by the Administrator in the test request. An alternative random...

  6. Quantum random number generation

    SciTech Connect

    Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; Zhang, Zhen; Qi, Bing

    2016-06-28

    Quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a high speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.

  7. Quantum random number generation

    DOE PAGES

    Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; ...

    2016-06-28

    Quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a highmore » speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.« less

  8. Quantum random number generation

    NASA Astrophysics Data System (ADS)

    Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; Qi, Bing; Zhang, Zhen

    2016-06-01

    Quantum physics can be exploited to generate true random numbers, which have important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness—coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. On the basis of the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a high speed by properly modelling the devices. The second category is self-testing QRNG, in which verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category that provides a tradeoff between the trustworthiness on the device and the random number generation speed.

  9. Assessment of non-BDNF neurotrophins and GDNF levels after depression treatment with sertraline and transcranial direct current stimulation in a factorial, randomized, sham-controlled trial (SELECT-TDCS): an exploratory analysis.

    PubMed

    Brunoni, André R; Machado-Vieira, Rodrigo; Zarate, Carlos A; Vieira, Erica L M; Valiengo, Leandro; Benseñor, Isabela M; Lotufo, Paulo A; Gattaz, Wagner F; Teixeira, Antonio L

    2015-01-02

    The neurotrophic hypothesis of depression states that the major depressive episode is associated with lower neurotrophic factors levels, which increase with amelioration of depressive symptoms. However, this hypothesis has not been extended to investigate neurotrophic factors other than the brain-derived neurotrophic factor (BDNF). We therefore explored whether plasma levels of neurotrophins 3 (NT-3) and 4 (NT-4), nerve growth factor (NGF) and glial cell line derived neurotrophic factor (GDNF) changed after antidepressant treatment and correlated with treatment response. Seventy-three patients with moderate-to-severe, antidepressant-free unipolar depression were assigned to a pharmacological (sertraline) and a non-pharmacological (transcranial direct current stimulation, tDCS) intervention in a randomized, 2 × 2, placebo-controlled design. The plasma levels of NT-3, NT-4, NGF and GDNF were determined by enzyme-linked immunosorbent assay before and after a 6-week treatment course and analyzed according to clinical response and allocation group. We found that tDCS and sertraline (separately and combined) produced significant improvement in depressive symptoms. Plasma levels of all neurotrophic factors were similar across groups at baseline and remained significantly unchanged regardless of the intervention and of clinical response. Also, baseline plasma levels were not associated with clinical response. To conclude, in this 6-week placebo-controlled trial, NT-3, NT-4, NGF and GDNF plasma levels did not significantly change with sertraline or tDCS. These data suggest that these neurotrophic factors are not surrogate biomarkers of treatment response or involved in the antidepressant mechanisms of tDCS. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. A cluster-randomized controlled trial evaluating the effects of delaying onset of adolescent substance abuse on cognitive development and addiction following a selective, personality-targeted intervention programme: the Co-Venture trial.

    PubMed

    O'Leary-Barrett, Maeve; Mâsse, Benoit; Pihl, Robert O; Stewart, Sherry H; Séguin, Jean R; Conrod, Patricia J

    2017-10-01

    Substance use and binge drinking during early adolescence are associated with neurocognitive abnormalities, mental health problems and an increased risk for future addiction. The trial aims to evaluate the protective effects of an evidence-based substance use prevention programme on the onset of alcohol and drug use in adolescence, as well as on cognitive, mental health and addiction outcomes over 5 years. Thirty-eight high schools will be recruited, with a final sample of 31 schools assigned to intervention or control conditions (3826 youth). Brief personality-targeted interventions will be delivered to high-risk youth attending intervention schools during the first year of the trial. Control school participants will receive no intervention above what is offered to them in the regular curriculum by their respective schools. Public/private French and English high schools in Montreal (Canada). All grade 7 students (12-13 years old) will be invited to participate. High-risk youth will be identified as those scoring one standard deviation or more above the school mean on one of the four personality subscales of the Substance Use Risk Profile Scale (40-45% youth). Self-reported substance use and mental health symptoms and cognitive functioning measured annually throughout 5 years. Primary outcomes are the onset of substance use disorders at 4 years post-intervention (year 5). Secondary intermediate outcomes are the onset of alcohol and substance use 2 years post-intervention and neuropsychological functions; namely, the protective effects of substance use prevention on cognitive functions generally, and executive functions and reward sensitivity specifically. This longitudinal, cluster-randomized controlled trial will investigate the impact of a brief personality-targeted intervention program on reducing the onset of addiction 4 years-post intervention. Results will tease apart the developmental sequences of uptake and growth in substance use and cognitive

  11. Additional benefit of using a risk-based selection for prostate biopsy: an analysis of biopsy complications in the Rotterdam section of the European Randomized Study of Screening for Prostate Cancer.

    PubMed

    Chiu, Peter K; Alberts, Arnout R; Venderbos, Lionne D F; Bangma, Chris H; Roobol, Monique J

    2017-09-01

    To investigate biopsy complications and hospital admissions that could be reduced by the use of European Randomized Study of Screening for Prostate Cancer (ERSPC) risk calculators. All biopsies performed in the Rotterdam section of the ERSPC between 1993 and 2015 were included. Biopsy complications and hospital admission data were prospectively recorded in questionnaires that were completed 2 weeks after biopsy. The ERSPC risk calculators 3 (RC3) and 4 (RC4) were applied to men attending the first and subsequent rounds of screening, respectively. Applying the predefined RC3/4 probability thresholds for prostate cancer (PCa) risk of ≥12.5% and high-grade PCa risk ≥3%, we assessed the number of complications, admissions and costs that could be reduced by avoiding biopsies in men below these thresholds. A total of 10 747 biopsies with complete questionnaires were included. For these biopsies a complication rate of 67.9% (7294/10 747), a post-biopsy fever rate of 3.9% (424/10747) and a hospital admission rate of 0.9% (92/10747) were recorded. The fever rate was found to be static over the years, but the hospital admission rate tripled from 0.6% (1993-1996) to 2.1% (2009-2015). Among 7704 biopsies which fit the criteria for RC3 or RC4, 35.8% of biopsies (2757/7704), 37.4% of complications (1972/5268), 39.4% of fever events (128/325) and 42.3% of admissions (30/71) could have been avoided by using one of the risk calculators. More complications could have been avoided if RC4 had been used and for more recent biopsies (2009-2015). Our findings show that 35.9% of the total cost of biopsies and complication treatment could have been avoided. A significant proportion of biopsy complications, hospital admissions and costs could be reduced if biopsy decisions were based on ERSPC risk calculators instead of PSA only. This effect was most prominent in more recent biopsies and in men with repeated biopsies or screening. © 2017 The Authors BJU International © 2017 BJU

  12. The Efficacy of Single-Agent Epidermal Growth Factor Receptor Tyrosine Kinase Inhibitor Therapy in Biologically Selected Patients with Non-Small-Cell Lung Cancer: A Meta-Analysis of 19 Randomized Controlled Trials.

    PubMed

    Li, Guifang; Gao, Shunji; Sheng, Zhixin; Li, Bin

    2016-01-01

    To determine the efficacy of first-generation single-agent epidermal growth factor receptor (EGFR) tyrosine kinase inhibitor (TKI) therapy in advanced non-small-cell lung cancer patients with known EGFR mutation status, we undertook this pooled analysis. We searched for randomized controlled trials (RCTs) in Medline, Embase, the Cochrane Controlled Trials Register, the Science Citation Index, and the American Society of Clinical Oncology annual meetings. Out of 2,129 retrieved articles, 19 RCTs enrolling 2,016 patients with wild-type EGFR tumors and 1,034 patients with mutant EGFR tumors were identified. For these EGFR mutant patients, single-agent EGFR-TKI therapy improved progression-free survival (PFS) over chemotherapy: the summary hazard ratios (HRs) were 0.41 (p < 0.001) for the first-line setting and 0.46 (p = 0.02) for the second-/third-line setting. For those EGFR wild-type patients, single-agent EGFR-TKI therapy did not do as well as chemotherapy in the first-line setting (HR = 1.65, p = 0.03) and in the second-/third-line setting (HR = 1.27, p = 0.006). No statistically significant difference was observed in terms of overall survival (OS). Using platinum-based doublet chemotherapy as a common comparator, indirect comparison showed the superior efficacy of single-agent EGFR-TKI therapy over EGFR-TKIs added to chemotherapy in PFS [HR = 1.35 (1.03, 1.77), p = 0.03]. Additionally, a marginal trend towards the same direction was found in the OS analysis [HR = 1.16 (0.99, 1.35), p = 0.06]. Interestingly, for those EGFR wild-type tumors, single-agent EGFR-TKI therapy was inferior to EGFR-TKIs added to chemotherapy in PFS [HR = 0.38 (0.33, 0.44), p < 0.001] and OS [HR = 0.83 (0.71, 0.97), p = 0.02]. For these EGFR mutant patients, single-agent EGFR-TKI therapy prolonged PFS over chemotherapy. However, single-agent EGFR-TKI therapy was inferior to chemotherapy in PFS for those EGFR wild-type patients. Single-agent EGFR-TKI therapy could improve PFS over the

  13. [Effect of treatment with a food supplement (containing: selected sea fish cartilage, vitamin C, vitamin E, folic acid, zinc, copper) in women with iron deficiency: double blind, randomized, placebo-controlled trial].

    PubMed

    Rondanelli, M; Opizzi, A; Andreoni, L; Trotti, R

    2006-10-01

    The term iron deficiency is used to indicate a condition in which the content of iron (Fe) in the organism is low, even before the consequent reduction in erythropoiesis comes about. This clinical situation is very frequent in patients in fertile age. The therapy commonly used (Fe salts) is often poorly tolerated. The use of a food supplement containing nutrients useful for improving the bioavailability of Fe and that is well tolerated can represent a valid alternative to iron therapy. The present study examines 49 fertile women with iron deficiency, of normal weight and not undergoing estroprogestin treatment. The patients underwent 3 assessments: basal, after 30 and after 60 days to determine their complete haemochrome, blood iron, blood ferritin, blood transferrin, iron binding capacity, folates, TSH, FT3, and FT4. Following the basal assessment, patients were randomly assigned to 1 of 2 treatment groups: treatment A (25 patients): food supplement containing hydrolyzed sea fish cartilage, vitamin C, vitamin E, folic acid, zinc, copper (Captafer); treatment B (24 patients): placebo. The patients were then subdivided into 2 groups according to the basal blood iron (<60 microg/dL) or blood ferritin (<20 ng/mL) values. In the group presenting blood iron of <60 microg/dL only treatment A supplement produced a significant improvement in blood iron after 30 (P<0.001) and after 60 (P<0.005) days of treatment. The group with basal blood ferritin of <20 ng/mL presented blood iron levels of >60 microg/dL; in these patients after 60 days of treatment with the supplement, there was a significant increase in blood ferritin (P<0.05); the patients treated with placebo, on the other hand, did not show any significant difference compared to basal values. This study has shown that, in patients with iron deficiency, the use of a food supplement, consisting of nutrients that improve the bioavailability of Fe, leads to a significant improvement in blood iron and blood ferritin levels.

  14. EDITORIAL: Nano and random lasers Nano and random lasers

    NASA Astrophysics Data System (ADS)

    Wiersma, Diederik S.; Noginov, Mikhail A.

    2010-02-01

    The field of extreme miniature sources of stimulated emission represented by random lasers and nanolasers has gone through an enormous development in recent years. Random lasers are disordered optical structures in which light waves are both multiply scattered and amplified. Multiple scattering is a process that we all know very well from daily experience. Many familiar materials are actually disordered dielectrics and owe their optical appearance to multiple light scattering. Examples are white marble, white painted walls, paper, white flowers, etc. Light waves inside such materials perform random walks, that is they are scattered several times in random directions before they leave the material, and this gives it an opaque white appearance. This multiple scattering process does not destroy the coherence of the light. It just creates a very complex interference pattern (also known as speckle). Random lasers can be made of basically any disordered dielectric material by adding an optical gain mechanism to the structure. In practice this can be achieved with, for instance, laser dye that is dissolved in the material and optically excited by a pump laser. Alternative routes to incorporate gain are achieved using rare-earth or transition metal doped solid-state laser materials or direct band gap semiconductors. The latter can potentially be pumped electrically. After excitation, the material is capable of scattering light and amplifying it, and these two ingredients form the basis for a random laser. Random laser emission can be highly coherent, even in the absence of an optical cavity. The reason is that random structures can sustain optical modes that are spectrally narrow. This provides a spectral selection mechanism that, together with gain saturation, leads to coherent emission. A random laser can have a large number of (randomly distributed) modes that are usually strongly coupled. This means that many modes compete for the gain that is available in a random

  15. An Overview of Randomization and Minimization Programs for Randomized Clinical Trials

    PubMed Central

    Saghaei, Mahmoud

    2011-01-01

    Randomization is an essential component of sound clinical trials, which prevents selection biases and helps in blinding the allocations. Randomization is a process by which subsequent subjects are enrolled into trial groups only by chance, which essentially eliminates selection biases. A serious consequence of randomization is severe imbalance among the treatment groups with respect to some prognostic factors, which invalidate the trial results or necessitate complex and usually unreliable secondary analysis to eradicate the source of imbalances. Minimization on the other hand tends to allocate in such a way as to minimize the differences among groups, with respect to prognostic factors. Pure minimization is therefore completely deterministic, that is, one can predict the allocation of the next subject by knowing the factor levels of a previously enrolled subject and having the properties of the next subject. To eliminate the predictability of randomization, it is necessary to include some elements of randomness in the minimization algorithms. In this article brief descriptions of randomization and minimization are presented followed by introducing selected randomization and minimization programs. PMID:22606659

  16. Quantum random number generators

    NASA Astrophysics Data System (ADS)

    Herrero-Collantes, Miguel; Garcia-Escartin, Juan Carlos

    2017-01-01

    Random numbers are a fundamental resource in science and engineering with important applications in simulation and cryptography. The inherent randomness at the core of quantum mechanics makes quantum systems a perfect source of entropy. Quantum random number generation is one of the most mature quantum technologies with many alternative generation methods. This review discusses the different technologies in quantum random number generation from the early devices based on radioactive decay to the multiple ways to use the quantum states of light to gather entropy from a quantum origin. Randomness extraction and amplification and the notable possibility of generating trusted random numbers even with untrusted hardware using device-independent generation protocols are also discussed.

  17. Random-Dot Stereogram

    NASA Astrophysics Data System (ADS)

    Kuwayama, Tetsuro

    The technology and history of random-dot stereogram are described. The paper on two-picture type random-dot stereogram is delivered in 1960, and this technology came to be known widely in 1960s. On the other hand, the principle of the single image random-dot stereogram (Autostereogram) was invented in 1979, but came to be known since the announcement of SPIE conference in 1990. Wallpaper stereogram is also described.

  18. Invitation to Random Tensors

    NASA Astrophysics Data System (ADS)

    Gurau, Razvan

    2016-09-01

    This article is preface to the SIGMA special issue ''Tensor Models, Formalism and Applications'', http://www.emis.de/journals/SIGMA/Tensor_Models.html. The issue is a collection of eight excellent, up to date reviews on random tensor models. The reviews combine pedagogical introductions meant for a general audience with presentations of the most recent developments in the field. This preface aims to give a condensed panoramic overview of random tensors as the natural generalization of random matrices to higher dimensions.

  19. Random survival forests for competing risks

    PubMed Central

    Ishwaran, Hemant; Gerds, Thomas A.; Kogalur, Udaya B.; Moore, Richard D.; Gange, Stephen J.; Lau, Bryan M.

    2014-01-01

    We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection in high-dimensional problems and in settings such as HIV/AIDS that involve many competing risks. PMID:24728979

  20. Randomized SUSAN edge detector

    NASA Astrophysics Data System (ADS)

    Qu, Zhi-Guo; Wang, Ping; Gao, Ying-Hui; Wang, Peng

    2011-11-01

    A speed up technique for the SUSAN edge detector based on random sampling is proposed. Instead of sliding the mask pixel by pixel on an image as the SUSAN edge detector does, the proposed scheme places the mask randomly on pixels to find edges in the image; we hereby name it randomized SUSAN edge detector (R-SUSAN). Specifically, the R-SUSAN edge detector adopts three approaches in the framework of random sampling to accelerate a SUSAN edge detector: procedure integration of response computation and nonmaxima suppression, reduction of unnecessary processing for obvious nonedge pixels, and early termination. Experimental results demonstrate the effectiveness of the proposed method.

  1. Do randomized clinical trial selection criteria reflect levels of risk as observed in a general population of acute myocardial infarction survivors? The PEGASUS trial in the light of the FAST-MI 2005 registry.

    PubMed

    Puymirat, Etienne; Schiele, François; Zeller, Marianne; Jacquemin, Laurent; Leclercq, Florence; Marcaggi, Xavier; Ferrières, Jean; Simon, Tabassome; Danchin, Nicolas

    2016-11-15

    Few clinical trials have focused on populations with a history of distant myocardial infarction (MI). The PEGASUS trial assessed the impact of dual antiplatelet therapy in such patients, selected by enrichment criteria of high cardiovascular risk. Whether the PEGASUS population reflects the risk of a broader post-MI population is questionable. We analyzed whether 4-year mortality of a routine-practice population would differ according to the inclusion and exclusion criteria used in PEGASUS. FAST-MI is a nationwide French registry recruiting acute MI patients in November 2005; 2490 patients alive and without recurrent MI at one year were classified into three groups: Group 1 ("PEGASUS-like" population; n=1395; 56%), Group 2 (population having ≥1 exclusion criterion for the trial; n=677; 27%), and group 3 (population meeting neither the PEGASUS inclusion nor exclusion criteria; n=418, 17%). Group 1 patients were older than Group 3 patients, with higher GRACE scores, more comorbidity, and less STEMI, but were younger than the PEGASUS trial population. Enrichment criteria successfully defined a population at higher risk: 4-year survival 83% in Group 1, 97% in Group 2, and 68% in Group 3 (P<0.001). Among risk-enrichment criteria, age alone was highly discriminant: in PEGASUS-like patients, survival was 78% in those ≥65 versus 94% in those <65years. Enrichment criteria used in PEGAGUS succeed in defining a population at increased risk in patients with prior MI, age being the most discriminant factor. The trial population, however, was notably younger and more masculine than the corresponding real-life population in France. Clinicaltrials.govnumber:NCT00673036. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Rationale and design of the randomized, double-blind trial testing INtraveNous and Oral administration of elinogrel, a selective and reversible P2Y(12)-receptor inhibitor, versus clopidogrel to eVAluate Tolerability and Efficacy in nonurgent Percutaneous Coronary Interventions patients (INNOVATE-PCI).

    PubMed

    Leonardi, Sergio; Rao, Sunil V; Harrington, Robert A; Bhatt, Deepak L; Gibson, C Michael; Roe, Matthew T; Kochman, Janusz; Huber, Kurt; Zeymer, Uwe; Madan, Mina; Gretler, Daniel D; McClure, Matthew W; Paynter, Gayle E; Thompson, Vivian; Welsh, Robert C

    2010-07-01

    Despite current dual-antiplatelet therapy with aspirin and clopidogrel, adverse clinical events continue to occur during and after percutaneous coronary intervention (PCI). The failure of clopidogrel to provide optimal protection may be related to delayed onset of action, interpatient variability in its effect, and an insufficient level of platelet inhibition. Furthermore, the irreversible binding of clopidogrel to the P2Y(12) receptor for the life span of the platelet is associated with increased bleeding risk especially during urgent or emergency surgery. Novel antiplatelet agents are required to improve management of patients undergoing PCI. Elinogrel is a potent, direct-acting (ie, non-prodrug), selective, competitive, and reversible P2Y(12) inhibitor available in both intravenous and oral formulations. The INNOVATE-PCI study is a phase 2 randomized, double-blind, clopidogrel-controlled trial to evaluate the safety, tolerability, and preliminary efficacy of this novel antiplatelet agent in patients undergoing nonurgent PCI.

  3. Random Packing and Random Covering Sequences.

    DTIC Science & Technology

    1988-03-24

    obtained by appeain~g to a result due to Marsaglia [39, and de Finetti [8]. Their result states that if (XI. X2 .. X,) is a random point on the simplex {X E...to sequeil~ coverage problems. J. App). Prob. 11. 281-293. [81 de Finetti . B. (1964). Alcune ossevazioni in tema de "suddivisione casuale." Giornale I

  4. Random one-of-N selector

    DOEpatents

    Kronberg, James W.

    1993-01-01

    An apparatus for selecting at random one item of N items on the average comprising counter and reset elements for counting repeatedly between zero and N, a number selected by the user, a circuit for activating and deactivating the counter, a comparator to determine if the counter stopped at a count of zero, an output to indicate an item has been selected when the count is zero or not selected if the count is not zero. Randomness is provided by having the counter cycle very often while varying the relatively longer duration between activation and deactivation of the count. The passive circuit components of the activating/deactivating circuit and those of the counter are selected for the sensitivity of their response to variations in temperature and other physical characteristics of the environment so that the response time of the circuitry varies. Additionally, the items themselves, which may be people, may vary in shape or the time they press a pushbutton, so that, for example, an ultrasonic beam broken by the item or person passing through it will add to the duration of the count and thus to the randomness of the selection.

  5. Random one-of-N selector

    DOEpatents

    Kronberg, J.W.

    1993-04-20

    An apparatus for selecting at random one item of N items on the average comprising counter and reset elements for counting repeatedly between zero and N, a number selected by the user, a circuit for activating and deactivating the counter, a comparator to determine if the counter stopped at a count of zero, an output to indicate an item has been selected when the count is zero or not selected if the count is not zero. Randomness is provided by having the counter cycle very often while varying the relatively longer duration between activation and deactivation of the count. The passive circuit components of the activating/deactivating circuit and those of the counter are selected for the sensitivity of their response to variations in temperature and other physical characteristics of the environment so that the response time of the circuitry varies. Additionally, the items themselves, which may be people, may vary in shape or the time they press a pushbutton, so that, for example, an ultrasonic beam broken by the item or person passing through it will add to the duration of the count and thus to the randomness of the selection.

  6. Quantum random number generator

    SciTech Connect

    Pooser, Raphael C.

    2016-05-10

    A quantum random number generator (QRNG) and a photon generator for a QRNG are provided. The photon generator may be operated in a spontaneous mode below a lasing threshold to emit photons. Photons emitted from the photon generator may have at least one random characteristic, which may be monitored by the QRNG to generate a random number. In one embodiment, the photon generator may include a photon emitter and an amplifier coupled to the photon emitter. The amplifier may enable the photon generator to be used in the QRNG without introducing significant bias in the random number and may enable multiplexing of multiple random numbers. The amplifier may also desensitize the photon generator to fluctuations in power supplied thereto while operating in the spontaneous mode. In one embodiment, the photon emitter and amplifier may be a tapered diode amplifier.

  7. Randomness: Quantum versus classical

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2016-05-01

    Recent tremendous development of quantum information theory has led to a number of quantum technological projects, e.g. quantum random generators. This development had stimulated a new wave of interest in quantum foundations. One of the most intriguing problems of quantum foundations is the elaboration of a consistent and commonly accepted interpretation of a quantum state. Closely related problem is the clarification of the notion of quantum randomness and its interrelation with classical randomness. In this short review, we shall discuss basics of classical theory of randomness (which by itself is very complex and characterized by diversity of approaches) and compare it with irreducible quantum randomness. We also discuss briefly “digital philosophy”, its role in physics (classical and quantum) and its coupling to the information interpretation of quantum mechanics (QM).

  8. Bayesian Enrichment Strategies for Randomized Discontinuation Trials

    PubMed Central

    Trippa, Lorenzo; Rosner, Gary L.; Müller, Peter

    2013-01-01

    Summary We propose optimal choice of the design parameters for random discontinuation designs (RDD) using a Bayesian decision-theoretic approach. We consider applications of RDDs to oncology phase II studies evaluating activity of cytostatic agents. The design consists of two stages. The preliminary open-label stage treats all patients with the new agent and identifies a possibly sensitive subpopulation. The subsequent second stage randomizes, treats, follows, and compares outcomes among patients in the identified subgroup, with randomization to either the new or a control treatment. Several tuning parameters characterize the design: the number of patients in the trial, the duration of the preliminary stage, and the duration of follow-up after randomization. We define a probability model for tumor growth, specify a suitable utility function, and develop a computational procedure for selecting the optimal tuning parameters. PMID:21714780

  9. Autonomous Byte Stream Randomizer

    NASA Technical Reports Server (NTRS)

    Paloulian, George K.; Woo, Simon S.; Chow, Edward T.

    2013-01-01

    Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.

  10. Selective Mutism

    PubMed Central

    2010-01-01

    Selective mutism is a rare and multidimensional childhood disorder that typically affects children entering school age. It is characterized by the persistent failure to speak in select social settings despite possessing the ability to speak and speak comfortably in more familiar settings. Many theories attempt to explain the etiology of selective mutism. Comorbidities and treatment. Selective mutism can present a variety of comorbidities including enuresis, encopresis, obsessive-compulsive disorder, depression, premorbid speech and language abnormalities, developmental delay, and Asperger's disorders. The specific manifestations and severity of these comorbidities vary based on the individual. Given the multidimensional manifestations of selective mutism, treatment options are similarly diverse. They include individual behavioral therapy, family therapy, and psychotherapy with antidepressants and anti-anxiety medications. Future directions. While studies have helped to elucidate the phenomenology of selective mutism, limitations and gaps in knowledge still persist. In particular, the literature on selective mutism consists primarily of small sample populations and case reports. Future research aims to develop an increasingly integrated, multidimensional framework for evaluating and treating children with selective mutism. PMID:20436772

  11. Teacher Selection.

    ERIC Educational Resources Information Center

    Heynderickx, James J.

    1987-01-01

    A one-page introduction is followed by three pages containing summaries of three journal articles and two documents on teacher selection. Mary Cihak Jensen argues that final selection decisions should be based on multiple information sources, since teaching requires proficiency in many interrelated skills. Superintendent Richard J. Caliendo…

  12. Correlated randomness and switching phenomena

    NASA Astrophysics Data System (ADS)

    Stanley, H. E.; Buldyrev, S. V.; Franzese, G.; Havlin, S.; Mallamace, F.; Kumar, P.; Plerou, V.; Preis, T.

    2010-08-01

    One challenge of biology, medicine, and economics is that the systems treated by these serious scientific disciplines have no perfect metronome in time and no perfect spatial architecture-crystalline or otherwise. Nonetheless, as if by magic, out of nothing but randomness one finds remarkably fine-tuned processes in time and remarkably fine-tuned structures in space. Further, many of these processes and structures have the remarkable feature of “switching” from one behavior to another as if by magic. The past century has, philosophically, been concerned with placing aside the human tendency to see the universe as a fine-tuned machine. Here we will address the challenge of uncovering how, through randomness (albeit, as we shall see, strongly correlated randomness), one can arrive at some of the many spatial and temporal patterns in biology, medicine, and economics and even begin to characterize the switching phenomena that enables a system to pass from one state to another. Inspired by principles developed by A. Nihat Berker and scores of other statistical physicists in recent years, we discuss some applications of correlated randomness to understand switching phenomena in various fields. Specifically, we present evidence from experiments and from computer simulations supporting the hypothesis that water’s anomalies are related to a switching point (which is not unlike the “tipping point” immortalized by Malcolm Gladwell), and that the bubbles in economic phenomena that occur on all scales are not “outliers” (another Gladwell immortalization). Though more speculative, we support the idea of disease as arising from some kind of yet-to-be-understood complex switching phenomenon, by discussing data on selected examples, including heart disease and Alzheimer disease.

  13. The pursuit of balance: An overview of covariate-adaptive randomization techniques in clinical trials.

    PubMed

    Lin, Yunzhi; Zhu, Ming; Su, Zheng

    2015-11-01

    Randomization is fundamental to the design and conduct of clinical trials. Simple randomization ensures independence among subject treatment assignments and prevents potential selection biases, yet it does not guarantee balance in covariate distributions across treatment groups. Ensuring balance in important prognostic covariates across treatment groups is desirable for many reasons. A broad class of randomization methods for achieving balance are reviewed in this paper; these include block randomization, stratified randomization, minimization, and dynamic hierarchical randomization. Practical considerations arising from experience with using the techniques are described. A review of randomization methods used in practice in recent randomized clinical trials is also provided.

  14. Optofluidic random laser

    NASA Astrophysics Data System (ADS)

    Shivakiran Bhaktha, B. N.; Bachelard, Nicolas; Noblin, Xavier; Sebbah, Patrick

    2012-10-01

    Random lasing is reported in a dye-circulated structured polymeric microfluidic channel. The role of disorder, which results from limited accuracy of photolithographic process, is demonstrated by the variation of the emission spectrum with local-pump position and by the extreme sensitivity to a local perturbation of the structure. Thresholds comparable to those of conventional microfluidic lasers are achieved, without the hurdle of state-of-the-art cavity fabrication. Potential applications of optofluidic random lasers for on-chip sensors are discussed. Introduction of random lasers in the field of optofluidics is a promising alternative to on-chip laser integration with light and fluidic functionalities.

  15. Derivation of Randomized Algorithms.

    DTIC Science & Technology

    1985-10-01

    81] randomized algorithm, 2)AKS deterministic algorithm[AKS83], 3) Column Sorting algorithm [Leighton 83], 4)FLASH SORT algorithm[Reif and Valiant 83...34 ) (for any a). This result immediately implies that r,. _ i- logN with 8 -’ probability > 1 - O(N - *) thus proving our claim. Lemma 3.3 A random SCX of...will be sampleselect, (LN/2J, N). With this modification, quicksort becomes atlgrithm samplesort, (X); if I Al 1 then r rnX; Choose a random subset SCX

  16. Universal statistics of selected values

    NASA Astrophysics Data System (ADS)

    Smerlak, Matteo; Youssef, Ahmed

    2017-03-01

    Selection, the tendency of some traits to become more frequent than others under the influence of some (natural or artificial) agency, is a key component of Darwinian evolution and countless other natural and social phenomena. Yet a general theory of selection, analogous to the Fisher-Tippett-Gnedenko theory of extreme events, is lacking. Here we introduce a probabilistic definition of selection and show that selected values are attracted to a universal family of limiting distributions which generalize the log-normal distribution. The universality classes and scaling exponents are determined by the tail thickness of the random variable under selection. Our results provide a possible explanation for skewed distributions observed in diverse contexts where selection plays a key role, from molecular biology to agriculture and sport.

  17. Molecular selection in a unified evolutionary sequence

    NASA Technical Reports Server (NTRS)

    Fox, S. W.

    1986-01-01

    With guidance from experiments and observations that indicate internally limited phenomena, an outline of unified evolutionary sequence is inferred. Such unification is not visible for a context of random matrix and random mutation. The sequence proceeds from Big Bang through prebiotic matter, protocells, through the evolving cell via molecular and natural selection, to mind, behavior, and society.

  18. On selective influences, marginal selectivity, and bell/CHSH inequalities.

    PubMed

    Dzhafarov, Ehtibar N; Kujala, Janne V

    2014-01-01

    The Bell/CHSH inequalities of quantum physics are identical with the inequalities derived in mathematical psychology for the problem of selective influences in cases involving two binary experimental factors and two binary random variables recorded in response to them. The following points are made regarding cognitive science applications: (1) compliance of data with these inequalities is informative only if the data satisfy the requirement known as marginal selectivity; (2) both violations of marginal selectivity and violations of the Bell/CHSH inequalities are interpretable as indicating that at least one of the two responses is influenced by both experimental factors. Copyright © 2013 Cognitive Science Society, Inc.

  19. Selecting Interventions.

    ERIC Educational Resources Information Center

    Langdon, Danny G.

    1997-01-01

    Describes a systematic approach to selecting instructional designs, discussing performance analysis, gaps, elements (inputs, conditions, process, outputs, consequences, feedback), matrices, changes in performance state (establishing, improving, maintaining, and extinguishing performance), intervention interference, and involving others in…

  20. Selected References.

    ERIC Educational Resources Information Center

    Allen, Walter C.

    1987-01-01

    This extensive bibliography on library building includes 15 categories: bibliography; background; general; planning teams; building programs; alternatives to new buildings; academic libraries; public libraries; school libraries; special libraries; site selection; interior planning and equipment; maintenance; security; and moving. (MES)

  1. Strategies for Improving Precision in Group-Randomized Experiments

    ERIC Educational Resources Information Center

    Raudenbush, Stephen W.; Martinez, Andres; Spybrook, Jessaca

    2007-01-01

    Interest has rapidly increased in studies that randomly assign classrooms or schools to interventions. When well implemented, such studies eliminate selection bias, providing strong evidence about the impact of the interventions. However, unless expected impacts are large, the number of units to be randomized needs to be quite large to achieve…

  2. Random array grid collimator

    DOEpatents

    Fenimore, E.E.

    1980-08-22

    A hexagonally shaped quasi-random no-two-holes touching grid collimator. The quasi-random array grid collimator eliminates contamination from small angle off-axis rays by using a no-two-holes-touching pattern which simultaneously provides for a self-supporting array increasng throughput by elimination of a substrate. The presentation invention also provides maximum throughput using hexagonally shaped holes in a hexagonal lattice pattern for diffraction limited applications. Mosaicking is also disclosed for reducing fabrication effort.

  3. Randomized Prediction Games for Adversarial Machine Learning.

    PubMed

    Rota Bulo, Samuel; Biggio, Battista; Pillai, Ignazio; Pelillo, Marcello; Roli, Fabio

    2016-08-04

    In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different evasion attacks and modifying the classification function accordingly. However, both the classification function and the simulated data manipulations have been modeled in a deterministic manner, without accounting for any form of randomization. In this paper, we overcome this limitation by proposing a randomized prediction game, namely, a noncooperative game-theoretic formulation in which the classifier and the attacker make randomized strategy selections according to some probability distribution defined over the respective strategy set. We show that our approach allows one to improve the tradeoff between attack detection and false alarms with respect to the state-of-the-art secure classifiers, even against attacks that are different from those hypothesized during design, on application examples including handwritten digit recognition, spam, and malware detection.

  4. Evolutionary dynamics on random structures

    SciTech Connect

    Fraser, S.M.; Reidys, C.M. |

    1997-04-01

    In this paper the authors consider the evolutionary dynamics of populations of sequences, under a process of selection at the phenotypic level of structures. They use a simple graph-theoretic representation of structures which captures well the properties of the mapping between RNA sequences and their molecular structure. Each sequence is assigned to a structure by means of a sequence-to-structure mapping. The authors make the basic assumption that every fitness landscape can be factorized through the structures. The set of all sequences that map into a particular random structure can then be modeled as a random graph in sequence space, the so-called neutral network. They analyze in detail how an evolving population searches for new structures, in particular how they switch from one neutral network to another. They verify that transitions occur directly between neutral networks, and study the effects of different population sizes and the influence of the relatedness of the structures on these transitions. In fitness landscapes where several structures exhibit high fitness, the authors then study evolutionary paths on the structural level taken by the population during its search. They present a new way of expressing structural similarities which are shown to have relevant implications for the time evolution of the population.

  5. When Is Selection Effective?

    PubMed

    Gravel, Simon

    2016-05-01

    Deleterious alleles can reach high frequency in small populations because of random fluctuations in allele frequency. This may lead, over time, to reduced average fitness. In this sense, selection is more "effective" in larger populations. Recent studies have considered whether the different demographic histories across human populations have resulted in differences in the number, distribution, and severity of deleterious variants, leading to an animated debate. This article first seeks to clarify some terms of the debate by identifying differences in definitions and assumptions used in recent studies. We argue that variants of Morton, Crow, and Muller's "total mutational damage" provide the soundest and most practical basis for such comparisons. Using simulations, analytical calculations, and 1000 Genomes Project data, we provide an intuitive and quantitative explanation for the observed similarity in genetic load across populations. We show that recent demography has likely modulated the effect of selection and still affects it, but the net result of the accumulated differences is small. Direct observation of differential efficacy of selection for specific allele classes is nevertheless possible with contemporary data sets. By contrast, identifying average genome-wide differences in the efficacy of selection across populations will require many modeling assumptions and is unlikely to provide much biological insight about human populations.

  6. The in vitro selection world.

    PubMed

    Jijakli, Kenan; Khraiwesh, Basel; Fu, Weiqi; Luo, Liming; Alzahmi, Amnah; Koussa, Joseph; Chaiboonchoe, Amphun; Kirmizialtin, Serdal; Yen, Laising; Salehi-Ashtiani, Kourosh

    2016-08-15

    Through iterative cycles of selection, amplification, and mutagenesis, in vitro selection provides the ability to isolate molecules of desired properties and function from large pools (libraries) of random molecules with as many as 10(16) distinct species. This review, in recognition of a quarter of century of scientific discoveries made through in vitro selection, starts with a brief overview of the method and its history. It further covers recent developments in in vitro selection with a focus on tools that enhance the capabilities of in vitro selection and its expansion from being purely a nucleic acids selection to that of polypeptides and proteins. In addition, we cover how next generation sequencing and modern biological computational tools are being used to complement in vitro selection experiments. On the very least, sequencing and computational tools can translate the large volume of information associated with in vitro selection experiments to manageable, analyzable, and exploitable information. Finally, in vivo selection is briefly compared and contrasted to in vitro selection to highlight the unique capabilities of each method. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Theory of random packings

    NASA Astrophysics Data System (ADS)

    Makse, Hernan; Song, Chaoming; Wang, Ping

    2009-03-01

    We present a theory of random packings to describe the statistical mechanics of jammed matter with the aim of shedding light to the long-standing problem of characterizing the random close packing (RCP) and random loose packing (RLP) of particles. We describe the jammed system with equations of state relating observables such as entropy, coordination number, volume fraction, and compactivity as well as the probability distributions of volume and contacts. We follow a systematic route to classify packings into a phase diagram of jamming, from frictionless to frictional particles, from hard to deformable particles, from monodisperse to polydisperse systems, from spherical particles to nonspherical convex particles, in an attempt to understand the packing problem from a unifying perspective. The studies of RCP and RLP includes 2d, nd, and the mean field limit of infinite dimension.

  8. Reconstructing random media

    NASA Astrophysics Data System (ADS)

    Yeong, C. L. Y.; Torquato, S.

    1998-01-01

    We formulate a procedure to reconstruct the structure of general random heterogeneous media from limited morphological information by extending the methodology of Rintoul and Torquato [J. Colloid Interface Sci. 186, 467 (1997)] developed for dispersions. The procedure has the advantages that it is simple to implement and generally applicable to multidimensional, multiphase, and anisotropic structures. Furthermore, an extremely useful feature is that it can incorporate any type and number of correlation functions in order to provide as much morphological information as is necessary for accurate reconstruction. We consider a variety of one- and two-dimensional reconstructions, including periodic and random arrays of rods, various distribution of disks, Debye random media, and a Fontainebleau sandstone sample. We also use our algorithm to construct heterogeneous media from specified hypothetical correlation functions, including an exponentially damped, oscillating function as well as physically unrealizable ones.

  9. Leaky Random Oracle

    NASA Astrophysics Data System (ADS)

    Yoneyama, Kazuki; Miyagawa, Satoshi; Ohta, Kazuo

    This work focuses on a vulnerability of hash functions due to sloppy usages or implementations in the real world. If our cryptographic research community succeeded in the development of a perfectly secure random function as the random oracle, it might be broken in some sense by invalid uses. In this paper, we propose a new variant of the random oracle model in order to analyze the security of cryptographic protocols under the situation of an invalid use of hash functions. Our model allows adversaries to obtain contents of the hash list of input and output pairs arbitrarily. Also, we analyze the security of several prevailing protocols (FDH, OAEP, Cramer-Shoup cryptosystem, Kurosawa-Desmedt cryptosystem, NAXOS) in our model. As the result of analyses, we clarify that FDH and Cramer-Shoup cryptosystem are still secure but others are insecure in our model. This result shows the separation between our model and the standard model.

  10. 9 CFR 592.450 - Procedures for selecting appeal samples.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... random. When the original sample containers cannot be located, the appeal sample shall consist of product taken at random from double the number of original sample containers. (c) Condition inspection. The... containers selected at random. A condition appeal cannot be made unless all originally sampled containers are...

  11. Integral Histogram with Random Projection for Pedestrian Detection.

    PubMed

    Liu, Chang-Hua; Lin, Jian-Kun

    2015-01-01

    In this paper, we give a systematic study to report several deep insights into the HOG, one of the most widely used features in the modern computer vision and image processing applications. We first show that, its magnitudes of gradient can be randomly projected with random matrix. To handle over-fitting, an integral histogram based on the differences of randomly selected blocks is proposed. The experiments show that both the random projection and integral histogram outperform the HOG feature obviously. Finally, the two ideas are combined into a new descriptor termed IHRP, which outperforms the HOG feature with less dimensions and higher speed.

  12. Integral Histogram with Random Projection for Pedestrian Detection

    PubMed Central

    Liu, Chang-Hua; Lin, Jian-Kun

    2015-01-01

    In this paper, we give a systematic study to report several deep insights into the HOG, one of the most widely used features in the modern computer vision and image processing applications. We first show that, its magnitudes of gradient can be randomly projected with random matrix. To handle over-fitting, an integral histogram based on the differences of randomly selected blocks is proposed. The experiments show that both the random projection and integral histogram outperform the HOG feature obviously. Finally, the two ideas are combined into a new descriptor termed IHRP, which outperforms the HOG feature with less dimensions and higher speed. PMID:26569486

  13. Tunable random fiber laser

    SciTech Connect

    Babin, S. A.; Podivilov, E. V.; El-Taher, A. E.; Harper, P.; Turitsyn, S. K.

    2011-08-15

    An optical fiber is treated as a natural one-dimensional random system where lasing is possible due to a combination of Rayleigh scattering by refractive index inhomogeneities and distributed amplification through the Raman effect. We present such a random fiber laser that is tunable over a broad wavelength range with uniquely flat output power and high efficiency, which outperforms traditional lasers of the same category. Outstanding characteristics defined by deep underlying physics and the simplicity of the scheme make the demonstrated laser a very attractive light source both for fundamental science and practical applications.

  14. Conformations of Random Polyampholytes

    NASA Astrophysics Data System (ADS)

    Yamakov, Vesselin; Milchev, Andrey; Jörg Limbach, Hans; Dünweg, Burkhard; Everaers, Ralf

    2000-11-01

    We study the size Rg of random polyampholytes (i.e., polymers with randomly charged monomers) as a function of their length N. All results of our extensive Monte Carlo simulations can be rationalized in terms of the scaling theory we develop for the Kantor-Kardar necklace model, although this theory neglects the quenched disorder in the charge sequence along the chain. We find ~N1/2. The elongated globule model, the initial predictions of both Higgs and Joanny ( ~N1/3) and Kantor and Kardar ( ~N), and previous numerical estimates are ruled out.

  15. Powerful narrow linewidth random fiber laser

    NASA Astrophysics Data System (ADS)

    Ye, Jun; Xu, Jiangming; Zhang, Hanwei; Zhou, Pu

    2017-03-01

    In this paper, we demonstrate a narrow linewidth random fiber laser, which employs a tunable pump laser to select the operating wavelength for efficiency optimization, a narrow-band fiber Bragg grating (FBG) and a section of single mode fiber to construct a half-open cavity, and a circulator to separate pump light input and random lasing output. Spectral linewidth down to 42.31 GHz is achieved through filtering by the FBG. When 8.97 W pump light centered at the optimized wavelength 1036.5 nm is launched into the half-open cavity, 1081.4 nm random lasing with the maximum output power of 2.15 W is achieved, which is more powerful than the previous reported results.

  16. Powerful narrow linewidth random fiber laser

    NASA Astrophysics Data System (ADS)

    Ye, Jun; Xu, Jiangming; Zhang, Hanwei; Zhou, Pu

    2016-11-01

    In this paper, we demonstrate a narrow linewidth random fiber laser, which employs a tunable pump laser to select the operating wavelength for efficiency optimization, a narrow-band fiber Bragg grating (FBG) and a section of single mode fiber to construct a half-open cavity, and a circulator to separate pump light input and random lasing output. Spectral linewidth down to 42.31 GHz is achieved through filtering by the FBG. When 8.97 W pump light centered at the optimized wavelength 1036.5 nm is launched into the half-open cavity, 1081.4 nm random lasing with the maximum output power of 2.15 W is achieved, which is more powerful than the previous reported results.

  17. Statistical properties of random clique networks

    NASA Astrophysics Data System (ADS)

    Ding, Yi-Min; Meng, Jun; Fan, Jing-Fang; Ye, Fang-Fu; Chen, Xiao-Song

    2017-10-01

    In this paper, a random clique network model to mimic the large clustering coefficient and the modular structure that exist in many real complex networks, such as social networks, artificial networks, and protein interaction networks, is introduced by combining the random selection rule of the Erdös and Rényi (ER) model and the concept of cliques. We find that random clique networks having a small average degree differ from the ER network in that they have a large clustering coefficient and a power law clustering spectrum, while networks having a high average degree have similar properties as the ER model. In addition, we find that the relation between the clustering coefficient and the average degree shows a non-monotonic behavior and that the degree distributions can be fit by multiple Poisson curves; we explain the origin of such novel behaviors and degree distributions.

  18. Selected Health Practices Among Ohio's Rural Residents.

    ERIC Educational Resources Information Center

    Phillips, G. Howard; Pugh, Albert

    Using a stratified random sample of 12 of Ohio's 88 counties, this 1967 study had as its objectives (1) to measure the level of participation in selected health practices by Ohio's rural residents, (2) to compare the level of participation in selected health practices of farm and rural nonfarm residents, and (3) to examine levels of participation…

  19. Model Selection with the Linear Mixed Model for Longitudinal Data

    ERIC Educational Resources Information Center

    Ryoo, Ji Hoon

    2011-01-01

    Model building or model selection with linear mixed models (LMMs) is complicated by the presence of both fixed effects and random effects. The fixed effects structure and random effects structure are codependent, so selection of one influences the other. Most presentations of LMM in psychology and education are based on a multilevel or…

  20. 40 CFR 204.57-2 - Test compressor sample selection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... of compressors of the category or configuration specified in the test request. The random selection... of random numbers to select the number of compressors, as specified in paragraph (c) of this section...) The Acceptable Quality Level is 10 percent. The appropriate sampling plans associated with the...

  1. 40 CFR 761.355 - Third level of sample selection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...

  2. Model Selection with the Linear Mixed Model for Longitudinal Data

    ERIC Educational Resources Information Center

    Ryoo, Ji Hoon

    2011-01-01

    Model building or model selection with linear mixed models (LMMs) is complicated by the presence of both fixed effects and random effects. The fixed effects structure and random effects structure are codependent, so selection of one influences the other. Most presentations of LMM in psychology and education are based on a multilevel or…

  3. Randomness + determinism = progresses: why random processes could be favored by evolution.

    PubMed

    Schabanel, Nicolas

    2012-09-01

    Biologists are somehow pioneers on the idea that progress can be driven by randomness: randomness is one of the main engine of evolution; small variations induced by randomness coupled with natural selection allows the species to self-adapt to their moving environment. Studies from the last 40 years in computer science suggest that randomness is in fact able of doing much more and revealed unexpected possibilities which might appear impossible at first. Furthermore, it turns out that these discoveries are faster, cheaper and above all exponentially thriftier than their deterministic alternatives. This means that random explorations would almost surely generate a stochastic process way before any equivalent deterministic counterpart is found. It follows that most likely these processes are favored by evolution and should thus be known to anyone dealing with systems (alive or not) having access to random sources. This article presents some of these counter-intuitive results as a possible source of inspiration for studying systems fed with randomness. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. [Intel random number generator-based true random number generator].

    PubMed

    Huang, Feng; Shen, Hong

    2004-09-01

    To establish a true random number generator on the basis of certain Intel chips. The random numbers were acquired by programming using Microsoft Visual C++ 6.0 via register reading from the random number generator (RNG) unit of an Intel 815 chipset-based computer with Intel Security Driver (ISD). We tested the generator with 500 random numbers in NIST FIPS 140-1 and X(2) R-Squared test, and the result showed that the random number it generated satisfied the demand of independence and uniform distribution. We also compared the random numbers generated by Intel RNG-based true random number generator and those from the random number table statistically, by using the same amount of 7500 random numbers in the same value domain, which showed that the SD, SE and CV of Intel RNG-based random number generator were less than those of the random number table. The result of u test of two CVs revealed no significant difference between the two methods. Intel RNG-based random number generator can produce high-quality random numbers with good independence and uniform distribution, and solves some problems with random number table in acquisition of the random numbers.

  5. Randomization Methods in Emergency Setting Trials: A Descriptive Review

    ERIC Educational Resources Information Center

    Corbett, Mark Stephen; Moe-Byrne, Thirimon; Oddie, Sam; McGuire, William

    2016-01-01

    Background: Quasi-randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods: We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic…

  6. Randomization Methods in Emergency Setting Trials: A Descriptive Review

    ERIC Educational Resources Information Center

    Corbett, Mark Stephen; Moe-Byrne, Thirimon; Oddie, Sam; McGuire, William

    2016-01-01

    Background: Quasi-randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods: We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic…

  7. Efficient robust conditional random fields.

    PubMed

    Song, Dongjin; Liu, Wei; Zhou, Tianyi; Tao, Dacheng; Meyer, David A

    2015-10-01

    Conditional random fields (CRFs) are a flexible yet powerful probabilistic approach and have shown advantages for popular applications in various areas, including text analysis, bioinformatics, and computer vision. Traditional CRF models, however, are incapable of selecting relevant features as well as suppressing noise from noisy original features. Moreover, conventional optimization methods often converge slowly in solving the training procedure of CRFs, and will degrade significantly for tasks with a large number of samples and features. In this paper, we propose robust CRFs (RCRFs) to simultaneously select relevant features. An optimal gradient method (OGM) is further designed to train RCRFs efficiently. Specifically, the proposed RCRFs employ the l1 norm of the model parameters to regularize the objective used by traditional CRFs, therefore enabling discovery of the relevant unary features and pairwise features of CRFs. In each iteration of OGM, the gradient direction is determined jointly by the current gradient together with the historical gradients, and the Lipschitz constant is leveraged to specify the proper step size. We show that an OGM can tackle the RCRF model training very efficiently, achieving the optimal convergence rate [Formula: see text] (where k is the number of iterations). This convergence rate is theoretically superior to the convergence rate O(1/k) of previous first-order optimization methods. Extensive experiments performed on three practical image segmentation tasks demonstrate the efficacy of OGM in training our proposed RCRFs.

  8. Adhesion promotion with random copolymers

    NASA Astrophysics Data System (ADS)

    Simmons, Edward Read

    This thesis presents a study of adhesion promotion with random copolymers (RCP's). Monte Carlo (MC) simulations are used to study the potential use of RCP's as interfacial strengtheners at a homopolymer-solid interface. We discuss the effect of varying several design parameters of the RCP chains on interfacial strength. We find that RCP's can promote adhesion dependent upon careful selection of the parameters such as the RCP composition, blockiness, and concentration. We draw our conclusions from both equilibrium and non-equilibrium MC simulations in which we impose a normal stress on the interfacial chain system and observe the response as the system is deformed. These simulations are designed to reflect experimentally realizable conditions as closely as possible. The ultimate goal of our work is to guide experimentalists in the design and selection of the best adhesion promoter for a given system. With this goal in mind, we suggest several extensions of our methodology to further tighten the connection between simulation and experiment.

  9. Generating "Random" Integers

    ERIC Educational Resources Information Center

    Griffiths, Martin

    2011-01-01

    One of the author's undergraduate students recently asked him whether it was possible to generate a random positive integer. After some thought, the author realised that there were plenty of interesting mathematical ideas inherent in her question. So much so in fact, that the author decided to organise a workshop, open both to undergraduates and…

  10. Randomness Of Amoeba Movements

    NASA Astrophysics Data System (ADS)

    Hashiguchi, S.; Khadijah, Siti; Kuwajima, T.; Ohki, M.; Tacano, M.; Sikula, J.

    2005-11-01

    Movements of amoebas were automatically traced using the difference between two successive frames of the microscopic movie. It was observed that the movements were almost random in that the directions and the magnitudes of the successive two steps are not correlated, and that the distance from the origin was proportional to the square root of the step number.

  11. Randomized branch sampling

    Treesearch

    Harry T. Valentine

    2002-01-01

    Randomized branch sampling (RBS) is a special application of multistage probability sampling (see Sampling, environmental), which was developed originally by Jessen [3] to estimate fruit counts on individual orchard trees. In general, the method can be used to obtain estimates of many different attributes of trees or other branched plants. The usual objective of RBS is...

  12. Random lattice superstrings

    SciTech Connect

    Feng Haidong; Siegel, Warren

    2006-08-15

    We propose some new simplifying ingredients for Feynman diagrams that seem necessary for random lattice formulations of superstrings. In particular, half the fermionic variables appear only in particle loops (similarly to loop momenta), reducing the supersymmetry of the constituents of the type IIB superstring to N=1, as expected from their interpretation in the 1/N expansion as super Yang-Mills.

  13. Selective Enumeration

    DTIC Science & Technology

    2000-07-01

    claims about properties of a specification by solving formulae derived the specification. Ladybug is a new tool that incorporates selective enumeration...languages. Ladybug includes implementations of three significant new algorithms to help reduce the search space: bounded generation, domain coloring, and...thesis, I have applied Ladybug to a suite of software specifications, including both artificial and "real-world" specifications, to quantify the

  14. Selective Emitters

    NASA Technical Reports Server (NTRS)

    Chubb, Donald L. (Inventor)

    1992-01-01

    This invention relates to a small particle selective emitter for converting thermal energy into narrow band radiation with high efficiency. The small particle selective emitter is used in combination with a photovoltaic array to provide a thermal to electrical energy conversion device. An energy conversion apparatus of this type is called a thermo-photovoltaic device. In the first embodiment, small diameter particles of a rare earth oxide are suspended in an inert gas enclosed between concentric cylinders. The rare earth oxides are used because they have the desired property of large emittance in a narrow wavelength band and small emittance outside the band. However, it should be emphasized that it is the smallness of the particles that enhances the radiation property. The small particle selective emitter is surrounded by a photovoltaic array. In an alternate embodiment, the small particle gas mixture is circulated through a thermal energy source. This thermal energy source can be a nuclear reactor, solar receiver, or combustor of a fossil fuel.

  15. Selective emitters

    NASA Astrophysics Data System (ADS)

    Chubb, Donald L.

    1992-01-01

    This invention relates to a small particle selective emitter for converting thermal energy into narrow band radiation with high efficiency. The small particle selective emitter is used in combination with a photovoltaic array to provide a thermal to electrical energy conversion device. An energy conversion apparatus of this type is called a thermo-photovoltaic device. In the first embodiment, small diameter particles of a rare earth oxide are suspended in an inert gas enclosed between concentric cylinders. The rare earth oxides are used because they have the desired property of large emittance in a narrow wavelength band and small emittance outside the band. However, it should be emphasized that it is the smallness of the particles that enhances the radiation property. The small particle selective emitter is surrounded by a photovoltaic array. In an alternate embodiment, the small particle gas mixture is circulated through a thermal energy source. This thermal energy source can be a nuclear reactor, solar receiver, or combustor of a fossil fuel.

  16. Random bits, true and unbiased, from atmospheric turbulence

    PubMed Central

    Marangon, Davide G.; Vallone, Giuseppe; Villoresi, Paolo

    2014-01-01

    Random numbers represent a fundamental ingredient for secure communications and numerical simulation as well as to games and in general to Information Science. Physical processes with intrinsic unpredictability may be exploited to generate genuine random numbers. The optical propagation in strong atmospheric turbulence is here taken to this purpose, by observing a laser beam after a 143 km free-space path. In addition, we developed an algorithm to extract the randomness of the beam images at the receiver without post-processing. The numbers passed very selective randomness tests for qualification as genuine random numbers. The extracting algorithm can be easily generalized to random images generated by different physical processes. PMID:24976499

  17. Three dimensional imaging with randomly distributed sensors.

    PubMed

    DaneshPanah, Mehdi; Javidi, Bahram; Watson, Edward A

    2008-04-28

    As a promising three dimensional passive imaging modality, Integral Imaging (II) has been investigated widely within the research community. In virtually all of such investigations, there is an implicit assumption that the collection of elemental images lie on a simple geometric surface (e.g. flat, concave, etc), also known as pickup surface. In this paper, we present a generalized framework for 3D II with arbitrary pickup surface geometry and randomly distributed sensor configuration. In particular, we will study the case of Synthetic Aperture Integral Imaging (SAII) with random location of cameras in space, while all cameras have parallel optical axes but different distances from the 3D scene. We assume that the sensors are randomly distributed in 3D volume of pick up space. For 3D reconstruction, a finite number of sensors with known coordinates are randomly selected from within this volume. The mathematical framework for 3D scene reconstruction is developed based on an affine transform representation of imaging under geometrical optics regime. We demonstrate the feasibility of the methods proposed here by experimental results. To the best of our knowledge, this is the first report on 3D imaging using randomly distributed sensors.

  18. Programmable disorder in random DNA tilings

    NASA Astrophysics Data System (ADS)

    Tikhomirov, Grigory; Petersen, Philip; Qian, Lulu

    2016-11-01

    Scaling up the complexity and diversity of synthetic molecular structures will require strategies that exploit the inherent stochasticity of molecular systems in a controlled fashion. Here we demonstrate a framework for programming random DNA tilings and show how to control the properties of global patterns through simple, local rules. We constructed three general forms of planar network—random loops, mazes and trees—on the surface of self-assembled DNA origami arrays on the micrometre scale with nanometre resolution. Using simple molecular building blocks and robust experimental conditions, we demonstrate control of a wide range of properties of the random networks, including the branching rules, the growth directions, the proximity between adjacent networks and the size distribution. Much as combinatorial approaches for generating random one-dimensional chains of polymers have been used to revolutionize chemical synthesis and the selection of functional nucleic acids, our strategy extends these principles to random two-dimensional networks of molecules and creates new opportunities for fabricating more complex molecular devices that are organized by DNA nanostructures.

  19. Programmable disorder in random DNA tilings

    NASA Astrophysics Data System (ADS)

    Tikhomirov, Grigory; Petersen, Philip; Qian, Lulu

    2017-03-01

    Scaling up the complexity and diversity of synthetic molecular structures will require strategies that exploit the inherent stochasticity of molecular systems in a controlled fashion. Here we demonstrate a framework for programming random DNA tilings and show how to control the properties of global patterns through simple, local rules. We constructed three general forms of planar network—random loops, mazes and trees—on the surface of self-assembled DNA origami arrays on the micrometre scale with nanometre resolution. Using simple molecular building blocks and robust experimental conditions, we demonstrate control of a wide range of properties of the random networks, including the branching rules, the growth directions, the proximity between adjacent networks and the size distribution. Much as combinatorial approaches for generating random one-dimensional chains of polymers have been used to revolutionize chemical synthesis and the selection of functional nucleic acids, our strategy extends these principles to random two-dimensional networks of molecules and creates new opportunities for fabricating more complex molecular devices that are organized by DNA nanostructures.

  20. Selection for complex traits leaves little or no classic signatures of selection.

    PubMed

    Kemper, Kathryn E; Saxton, Sarah J; Bolormaa, Sunduimijid; Hayes, Benjamin J; Goddard, Michael E

    2014-03-28

    Selection signatures aim to identify genomic regions underlying recent adaptations in populations. However, the effects of selection in the genome are difficult to distinguish from random processes, such as genetic drift. Often associations between selection signatures and selected variants for complex traits is assumed even though this is rarely (if ever) tested. In this paper, we use 8 breeds of domestic cattle under strong artificial selection to investigate if selection signatures are co-located in genomic regions which are likely to be under selection. Our approaches to identify selection signatures (haplotype heterozygosity, integrated haplotype score and FST) identified strong and recent selection near many loci with mutations affecting simple traits under strong selection, such as coat colour. However, there was little evidence for a genome-wide association between strong selection signatures and regions affecting complex traits under selection, such as milk yield in dairy cattle. Even identifying selection signatures near some major loci was hindered by factors including allelic heterogeneity, selection for ancestral alleles and interactions with nearby selected loci. Selection signatures detect loci with large effects under strong selection. However, the methodology is often assumed to also detect loci affecting complex traits where the selection pressure at an individual locus is weak. We present empirical evidence to suggests little discernible 'selection signature' for complex traits in the genome of dairy cattle despite very strong and recent artificial selection.

  1. EDITORIAL: Nanotechnological selection Nanotechnological selection

    NASA Astrophysics Data System (ADS)

    Demming, Anna

    2013-01-01

    At the nanoscale measures can move from a mass-scale analogue calibration to counters of discrete units. The shift redefines the possible levels of control that can be achieved in a system if adequate selectivity can be imposed. As an example as ionic substances pass through nanoscale pores, the quantity of ions is low enough that the pore can contain either negative or positive ions. Yet precise control over this selectivity still raises difficulties. In this issue researchers address the challenge of how to regulate the ionic selectivity of negative and positive charges with the use of an external charge. The approach may be useful for controlling the behaviour, properties and chemical composition of liquids and has possible technical applications for nanofluidic field effect transistors [1]. Selectivity is a critical advantage in the administration of drugs. Nanoparticles functionalized with targeting moieties can allow delivery of anti-cancer drugs to tumour cells, whilst avoiding healthy cells and hence reducing some of the debilitating side effects of cancer treatments [2]. Researchers in Belarus and the US developed a new theranostic approach—combining therapy and diagnosis—to support the evident benefits of cellular selectivity that can be achieved when nanoparticles are applied in medicine [3]. Their process uses nanobubbles of photothermal vapour, referred to as plasmonic nanobubbles, generated by plasmonic excitations in gold nanoparticles conjugated to diagnosis-specific antibodies. The intracellular plasmonic nanobubbles are controlled by laser fluence so that the response can be tuned in individual living cells. Lower fluence allows non-invasive high-sensitive imaging for diagnosis and higher fluence can disrupt the cellular membrane for treatments. The selective response of carbon nanotubes to different gases has leant them to be used within various different types of sensors, as summarized in a review by researchers at the University of

  2. Random matrix theory

    NASA Astrophysics Data System (ADS)

    Edelman, Alan; Rao, N. Raj

    Random matrix theory is now a big subject with applications in many disciplines of science, engineering and finance. This article is a survey specifically oriented towards the needs and interests of a numerical analyst. This survey includes some original material not found anywhere else. We include the important mathematics which is a very modern development, as well as the computational software that is transforming the theory into useful practice.

  3. Random equations in aerodynamics

    NASA Technical Reports Server (NTRS)

    Bharucha-Reid, A. T.

    1984-01-01

    Literature was reviewed to identify aerodynamic models which might be treated by probablistic methods. The numerical solution of some integral equations that arise in aerodynamical problems were investigated. On the basis of the numerical studies a qualitative theory of random integral equations was developed to provide information on the behavior of the solutions of these equations (in particular, boundary and asymptotic behavior, and stability) and their statistical properties without actually obtaining explicit solutions of the equations.

  4. Diffusion in random networks

    DOE PAGES

    Zhang, Duan Z.; Padrino, Juan C.

    2017-06-01

    The ensemble averaging technique is applied to model mass transport by diffusion in random networks. The system consists of an ensemble of random networks, where each network is made of pockets connected by tortuous channels. Inside a channel, fluid transport is assumed to be governed by the one-dimensional diffusion equation. Mass balance leads to an integro-differential equation for the pocket mass density. The so-called dual-porosity model is found to be equivalent to the leading order approximation of the integration kernel when the diffusion time scale inside the channels is small compared to the macroscopic time scale. As a test problem,more » we consider the one-dimensional mass diffusion in a semi-infinite domain. Because of the required time to establish the linear concentration profile inside a channel, for early times the similarity variable is xt$-$1/4 rather than xt$-$1/2 as in the traditional theory. We found this early time similarity can be explained by random walk theory through the network.« less

  5. Sex Differences in the Performance of Cardiac Computed Tomography Compared With Functional Testing in Evaluating Stable Chest Pain: Subanalysis of the Multicenter, Randomized CRESCENT Trial (Calcium Imaging and Selective CT Angiography in Comparison to Functional Testing for Suspected Coronary Artery Disease).

    PubMed

    Lubbers, Marisa; Coenen, Adriaan; Bruning, Tobias; Galema, Tjebbe; Akkerhuis, Jurgen; Krenning, Boudewijn; Musters, Paul; Ouhlous, Mohamed; Liem, Ahno; Niezen, Andre; Dedic, Admir; van Domburg, Ron; Hunink, Miriam; Nieman, Koen

    2017-02-01

    Cardiac computed tomography (CT) represents an alternative diagnostic strategy for women with suspected coronary artery disease, with potential benefits in terms of effectiveness and cost-efficiency. The CRESCENT trial (Calcium Imaging and Selective CT Angiography in Comparison to Functional Testing for Suspected Coronary Artery Disease) prospectively randomized 350 patients with stable angina (55% women; aged 55±10 years), mostly with an intermediate coronary artery disease probability, between cardiac CT and functional testing. The tiered cardiac CT protocol included a calcium scan followed by CT angiography if the Agatston calcium score was between 1 and 400. Patients with test-specific contraindications were not excluded from study participation. Sex differences were studied as a prespecified subanalysis. Enrolled women presented more frequently with atypical chest pain and had a lower pretest probability of coronary artery disease compared with men. Independently of these differences, cardiac CT led in both sexes to a fast final diagnosis when compared with functional testing, although the effect was larger in women (P interaction=0.01). The reduced need for further testing after CT, compared with functional testing, was most evident in women (P interaction=0.009). However, no sex interaction was observed with respect to changes in angina and quality of life, cumulative diagnostic costs, and applied radiation dose (all P interactions≥0.097). Cardiac CT is more efficient in women than in men in terms of time to reach the final diagnosis and downstream testing. However, overall clinical outcome showed no significant difference between women and men after 1 year. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01393028. © 2017 American Heart Association, Inc.

  6. Predictors of postdischarge outcomes from information acquired shortly after admission for acute heart failure: a report from the Placebo-Controlled Randomized Study of the Selective A1 Adenosine Receptor Antagonist Rolofylline for Patients Hospitalized With Acute Decompensated Heart Failure and Volume Overload to Assess Treatment Effect on Congestion and Renal Function (PROTECT) Study.

    PubMed

    Cleland, John G; Chiswell, Karen; Teerlink, John R; Stevens, Susanna; Fiuzat, Mona; Givertz, Michael M; Davison, Beth A; Mansoor, George A; Ponikowski, Piotr; Voors, Adriaan A; Cotter, Gad; Metra, Marco; Massie, Barry M; O'Connor, Christopher M

    2014-01-01

    Acute heart failure is a common reason for admission, and outcome is often poor. Improved prognostic risk stratification may assist in the design of future trials and in patient management. Using data from a large randomized trial, we explored the prognostic value of clinical variables, measured at hospital admission for acute heart failure, to determine whether a few selected variables were inferior to an extended data set. The prognostic model included 37 clinical characteristics collected at baseline in PROTECT, a study comparing rolofylline and placebo in 2033 patients admitted with acute heart failure. Prespecified outcomes at 30 days were death or rehospitalization for any reason; death or rehospitalization for cardiovascular or renal reasons; and, at both 30 and 180 days, all-cause mortality. No variable had a c-index>0.70, and few had values>0.60; c-indices were lower for composite outcomes than for mortality. Blood urea was generally the strongest single predictor. Eighteen variables contributed independent prognostic information, but a reduced model using only 8 items (age, previous heart failure hospitalization, peripheral edema, systolic blood pressure, serum sodium, urea, creatinine, and albumin) performed similarly. For prediction of all-cause mortality at 180 days, the model c-index using all variables was 0.72 and for the simplified model, also 0.72. A few simple clinical variables measured on admission in patients with acute heart failure predict a variety of adverse outcomes with accuracy similar to more complex models. However, predictive models were of only moderate accuracy, especially for outcomes that included nonfatal events. Better methods of risk stratification are required. URL: http://www.clinicaltrials.gov. Unique identifiers: NCT00328692 and NCT00354458.

  7. Certified randomness in quantum physics.

    PubMed

    Acín, Antonio; Masanes, Lluis

    2016-12-07

    The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.

  8. Certified randomness in quantum physics

    NASA Astrophysics Data System (ADS)

    Acín, Antonio; Masanes, Lluis

    2016-12-01

    The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.

  9. Generation of kth-order random toposequences

    NASA Astrophysics Data System (ADS)

    Odgers, Nathan P.; McBratney, Alex. B.; Minasny, Budiman

    2008-05-01

    The model presented in this paper derives toposequences from a digital elevation model (DEM). It is written in ArcInfo Macro Language (AML). The toposequences are called kth-order random toposequences, because they take a random path uphill to the top of a hill and downhill to a stream or valley bottom from a randomly selected seed point, and they are located in a streamshed of order k according to a particular stream-ordering system. We define a kth-order streamshed as the area of land that drains directly to a stream segment of stream order k. The model attempts to optimise the spatial configuration of a set of derived toposequences iteratively by using simulated annealing to maximise the total sum of distances between each toposequence hilltop in the set. The user is able to select the order, k, of the derived toposequences. Toposequences are useful for determining soil sampling locations for use in collecting soil data for digital soil mapping applications. Sampling locations can be allocated according to equal elevation or equal-distance intervals along the length of the toposequence, for example. We demonstrate the use of this model for a study area in the Hunter Valley of New South Wales, Australia. Of the 64 toposequences derived, 32 were first-order random toposequences according to Strahler's stream-ordering system, and 32 were second-order random toposequences. The model that we present in this paper is an efficient method for sampling soil along soil toposequences. The soils along a toposequence are related to each other by the topography they are found in, so soil data collected by this method is useful for establishing soil-landscape rules for the preparation of digital soil maps.

  10. Improved Compressive Sensing of Natural Scenes Using Localized Random Sampling

    PubMed Central

    Barranca, Victor J.; Kovačič, Gregor; Zhou, Douglas; Cai, David

    2016-01-01

    Compressive sensing (CS) theory demonstrates that by using uniformly-random sampling, rather than uniformly-spaced sampling, higher quality image reconstructions are often achievable. Considering that the structure of sampling protocols has such a profound impact on the quality of image reconstructions, we formulate a new sampling scheme motivated by physiological receptive field structure, localized random sampling, which yields significantly improved CS image reconstructions. For each set of localized image measurements, our sampling method first randomly selects an image pixel and then measures its nearby pixels with probability depending on their distance from the initially selected pixel. We compare the uniformly-random and localized random sampling methods over a large space of sampling parameters, and show that, for the optimal parameter choices, higher quality image reconstructions can be consistently obtained by using localized random sampling. In addition, we argue that the localized random CS optimal parameter choice is stable with respect to diverse natural images, and scales with the number of samples used for reconstruction. We expect that the localized random sampling protocol helps to explain the evolutionarily advantageous nature of receptive field structure in visual systems and suggests several future research areas in CS theory and its application to brain imaging. PMID:27555464

  11. Improved Compressive Sensing of Natural Scenes Using Localized Random Sampling

    NASA Astrophysics Data System (ADS)

    Barranca, Victor J.; Kovačič, Gregor; Zhou, Douglas; Cai, David

    2016-08-01

    Compressive sensing (CS) theory demonstrates that by using uniformly-random sampling, rather than uniformly-spaced sampling, higher quality image reconstructions are often achievable. Considering that the structure of sampling protocols has such a profound impact on the quality of image reconstructions, we formulate a new sampling scheme motivated by physiological receptive field structure, localized random sampling, which yields significantly improved CS image reconstructions. For each set of localized image measurements, our sampling method first randomly selects an image pixel and then measures its nearby pixels with probability depending on their distance from the initially selected pixel. We compare the uniformly-random and localized random sampling methods over a large space of sampling parameters, and show that, for the optimal parameter choices, higher quality image reconstructions can be consistently obtained by using localized random sampling. In addition, we argue that the localized random CS optimal parameter choice is stable with respect to diverse natural images, and scales with the number of samples used for reconstruction. We expect that the localized random sampling protocol helps to explain the evolutionarily advantageous nature of receptive field structure in visual systems and suggests several future research areas in CS theory and its application to brain imaging.

  12. Weighted Hybrid Decision Tree Model for Random Forest Classifier

    NASA Astrophysics Data System (ADS)

    Kulkarni, Vrushali Y.; Sinha, Pradeep K.; Petare, Manisha C.

    2016-06-01

    Random Forest is an ensemble, supervised machine learning algorithm. An ensemble generates many classifiers and combines their results by majority voting. Random forest uses decision tree as base classifier. In decision tree induction, an attribute split/evaluation measure is used to decide the best split at each node of the decision tree. The generalization error of a forest of tree classifiers depends on the strength of the individual trees in the forest and the correlation among them. The work presented in this paper is related to attribute split measures and is a two step process: first theoretical study of the five selected split measures is done and a comparison matrix is generated to understand pros and cons of each measure. These theoretical results are verified by performing empirical analysis. For empirical analysis, random forest is generated using each of the five selected split measures, chosen one at a time. i.e. random forest using information gain, random forest using gain ratio, etc. The next step is, based on this theoretical and empirical analysis, a new approach of hybrid decision tree model for random forest classifier is proposed. In this model, individual decision tree in Random Forest is generated using different split measures. This model is augmented by weighted voting based on the strength of individual tree. The new approach has shown notable increase in the accuracy of random forest.

  13. Improved Compressive Sensing of Natural Scenes Using Localized Random Sampling.

    PubMed

    Barranca, Victor J; Kovačič, Gregor; Zhou, Douglas; Cai, David

    2016-08-24

    Compressive sensing (CS) theory demonstrates that by using uniformly-random sampling, rather than uniformly-spaced sampling, higher quality image reconstructions are often achievable. Considering that the structure of sampling protocols has such a profound impact on the quality of image reconstructions, we formulate a new sampling scheme motivated by physiological receptive field structure, localized random sampling, which yields significantly improved CS image reconstructions. For each set of localized image measurements, our sampling method first randomly selects an image pixel and then measures its nearby pixels with probability depending on their distance from the initially selected pixel. We compare the uniformly-random and localized random sampling methods over a large space of sampling parameters, and show that, for the optimal parameter choices, higher quality image reconstructions can be consistently obtained by using localized random sampling. In addition, we argue that the localized random CS optimal parameter choice is stable with respect to diverse natural images, and scales with the number of samples used for reconstruction. We expect that the localized random sampling protocol helps to explain the evolutionarily advantageous nature of receptive field structure in visual systems and suggests several future research areas in CS theory and its application to brain imaging.

  14. Site selection

    SciTech Connect

    Olsen, C.W.

    1983-07-01

    The conditions and criteria for selecting a site for a nuclear weapons test at the Nevada Test Site are summarized. Factors considered are: (1) scheduling of drill rigs, (2) scheduling of site preparation (dirt work, auger hole, surface casing, cementing), (3) schedule of event (when are drill hole data needed), (4) depth range of proposed W.P., (5) geologic structure (faults, Pz contact, etc.), (6) stratigraphy (alluvium, location of Grouse Canyon Tuff, etc.), (7) material properties (particularly montmorillonite and CO/sub 2/ content), (8) water table depth, (9) potential drilling problems (caving), (10) adjacent collapse craters and chimneys, (11) adjacent expended but uncollapsed sites, (12) adjacent post-shot or other small diameter holes, (13) adjacent stockpile emplacement holes, (14) adjacent planned events (including LANL), (15) projected needs of Test Program for various DOB's and operational separations, and (16) optimal use of NTS real estate.

  15. Granulator Selection

    SciTech Connect

    Gould, T H; Armantrout, G

    1999-08-02

    Following our detailed review of the granulation reports and additional conversations with process and development personnel, we have reached a consensus position regarding granulator selection. At this time, we recommend going forward with implementation of the tumbling granulator approach (GEMCO) based on our assessment of the tested granulation techniques using the established criteria. The basis for this selection is summarized in the following sections, followed by our recommendations for proceeding with implementation of the tumbling granulation approach. All five granulation technologies produced granulated products that can be made into acceptable sintered pucks. A possible exception is the product from the fluidized bed granulator. This material has been more difficult to press into uniform pucks without subsequent cracking of the puck during the sintering cycle for the pucks in this series of tests. This problem may be an artifact of the conditions of the particular granulation demonstration run involved, but earlier results have also been mixed. All granulators made acceptable granulated feed from the standpoint of transfer and press feeding, though the roller compactor and fluidized bed products were dustier than the rest. There was also differentiation among the granulators in the operational areas of (1) potential for process upset, (2) plant implementation and operational complexity, and (3) maintenance concerns. These considerations will be discussed further in the next section. Note that concerns also exist regarding the extension of the granulation processes to powders containing actinides. Only the method that involves tumbling and moisture addition has been tested with uranium, and in that instance, significant differences were found in the granulation behavior of the powders.

  16. Nematode-bacteria mutualism: Selection within the mutualism supersedes selection outside of the mutualism.

    PubMed

    Morran, Levi T; Penley, McKenna J; Byrd, Victoria S; Meyer, Andrew J; O'Sullivan, Timothy S; Bashey, Farrah; Goodrich-Blair, Heidi; Lively, Curtis M

    2016-03-01

    The coevolution of interacting species can lead to codependent mutualists. Little is known about the effect of selection on partners within verses apart from the association. Here, we determined the effect of selection on bacteria (Xenorhabdus nematophila) both within and apart from its mutualistic partner (a nematode, Steinernema carpocapsae). In nature, the two species cooperatively infect and kill arthropods. We passaged the bacteria either together with (M+), or isolated from (M-), nematodes under two different selection regimes: random selection (S-) and selection for increased virulence against arthropod hosts (S+). We found that the isolated bacteria evolved greater virulence under selection for greater virulence (M-S+) than under random selection (M-S-). In addition, the response to selection in the isolated bacteria (M-S+) caused a breakdown of the mutualism following reintroduction to the nematode. Finally, selection for greater virulence did not alter the evolutionary trajectories of bacteria passaged within the mutualism (M+S+ = M+S-), indicating that selection for the maintenance of the mutualism was stronger than selection for increased virulence. The results show that selection on isolated mutualists can rapidly breakdown beneficial interactions between species, but that selection within a mutualism can supersede external selection, potentially generating codependence over time.

  17. Nematode-Bacteria Mutualism: Selection Within the Mutualism Supersedes Selection Outside of the Mutualism

    PubMed Central

    Morran, Levi T.; Penley, McKenna J.; Byrd, Victoria S.; Meyer, Andrew J.; O’Sullivan, Timothy S.; Bashey, Farrah; Goodrich-Blair, Heidi; Lively, Curtis M.

    2016-01-01

    The coevolution of interacting species can lead to co-dependent mutualists. Little is known about the effect of selection on partners within verses apart from the association. Here, we determined the effect of selection on bacteria (Xenorhabdus nematophila) both within and apart from its mutualistic partner (a nematode, Steinernema carpocapsae). In nature, the two species cooperatively infect and kill arthropods. We passaged the bacteria either together with (M+), or isolated from (M−), nematodes under two different selection regimes: random selection (S−) and selection for increased virulence against arthropod hosts (S+). We found that the isolated bacteria evolved greater virulence under selection for greater virulence (M−S+) than under random selection (M−S−). In addition, the response to selection in the isolated bacteria (M−S+) caused a breakdown of the mutualism following reintroduction to the nematode. Finally, selection for greater virulence did not alter the evolutionary trajectories of bacteria passaged within the mutualism (M+S+ = M+S−), indicating that selection for the maintenance of the mutualism was stronger than selection for increased virulence. The results show that selection on isolated mutualists can rapidly breakdown beneficial interactions between species, but that selection within a mutualism can supersede external selection, potentially generating co-dependence over time. PMID:26867502

  18. Comparison of selective genotyping strategies for prediction of breeding values in a population undergoing selection.

    PubMed

    Boligon, A A; Long, N; Albuquerque, L G; Weigel, K A; Gianola, D; Rosa, G J M

    2012-12-01

    Genomewide marker information can improve the reliability of breeding value predictions for young selection candidates in genomic selection. However, the cost of genotyping limits its use to elite animals, and how such selective genotyping affects predictive ability of genomic selection models is an open question. We performed a simulation study to evaluate the quality of breeding value predictions for selection candidates based on different selective genotyping strategies in a population undergoing selection. The genome consisted of 10 chromosomes of 100 cM each. After 5,000 generations of random mating with a population size of 100 (50 males and 50 females), generation G(0) (reference population) was produced via a full factorial mating between the 50 males and 50 females from generation 5,000. Different levels of selection intensities (animals with the largest yield deviation value) in G(0) or random sampling (no selection) were used to produce offspring of G(0) generation (G(1)). Five genotyping strategies were used to choose 500 animals in G(0) to be genotyped: 1) Random: randomly selected animals, 2) Top: animals with largest yield deviation values, 3) Bottom: animals with lowest yield deviations values, 4) Extreme: animals with the 250 largest and the 250 lowest yield deviations values, and 5) Less Related: less genetically related animals. The number of individuals in G(0) and G(1) was fixed at 2,500 each, and different levels of heritability were considered (0.10, 0.25, and 0.50). Additionally, all 5 selective genotyping strategies (Random, Top, Bottom, Extreme, and Less Related) were applied to an indicator trait in generation G(0,) and the results were evaluated for the target trait in generation G(1), with the genetic correlation between the 2 traits set to 0.50. The 5 genotyping strategies applied to individuals in G(0) (reference population) were compared in terms of their ability to predict the genetic values of the animals in G(1) (selection

  19. Random noun generation in younger and older adults.

    PubMed

    Heuer, Herbert; Janczyk, Markus; Kunde, Wilfried

    2010-03-01

    We examined age-related changes of executive functions by means of random noun generation. Consistent with previous observations on random letter generation, older participants produced more prepotent responses than younger ones. In the case of random noun generation, prepotent responses are nouns of the same category as the preceding noun. In contrast to previous observations, older participants exhibited stronger repetition avoidance and a stronger tendency toward local evenness-that is, toward equal frequencies of the alternative responses even in short subsequences. These data suggest that at higher adult age inhibition of prepotent responses is impaired. In addition, strategic attentional processes of response selection are strengthened, in particular the application of a heuristic for randomness. In this sense response selection is more controlled in older than in younger adults.

  20. Challenges of cluster randomized trials.

    PubMed

    Campbell, Michael J

    2014-05-01

    Cluster randomized trials are trials that randomize clusters of people, rather than individuals. They are becoming increasingly common. A number of innovations have been developed recently, particularly in the calculation of the required size of a cluster trial, the handling of missing data, designs to minimize recruitment bias, the ethics of cluster randomized trials and the stepped wedge design. This article will highlight and illustrate these developments. It will also discuss issues with regards to the reporting of cluster randomized trials.

  1. Random numbers from vacuum fluctuations

    NASA Astrophysics Data System (ADS)

    Shi, Yicheng; Chng, Brenda; Kurtsiefer, Christian

    2016-07-01

    We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.

  2. Random numbers from vacuum fluctuations

    SciTech Connect

    Shi, Yicheng; Kurtsiefer, Christian; Chng, Brenda

    2016-07-25

    We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.

  3. Random recursive trees and the elephant random walk

    NASA Astrophysics Data System (ADS)

    Kürsten, Rüdiger

    2016-03-01

    One class of random walks with infinite memory, so-called elephant random walks, are simple models describing anomalous diffusion. We present a surprising connection between these models and bond percolation on random recursive trees. We use a coupling between the two models to translate results from elephant random walks to the percolation process. We calculate, besides other quantities, exact expressions for the first and the second moment of the root cluster size and of the number of nodes in child clusters of the first generation. We further introduce another model, the skew elephant random walk, and calculate the first and second moment of this process.

  4. Randomly hyperbranched polymers.

    PubMed

    Konkolewicz, Dominik; Gilbert, Robert G; Gray-Weale, Angus

    2007-06-08

    We describe a model for the structures of randomly hyperbranched polymers in solution, and find a logarithmic growth of radius with polymer mass. We include segmental overcrowding, which puts an upper limit on the density. The model is tested against simulations, against data on amylopectin, a major component of starch, on glycogen, and on polyglycerols. For samples of synthetic polyglycerol and glycogen, our model holds well for all the available data. The model reveals higher-level scaling structure in glycogen, related to the beta particles seen in electron microscopy.

  5. Randomly Hyperbranched Polymers

    NASA Astrophysics Data System (ADS)

    Konkolewicz, Dominik; Gilbert, Robert G.; Gray-Weale, Angus

    2007-06-01

    We describe a model for the structures of randomly hyperbranched polymers in solution, and find a logarithmic growth of radius with polymer mass. We include segmental overcrowding, which puts an upper limit on the density. The model is tested against simulations, against data on amylopectin, a major component of starch, on glycogen, and on polyglycerols. For samples of synthetic polyglycerol and glycogen, our model holds well for all the available data. The model reveals higher-level scaling structure in glycogen, related to the β particles seen in electron microscopy.

  6. Adaptive random testing with combinatorial input domain.

    PubMed

    Huang, Rubing; Chen, Jinfu; Lu, Yansheng

    2014-01-01

    Random testing (RT) is a fundamental testing technique to assess software reliability, by simply selecting test cases in a random manner from the whole input domain. As an enhancement of RT, adaptive random testing (ART) has better failure-detection capability and has been widely applied in different scenarios, such as numerical programs, some object-oriented programs, and mobile applications. However, not much work has been done on the effectiveness of ART for the programs with combinatorial input domain (i.e., the set of categorical data). To extend the ideas to the testing for combinatorial input domain, we have adopted different similarity measures that are widely used for categorical data in data mining and have proposed two similarity measures based on interaction coverage. Then, we propose a new version named ART-CID as an extension of ART in combinatorial input domain, which selects an element from categorical data as the next test case such that it has the lowest similarity against already generated test cases. Experimental results show that ART-CID generally performs better than RT, with respect to different evaluation metrics.

  7. Random Numbers and Quantum Computers

    ERIC Educational Resources Information Center

    McCartney, Mark; Glass, David

    2002-01-01

    The topic of random numbers is investigated in such a way as to illustrate links between mathematics, physics and computer science. First, the generation of random numbers by a classical computer using the linear congruential generator and logistic map is considered. It is noted that these procedures yield only pseudo-random numbers since…

  8. Investigating the Randomness of Numbers

    ERIC Educational Resources Information Center

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  9. Wireless Network Security Using Randomness

    DTIC Science & Technology

    2012-06-19

    REPORT WIRELESS NETWORK SECURITY USING RANDOMNESS 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: The present invention provides systems and methods for... securing communications in a wireless network by utilizing the inherent randomness of propagation errors to enable legitimate users to dynamically...Box 12211 Research Triangle Park, NC 27709-2211 15. SUBJECT TERMS Patent, security , wireless networks, randomness Sheng Xiao, Weibo Gong

  10. Randomness and Non-Locality

    NASA Astrophysics Data System (ADS)

    Senno, Gabriel; Bendersky, Ariel; Figueira, Santiago

    2016-07-01

    The concepts of randomness and non-locality are intimately intertwined outcomes of randomly chosen measurements over entangled systems exhibiting non-local correlations are, if we preclude instantaneous influence between distant measurement choices and outcomes, random. In this paper, we survey some recent advances in the knowledge of the interplay between these two important notions from a quantum information science perspective.

  11. Investigating the Randomness of Numbers

    ERIC Educational Resources Information Center

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  12. Random Numbers and Quantum Computers

    ERIC Educational Resources Information Center

    McCartney, Mark; Glass, David

    2002-01-01

    The topic of random numbers is investigated in such a way as to illustrate links between mathematics, physics and computer science. First, the generation of random numbers by a classical computer using the linear congruential generator and logistic map is considered. It is noted that these procedures yield only pseudo-random numbers since…

  13. Instant Random Information

    NASA Astrophysics Data System (ADS)

    Abramson, Nils H.

    2010-12-01

    Information is carried by matter or by energy and thus Einstein stated that "no information can travel faster than light." He also was very critical to the "Spooky action at distance" as described in Quantum Physics. However, many verified experiments have proven that the "Spooky actions" not only work at distance but also that they travel at a velocity faster than light, probably at infinite velocity. Examples are Young's fringes at low light levels or entanglements. My explanation is that this information is without energy. In the following I will refer to this spooky information as exformation, where "ex-" refers to existence, the information is not transported in any way, it simply exists. Thus Einstein might have been wrong when he stated that no information can travel faster than light. But he was right in that no detectable information can travel faster than light. Phenomena connected to entanglement appear at first to be exceptions, but in those cases the information can not be reconstructed until energy is later sent in the form of correlation using ordinary information at the velocity of light. In entanglement we see that even if the exformation can not be detected directly because its luck of energy it still can influence what happens at random, because in Quantum Physics there is by definition no energy difference between two states that happen randomly.

  14. Fragmentation of random trees

    NASA Astrophysics Data System (ADS)

    Kalay, Ziya; Ben-Naim, Eli

    2015-03-01

    We investigate the fragmentation of a random recursive tree by repeated removal of nodes, resulting in a forest of disjoint trees. The initial tree is generated by sequentially attaching new nodes to randomly chosen existing nodes until the tree contains N nodes. As nodes are removed, one at a time, the tree dissolves into an ensemble of separate trees, namely a forest. We study the statistical properties of trees and nodes in this heterogeneous forest. In the limit N --> ∞ , we find that the system is characterized by a single parameter: the fraction of remaining nodes m. We obtain analytically the size density ϕs of trees of size s, which has a power-law tail ϕs ~s-α , with exponent α = 1 + 1 / m . Therefore, the tail becomes steeper as further nodes are removed, producing an unusual scaling exponent that increases continuously with time. Furthermore, we investigate the fragment size distribution in a growing tree, where nodes are added as well as removed, and find that the distribution for this case is much narrower.

  15. Random-walk enzymes

    PubMed Central

    Mak, Chi H.; Pham, Phuong; Afif, Samir A.; Goodman, Myron F.

    2015-01-01

    Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C → U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics. PMID:26465508

  16. Random-walk enzymes.

    PubMed

    Mak, Chi H; Pham, Phuong; Afif, Samir A; Goodman, Myron F

    2015-09-01

    Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C→U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics.

  17. Random walk with barriers

    PubMed Central

    Novikov, Dmitry S.; Fieremans, Els; Jensen, Jens H.; Helpern, Joseph A.

    2011-01-01

    Restrictions to molecular motion by barriers (membranes) are ubiquitous in porous media, composite materials and biological tissues. A major challenge is to characterize the microstructure of a material or an organism nondestructively using a bulk transport measurement. Here we demonstrate how the long-range structural correlations introduced by permeable membranes give rise to distinct features of transport. We consider Brownian motion restricted by randomly placed and oriented membranes (d − 1 dimensional planes in d dimensions) and focus on the disorder-averaged diffusion propagator using a scattering approach. The renormalization group solution reveals a scaling behavior of the diffusion coefficient for large times, with a characteristically slow inverse square root time dependence for any d. Its origin lies in the strong structural fluctuations introduced by the spatially extended random restrictions, representing a novel universality class of the structural disorder. Our results agree well with Monte Carlo simulations in two dimensions. They can be used to identify permeable barriers as restrictions to transport, and to quantify their permeability and surface area. PMID:21686083

  18. Random-walk enzymes

    NASA Astrophysics Data System (ADS)

    Mak, Chi H.; Pham, Phuong; Afif, Samir A.; Goodman, Myron F.

    2015-09-01

    Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C →U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics.

  19. Pseudo-random number generator for the Sigma 5 computer

    NASA Technical Reports Server (NTRS)

    Carroll, S. N.

    1983-01-01

    A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.

  20. Mapping in random-structures

    SciTech Connect

    Reidys, C.M.

    1996-06-01

    A mapping in random-structures is defined on the vertices of a generalized hypercube Q{sub {alpha}}{sup n}. A random-structure will consist of (1) a random contact graph and (2) a family of relations imposed on adjacent vertices. The vertex set of a random contact graph will be the set of all coordinates of a vertex P {element_of} Q{sub {alpha}}{sup n}. Its edge will be the union of the edge sets of two random graphs. The first is a random 1-regular graph on 2m vertices (coordinates) and the second is a random graph G{sub p} with p = c{sub 2}/n on all n vertices (coordinates). The structure of the random contact graphs will be investigated and it will be shown that for certain values of m, c{sub 2} the mapping in random-structures allows to search by the set of random-structures. This is applied to mappings in RNA-secondary structures. Also, the results on random-structures might be helpful for designing 3D-folding algorithms for RNA.

  1. Discriminative Projection Selection Based Face Image Hashing

    NASA Astrophysics Data System (ADS)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  2. Mobile access to virtual randomization for investigator-initiated trials.

    PubMed

    Deserno, Thomas M; Keszei, András P

    2017-08-01

    Background/aims Randomization is indispensable in clinical trials in order to provide unbiased treatment allocation and a valid statistical inference. Improper handling of allocation lists can be avoided using central systems, for example, human-based services. However, central systems are unaffordable for investigator-initiated trials and might be inaccessible from some places, where study subjects need allocations. We propose mobile access to virtual randomization, where the randomization lists are non-existent and the appropriate allocation is computed on demand. Methods The core of the system architecture is an electronic data capture system or a clinical trial management system, which is extended by an R interface connecting the R server using the Java R Interface. Mobile devices communicate via the representational state transfer web services. Furthermore, a simple web-based setup allows configuring the appropriate statistics by non-statisticians. Our comprehensive R script supports simple randomization, restricted randomization using a random allocation rule, block randomization, and stratified randomization for un-blinded, single-blinded, and double-blinded trials. For each trial, the electronic data capture system or the clinical trial management system stores the randomization parameters and the subject assignments. Results Apps are provided for iOS and Android and subjects are randomized using smartphones. After logging onto the system, the user selects the trial and the subject, and the allocation number and treatment arm are displayed instantaneously and stored in the core system. So far, 156 subjects have been allocated from mobile devices serving five investigator-initiated trials. Conclusion Transforming pre-printed allocation lists into virtual ones ensures the correct conduct of trials and guarantees a strictly sequential processing in all trial sites. Covering 88% of all randomization models that are used in recent trials, virtual randomization

  3. Purely antiferromagnetic magnetoelectric random access memory.

    PubMed

    Kosub, Tobias; Kopte, Martin; Hühne, Ruben; Appel, Patrick; Shields, Brendan; Maletinsky, Patrick; Hübner, René; Liedke, Maciej Oskar; Fassbender, Jürgen; Schmidt, Oliver G; Makarov, Denys

    2017-01-03

    Magnetic random access memory schemes employing magnetoelectric coupling to write binary information promise outstanding energy efficiency. We propose and demonstrate a purely antiferromagnetic magnetoelectric random access memory (AF-MERAM) that offers a remarkable 50-fold reduction of the writing threshold compared with ferromagnet-based counterparts, is robust against magnetic disturbances and exhibits no ferromagnetic hysteresis losses. Using the magnetoelectric antiferromagnet Cr2O3, we demonstrate reliable isothermal switching via gate voltage pulses and all-electric readout at room temperature. As no ferromagnetic component is present in the system, the writing magnetic field does not need to be pulsed for readout, allowing permanent magnets to be used. Based on our prototypes, we construct a comprehensive model of the magnetoelectric selection mechanisms in thin films of magnetoelectric antiferromagnets, revealing misfit induced ferrimagnetism as an important factor. Beyond memory applications, the AF-MERAM concept introduces a general all-electric interface for antiferromagnets and should find wide applicability in antiferromagnetic spintronics.

  4. Purely antiferromagnetic magnetoelectric random access memory

    NASA Astrophysics Data System (ADS)

    Kosub, Tobias; Kopte, Martin; Hühne, Ruben; Appel, Patrick; Shields, Brendan; Maletinsky, Patrick; Hübner, René; Liedke, Maciej Oskar; Fassbender, Jürgen; Schmidt, Oliver G.; Makarov, Denys

    2017-01-01

    Magnetic random access memory schemes employing magnetoelectric coupling to write binary information promise outstanding energy efficiency. We propose and demonstrate a purely antiferromagnetic magnetoelectric random access memory (AF-MERAM) that offers a remarkable 50-fold reduction of the writing threshold compared with ferromagnet-based counterparts, is robust against magnetic disturbances and exhibits no ferromagnetic hysteresis losses. Using the magnetoelectric antiferromagnet Cr2O3, we demonstrate reliable isothermal switching via gate voltage pulses and all-electric readout at room temperature. As no ferromagnetic component is present in the system, the writing magnetic field does not need to be pulsed for readout, allowing permanent magnets to be used. Based on our prototypes, we construct a comprehensive model of the magnetoelectric selection mechanisms in thin films of magnetoelectric antiferromagnets, revealing misfit induced ferrimagnetism as an important factor. Beyond memory applications, the AF-MERAM concept introduces a general all-electric interface for antiferromagnets and should find wide applicability in antiferromagnetic spintronics.

  5. Resolving social dilemmas on evolving random networks

    NASA Astrophysics Data System (ADS)

    Szolnoki, Attila; Perc, Matjaž

    2009-05-01

    We show that strategy-independent adaptations of random interaction networks can induce powerful mechanisms, ranging from the Red Queen to group selection, which promote cooperation in evolutionary social dilemmas. These two mechanisms emerge spontaneously as dynamical processes due to deletions and additions of links, which are performed whenever players adopt new strategies and after a certain number of game iterations, respectively. The potency of cooperation promotion, as well as the mechanism responsible for it, can thereby be tuned via a single parameter determining the frequency of link additions. We thus demonstrate that coevolving random networks may evoke an appropriate mechanism for each social dilemma, such that cooperation prevails even in highly unfavorable conditions.

  6. Random Test Run Length and Effectiveness

    NASA Technical Reports Server (NTRS)

    Andrews, James H.; Groce, Alex; Weston, Melissa; Xu, Ru-Gang

    2008-01-01

    A poorly understood but important factor in many applications of random testing is the selection of a maximum length for test runs. Given a limited time for testing, it is seldom clear whether executing a small number of long runs or a large number of short runs maximizes utility. It is generally expected that longer runs are more likely to expose failures -- which is certainly true with respect to runs shorter than the shortest failing trace. However, longer runs produce longer failing traces, requiring more effort from humans in debugging or more resources for automated minimization. In testing with feedback, increasing ranges for parameters may also cause the probability of failure to decrease in longer runs. We show that the choice of test length dramatically impacts the effectiveness of random testing, and that the patterns observed in simple models and predicted by analysis are useful in understanding effects observed.

  7. Facade Segmentation with a Structured Random Forest

    NASA Astrophysics Data System (ADS)

    Rahmani, K.; Huang, H.; Mayer, H.

    2017-05-01

    In this paper we present a bottom-up approach for the semantic segmentation of building facades. Facades have a predefined topology, contain specific objects such as doors and windows and follow architectural rules. Our goal is to create homogeneous segments for facade objects. To this end, we have created a pixelwise labeling method using a Structured Random Forest. According to the evaluation of results for two datasets with the classifier we have achieved the above goal producing a nearly noise-free labeling image and perform on par or even slightly better than the classifier-only stages of state-of-the-art approaches. This is due to the encoding of the local topological structure of the facade objects in the Structured Random Forest. Additionally, we have employed an iterative optimization approach to select the best possible labeling.

  8. Random Test Run Length and Effectiveness

    NASA Technical Reports Server (NTRS)

    Andrews, James H.; Groce, Alex; Weston, Melissa; Xu, Ru-Gang

    2008-01-01

    A poorly understood but important factor in many applications of random testing is the selection of a maximum length for test runs. Given a limited time for testing, it is seldom clear whether executing a small number of long runs or a large number of short runs maximizes utility. It is generally expected that longer runs are more likely to expose failures -- which is certainly true with respect to runs shorter than the shortest failing trace. However, longer runs produce longer failing traces, requiring more effort from humans in debugging or more resources for automated minimization. In testing with feedback, increasing ranges for parameters may also cause the probability of failure to decrease in longer runs. We show that the choice of test length dramatically impacts the effectiveness of random testing, and that the patterns observed in simple models and predicted by analysis are useful in understanding effects observed.

  9. Simulation of pedigree genotypes by random walks.

    PubMed Central

    Lange, K; Matthysse, S

    1989-01-01

    A random walk method, based on the Metropolis algorithm, is developed for simulating the distribution of trait and linkage marker genotypes in pedigrees where trait phenotypes are already known. The method complements techniques suggested by Ploughman and Boehnke and by Ott that are based on sequential sampling of genotypes within a pedigree. These methods are useful for estimating the power of linkage analysis before complete study of a pedigree is undertaken. We apply the random walk technique to a partially penetrant disease, schizophrenia, and to a recessive disease, ataxia-telangiectasia. In the first case we show that accessory phenotypes with higher penetrance than that of schizophrenia itself may be crucial for effective linkage analysis, and in the second case we show that impressionistic selection of informative pedigrees may be misleading. PMID:2589323

  10. Neither fixed nor random: weighted least squares meta-regression.

    PubMed

    Stanley, T D; Doucouliagos, Hristos

    2017-03-01

    Our study revisits and challenges two core conventional meta-regression estimators: the prevalent use of 'mixed-effects' or random-effects meta-regression analysis and the correction of standard errors that defines fixed-effects meta-regression analysis (FE-MRA). We show how and explain why an unrestricted weighted least squares MRA (WLS-MRA) estimator is superior to conventional random-effects (or mixed-effects) meta-regression when there is publication (or small-sample) bias that is as good as FE-MRA in all cases and better than fixed effects in most practical applications. Simulations and statistical theory show that WLS-MRA provides satisfactory estimates of meta-regression coefficients that are practically equivalent to mixed effects or random effects when there is no publication bias. When there is publication selection bias, WLS-MRA always has smaller bias than mixed effects or random effects. In practical applications, an unrestricted WLS meta-regression is likely to give practically equivalent or superior estimates to fixed-effects, random-effects, and mixed-effects meta-regression approaches. However, random-effects meta-regression remains viable and perhaps somewhat preferable if selection for statistical significance (publication bias) can be ruled out and when random, additive normal heterogeneity is known to directly affect the 'true' regression coefficient. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Avoid Early Selection for Growth Rate in Cottonwood

    Treesearch

    D. T. Cooper; Robert B. Ferguson

    1971-01-01

    A sample of 37 cottonwood clones from a selection program was compared with a sample of 40 random clones in a 14-year test at two sites near Stoneville, Mississippi. Throughout the test period, the select sample was slightly better in mean growth rate, but this difference decreased with age. Performance of ''blue tag" clones selected at age 5 and planted...

  12. Bridging Emergent Attributes and Darwinian Principles in Teaching Natural Selection

    ERIC Educational Resources Information Center

    Xu, Dongchen; Chi, Michelene T. H.

    2016-01-01

    Students often have misconceptions about natural selection as they misuse a direct causal schema to explain the process. Natural selection is in fact an emergent process where random interactions lead to changes in a population. The misconceptions stem from students' lack of emergent schema for natural selection. In order to help students…

  13. Generalized random sequential adsorption

    NASA Astrophysics Data System (ADS)

    Tarjus, G.; Schaaf, P.; Talbot, J.

    1990-12-01

    Adsorption of hard spherical particles onto a flat uniform surface is analyzed by using generalized random sequential adsorption (RSA) models. These models are defined by releasing the condition of immobility present in the usual RSA rules to allow for desorption or surface diffusion. Contrary to the simple RSA case, generalized RSA processes are no longer irreversible and the system formed by the adsorbed particles on the surface may reach an equilibrium state. We show by using a distribution function approach that the kinetics of such processes can be described by means of an exact infinite hierarchy of equations reminiscent of the Kirkwood-Salsburg hierarchy for systems at equilibrium. We illustrate the way in which the systems produced by adsorption/desorption and by adsorption/diffusion evolve between the two limits represented by ``simple RSA'' and ``equilibrium'' by considering approximate solutions in terms of truncated density expansions.

  14. Tailored Random Graph Ensembles

    NASA Astrophysics Data System (ADS)

    Roberts, E. S.; Annibale, A.; Coolen, A. C. C.

    2013-02-01

    Tailored graph ensembles are a developing bridge between biological networks and statistical mechanics. The aim is to use this concept to generate a suite of rigorous tools that can be used to quantify and compare the topology of cellular signalling networks, such as protein-protein interaction networks and gene regulation networks. We calculate exact and explicit formulae for the leading orders in the system size of the Shannon entropies of random graph ensembles constrained with degree distribution and degree-degree correlation. We also construct an ergodic detailed balance Markov chain with non-trivial acceptance probabilities which converges to a strictly uniform measure and is based on edge swaps that conserve all degrees. The acceptance probabilities can be generalized to define Markov chains that target any alternative desired measure on the space of directed or undirected graphs, in order to generate graphs with more sophisticated topological features.

  15. Investments in random environments

    NASA Astrophysics Data System (ADS)

    Navarro-Barrientos, Jesús Emeterio; Cantero-Álvarez, Rubén; Matias Rodrigues, João F.; Schweitzer, Frank

    2008-03-01

    We present analytical investigations of a multiplicative stochastic process that models a simple investor dynamics in a random environment. The dynamics of the investor's budget, x(t) , depends on the stochasticity of the return on investment, r(t) , for which different model assumptions are discussed. The fat-tail distribution of the budget is investigated and compared with theoretical predictions. We are mainly interested in the most probable value xmp of the budget that reaches a constant value over time. Based on an analytical in