Sample records for humanly generated random

  1. On grey levels in random CAPTCHA generation

    NASA Astrophysics Data System (ADS)

    Newton, Fraser; Kouritzin, Michael A.

    2011-06-01

    A CAPTCHA is an automatically generated test designed to distinguish between humans and computer programs; specifically, they are designed to be easy for humans but difficult for computer programs to pass in order to prevent the abuse of resources by automated bots. They are commonly seen guarding webmail registration forms, online auction sites, and preventing brute force attacks on passwords. In the following, we address the question: How does adding a grey level to random CAPTCHA generation affect the utility of the CAPTCHA? We treat the problem of generating the random CAPTCHA as one of random field simulation: An initial state of background noise is evolved over time using Gibbs sampling and an efficient algorithm for generating correlated random variables. This approach has already been found to yield highly-readable yet difficult-to-crack CAPTCHAs. We detail how the requisite parameters for introducing grey levels are estimated and how we generate the random CAPTCHA. The resulting CAPTCHA will be evaluated in terms of human readability as well as its resistance to automated attacks in the forms of character segmentation and optical character recognition.

  2. On random field Completely Automated Public Turing Test to Tell Computers and Humans Apart generation.

    PubMed

    Kouritzin, Michael A; Newton, Fraser; Wu, Biao

    2013-04-01

    Herein, we propose generating CAPTCHAs through random field simulation and give a novel, effective and efficient algorithm to do so. Indeed, we demonstrate that sufficient information about word tests for easy human recognition is contained in the site marginal probabilities and the site-to-nearby-site covariances and that these quantities can be embedded directly into certain conditional probabilities, designed for effective simulation. The CAPTCHAs are then partial random realizations of the random CAPTCHA word. We start with an initial random field (e.g., randomly scattered letter pieces) and use Gibbs resampling to re-simulate portions of the field repeatedly using these conditional probabilities until the word becomes human-readable. The residual randomness from the initial random field together with the random implementation of the CAPTCHA word provide significant resistance to attack. This results in a CAPTCHA, which is unrecognizable to modern optical character recognition but is recognized about 95% of the time in a human readability study.

  3. An Examination of the Utility of Non-Linear Dynamics Techniques for Analyzing Human Information Behaviors.

    ERIC Educational Resources Information Center

    Snyder, Herbert; Kurtze, Douglas

    1992-01-01

    Discusses the use of chaos, or nonlinear dynamics, for investigating computer-mediated communication. A comparison between real, human-generated data from a computer network and similarly constructed random-generated data is made, and mathematical procedures for determining chaos are described. (seven references) (LRW)

  4. Assessing randomness and complexity in human motion trajectories through analysis of symbolic sequences

    PubMed Central

    Peng, Zhen; Genewein, Tim; Braun, Daniel A.

    2014-01-01

    Complexity is a hallmark of intelligent behavior consisting both of regular patterns and random variation. To quantitatively assess the complexity and randomness of human motion, we designed a motor task in which we translated subjects' motion trajectories into strings of symbol sequences. In the first part of the experiment participants were asked to perform self-paced movements to create repetitive patterns, copy pre-specified letter sequences, and generate random movements. To investigate whether the degree of randomness can be manipulated, in the second part of the experiment participants were asked to perform unpredictable movements in the context of a pursuit game, where they received feedback from an online Bayesian predictor guessing their next move. We analyzed symbol sequences representing subjects' motion trajectories with five common complexity measures: predictability, compressibility, approximate entropy, Lempel-Ziv complexity, as well as effective measure complexity. We found that subjects' self-created patterns were the most complex, followed by drawing movements of letters and self-paced random motion. We also found that participants could change the randomness of their behavior depending on context and feedback. Our results suggest that humans can adjust both complexity and regularity in different movement types and contexts and that this can be assessed with information-theoretic measures of the symbolic sequences generated from movement trajectories. PMID:24744716

  5. Human Capital Background and the Educational Attainment of Second-Generation Immigrants in France

    ERIC Educational Resources Information Center

    Dos Santos, Manon Domingues; Wolff, Francois-Charles

    2011-01-01

    In this paper, we study the impact of parental human capital background on ethnic educational gaps between second-generation immigrants using a large data set conducted in France in 2003. Estimates from censored random effect ordered Probit regressions show that the skills of immigrants explain in the most part, the ethnic educational gap between…

  6. The nature and perception of fluctuations in human musical rhythms

    NASA Astrophysics Data System (ADS)

    Hennig, Holger; Fleischmann, Ragnar; Fredebohm, Anneke; Hagmayer, York; Nagler, Jan; Witt, Annette; Theis, Fabian; Geisel, Theo

    2012-02-01

    Although human musical performances represent one of the most valuable achievements of mankind, the best musicians perform imperfectly. Musical rhythms are not entirely accurate and thus inevitably deviate from the ideal beat pattern. Nevertheless, computer generated perfect beat patterns are frequently devalued by listeners due to a perceived lack of human touch. Professional audio editing software therefore offers a humanizing feature which artificially generates rhythmic fluctuations. However, the built-in humanizing units are essentially random number generators producing only simple uncorrelated fluctuations. Here, for the first time, we establish long-range fluctuations as an inevitable natural companion of both simple and complex human rhythmic performances [1]. Moreover, we demonstrate that listeners strongly prefer long-range correlated fluctuations in musical rhythms. Thus, the favorable fluctuation type for humanizing interbeat intervals coincides with the one generically inherent in human musical performances. [1] HH et al., PLoS ONE,6,e26457 (2011)

  7. Genetic recombination pathways and their application for genome modification of human embryonic stem cells.

    PubMed

    Nieminen, Mikko; Tuuri, Timo; Savilahti, Harri

    2010-10-01

    Human embryonic stem cells are pluripotent cells derived from early human embryo and retain a potential to differentiate into all adult cell types. They provide vast opportunities in cell replacement therapies and are expected to become significant tools in drug discovery as well as in the studies of cellular and developmental functions of human genes. The progress in applying different types of DNA recombination reactions for genome modification in a variety of eukaryotic cell types has provided means to utilize recombination-based strategies also in human embryonic stem cells. Homologous recombination-based methods, particularly those utilizing extended homologous regions and those employing zinc finger nucleases to boost genomic integration, have shown their usefulness in efficient genome modification. Site-specific recombination systems are potent genome modifiers, and they can be used to integrate DNA into loci that contain an appropriate recombination signal sequence, either naturally occurring or suitably pre-engineered. Non-homologous recombination can be used to generate random integrations in genomes relatively effortlessly, albeit with a moderate efficiency and precision. DNA transposition-based strategies offer substantially more efficient random strategies and provide means to generate single-copy insertions, thus potentiating the generation of genome-wide insertion libraries applicable in genetic screens. 2010 Elsevier Inc. All rights reserved.

  8. Effects of stereospecific positioning of fatty acids in triacylglycerol structures in native and randomized fats: a review of their nutritional implications

    PubMed Central

    Karupaiah, Tilakavati; Sundram, Kalyana

    2007-01-01

    Most studies on lipid lowering diets have focused on the total content of saturated, polyunsaturated and monounsaturated fatty acids. However, the distribution of these fatty acids on the triacylglycerol (TAG) molecule and the molecular TAG species generated by this stereospecificity are characteristic for various native dietary TAGs. Fat randomization or interesterification is a process involving the positional redistribution of fatty acids, which leads to the generation of new TAG molecular species. A comparison between native and randomized TAGs is the subject of this review with regards to the role of stereospecificity of fatty acids in metabolic processing and effects on fasting lipids and postprandial lipemia. The positioning of unsaturated versus saturated fatty acids in the sn-2 position of TAGs indicate differences in early metabolic processing and postprandial clearance, which may explain modulatory effects on atherogenecity and thrombogenecity. Both human and animal studies are discussed with implications for human health. PMID:17625019

  9. Subjective randomness as statistical inference.

    PubMed

    Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B

    2018-06-01

    Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Quantifying randomness in real networks

    NASA Astrophysics Data System (ADS)

    Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-10-01

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs.

  11. Human cerebral potentials evoked by moving dynamic random dot stereograms.

    PubMed

    Herpers, M J; Caberg, H B; Mol, J M

    1981-07-01

    In 11 normal healthy human subjects an evoked potential was elicited by moving dynamic random dot stereograms. The random dots were generated by a minicomputer. An average of each of 8 EEG channels of the subjects tested was made. The maximum of the cerebral evoked potentials thus found was localized in the central and parietal region. No response earlier than 130--150 msec after the stimulus could be proved. The influence of fixation, the number of dots provided, an interocular interstimulus interval in the presentation of the dots, and lense accommodation movements on the evoked stereoptic potentials was investigated and discussed. An interocular interstimulus interval (left eye leading) in the presentation of the dots caused an increase in latency of the response much longer than the imposed interstimulus interval itself. It was shown that no accommodation was needed to perceive the depth impression, and to evoke the cerebral response with random dot stereograms. There are indications of an asymmetry between the two hemispheres in the handling of depth perception after 250 msec. The potential distribution of the evoked potentials strongly suggests that they are not generated in the occipital region.

  12. Shocking Behavior: Random Wealth in Antebellum Georgia and Human Capital Across Generations

    PubMed Central

    Bleakley, Hoyt; Ferrie, Joseph

    2017-01-01

    Does the lack of wealth constrain parents’ investments in the human capital of their descendants? We conduct a nearly fifty-year followup of an episode in which such constraints would have been plausibly relaxed by a random allocation of substantial wealth to families. We track descendants of participants in Georgia’s Cherokee Land Lottery of 1832, in which nearly every adult white male in Georgia took part. Winners received close to the median level of wealth – a large financial windfall orthogonal to participants’ underlying characteristics that might have also affected their children’s human capital. Although winners had slightly more children than non-winners, they did not send them to school more. Sons of winners have no better adult outcomes (wealth, income, literacy) than the sons of non-winners, and winners’ grandchildren do not have higher literacy or school attendance than non-winners’ grandchildren. This suggests only a limited role for family financial resources in the formation of human capital in the next generations in this environment and a potentially more important role for other factors that persist through family lines. PMID:28529385

  13. Model-based VQ for image data archival, retrieval and distribution

    NASA Technical Reports Server (NTRS)

    Manohar, Mareboyana; Tilton, James C.

    1995-01-01

    An ideal image compression technique for image data archival, retrieval and distribution would be one with the asymmetrical computational requirements of Vector Quantization (VQ), but without the complications arising from VQ codebooks. Codebook generation and maintenance are stumbling blocks which have limited the use of VQ as a practical image compression algorithm. Model-based VQ (MVQ), a variant of VQ described here, has the computational properties of VQ but does not require explicit codebooks. The codebooks are internally generated using mean removed error and Human Visual System (HVS) models. The error model assumed is the Laplacian distribution with mean, lambda-computed from a sample of the input image. A Laplacian distribution with mean, lambda, is generated with uniform random number generator. These random numbers are grouped into vectors. These vectors are further conditioned to make them perceptually meaningful by filtering the DCT coefficients from each vector. The DCT coefficients are filtered by multiplying by a weight matrix that is found to be optimal for human perception. The inverse DCT is performed to produce the conditioned vectors for the codebook. The only image dependent parameter used in the generation of codebook is the mean, lambda, that is included in the coded file to repeat the codebook generation process for decoding.

  14. Spatial Analysis of “Crazy Quilts”, a Class of Potentially Random Aesthetic Artefacts

    PubMed Central

    Westphal-Fitch, Gesche; Fitch, W. Tecumseh

    2013-01-01

    Human artefacts in general are highly structured and often display ordering principles such as translational, reflectional or rotational symmetry. In contrast, human artefacts that are intended to appear random and non symmetrical are very rare. Furthermore, many studies show that humans find it extremely difficult to recognize or reproduce truly random patterns or sequences. Here, we attempt to model two-dimensional decorative spatial patterns produced by humans that show no obvious order. “Crazy quilts” represent a historically important style of quilt making that became popular in the 1870s, and lasted about 50 years. Crazy quilts are unusual because unlike most human artefacts, they are specifically intended to appear haphazard and unstructured. We evaluate the degree to which this intention was achieved by using statistical techniques of spatial point pattern analysis to compare crazy quilts with regular quilts from the same region and era and to evaluate the fit of various random distributions to these two quilt classes. We found that the two quilt categories exhibit fundamentally different spatial characteristics: The patch areas of crazy quilts derive from a continuous random distribution, while area distributions of regular quilts consist of Gaussian mixtures. These Gaussian mixtures derive from regular pattern motifs that are repeated and we suggest that such a mixture is a distinctive signature of human-made visual patterns. In contrast, the distribution found in crazy quilts is shared with many other naturally occurring spatial patterns. Centroids of patches in the two quilt classes are spaced differently and in general, crazy quilts but not regular quilts are well-fitted by a random Strauss process. These results indicate that, within the constraints of the quilt format, Victorian quilters indeed achieved their goal of generating random structures. PMID:24066095

  15. Spatial analysis of "crazy quilts", a class of potentially random aesthetic artefacts.

    PubMed

    Westphal-Fitch, Gesche; Fitch, W Tecumseh

    2013-01-01

    Human artefacts in general are highly structured and often display ordering principles such as translational, reflectional or rotational symmetry. In contrast, human artefacts that are intended to appear random and non symmetrical are very rare. Furthermore, many studies show that humans find it extremely difficult to recognize or reproduce truly random patterns or sequences. Here, we attempt to model two-dimensional decorative spatial patterns produced by humans that show no obvious order. "Crazy quilts" represent a historically important style of quilt making that became popular in the 1870s, and lasted about 50 years. Crazy quilts are unusual because unlike most human artefacts, they are specifically intended to appear haphazard and unstructured. We evaluate the degree to which this intention was achieved by using statistical techniques of spatial point pattern analysis to compare crazy quilts with regular quilts from the same region and era and to evaluate the fit of various random distributions to these two quilt classes. We found that the two quilt categories exhibit fundamentally different spatial characteristics: The patch areas of crazy quilts derive from a continuous random distribution, while area distributions of regular quilts consist of Gaussian mixtures. These Gaussian mixtures derive from regular pattern motifs that are repeated and we suggest that such a mixture is a distinctive signature of human-made visual patterns. In contrast, the distribution found in crazy quilts is shared with many other naturally occurring spatial patterns. Centroids of patches in the two quilt classes are spaced differently and in general, crazy quilts but not regular quilts are well-fitted by a random Strauss process. These results indicate that, within the constraints of the quilt format, Victorian quilters indeed achieved their goal of generating random structures.

  16. Human choice among five alternatives when reinforcers decay.

    PubMed

    Rothstein, Jacob B; Jensen, Greg; Neuringer, Allen

    2008-06-01

    Human participants played a computer game in which choices among five alternatives were concurrently reinforced according to dependent random-ratio schedules. "Dependent" indicates that choices to any of the wedges activated the random-number generators governing reinforcers on all five alternatives. Two conditions were compared. In the hold condition, once scheduled, a reinforcer - worth a constant five points - remained available until it was collected. In the decay condition, point values decreased with intervening responses, i.e., rapid collection was differentially reinforced. Slopes of matching functions were higher in the decay than hold condition. However inter-subject variability was high in both conditions.

  17. Employing the Components of the Human Development Index to Drive Resources to Educational Policies

    ERIC Educational Resources Information Center

    Sant'Anna, Annibal Parracho; de Araujo Ribeiro, Rodrigo Otavio; Dutt-Ross, Steven

    2011-01-01

    A new form of composition of the indicators employed to generate the United Nations Human Development Index (HDI) is presented here. This form of composition is based on the assumption that random errors affect the measurement of each indicator. This assumption allows for replacing the vector of evaluations according to each indicator by vectors…

  18. Planckian Information (Ip): A New Measure of Order in Atoms, Enzymes, Cells, Brains, Human Societies, and the Cosmos

    NASA Astrophysics Data System (ADS)

    Ji, Sungchul

    A new mathematical formula referred to as the Planckian distribution equation (PDE) has been found to fit long-tailed histograms generated in various fields of studies, ranging from atomic physics to single-molecule enzymology, cell biology, brain neurobiology, glottometrics, econophysics, and to cosmology. PDE can be derived from a Gaussian-like equation (GLE) by non-linearly transforming its variable, x, while keeping the y coordinate constant. Assuming that GLE represents a random distribution (due to its symmetry), it is possible to define a binary logarithm of the ratio between the areas under the curves of PDE and GLE as a measure of the non-randomness (or order) underlying the biophysicochemical processes generating long-tailed histograms that fit PDE. This new function has been named the Planckian information, IP, which (i) may be a new measure of order that can be applied widely to both natural and human sciences and (ii) can serve as the opposite of the Boltzmann-Gibbs entropy, S, which is a measure of disorder. The possible rationales for the universality of PDE may include (i) the universality of the wave-particle duality embedded in PDE, (ii) the selection of subsets of random processes (thereby breaking the symmetry of GLE) as the basic mechanism of generating order, organization, and function, and (iii) the quantity-quality complementarity as the connection between PDE and Peircean semiotics.

  19. Host-Associated Metagenomics: A Guide to Generating Infectious RNA Viromes

    PubMed Central

    Robert, Catherine; Pascalis, Hervé; Michelle, Caroline; Jardot, Priscilla; Charrel, Rémi; Raoult, Didier; Desnues, Christelle

    2015-01-01

    Background Metagenomic analyses have been widely used in the last decade to describe viral communities in various environments or to identify the etiology of human, animal, and plant pathologies. Here, we present a simple and standardized protocol that allows for the purification and sequencing of RNA viromes from complex biological samples with an important reduction of host DNA and RNA contaminants, while preserving the infectivity of viral particles. Principal Findings We evaluated different viral purification steps, random reverse transcriptions and sequence-independent amplifications of a pool of representative RNA viruses. Viruses remained infectious after the purification process. We then validated the protocol by sequencing the RNA virome of human body lice engorged in vitro with artificially contaminated human blood. The full genomes of the most abundant viruses absorbed by the lice during the blood meal were successfully sequenced. Interestingly, random amplifications differed in the genome coverage of segmented RNA viruses. Moreover, the majority of reads were taxonomically identified, and only 7–15% of all reads were classified as “unknown”, depending on the random amplification method. Conclusion The protocol reported here could easily be applied to generate RNA viral metagenomes from complex biological samples of different origins. Our protocol allows further virological characterizations of the described viral communities because it preserves the infectivity of viral particles and allows for the isolation of viruses. PMID:26431175

  20. Human brain detects short-time nonlinear predictability in the temporal fine structure of deterministic chaotic sounds

    NASA Astrophysics Data System (ADS)

    Itoh, Kosuke; Nakada, Tsutomu

    2013-04-01

    Deterministic nonlinear dynamical processes are ubiquitous in nature. Chaotic sounds generated by such processes may appear irregular and random in waveform, but these sounds are mathematically distinguished from random stochastic sounds in that they contain deterministic short-time predictability in their temporal fine structures. We show that the human brain distinguishes deterministic chaotic sounds from spectrally matched stochastic sounds in neural processing and perception. Deterministic chaotic sounds, even without being attended to, elicited greater cerebral cortical responses than the surrogate control sounds after about 150 ms in latency after sound onset. Listeners also clearly discriminated these sounds in perception. The results support the hypothesis that the human auditory system is sensitive to the subtle short-time predictability embedded in the temporal fine structure of sounds.

  1. Proteasix: a tool for automated and large-scale prediction of proteases involved in naturally occurring peptide generation.

    PubMed

    Klein, Julie; Eales, James; Zürbig, Petra; Vlahou, Antonia; Mischak, Harald; Stevens, Robert

    2013-04-01

    In this study, we have developed Proteasix, an open-source peptide-centric tool that can be used to predict in silico the proteases involved in naturally occurring peptide generation. We developed a curated cleavage site (CS) database, containing 3500 entries about human protease/CS combinations. On top of this database, we built a tool, Proteasix, which allows CS retrieval and protease associations from a list of peptides. To establish the proof of concept of the approach, we used a list of 1388 peptides identified from human urine samples, and compared the prediction to the analysis of 1003 randomly generated amino acid sequences. Metalloprotease activity was predominantly involved in urinary peptide generation, and more particularly to peptides associated with extracellular matrix remodelling, compared to proteins from other origins. In comparison, random sequences returned almost no results, highlighting the specificity of the prediction. This study provides a tool that can facilitate linking of identified protein fragments to predicted protease activity, and therefore into presumed mechanisms of disease. Experiments are needed to confirm the in silico hypotheses; nevertheless, this approach may be of great help to better understand molecular mechanisms of disease, and define new biomarkers, and therapeutic targets. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Mice with megabase humanization of their immunoglobulin genes generate antibodies as efficiently as normal mice.

    PubMed

    Murphy, Andrew J; Macdonald, Lynn E; Stevens, Sean; Karow, Margaret; Dore, Anthony T; Pobursky, Kevin; Huang, Tammy T; Poueymirou, William T; Esau, Lakeisha; Meola, Melissa; Mikulka, Warren; Krueger, Pamela; Fairhurst, Jeanette; Valenzuela, David M; Papadopoulos, Nicholas; Yancopoulos, George D

    2014-04-08

    Mice genetically engineered to be humanized for their Ig genes allow for human antibody responses within a mouse background (HumAb mice), providing a valuable platform for the generation of fully human therapeutic antibodies. Unfortunately, existing HumAb mice do not have fully functional immune systems, perhaps because of the manner in which their genetic humanization was carried out. Heretofore, HumAb mice have been generated by disrupting the endogenous mouse Ig genes and simultaneously introducing human Ig transgenes at a different and random location; KO-plus-transgenic humanization. As we describe in the companion paper, we attempted to make mice that more efficiently use human variable region segments in their humoral responses by precisely replacing 6 Mb of mouse Ig heavy and kappa light variable region germ-line gene segments with their human counterparts while leaving the mouse constant regions intact, using a unique in situ humanization approach. We reasoned the introduced human variable region gene segments would function indistinguishably in their new genetic location, whereas the retained mouse constant regions would allow for optimal interactions and selection of the resulting antibodies within the mouse environment. We show that these mice, termed VelocImmune mice because they were generated using VelociGene technology, efficiently produce human:mouse hybrid antibodies (that are rapidly convertible to fully human antibodies) and have fully functional humoral immune systems indistinguishable from those of WT mice. The efficiency of the VelocImmune approach is confirmed by the rapid progression of 10 different fully human antibodies into human clinical trials.

  3. Reduction of display artifacts by random sampling

    NASA Technical Reports Server (NTRS)

    Ahumada, A. J., Jr.; Nagel, D. C.; Watson, A. B.; Yellott, J. I., Jr.

    1983-01-01

    The application of random-sampling techniques to remove visible artifacts (such as flicker, moire patterns, and paradoxical motion) introduced in TV-type displays by discrete sequential scanning is discussed and demonstrated. Sequential-scanning artifacts are described; the window of visibility defined in spatiotemporal frequency space by Watson and Ahumada (1982 and 1983) and Watson et al. (1983) is explained; the basic principles of random sampling are reviewed and illustrated by the case of the human retina; and it is proposed that the sampling artifacts can be replaced by random noise, which can then be shifted to frequency-space regions outside the window of visibility. Vertical sequential, single-random-sequence, and continuously renewed random-sequence plotting displays generating 128 points at update rates up to 130 Hz are applied to images of stationary and moving lines, and best results are obtained with the single random sequence for the stationary lines and with the renewed random sequence for the moving lines.

  4. Enzymatic synthesis of random sequences of RNA and RNA analogues by DNA polymerase theta mutants for the generation of aptamer libraries.

    PubMed

    Randrianjatovo-Gbalou, Irina; Rosario, Sandrine; Sismeiro, Odile; Varet, Hugo; Legendre, Rachel; Coppée, Jean-Yves; Huteau, Valérie; Pochet, Sylvie; Delarue, Marc

    2018-05-21

    Nucleic acid aptamers, especially RNA, exhibit valuable advantages compared to protein therapeutics in terms of size, affinity and specificity. However, the synthesis of libraries of large random RNAs is still difficult and expensive. The engineering of polymerases able to directly generate these libraries has the potential to replace the chemical synthesis approach. Here, we start with a DNA polymerase that already displays a significant template-free nucleotidyltransferase activity, human DNA polymerase theta, and we mutate it based on the knowledge of its three-dimensional structure as well as previous mutational studies on members of the same polA family. One mutant exhibited a high tolerance towards ribonucleotides (NTPs) and displayed an efficient ribonucleotidyltransferase activity that resulted in the assembly of long RNA polymers. HPLC analysis and RNA sequencing of the products were used to quantify the incorporation of the four NTPs as a function of initial NTP concentrations and established the randomness of each generated nucleic acid sequence. The same mutant revealed a propensity to accept other modified nucleotides and to extend them in long fragments. Hence, this mutant can deliver random natural and modified RNA polymers libraries ready to use for SELEX, with custom lengths and balanced or unbalanced ratios.

  5. LTE-advanced random access mechanism for M2M communication: A review

    NASA Astrophysics Data System (ADS)

    Mustafa, Rashid; Sarowa, Sandeep; Jaglan, Reena Rathee; Khan, Mohammad Junaid; Agrawal, Sunil

    2016-03-01

    Machine Type Communications (MTC) enables one or more self-sufficient machines to communicate directly with one another without human interference. MTC applications include smart grid, security, e-Health and intelligent automation system. To support huge numbers of MTC devices, one of the challenging issues is to provide a competent way for numerous access in the network and to minimize network overload. In this article, the different control mechanisms for overload random access are reviewed to avoid congestion caused by random access channel (RACH) of MTC devices. However, past and present wireless technologies have been engineered for Human-to-Human (H2H) communications, in particular, for transmission of voice. Consequently the Long Term Evolution (LTE) -Advanced is expected to play a central role in communicating Machine to Machine (M2M) and are very optimistic about H2H communications. Distinct and unique characteristics of M2M communications create new challenges from those in H2H communications. In this article, we investigate the impact of massive M2M terminals attempting random access to LTE-Advanced all at once. We discuss and review the solutions to alleviate the overload problem by Third Generation Partnership Project (3GPP). As a result, we evaluate and compare these solutions that can effectively eliminate the congestion on the random access channel for M2M communications without affecting H2H communications.

  6. At least some errors are randomly generated (Freud was wrong)

    NASA Technical Reports Server (NTRS)

    Sellen, A. J.; Senders, J. W.

    1986-01-01

    An experiment was carried out to expose something about human error generating mechanisms. In the context of the experiment, an error was made when a subject pressed the wrong key on a computer keyboard or pressed no key at all in the time allotted. These might be considered, respectively, errors of substitution and errors of omission. Each of seven subjects saw a sequence of three digital numbers, made an easily learned binary judgement about each, and was to press the appropriate one of two keys. Each session consisted of 1,000 presentations of randomly permuted, fixed numbers broken into 10 blocks of 100. One of two keys should have been pressed within one second of the onset of each stimulus. These data were subjected to statistical analyses in order to probe the nature of the error generating mechanisms. Goodness of fit tests for a Poisson distribution for the number of errors per 50 trial interval and for an exponential distribution of the length of the intervals between errors were carried out. There is evidence for an endogenous mechanism that may best be described as a random error generator. Furthermore, an item analysis of the number of errors produced per stimulus suggests the existence of a second mechanism operating on task driven factors producing exogenous errors. Some errors, at least, are the result of constant probability generating mechanisms with error rate idiosyncratically determined for each subject.

  7. Non-random Mis-segregation of Human Chromosomes.

    PubMed

    Worrall, Joseph Thomas; Tamura, Naoka; Mazzagatti, Alice; Shaikh, Nadeem; van Lingen, Tineke; Bakker, Bjorn; Spierings, Diana Carolina Johanna; Vladimirou, Elina; Foijer, Floris; McClelland, Sarah Elizabeth

    2018-06-12

    A common assumption is that human chromosomes carry equal chances of mis-segregation during compromised cell division. Human chromosomes vary in multiple parameters that might generate bias, but technological limitations have precluded a comprehensive analysis of chromosome-specific aneuploidy. Here, by imaging specific centromeres coupled with high-throughput single-cell analysis as well as single-cell sequencing, we show that aneuploidy occurs non-randomly following common treatments to elevate chromosome mis-segregation. Temporary spindle disruption leads to elevated mis-segregation and aneuploidy of a subset of chromosomes, particularly affecting chromosomes 1 and 2. Unexpectedly, we find that a period of mitotic delay weakens centromeric cohesion and promotes chromosome mis-segregation and that chromosomes 1 and 2 are particularly prone to suffer cohesion fatigue. Our findings demonstrate that inherent properties of individual chromosomes can bias chromosome mis-segregation and aneuploidy rates, with implications for studies on aneuploidy in human disease. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  8. A Single Mechanism Can Account for Human Perception of Depth in Mixed Correlation Random Dot Stereograms

    PubMed Central

    Cumming, Bruce G.

    2016-01-01

    In order to extract retinal disparity from a visual scene, the brain must match corresponding points in the left and right retinae. This computationally demanding task is known as the stereo correspondence problem. The initial stage of the solution to the correspondence problem is generally thought to consist of a correlation-based computation. However, recent work by Doi et al suggests that human observers can see depth in a class of stimuli where the mean binocular correlation is 0 (half-matched random dot stereograms). Half-matched random dot stereograms are made up of an equal number of correlated and anticorrelated dots, and the binocular energy model—a well-known model of V1 binocular complex cells—fails to signal disparity here. This has led to the proposition that a second, match-based computation must be extracting disparity in these stimuli. Here we show that a straightforward modification to the binocular energy model—adding a point output nonlinearity—is by itself sufficient to produce cells that are disparity-tuned to half-matched random dot stereograms. We then show that a simple decision model using this single mechanism can reproduce psychometric functions generated by human observers, including reduced performance to large disparities and rapidly updating dot patterns. The model makes predictions about how performance should change with dot size in half-matched stereograms and temporal alternation in correlation, which we test in human observers. We conclude that a single correlation-based computation, based directly on already-known properties of V1 neurons, can account for the literature on mixed correlation random dot stereograms. PMID:27196696

  9. Challenging local realism with human choices.

    PubMed

    2018-05-01

    A Bell test is a randomized trial that compares experimental observations against the philosophical worldview of local realism 1 , in which the properties of the physical world are independent of our observation of them and no signal travels faster than light. A Bell test requires spatially distributed entanglement, fast and high-efficiency detection and unpredictable measurement settings 2,3 . Although technology can satisfy the first two of these requirements 4-7 , the use of physical devices to choose settings in a Bell test involves making assumptions about the physics that one aims to test. Bell himself noted this weakness in using physical setting choices and argued that human 'free will' could be used rigorously to ensure unpredictability in Bell tests 8 . Here we report a set of local-realism tests using human choices, which avoids assumptions about predictability in physics. We recruited about 100,000 human participants to play an online video game that incentivizes fast, sustained input of unpredictable selections and illustrates Bell-test methodology 9 . The participants generated 97,347,490 binary choices, which were directed via a scalable web platform to 12 laboratories on five continents, where 13 experiments tested local realism using photons 5,6 , single atoms 7 , atomic ensembles 10 and superconducting devices 11 . Over a 12-hour period on 30 November 2016, participants worldwide provided a sustained data flow of over 1,000 bits per second to the experiments, which used different human-generated data to choose each measurement setting. The observed correlations strongly contradict local realism and other realistic positions in bipartite and tripartite 12 scenarios. Project outcomes include closing the 'freedom-of-choice loophole' (the possibility that the setting choices are influenced by 'hidden variables' to correlate with the particle properties 13 ), the utilization of video-game methods 14 for rapid collection of human-generated randomness, and the use of networking techniques for global participation in experimental science.

  10. Development of a Humanized Monoclonal Antibody with Therapeutic Potential against West Nile Virus

    PubMed Central

    Oliphant, Theodore; Engle, Michael; Nybakken, Grant E.; Doane, Chris; Johnson, Syd; Huang, Ling; Gorlatov, Sergey; Mehlhop, Erin; Marri, Anantha; Chung, Kyung Min; Ebel, Gregory D.; Kramer, Laura D.; Fremont, Daved H.; Diamond, Michael S.

    2006-01-01

    Neutralization of West Nile virus (WNV) in vivo correlates with the development of an antibody response against the viral envelope (E) protein. Using random mutagenesis and yeast surface display, we defined individual contact residues of 14 newly generated mAbs against domain III of the WNV E protein. MAbs that strongly neutralized WNV localized to a surface patch on the lateral face of domain III. Convalescent antibodies from human patients who had recovered from WNV infection also detected this epitope. One mAb, E16, neutralized 10 different strains in vitro, and demonstrated therapeutic efficacy in mice, even when administered as a single dose 5 d after infection. A humanized version of E16 was generated that retained antigen specificity, avidity, and neutralizing activity. In post-exposure therapeutic trials in mice, a single dose of humanized E16 protected mice against WNV-induced mortality, and thus, may be a viable treatment option against WNV infection in humans. PMID:15852016

  11. The use of crop rotation for mapping soil organic content in farmland

    NASA Astrophysics Data System (ADS)

    Yang, Lin; Song, Min; Zhu, A.-Xing; Qin, Chengzhi

    2017-04-01

    Most of the current digital soil mapping uses natural environmental covariates. However, human activities have significantly impacted the development of soil properties since half a century, and therefore become an important factor affecting soil spatial variability. Many researches have done field experiments to show how soil properties are impacted and changed by human activities, however, spatial variation data of human activities as environmental covariates have been rarely used in digital soil mapping. In this paper, we took crop rotation as an example of agricultural activities, and explored its effectiveness in characterizing and mapping the spatial variability of soil. The cultivated area of Xuanzhou city and Langxi County in Anhui Province was chosen as the study area. Three main crop rotations,including double-rice, wheat-rice,and oilseed rape-cotton were observed through field investigation in 2010. The spatial distribution of the three crop rotations in the study area was obtained by multi-phase remote sensing image interpretation using a supervised classification method. One-way analysis of variance (ANOVA) for topsoil organic content in the three crop rotation groups was performed. Factor importance of seven natural environmental covariates, crop rotation, Land use and NDVI were generated by variable importance criterion of Random Forest. Different combinations of environmental covariates were selected according to the importance rankings of environmental covariates for predicting SOC using Random Forest and Soil Landscape Inference Model (SOLIM). A cross validation was generated to evaluated the mapping accuracies. The results showed that there were siginificant differences of topsoil organic content among the three crop rotation groups. The crop rotation is more important than parent material, land use or NDVI according to the importance ranking calculated by Random Forest. In addition, crop rotation improved the mapping accuracy, especially for the flat clutivated area. This study demonstrates the usefulness of human activities in digital soil mapping and thus indicates the necessity for human activity factors in digital soil mapping studies.

  12. Deficiencies of effectiveness of intervention studies in veterinary medicine: a cross-sectional survey of ten leading veterinary and medical journals

    PubMed Central

    Meursinge Reynders, Reint

    2016-01-01

    The validity of studies that assess the effectiveness of an intervention (EoI) depends on variables such as the type of study design, the quality of their methodology, and the participants enrolled. Five leading veterinary journals and 5 leading human medical journals were hand-searched for EoI studies for the year 2013. We assessed (1) the prevalence of randomized controlled trials (RCTs) among EoI studies, (2) the type of participants enrolled, and (3) the methodological quality of the selected studies. Of 1707 eligible articles, 590 were EoI articles and 435 RCTs. Random allocation to the intervention was performed in 52% (114/219; 95%CI:45.2–58.8%) of veterinary EoI articles, against 87% (321/371; 82.5–89.7%) of human EoI articles (adjusted OR:9.2; 3.4–24.8). Veterinary RCTs were smaller (median: 26 animals versus 465 humans) and less likely to enroll real patients, compared with human RCTs (OR:331; 45–2441). Only 2% of the veterinary RCTs, versus 77% of the human RCTs, reported power calculations, primary outcomes, random sequence generation, allocation concealment and estimation methods. Currently, internal and external validity of veterinary EoI studies is limited compared to human medical ones. To address these issues, veterinary interventional research needs to improve its methodology, increase the number of published RCTs and enroll real clinical patients. PMID:26835187

  13. Analysis of Mycobacterium avium subsp. paratuberculosis mutant libraries reveals loci-dependent transcription biases and strategies to novel mutant discovery

    USDA-ARS?s Scientific Manuscript database

    Mycobacterium avium subsp. paratuberculosis (MAP) is the etiologic agent of Johne’s disease in ruminants and it has been implicated as a cause of Crohn’s disease in humans. The generation of comprehensive random mutant banks by transposon mutagenesis is a fundamental wide genomic technology utilized...

  14. Self-correcting random number generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Pooser, Raphael C.

    2016-09-06

    A system and method for generating random numbers. The system may include a random number generator (RNG), such as a quantum random number generator (QRNG) configured to self-correct or adapt in order to substantially achieve randomness from the output of the RNG. By adapting, the RNG may generate a random number that may be considered random regardless of whether the random number itself is tested as such. As an example, the RNG may include components to monitor one or more characteristics of the RNG during operation, and may use the monitored characteristics as a basis for adapting, or self-correcting, tomore » provide a random number according to one or more performance criteria.« less

  15. Real-time fast physical random number generator with a photonic integrated circuit.

    PubMed

    Ugajin, Kazusa; Terashima, Yuta; Iwakawa, Kento; Uchida, Atsushi; Harayama, Takahisa; Yoshimura, Kazuyuki; Inubushi, Masanobu

    2017-03-20

    Random number generators are essential for applications in information security and numerical simulations. Most optical-chaos-based random number generators produce random bit sequences by offline post-processing with large optical components. We demonstrate a real-time hardware implementation of a fast physical random number generator with a photonic integrated circuit and a field programmable gate array (FPGA) electronic board. We generate 1-Tbit random bit sequences and evaluate their statistical randomness using NIST Special Publication 800-22 and TestU01. All of the BigCrush tests in TestU01 are passed using 410-Gbit random bit sequences. A maximum real-time generation rate of 21.1 Gb/s is achieved for random bit sequences in binary format stored in a computer, which can be directly used for applications involving secret keys in cryptography and random seeds in large-scale numerical simulations.

  16. Cooperation and charity in spatial public goods game under different strategy update rules

    NASA Astrophysics Data System (ADS)

    Li, Yixiao; Jin, Xiaogang; Su, Xianchuang; Kong, Fansheng; Peng, Chengbin

    2010-03-01

    Human cooperation can be influenced by other human behaviors and recent years have witnessed the flourishing of studying the coevolution of cooperation and punishment, yet the common behavior of charity is seldom considered in game-theoretical models. In this article, we investigate the coevolution of altruistic cooperation and egalitarian charity in spatial public goods game, by considering charity as the behavior of reducing inter-individual payoff differences. Our model is that, in each generation of the evolution, individuals play games first and accumulate payoff benefits, and then each egalitarian makes a charity donation by payoff transfer in its neighborhood. To study the individual-level evolutionary dynamics, we adopt different strategy update rules and investigate their effects on charity and cooperation. These rules can be classified into two global rules: random selection rule in which individuals randomly update strategies, and threshold selection rule where only those with payoffs below a threshold update strategies. Simulation results show that random selection enhances the cooperation level, while threshold selection lowers the threshold of the multiplication factor to maintain cooperation. When charity is considered, it is incapable in promoting cooperation under random selection, whereas it promotes cooperation under threshold selection. Interestingly, the evolution of charity strongly depends on the dispersion of payoff acquisitions of the population, which agrees with previous results. Our work may shed light on understanding human egalitarianism.

  17. Quantum random number generator

    DOEpatents

    Pooser, Raphael C.

    2016-05-10

    A quantum random number generator (QRNG) and a photon generator for a QRNG are provided. The photon generator may be operated in a spontaneous mode below a lasing threshold to emit photons. Photons emitted from the photon generator may have at least one random characteristic, which may be monitored by the QRNG to generate a random number. In one embodiment, the photon generator may include a photon emitter and an amplifier coupled to the photon emitter. The amplifier may enable the photon generator to be used in the QRNG without introducing significant bias in the random number and may enable multiplexing of multiple random numbers. The amplifier may also desensitize the photon generator to fluctuations in power supplied thereto while operating in the spontaneous mode. In one embodiment, the photon emitter and amplifier may be a tapered diode amplifier.

  18. Quantum random number generation

    DOE PAGES

    Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; ...

    2016-06-28

    Quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a highmore » speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.« less

  19. Fractal attractors in economic growth models with random pollution externalities

    NASA Astrophysics Data System (ADS)

    La Torre, Davide; Marsiglio, Simone; Privileggi, Fabio

    2018-05-01

    We analyze a discrete time two-sector economic growth model where the production technologies in the final and human capital sectors are affected by random shocks both directly (via productivity and factor shares) and indirectly (via a pollution externality). We determine the optimal dynamics in the decentralized economy and show how these dynamics can be described in terms of a two-dimensional affine iterated function system with probability. This allows us to identify a suitable parameter configuration capable of generating exactly the classical Barnsley's fern as the attractor of the log-linearized optimal dynamical system.

  20. Who is the research subject in cluster randomized trials in health research?

    PubMed Central

    2011-01-01

    This article is part of a series of papers examining ethical issues in cluster randomized trials (CRTs) in health research. In the introductory paper in this series, we set out six areas of inquiry that must be addressed if the CRT is to be set on a firm ethical foundation. This paper addresses the first of the questions posed, namely, who is the research subject in a CRT in health research? The identification of human research subjects is logically prior to the application of protections as set out in research ethics and regulation. Aspects of CRT design, including the fact that in a single study the units of randomization, experimentation, and observation may differ, complicate the identification of human research subjects. But the proper identification of human research subjects is important if they are to be protected from harm and exploitation, and if research ethics committees are to review CRTs efficiently. We examine the research ethics literature and international regulations to identify the core features of human research subjects, and then unify these features under a single, comprehensive definition of human research subject. We define a human research subject as any person whose interests may be compromised as a result of interventions in a research study. Individuals are only human research subjects in CRTs if: (1) they are directly intervened upon by investigators; (2) they interact with investigators; (3) they are deliberately intervened upon via a manipulation of their environment that may compromise their interests; or (4) their identifiable private information is used to generate data. Individuals who are indirectly affected by CRT study interventions, including patients of healthcare providers participating in knowledge translation CRTs, are not human research subjects unless at least one of these conditions is met. PMID:21791064

  1. Random and independent sampling of endogenous tryptic peptides from normal human EDTA plasma by liquid chromatography micro electrospray ionization and tandem mass spectrometry.

    PubMed

    Dufresne, Jaimie; Florentinus-Mefailoski, Angelique; Ajambo, Juliet; Ferwa, Ammara; Bowden, Peter; Marshall, John

    2017-01-01

    Normal human EDTA plasma samples were collected on ice, processed ice cold, and stored in a freezer at - 80 °C prior to experiments. Plasma test samples from the - 80 °C freezer were thawed on ice or intentionally warmed to room temperature. Protein content was measured by CBBR binding and the release of alcohol soluble amines by the Cd ninhydrin assay. Plasma peptides released over time were collected over C18 for random and independent sampling by liquid chromatography micro electrospray ionization and tandem mass spectrometry (LC-ESI-MS/MS) and correlated with X!TANDEM. Fully tryptic peptides by X!TANDEM returned a similar set of proteins, but was more computationally efficient, than "no enzyme" correlations. Plasma samples maintained on ice, or ice with a cocktail of protease inhibitors, showed lower background amounts of plasma peptides compared to samples incubated at room temperature. Regression analysis indicated that warming plasma to room temperature, versus ice cold, resulted in a ~ twofold increase in the frequency of peptide identification over hours-days of incubation at room temperature. The type I error rate of the protein identification from the X!TANDEM algorithm combined was estimated to be low compared to a null model of computer generated random MS/MS spectra. The peptides of human plasma were identified and quantified with low error rates by random and independent sampling that revealed 1000s of peptides from hundreds of human plasma proteins from endogenous tryptic peptides.

  2. Low-Energy Truly Random Number Generation with Superparamagnetic Tunnel Junctions for Unconventional Computing

    NASA Astrophysics Data System (ADS)

    Vodenicarevic, D.; Locatelli, N.; Mizrahi, A.; Friedman, J. S.; Vincent, A. F.; Romera, M.; Fukushima, A.; Yakushiji, K.; Kubota, H.; Yuasa, S.; Tiwari, S.; Grollier, J.; Querlioz, D.

    2017-11-01

    Low-energy random number generation is critical for many emerging computing schemes proposed to complement or replace von Neumann architectures. However, current random number generators are always associated with an energy cost that is prohibitive for these computing schemes. We introduce random number bit generation based on specific nanodevices: superparamagnetic tunnel junctions. We experimentally demonstrate high-quality random bit generation that represents an orders-of-magnitude improvement in energy efficiency over current solutions. We show that the random generation speed improves with nanodevice scaling, and we investigate the impact of temperature, magnetic field, and cross talk. Finally, we show how alternative computing schemes can be implemented using superparamagentic tunnel junctions as random number generators. These results open the way for fabricating efficient hardware computing devices leveraging stochasticity, and they highlight an alternative use for emerging nanodevices.

  3. Generation and dietary modulation of anti-inflammatory electrophilic omega-3 fatty acid derivatives.

    PubMed

    Cipollina, Chiara; Salvatore, Sonia R; Muldoon, Matthew F; Freeman, Bruce A; Schopfer, Francisco J

    2014-01-01

    Dietary ω-3 polyunsaturated fatty acids (PUFAs) decrease cardiovascular risk via suppression of inflammation. The generation of electrophilic α,β-unsaturated ketone derivatives of the ω-3 PUFAs docosahexaenoic acid (DHA) and docosapentaenoic acid (DPA) in activated human macrophages is catalyzed by cyclooxygenase-2 (Cox-2). These derivatives are potent pleiotropic anti-inflammatory signaling mediators that act via mechanisms including the activation of Nrf2-dependent phase 2 gene expression and suppression of pro-inflammatory NF-κB-driven gene expression. Herein, the endogenous generation of ω-3 PUFAs electrophilic ketone derivatives and their hydroxy precursors was evaluated in human neutrophils. In addition, their dietary modulation was assessed through a randomized clinical trial. Endogenous generation of electrophilic omega-3 PUFAs and their hydroxy precursors was evaluated by mass spectrometry in neutrophils isolated from healthy subjects, both at baseline and upon stimulation with calcium ionophore. For the clinical trial, participants were healthy adults 30-55 years of age with a reported EPA+DHA consumption of ≤300 mg/day randomly assigned to parallel groups receiving daily oil capsule supplements for a period of 4 months containing either 1.4 g of EPA+DHA (active condition, n = 24) or identical appearing soybean oil (control condition, n = 21). Participants and laboratory technicians remained blinded to treatment assignments. 5-lypoxygenase-dependent endogenous generation of 7-oxo-DHA, 7-oxo-DPA and 5-oxo-EPA and their hydroxy precursors is reported in human neutrophils stimulated with calcium ionophore and phorbol 12-myristate 13-acetate (PMA). Dietary EPA+DHA supplementation significantly increased the formation of 7-oxo-DHA and 5-oxo-EPA, with no significant modulation of arachidonic acid (AA) metabolite levels. The endogenous detection of these electrophilic ω-3 fatty acid ketone derivatives supports the precept that the benefit of ω-3 PUFA-rich diets can be attributed to the generation of electrophilic oxygenated metabolites that transduce anti-inflammatory actions rather than the suppression of pro-inflammatory AA metabolites. ClinicalTrials.gov NCT00663871.

  4. Animal research as a basis for clinical trials.

    PubMed

    Faggion, Clovis M

    2015-04-01

    Animal experiments are critical for the development of new human therapeutics because they provide mechanistic information, as well as important information on efficacy and safety. Some evidence suggests that authors of animal research in dentistry do not observe important methodological issues when planning animal experiments, for example sample-size calculation. Low-quality animal research directly interferes with development of the research process in which multiple levels of research are interconnected. For example, high-quality animal experiments generate sound information for the further planning and development of randomized controlled trials in humans. These randomized controlled trials are the main source for the development of systematic reviews and meta-analyses, which will generate the best evidence for the development of clinical guidelines. Therefore, adequate planning of animal research is a sine qua non condition for increasing efficacy and efficiency in research. Ethical concerns arise when animal research is not performed with high standards. This Focus article presents the latest information on the standards of animal research in dentistry, more precisely in the field of implant dentistry. Issues on precision and risk of bias are discussed, and strategies to reduce risk of bias in animal research are reported. © 2015 Eur J Oral Sci.

  5. Pseudo-Random Number Generator Based on Coupled Map Lattices

    NASA Astrophysics Data System (ADS)

    Lü, Huaping; Wang, Shihong; Hu, Gang

    A one-way coupled chaotic map lattice is used for generating pseudo-random numbers. It is shown that with suitable cooperative applications of both chaotic and conventional approaches, the output of the spatiotemporally chaotic system can easily meet the practical requirements of random numbers, i.e., excellent random statistical properties, long periodicity of computer realizations, and fast speed of random number generations. This pseudo-random number generator system can be used as ideal synchronous and self-synchronizing stream cipher systems for secure communications.

  6. Generating random numbers by means of nonlinear dynamic systems

    NASA Astrophysics Data System (ADS)

    Zang, Jiaqi; Hu, Haojie; Zhong, Juhua; Luo, Duanbin; Fang, Yi

    2018-07-01

    To introduce the randomness of a physical process to students, a chaotic pendulum experiment was opened in East China University of Science and Technology (ECUST) on the undergraduate level in the physics department. It was shown chaotic motion could be initiated through adjusting the operation of a chaotic pendulum. By using the data of the angular displacements of chaotic motion, random binary numerical arrays can be generated. To check the randomness of generated numerical arrays, the NIST Special Publication 800-20 method was adopted. As a result, it was found that all the random arrays which were generated by the chaotic motion could pass the validity criteria and some of them were even better than the quality of pseudo-random numbers generated by a computer. Through the experiments, it is demonstrated that chaotic pendulum can be used as an efficient mechanical facility in generating random numbers, and can be applied in teaching random motion to the students.

  7. Extracting random numbers from quantum tunnelling through a single diode.

    PubMed

    Bernardo-Gavito, Ramón; Bagci, Ibrahim Ethem; Roberts, Jonathan; Sexton, James; Astbury, Benjamin; Shokeir, Hamzah; McGrath, Thomas; Noori, Yasir J; Woodhead, Christopher S; Missous, Mohamed; Roedig, Utz; Young, Robert J

    2017-12-19

    Random number generation is crucial in many aspects of everyday life, as online security and privacy depend ultimately on the quality of random numbers. Many current implementations are based on pseudo-random number generators, but information security requires true random numbers for sensitive applications like key generation in banking, defence or even social media. True random number generators are systems whose outputs cannot be determined, even if their internal structure and response history are known. Sources of quantum noise are thus ideal for this application due to their intrinsic uncertainty. In this work, we propose using resonant tunnelling diodes as practical true random number generators based on a quantum mechanical effect. The output of the proposed devices can be directly used as a random stream of bits or can be further distilled using randomness extraction algorithms, depending on the application.

  8. Generating and using truly random quantum states in Mathematica

    NASA Astrophysics Data System (ADS)

    Miszczak, Jarosław Adam

    2012-01-01

    The problem of generating random quantum states is of a great interest from the quantum information theory point of view. In this paper we present a package for Mathematica computing system harnessing a specific piece of hardware, namely Quantis quantum random number generator (QRNG), for investigating statistical properties of quantum states. The described package implements a number of functions for generating random states, which use Quantis QRNG as a source of randomness. It also provides procedures which can be used in simulations not related directly to quantum information processing. Program summaryProgram title: TRQS Catalogue identifier: AEKA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 7924 No. of bytes in distributed program, including test data, etc.: 88 651 Distribution format: tar.gz Programming language: Mathematica, C Computer: Requires a Quantis quantum random number generator (QRNG, http://www.idquantique.com/true-random-number-generator/products-overview.html) and supporting a recent version of Mathematica Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit) RAM: Case dependent Classification: 4.15 Nature of problem: Generation of random density matrices. Solution method: Use of a physical quantum random number generator. Running time: Generating 100 random numbers takes about 1 second, generating 1000 random density matrices takes more than a minute.

  9. Neural Mechanism for Stochastic Behavior During a Competitive Game

    PubMed Central

    Soltani, Alireza; Lee, Daeyeol; Wang, Xiao-Jing

    2006-01-01

    Previous studies have shown that non-human primates can generate highly stochastic choice behavior, especially when this is required during a competitive interaction with another agent. To understand the neural mechanism of such dynamic choice behavior, we propose a biologically plausible model of decision making endowed with synaptic plasticity that follows a reward-dependent stochastic Hebbian learning rule. This model constitutes a biophysical implementation of reinforcement learning, and it reproduces salient features of behavioral data from an experiment with monkeys playing a matching pennies game. Due to interaction with an opponent and learning dynamics, the model generates quasi-random behavior robustly in spite of intrinsic biases. Furthermore, non-random choice behavior can also emerge when the model plays against a non-interactive opponent, as observed in the monkey experiment. Finally, when combined with a meta-learning algorithm, our model accounts for the slow drift in the animal’s strategy based on a process of reward maximization. PMID:17015181

  10. Deterministic binary vectors for efficient automated indexing of MEDLINE/PubMed abstracts.

    PubMed

    Wahle, Manuel; Widdows, Dominic; Herskovic, Jorge R; Bernstam, Elmer V; Cohen, Trevor

    2012-01-01

    The need to maintain accessibility of the biomedical literature has led to development of methods to assist human indexers by recommending index terms for newly encountered articles. Given the rapid expansion of this literature, it is essential that these methods be scalable. Document vector representations are commonly used for automated indexing, and Random Indexing (RI) provides the means to generate them efficiently. However, RI is difficult to implement in real-world indexing systems, as (1) efficient nearest-neighbor search requires retaining all document vectors in RAM, and (2) it is necessary to maintain a store of randomly generated term vectors to index future documents. Motivated by these concerns, this paper documents the development and evaluation of a deterministic binary variant of RI. The increased capacity demonstrated by binary vectors has implications for information retrieval, and the elimination of the need to retain term vectors facilitates distributed implementations, enhancing the scalability of RI.

  11. Deterministic Binary Vectors for Efficient Automated Indexing of MEDLINE/PubMed Abstracts

    PubMed Central

    Wahle, Manuel; Widdows, Dominic; Herskovic, Jorge R.; Bernstam, Elmer V.; Cohen, Trevor

    2012-01-01

    The need to maintain accessibility of the biomedical literature has led to development of methods to assist human indexers by recommending index terms for newly encountered articles. Given the rapid expansion of this literature, it is essential that these methods be scalable. Document vector representations are commonly used for automated indexing, and Random Indexing (RI) provides the means to generate them efficiently. However, RI is difficult to implement in real-world indexing systems, as (1) efficient nearest-neighbor search requires retaining all document vectors in RAM, and (2) it is necessary to maintain a store of randomly generated term vectors to index future documents. Motivated by these concerns, this paper documents the development and evaluation of a deterministic binary variant of RI. The increased capacity demonstrated by binary vectors has implications for information retrieval, and the elimination of the need to retain term vectors facilitates distributed implementations, enhancing the scalability of RI. PMID:23304369

  12. Quantum random number generation for loophole-free Bell tests

    NASA Astrophysics Data System (ADS)

    Mitchell, Morgan; Abellan, Carlos; Amaya, Waldimar

    2015-05-01

    We describe the generation of quantum random numbers at multi-Gbps rates, combined with real-time randomness extraction, to give very high purity random numbers based on quantum events at most tens of ns in the past. The system satisfies the stringent requirements of quantum non-locality tests that aim to close the timing loophole. We describe the generation mechanism using spontaneous-emission-driven phase diffusion in a semiconductor laser, digitization, and extraction by parity calculation using multi-GHz logic chips. We pay special attention to experimental proof of the quality of the random numbers and analysis of the randomness extraction. In contrast to widely-used models of randomness generators in the computer science literature, we argue that randomness generation by spontaneous emission can be extracted from a single source.

  13. Combined glyco- and protein-Fc engineering simultaneously enhance cytotoxicity and half-life of a therapeutic antibody.

    PubMed

    Monnet, Céline; Jorieux, Sylvie; Souyris, Nathalie; Zaki, Ouafa; Jacquet, Alexandra; Fournier, Nathalie; Crozet, Fabien; de Romeuf, Christophe; Bouayadi, Khalil; Urbain, Rémi; Behrens, Christian K; Mondon, Philippe; Fontayne, Alexandre

    2014-01-01

    While glyco-engineered monoclonal antibodies (mAbs) with improved antibody-dependent cell-mediated cytotoxicity (ADCC) are reaching the market, extensive efforts have also been made to improve their pharmacokinetic properties to generate biologically superior molecules. Most therapeutic mAbs are human or humanized IgG molecules whose half-life is dependent on the neonatal Fc receptor FcRn. FcRn reduces IgG catabolism by binding to the Fc domain of endocytosed IgG in acidic lysosomal compartments, allowing them to be recycled into the blood. Fc-engineered mAbs with increased FcRn affinity resulted in longer in vivo half-life in animal models, but also in healthy humans. These Fc-engineered mAbs were obtained by alanine scanning, directed mutagenesis or in silico approach of the FcRn binding site. In our approach, we applied a random mutagenesis technology (MutaGen™) to generate mutations evenly distributed over the whole Fc sequence of human IgG1. IgG variants with improved FcRn-binding were then isolated from these Fc-libraries using a pH-dependent phage display selection process. Two successive rounds of mutagenesis and selection were performed to identify several mutations that dramatically improve FcRn binding. Notably, many of these mutations were unpredictable by rational design as they were located distantly from the FcRn binding site, validating our random molecular approach. When produced on the EMABling(®) platform allowing effector function increase, our IgG variants retained both higher ADCC and higher FcRn binding. Moreover, these IgG variants exhibited longer half-life in human FcRn transgenic mice. These results clearly demonstrate that glyco-engineering to improve cytotoxicity and protein-engineering to increase half-life can be combined to further optimize therapeutic mAbs.

  14. CrowdPhase: crowdsourcing the phase problem

    PubMed Central

    Jorda, Julien; Sawaya, Michael R.; Yeates, Todd O.

    2014-01-01

    The human mind innately excels at some complex tasks that are difficult to solve using computers alone. For complex problems amenable to parallelization, strategies can be developed to exploit human intelligence in a collective form: such approaches are sometimes referred to as ‘crowdsourcing’. Here, a first attempt at a crowdsourced approach for low-resolution ab initio phasing in macromolecular crystallography is proposed. A collaborative online game named CrowdPhase was designed, which relies on a human-powered genetic algorithm, where players control the selection mechanism during the evolutionary process. The algorithm starts from a population of ‘individuals’, each with a random genetic makeup, in this case a map prepared from a random set of phases, and tries to cause the population to evolve towards individuals with better phases based on Darwinian survival of the fittest. Players apply their pattern-recognition capabilities to evaluate the electron-density maps generated from these sets of phases and to select the fittest individuals. A user-friendly interface, a training stage and a competitive scoring system foster a network of well trained players who can guide the genetic algorithm towards better solutions from generation to generation via gameplay. CrowdPhase was applied to two synthetic low-resolution phasing puzzles and it was shown that players could successfully obtain phase sets in the 30° phase error range and corresponding molecular envelopes showing agreement with the low-resolution models. The successful preliminary studies suggest that with further development the crowdsourcing approach could fill a gap in current crystallographic methods by making it possible to extract meaningful information in cases where limited resolution might otherwise prevent initial phasing. PMID:24914965

  15. An On-Demand Optical Quantum Random Number Generator with In-Future Action and Ultra-Fast Response

    PubMed Central

    Stipčević, Mario; Ursin, Rupert

    2015-01-01

    Random numbers are essential for our modern information based society e.g. in cryptography. Unlike frequently used pseudo-random generators, physical random number generators do not depend on complex algorithms but rather on a physicsal process to provide true randomness. Quantum random number generators (QRNG) do rely on a process, wich can be described by a probabilistic theory only, even in principle. Here we present a conceptualy simple implementation, which offers a 100% efficiency of producing a random bit upon a request and simultaneously exhibits an ultra low latency. A careful technical and statistical analysis demonstrates its robustness against imperfections of the actual implemented technology and enables to quickly estimate randomness of very long sequences. Generated random numbers pass standard statistical tests without any post-processing. The setup described, as well as the theory presented here, demonstrate the maturity and overall understanding of the technology. PMID:26057576

  16. Employing online quantum random number generators for generating truly random quantum states in Mathematica

    NASA Astrophysics Data System (ADS)

    Miszczak, Jarosław Adam

    2013-01-01

    The presented package for the Mathematica computing system allows the harnessing of quantum random number generators (QRNG) for investigating the statistical properties of quantum states. The described package implements a number of functions for generating random states. The new version of the package adds the ability to use the on-line quantum random number generator service and implements new functions for retrieving lists of random numbers. Thanks to the introduced improvements, the new version provides faster access to high-quality sources of random numbers and can be used in simulations requiring large amount of random data. New version program summaryProgram title: TRQS Catalogue identifier: AEKA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 18 134 No. of bytes in distributed program, including test data, etc.: 2 520 49 Distribution format: tar.gz Programming language: Mathematica, C. Computer: Any supporting Mathematica in version 7 or higher. Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit). RAM: Case-dependent Supplementary material: Fig. 1 mentioned below can be downloaded. Classification: 4.15. External routines: Quantis software library (http://www.idquantique.com/support/quantis-trng.html) Catalogue identifier of previous version: AEKA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183(2012)118 Does the new version supersede the previous version?: Yes Nature of problem: Generation of random density matrices and utilization of high-quality random numbers for the purpose of computer simulation. Solution method: Use of a physical quantum random number generator and an on-line service providing access to the source of true random numbers generated by quantum real number generator. Reasons for new version: Added support for the high-speed on-line quantum random number generator and improved methods for retrieving lists of random numbers. Summary of revisions: The presented version provides two signicant improvements. The first one is the ability to use the on-line Quantum Random Number Generation service developed by PicoQuant GmbH and the Nano-Optics groups at the Department of Physics of Humboldt University. The on-line service supported in the version 2.0 of the TRQS package provides faster access to true randomness sources constructed using the laws of quantum physics. The service is freely available at https://qrng.physik.hu-berlin.de/. The use of this service allows using the presented package with the need of a physical quantum random number generator. The second improvement introduced in this version is the ability to retrieve arrays of random data directly for the used source. This increases the speed of the random number generation, especially in the case of an on-line service, where it reduces the time necessary to establish the connection. Thanks to the speed improvement of the presented version, the package can now be used in simulations requiring larger amounts of random data. Moreover, the functions for generating random numbers provided by the current version of the package more closely follow the pattern of functions for generating pseudo- random numbers provided in Mathematica. Additional comments: Speed comparison: The implementation of the support for the QRNG on-line service provides a noticeable improvement in the speed of random number generation. For the samples of real numbers of size 101; 102,…,107 the times required to generate these samples using Quantis USB device and QRNG service are compared in Fig. 1. The presented results show that the use of the on-line service provides faster access to random numbers. One should note, however, that the speed gain can increase or decrease depending on the connection speed between the computer and the server providing random numbers. Running time: Depends on the used source of randomness and the amount of random data used in the experiment. References: [1] M. Wahl, M. Leifgen, M. Berlin, T. Röhlicke, H.-J. Rahn, O. Benson., An ultrafast quantum random number generator with provably bounded output bias based on photon arrival time measurements, Applied Physics Letters, Vol. 098, 171105 (2011). http://dx.doi.org/10.1063/1.3578456.

  17. Security of practical private randomness generation

    NASA Astrophysics Data System (ADS)

    Pironio, Stefano; Massar, Serge

    2013-01-01

    Measurements on entangled quantum systems necessarily yield outcomes that are intrinsically unpredictable if they violate a Bell inequality. This property can be used to generate certified randomness in a device-independent way, i.e., without making detailed assumptions about the internal working of the quantum devices used to generate the random numbers. Furthermore these numbers are also private; i.e., they appear random not only to the user but also to any adversary that might possess a perfect description of the devices. Since this process requires a small initial random seed to sample the behavior of the quantum devices and to extract uniform randomness from the raw outputs of the devices, one usually speaks of device-independent randomness expansion. The purpose of this paper is twofold. First, we point out that in most real, practical situations, where the concept of device independence is used as a protection against unintentional flaws or failures of the quantum apparatuses, it is sufficient to show that the generated string is random with respect to an adversary that holds only classical side information; i.e., proving randomness against quantum side information is not necessary. Furthermore, the initial random seed does not need to be private with respect to the adversary, provided that it is generated in a way that is independent from the measured systems. The devices, however, will generate cryptographically secure randomness that cannot be predicted by the adversary, and thus one can, given access to free public randomness, talk about private randomness generation. The theoretical tools to quantify the generated randomness according to these criteria were already introduced in S. Pironio [Nature (London)NATUAS0028-083610.1038/nature09008 464, 1021 (2010)], but the final results were improperly formulated. The second aim of this paper is to correct this inaccurate formulation and therefore lay out a precise theoretical framework for practical device-independent randomness generation.

  18. The RANDOM computer program: A linear congruential random number generator

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  19. The Effect of Trimethoprim on Serum Folate Levels in Humans: A Randomized, Double-Blind, Placebo-Controlled Trial.

    PubMed

    Meidahl Petersen, Kasper; Eplov, Kasper; Kjær Nielsen, Torben; Jimenez-Solem, Espen; Petersen, Morten; Broedbaek, Kasper; Daugaard Popik, Sara; Kallehave Hansen, Lina; Enghusen Poulsen, Henrik; Trærup Andersen, Jon

    2016-01-01

    Trimethoprim antagonize the actions of folate by inhibition of dihydrofolate reductase. This could diminish serum folate levels in humans and causes folate deficiency in some patients. We conducted a randomized, double-blind, placebo-controlled trial, to investigate the effect of trimethoprim on serum folate levels in healthy participants after a 7-day trial period. Thirty young, healthy males were randomly allocated to receive trimethoprim, 200 mg twice daily, and 30 were randomly allocated to placebo. Before trial initiation, participant numbers were given randomly generated treatment allocations within sealed opaque envelopes. Participants and all staff were kept blinded to treatment allocations during the trial. Serum folate was measured at baseline and at end of trial. In the 58 participants analyzed (30 in the trimethoprim group and 28 in the placebo group), 8 had folate deficiency at baseline. Within the trimethoprim group, serum folate was significantly decreased (P = 0.018) after the trial. We found a mean decrease in serum folate among trimethoprim exposed of 1.95 nmol/L, compared with a 0.21 nmol/L mean increase in the placebo group (P = 0.040). The proportion of folate-deficient participants increased significantly within the trimethoprim group (P = 0.034). No serious adverse events were observed. In conclusion, we found that a daily dose of 400 mg trimethoprim for 7 days significantly lowered serum folate levels in healthy study participants.

  20. Connectivity in the human brain dissociates entropy and complexity of auditory inputs☆

    PubMed Central

    Nastase, Samuel A.; Iacovella, Vittorio; Davis, Ben; Hasson, Uri

    2015-01-01

    Complex systems are described according to two central dimensions: (a) the randomness of their output, quantified via entropy; and (b) their complexity, which reflects the organization of a system's generators. Whereas some approaches hold that complexity can be reduced to uncertainty or entropy, an axiom of complexity science is that signals with very high or very low entropy are generated by relatively non-complex systems, while complex systems typically generate outputs with entropy peaking between these two extremes. In understanding their environment, individuals would benefit from coding for both input entropy and complexity; entropy indexes uncertainty and can inform probabilistic coding strategies, whereas complexity reflects a concise and abstract representation of the underlying environmental configuration, which can serve independent purposes, e.g., as a template for generalization and rapid comparisons between environments. Using functional neuroimaging, we demonstrate that, in response to passively processed auditory inputs, functional integration patterns in the human brain track both the entropy and complexity of the auditory signal. Connectivity between several brain regions scaled monotonically with input entropy, suggesting sensitivity to uncertainty, whereas connectivity between other regions tracked entropy in a convex manner consistent with sensitivity to input complexity. These findings suggest that the human brain simultaneously tracks the uncertainty of sensory data and effectively models their environmental generators. PMID:25536493

  1. Generating Adaptive Behaviour within a Memory-Prediction Framework

    PubMed Central

    Rawlinson, David; Kowadlo, Gideon

    2012-01-01

    The Memory-Prediction Framework (MPF) and its Hierarchical-Temporal Memory implementation (HTM) have been widely applied to unsupervised learning problems, for both classification and prediction. To date, there has been no attempt to incorporate MPF/HTM in reinforcement learning or other adaptive systems; that is, to use knowledge embodied within the hierarchy to control a system, or to generate behaviour for an agent. This problem is interesting because the human neocortex is believed to play a vital role in the generation of behaviour, and the MPF is a model of the human neocortex. We propose some simple and biologically-plausible enhancements to the Memory-Prediction Framework. These cause it to explore and interact with an external world, while trying to maximize a continuous, time-varying reward function. All behaviour is generated and controlled within the MPF hierarchy. The hierarchy develops from a random initial configuration by interaction with the world and reinforcement learning only. Among other demonstrations, we show that a 2-node hierarchy can learn to successfully play “rocks, paper, scissors” against a predictable opponent. PMID:22272231

  2. Generation of physical random numbers by using homodyne detection

    NASA Astrophysics Data System (ADS)

    Hirakawa, Kodai; Oya, Shota; Oguri, Yusuke; Ichikawa, Tsubasa; Eto, Yujiro; Hirano, Takuya; Tsurumaru, Toyohiro

    2016-10-01

    Physical random numbers generated by quantum measurements are, in principle, impossible to predict. We have demonstrated the generation of physical random numbers by using a high-speed balanced photodetector to measure the quadrature amplitudes of vacuum states. Using this method, random numbers were generated at 500 Mbps, which is more than one order of magnitude faster than previously [Gabriel et al:, Nature Photonics 4, 711-715 (2010)]. The Crush test battery of the TestU01 suite consists of 31 tests in 144 variations, and we used them to statistically analyze these numbers. The generated random numbers passed 14 of the 31 tests. To improve the randomness, we performed a hash operation, in which each random number was multiplied by a random Toeplitz matrix; the resulting numbers passed all of the tests in the TestU01 Crush battery.

  3. Unbiased All-Optical Random-Number Generator

    NASA Astrophysics Data System (ADS)

    Steinle, Tobias; Greiner, Johannes N.; Wrachtrup, Jörg; Giessen, Harald; Gerhardt, Ilja

    2017-10-01

    The generation of random bits is of enormous importance in modern information science. Cryptographic security is based on random numbers which require a physical process for their generation. This is commonly performed by hardware random-number generators. These often exhibit a number of problems, namely experimental bias, memory in the system, and other technical subtleties, which reduce the reliability in the entropy estimation. Further, the generated outcome has to be postprocessed to "iron out" such spurious effects. Here, we present a purely optical randomness generator, based on the bistable output of an optical parametric oscillator. Detector noise plays no role and postprocessing is reduced to a minimum. Upon entering the bistable regime, initially the resulting output phase depends on vacuum fluctuations. Later, the phase is rigidly locked and can be well determined versus a pulse train, which is derived from the pump laser. This delivers an ambiguity-free output, which is reliably detected and associated with a binary outcome. The resulting random bit stream resembles a perfect coin toss and passes all relevant randomness measures. The random nature of the generated binary outcome is furthermore confirmed by an analysis of resulting conditional entropies.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu

    Quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a highmore » speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.« less

  5. Analysis of Uniform Random Numbers Generated by Randu and Urn Ten Different Seeds.

    DTIC Science & Technology

    The statistical properties of the numbers generated by two uniform random number generators, RANDU and URN, each using ten different seeds are...The testing is performed on a sequence of 50,000 numbers generated by each uniform random number generator using each of the ten seeds . (Author)

  6. Multi-Disciplinary Knowledge Synthesis for Human Health Assessment on Earth and in Space

    NASA Astrophysics Data System (ADS)

    Christakos, G.

    We discuss methodological developments in multi-disciplinary knowledge synthesis (KS) of human health assessment. A theoretical KS framework can provide the rational means for the assimilation of various information bases (general, site-specific etc.) that are relevant to the life system of interest. KS-based techniques produce a realistic representation of the system, provide a rigorous assessment of the uncertainty sources, and generate informative health state predictions across space-time. The underlying epistemic cognition methodology is based on teleologic criteria and stochastic logic principles. The mathematics of KS involves a powerful and versatile spatiotemporal random field model that accounts rigorously for the uncertainty features of the life system and imposes no restriction on the shape of the probability distributions or the form of the predictors. KS theory is instrumental in understanding natural heterogeneities, assessing crucial human exposure correlations and laws of physical change, and explaining toxicokinetic mechanisms and dependencies in a spatiotemporal life system domain. It is hoped that a better understanding of KS fundamentals would generate multi-disciplinary models that are useful for the maintenance of human health on Earth and in Space.

  7. PUFKEY: A High-Security and High-Throughput Hardware True Random Number Generator for Sensor Networks

    PubMed Central

    Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin

    2015-01-01

    Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks. PMID:26501283

  8. PUFKEY: a high-security and high-throughput hardware true random number generator for sensor networks.

    PubMed

    Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin

    2015-10-16

    Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks.

  9. High-throughput sequencing of complete human mtDNA genomes from the Caucasus and West Asia: high diversity and demographic inferences.

    PubMed

    Schönberg, Anna; Theunert, Christoph; Li, Mingkun; Stoneking, Mark; Nasidze, Ivan

    2011-09-01

    To investigate the demographic history of human populations from the Caucasus and surrounding regions, we used high-throughput sequencing to generate 147 complete mtDNA genome sequences from random samples of individuals from three groups from the Caucasus (Armenians, Azeri and Georgians), and one group each from Iran and Turkey. Overall diversity is very high, with 144 different sequences that fall into 97 different haplogroups found among the 147 individuals. Bayesian skyline plots (BSPs) of population size change through time show a population expansion around 40-50 kya, followed by a constant population size, and then another expansion around 15-18 kya for the groups from the Caucasus and Iran. The BSP for Turkey differs the most from the others, with an increase from 35 to 50 kya followed by a prolonged period of constant population size, and no indication of a second period of growth. An approximate Bayesian computation approach was used to estimate divergence times between each pair of populations; the oldest divergence times were between Turkey and the other four groups from the South Caucasus and Iran (~400-600 generations), while the divergence time of the three Caucasus groups from each other was comparable to their divergence time from Iran (average of ~360 generations). These results illustrate the value of random sampling of complete mtDNA genome sequences that can be obtained with high-throughput sequencing platforms.

  10. A rule-based software test data generator

    NASA Technical Reports Server (NTRS)

    Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

    1991-01-01

    Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

  11. The metabolic and endocrine response and health implications of consuming sugar-sweetened beverages: findings from recent randomized controlled trials.

    PubMed

    Rippe, James M

    2013-11-01

    Fructose-containing sugars, including fructose itself, high fructose corn syrup (HFCS), and sucrose have engendered considerable controversy. The effects of HFCS and sucrose in sugar-sweetened beverages, in particular, have generated intense scientific debate that has spilled over to the public. This controversy is related to well-known differences in metabolism between fructose and glucose in the liver. In addition, research studies have often been conducted comparing pure fructose and pure glucose even though neither is consumed to any appreciable degree in isolation in the human diet. Other evidence has been drawn from animal studies and epidemiologic or cohort studies. Few randomized controlled trials (RCTs) have compared HFCS with sucrose (the 2 sugars most commonly consumed in the human diet) at dosage amounts within the normal human consumption range. This review compares results of recently concluded RCTs with other forms of evidence related to fructose, HFCS, and sucrose. We conclude that great caution must be used when suggesting adverse health effects of consuming these sugars in the normal way they are consumed and at the normal amounts in the human diet, because RCTs do not support adverse health consequences at these doses when employing these sugars.

  12. Folded Elastic Strip-Based Triboelectric Nanogenerator for Harvesting Human Motion Energy for Multiple Applications.

    PubMed

    Kang, Yue; Wang, Bo; Dai, Shuge; Liu, Guanlin; Pu, Yanping; Hu, Chenguo

    2015-09-16

    A folded elastic strip-based triboelectric nanogenerator (FS-TENG) made from two folded double-layer elastic strips of Al/PET and PTFE/PET can achieve multiple functions by low frequency mechanical motion. A single FS-TENG with strip width of 3 cm and length of 27 cm can generate a maximum output current, open-circuit voltage, and peak power of 55 μA, 840 V, and 7.33 mW at deformation frequency of 4 Hz with amplitude of 2.5 cm, respectively. This FS-TENG can work as a weight sensor due to its good elasticity. An integrated generator assembled by four FS-TENGs (IFS-TENG) can harvest the energy of human motion like flapping hands and walking steps. In addition, the IFS-TENG combined with electromagnetically induced electricity can achieve a completely self-driven doorbell with flashing lights. Moreover, a box-like generator integrated by four IFS-TENGs inside can work in horizontal or random motion modes and can be improved to harvest energy in all directions. This work promotes the research of completely self-driven systems and energy harvesting of human motion for applications in our daily life.

  13. Video quality assessment method motivated by human visual perception

    NASA Astrophysics Data System (ADS)

    He, Meiling; Jiang, Gangyi; Yu, Mei; Song, Yang; Peng, Zongju; Shao, Feng

    2016-11-01

    Research on video quality assessment (VQA) plays a crucial role in improving the efficiency of video coding and the performance of video processing. It is well acknowledged that the motion energy model generates motion energy responses in a middle temporal area by simulating the receptive field of neurons in V1 for the motion perception of the human visual system. Motivated by the biological evidence for the visual motion perception, a VQA method is proposed in this paper, which comprises the motion perception quality index and the spatial index. To be more specific, the motion energy model is applied to evaluate the temporal distortion severity of each frequency component generated from the difference of Gaussian filter bank, which produces the motion perception quality index, and the gradient similarity measure is used to evaluate the spatial distortion of the video sequence to get the spatial quality index. The experimental results of the LIVE, CSIQ, and IVP video databases demonstrate that the random forests regression technique trained by the generated quality indices is highly correspondent to human visual perception and has many significant improvements than comparable well-performing methods. The proposed method has higher consistency with subjective perception and higher generalization capability.

  14. A model-based analysis of a display for helicopter landing approach. [control theoretical model of human pilot

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Wheat, L. W.

    1975-01-01

    A control theoretic model of the human pilot was used to analyze a baseline electronic cockpit display in a helicopter landing approach task. The head down display was created on a stroke written cathode ray tube and the vehicle was a UH-1H helicopter. The landing approach task consisted of maintaining prescribed groundspeed and glideslope in the presence of random vertical and horizontal turbulence. The pilot model was also used to generate and evaluate display quickening laws designed to improve pilot vehicle performance. A simple fixed base simulation provided comparative tracking data.

  15. TTC25 Deficiency Results in Defects of the Outer Dynein Arm Docking Machinery and Primary Ciliary Dyskinesia with Left-Right Body Asymmetry Randomization.

    PubMed

    Wallmeier, Julia; Shiratori, Hidetaka; Dougherty, Gerard W; Edelbusch, Christine; Hjeij, Rim; Loges, Niki T; Menchen, Tabea; Olbrich, Heike; Pennekamp, Petra; Raidt, Johanna; Werner, Claudius; Minegishi, Katsura; Shinohara, Kyosuke; Asai, Yasuko; Takaoka, Katsuyoshi; Lee, Chanjae; Griese, Matthias; Memari, Yasin; Durbin, Richard; Kolb-Kokocinski, Anja; Sauer, Sascha; Wallingford, John B; Hamada, Hiroshi; Omran, Heymut

    2016-08-04

    Multiprotein complexes referred to as outer dynein arms (ODAs) develop the main mechanical force to generate the ciliary and flagellar beat. ODA defects are the most common cause of primary ciliary dyskinesia (PCD), a congenital disorder of ciliary beating, characterized by recurrent infections of the upper and lower airways, as well as by progressive lung failure and randomization of left-right body asymmetry. Using a whole-exome sequencing approach, we identified recessive loss-of-function mutations within TTC25 in three individuals from two unrelated families affected by PCD. Mice generated by CRISPR/Cas9 technology and carrying a deletion of exons 2 and 3 in Ttc25 presented with laterality defects. Consistently, we observed immotile nodal cilia and missing leftward flow via particle image velocimetry. Furthermore, transmission electron microscopy (TEM) analysis in TTC25-deficient mice revealed an absence of ODAs. Consistent with our findings in mice, we were able to show loss of the ciliary ODAs in humans via TEM and immunofluorescence (IF) analyses. Additionally, IF analyses revealed an absence of the ODA docking complex (ODA-DC), along with its known components CCDC114, CCDC151, and ARMC4. Co-immunoprecipitation revealed interaction between the ODA-DC component CCDC114 and TTC25. Thus, here we report TTC25 as a new member of the ODA-DC machinery in humans and mice. Copyright © 2016 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  16. A fast ergodic algorithm for generating ensembles of equilateral random polygons

    NASA Astrophysics Data System (ADS)

    Varela, R.; Hinson, K.; Arsuaga, J.; Diao, Y.

    2009-03-01

    Knotted structures are commonly found in circular DNA and along the backbone of certain proteins. In order to properly estimate properties of these three-dimensional structures it is often necessary to generate large ensembles of simulated closed chains (i.e. polygons) of equal edge lengths (such polygons are called equilateral random polygons). However finding efficient algorithms that properly sample the space of equilateral random polygons is a difficult problem. Currently there are no proven algorithms that generate equilateral random polygons with its theoretical distribution. In this paper we propose a method that generates equilateral random polygons in a 'step-wise uniform' way. We prove that this method is ergodic in the sense that any given equilateral random polygon can be generated by this method and we show that the time needed to generate an equilateral random polygon of length n is linear in terms of n. These two properties make this algorithm a big improvement over the existing generating methods. Detailed numerical comparisons of our algorithm with other widely used algorithms are provided.

  17. Using Computer-Generated Random Numbers to Calculate the Lifetime of a Comet.

    ERIC Educational Resources Information Center

    Danesh, Iraj

    1991-01-01

    An educational technique to calculate the lifetime of a comet using software-generated random numbers is introduced to undergraduate physiques and astronomy students. Discussed are the generation and eligibility of the required random numbers, background literature related to the problem, and the solution to the problem using random numbers.…

  18. A Comparison of Three Random Number Generators for Aircraft Dynamic Modeling Applications

    NASA Technical Reports Server (NTRS)

    Grauer, Jared A.

    2017-01-01

    Three random number generators, which produce Gaussian white noise sequences, were compared to assess their suitability in aircraft dynamic modeling applications. The first generator considered was the MATLAB (registered) implementation of the Mersenne-Twister algorithm. The second generator was a website called Random.org, which processes atmospheric noise measured using radios to create the random numbers. The third generator was based on synthesis of the Fourier series, where the random number sequences are constructed from prescribed amplitude and phase spectra. A total of 200 sequences, each having 601 random numbers, for each generator were collected and analyzed in terms of the mean, variance, normality, autocorrelation, and power spectral density. These sequences were then applied to two problems in aircraft dynamic modeling, namely estimating stability and control derivatives from simulated onboard sensor data, and simulating flight in atmospheric turbulence. In general, each random number generator had good performance and is well-suited for aircraft dynamic modeling applications. Specific strengths and weaknesses of each generator are discussed. For Monte Carlo simulation, the Fourier synthesis method is recommended because it most accurately and consistently approximated Gaussian white noise and can be implemented with reasonable computational effort.

  19. Realization of a Quantum Random Generator Certified with the Kochen-Specker Theorem

    NASA Astrophysics Data System (ADS)

    Kulikov, Anatoly; Jerger, Markus; Potočnik, Anton; Wallraff, Andreas; Fedorov, Arkady

    2017-12-01

    Random numbers are required for a variety of applications from secure communications to Monte Carlo simulation. Yet randomness is an asymptotic property, and no output string generated by a physical device can be strictly proven to be random. We report an experimental realization of a quantum random number generator (QRNG) with randomness certified by quantum contextuality and the Kochen-Specker theorem. The certification is not performed in a device-independent way but through a rigorous theoretical proof of each outcome being value indefinite even in the presence of experimental imperfections. The analysis of the generated data confirms the incomputable nature of our QRNG.

  20. Realization of a Quantum Random Generator Certified with the Kochen-Specker Theorem.

    PubMed

    Kulikov, Anatoly; Jerger, Markus; Potočnik, Anton; Wallraff, Andreas; Fedorov, Arkady

    2017-12-15

    Random numbers are required for a variety of applications from secure communications to Monte Carlo simulation. Yet randomness is an asymptotic property, and no output string generated by a physical device can be strictly proven to be random. We report an experimental realization of a quantum random number generator (QRNG) with randomness certified by quantum contextuality and the Kochen-Specker theorem. The certification is not performed in a device-independent way but through a rigorous theoretical proof of each outcome being value indefinite even in the presence of experimental imperfections. The analysis of the generated data confirms the incomputable nature of our QRNG.

  1. Source-Independent Quantum Random Number Generation

    NASA Astrophysics Data System (ADS)

    Cao, Zhu; Zhou, Hongyi; Yuan, Xiao; Ma, Xiongfeng

    2016-01-01

    Quantum random number generators can provide genuine randomness by appealing to the fundamental principles of quantum mechanics. In general, a physical generator contains two parts—a randomness source and its readout. The source is essential to the quality of the resulting random numbers; hence, it needs to be carefully calibrated and modeled to achieve information-theoretical provable randomness. However, in practice, the source is a complicated physical system, such as a light source or an atomic ensemble, and any deviations in the real-life implementation from the theoretical model may affect the randomness of the output. To close this gap, we propose a source-independent scheme for quantum random number generation in which output randomness can be certified, even when the source is uncharacterized and untrusted. In our randomness analysis, we make no assumptions about the dimension of the source. For instance, multiphoton emissions are allowed in optical implementations. Our analysis takes into account the finite-key effect with the composable security definition. In the limit of large data size, the length of the input random seed is exponentially small compared to that of the output random bit. In addition, by modifying a quantum key distribution system, we experimentally demonstrate our scheme and achieve a randomness generation rate of over 5 ×103 bit /s .

  2. Quantum Random Number Generation Using a Quanta Image Sensor

    PubMed Central

    Amri, Emna; Felk, Yacine; Stucki, Damien; Ma, Jiaju; Fossum, Eric R.

    2016-01-01

    A new quantum random number generation method is proposed. The method is based on the randomness of the photon emission process and the single photon counting capability of the Quanta Image Sensor (QIS). It has the potential to generate high-quality random numbers with remarkable data output rate. In this paper, the principle of photon statistics and theory of entropy are discussed. Sample data were collected with QIS jot device, and its randomness quality was analyzed. The randomness assessment method and results are discussed. PMID:27367698

  3. A random forest approach for predicting the presence of Echinococcus multilocularis intermediate host Ochotona spp. presence in relation to landscape characteristics in western China

    PubMed Central

    Marston, Christopher G.; Danson, F. Mark; Armitage, Richard P.; Giraudoux, Patrick; Pleydell, David R.J.; Wang, Qian; Qui, Jiamin; Craig, Philip S.

    2014-01-01

    Understanding distribution patterns of hosts implicated in the transmission of zoonotic disease remains a key goal of parasitology. Here, random forests are employed to model spatial patterns of the presence of the plateau pika (Ochotona spp.) small mammal intermediate host for the parasitic tapeworm Echinococcus multilocularis which is responsible for a significant burden of human zoonoses in western China. Landsat ETM+ satellite imagery and digital elevation model data were utilized to generate quantified measures of environmental characteristics across a study area in Sichuan Province, China. Land cover maps were generated identifying the distribution of specific land cover types, with landscape metrics employed to describe the spatial organisation of land cover patches. Random forests were used to model spatial patterns of Ochotona spp. presence, enabling the relative importance of the environmental characteristics in relation to Ochotona spp. presence to be ranked. An index of habitat aggregation was identified as the most important variable in influencing Ochotona spp. presence, with area of degraded grassland the most important land cover class variable. 71% of the variance in Ochotona spp. presence was explained, with a 90.98% accuracy rate as determined by ‘out-of-bag’ error assessment. Identification of the environmental characteristics influencing Ochotona spp. presence enables us to better understand distribution patterns of hosts implicated in the transmission of Em. The predictive mapping of this Em host enables the identification of human populations at increased risk of infection, enabling preventative strategies to be adopted. PMID:25386042

  4. Nanofiber Orientation and Surface Functionalization Modulate Human Mesenchymal Stem Cell Behavior In Vitro

    PubMed Central

    Kolambkar, Yash M.; Bajin, Mehmet; Wojtowicz, Abigail; Hutmacher, Dietmar W.; García, Andrés J.

    2014-01-01

    Electrospun nanofiber meshes have emerged as a new generation of scaffold membranes possessing a number of features suitable for tissue regeneration. One of these features is the flexibility to modify their structure and composition to orchestrate specific cellular responses. In this study, we investigated the effects of nanofiber orientation and surface functionalization on human mesenchymal stem cell (hMSC) migration and osteogenic differentiation. We used an in vitro model to examine hMSC migration into a cell-free zone on nanofiber meshes and mitomycin C treatment to assess the contribution of proliferation to the observed migration. Poly (ɛ-caprolactone) meshes with oriented topography were created by electrospinning aligned nanofibers on a rotating mandrel, while randomly oriented controls were collected on a stationary collector. Both aligned and random meshes were coated with a triple-helical, type I collagen-mimetic peptide, containing the glycine-phenylalanine-hydroxyproline-glycine-glutamate-arginine (GFOGER) motif. Our results indicate that nanofiber GFOGER peptide functionalization and orientation modulate cellular behavior, individually, and in combination. GFOGER significantly enhanced the migration, proliferation, and osteogenic differentiation of hMSCs on nanofiber meshes. Aligned nanofiber meshes displayed increased cell migration along the direction of fiber orientation compared to random meshes; however, fiber alignment did not influence osteogenic differentiation. Compared to each other, GFOGER coating resulted in a higher proliferation-driven cell migration, whereas fiber orientation appeared to generate a larger direct migratory effect. This study demonstrates that peptide surface modification and topographical cues associated with fiber alignment can be used to direct cellular behavior on nanofiber mesh scaffolds, which may be exploited for tissue regeneration. PMID:24020454

  5. Is walking a random walk? Evidence for long-range correlations in stride interval of human gait

    NASA Technical Reports Server (NTRS)

    Hausdorff, Jeffrey M.; Peng, C.-K.; Ladin, Zvi; Wei, Jeanne Y.; Goldberger, Ary L.

    1995-01-01

    Complex fluctuation of unknown origin appear in the normal gait pattern. These fluctuations might be described as being (1) uncorrelated white noise, (2) short-range correlations, or (3) long-range correlations with power-law scaling. To test these possibilities, the stride interval of 10 healthy young men was measured as they walked for 9 min at their usual rate. From these time series we calculated scaling indexes by using a modified random walk analysis and power spectral analysis. Both indexes indicated the presence of long-range self-similar correlations extending over hundreds of steps; the stride interval at any time depended on the stride interval at remote previous times, and this dependence decayed in a scale-free (fractallike) power-law fashion. These scaling indexes were significantly different from those obtained after random shuffling of the original time series, indicating the importance of the sequential ordering of the stride interval. We demonstrate that conventional models of gait generation fail to reproduce the observed scaling behavior and introduce a new type of central pattern generator model that sucessfully accounts for the experimentally observed long-range correlations.

  6. Image encryption using random sequence generated from generalized information domain

    NASA Astrophysics Data System (ADS)

    Xia-Yan, Zhang; Guo-Ji, Zhang; Xuan, Li; Ya-Zhou, Ren; Jie-Hua, Wu

    2016-05-01

    A novel image encryption method based on the random sequence generated from the generalized information domain and permutation-diffusion architecture is proposed. The random sequence is generated by reconstruction from the generalized information file and discrete trajectory extraction from the data stream. The trajectory address sequence is used to generate a P-box to shuffle the plain image while random sequences are treated as keystreams. A new factor called drift factor is employed to accelerate and enhance the performance of the random sequence generator. An initial value is introduced to make the encryption method an approximately one-time pad. Experimental results show that the random sequences pass the NIST statistical test with a high ratio and extensive analysis demonstrates that the new encryption scheme has superior security.

  7. A generator for unique quantum random numbers based on vacuum states

    NASA Astrophysics Data System (ADS)

    Gabriel, Christian; Wittmann, Christoffer; Sych, Denis; Dong, Ruifang; Mauerer, Wolfgang; Andersen, Ulrik L.; Marquardt, Christoph; Leuchs, Gerd

    2010-10-01

    Random numbers are a valuable component in diverse applications that range from simulations over gambling to cryptography. The quest for true randomness in these applications has engendered a large variety of different proposals for producing random numbers based on the foundational unpredictability of quantum mechanics. However, most approaches do not consider that a potential adversary could have knowledge about the generated numbers, so the numbers are not verifiably random and unique. Here we present a simple experimental setup based on homodyne measurements that uses the purity of a continuous-variable quantum vacuum state to generate unique random numbers. We use the intrinsic randomness in measuring the quadratures of a mode in the lowest energy vacuum state, which cannot be correlated to any other state. The simplicity of our source, combined with its verifiably unique randomness, are important attributes for achieving high-reliability, high-speed and low-cost quantum random number generators.

  8. Random ambience using high fidelity images

    NASA Astrophysics Data System (ADS)

    Abu, Nur Azman; Sahib, Shahrin

    2011-06-01

    Most of the secure communication nowadays mandates true random keys as an input. These operations are mostly designed and taken care of by the developers of the cryptosystem. Due to the nature of confidential crypto development today, pseudorandom keys are typically designed and still preferred by the developers of the cryptosystem. However, these pseudorandom keys are predictable, periodic and repeatable, hence they carry minimal entropy. True random keys are believed to be generated only via hardware random number generators. Careful statistical analysis is still required to have any confidence the process and apparatus generates numbers that are sufficiently random to suit the cryptographic use. In this underlying research, each moment in life is considered unique in itself. The random key is unique for the given moment generated by the user whenever he or she needs the random keys in practical secure communication. An ambience of high fidelity digital image shall be tested for its randomness according to the NIST Statistical Test Suite. Recommendation on generating a simple 4 megabits per second random cryptographic keys live shall be reported.

  9. Superparamagnetic perpendicular magnetic tunnel junctions for true random number generators

    NASA Astrophysics Data System (ADS)

    Parks, Bradley; Bapna, Mukund; Igbokwe, Julianne; Almasi, Hamid; Wang, Weigang; Majetich, Sara A.

    2018-05-01

    Superparamagnetic perpendicular magnetic tunnel junctions are fabricated and analyzed for use in random number generators. Time-resolved resistance measurements are used as streams of bits in statistical tests for randomness. Voltage control of the thermal stability enables tuning the average speed of random bit generation up to 70 kHz in a 60 nm diameter device. In its most efficient operating mode, the device generates random bits at an energy cost of 600 fJ/bit. A narrow range of magnetic field tunes the probability of a given state from 0 to 1, offering a means of probabilistic computing.

  10. Brownian motion properties of optoelectronic random bit generators based on laser chaos.

    PubMed

    Li, Pu; Yi, Xiaogang; Liu, Xianglian; Wang, Yuncai; Wang, Yongge

    2016-07-11

    The nondeterministic property of the optoelectronic random bit generator (RBG) based on laser chaos are experimentally analyzed from two aspects of the central limit theorem and law of iterated logarithm. The random bits are extracted from an optical feedback chaotic laser diode using a multi-bit extraction technique in the electrical domain. Our experimental results demonstrate that the generated random bits have no statistical distance from the Brownian motion, besides that they can pass the state-of-the-art industry-benchmark statistical test suite (NIST SP800-22). All of them give a mathematically provable evidence that the ultrafast random bit generator based on laser chaos can be used as a nondeterministic random bit source.

  11. Connectivity in the human brain dissociates entropy and complexity of auditory inputs.

    PubMed

    Nastase, Samuel A; Iacovella, Vittorio; Davis, Ben; Hasson, Uri

    2015-03-01

    Complex systems are described according to two central dimensions: (a) the randomness of their output, quantified via entropy; and (b) their complexity, which reflects the organization of a system's generators. Whereas some approaches hold that complexity can be reduced to uncertainty or entropy, an axiom of complexity science is that signals with very high or very low entropy are generated by relatively non-complex systems, while complex systems typically generate outputs with entropy peaking between these two extremes. In understanding their environment, individuals would benefit from coding for both input entropy and complexity; entropy indexes uncertainty and can inform probabilistic coding strategies, whereas complexity reflects a concise and abstract representation of the underlying environmental configuration, which can serve independent purposes, e.g., as a template for generalization and rapid comparisons between environments. Using functional neuroimaging, we demonstrate that, in response to passively processed auditory inputs, functional integration patterns in the human brain track both the entropy and complexity of the auditory signal. Connectivity between several brain regions scaled monotonically with input entropy, suggesting sensitivity to uncertainty, whereas connectivity between other regions tracked entropy in a convex manner consistent with sensitivity to input complexity. These findings suggest that the human brain simultaneously tracks the uncertainty of sensory data and effectively models their environmental generators. Copyright © 2014. Published by Elsevier Inc.

  12. High power transcranial beam steering for ultrasonic brain therapy

    PubMed Central

    Pernot, Mathieu; Aubry, Jean-François; Tanter, Mickaël; Thomas, Jean-Louis; Fink, Mathias

    2003-01-01

    A sparse phased array is specially designed for non-invasive ultrasound transskull brain therapy. The array is made of 200 single-elements corresponding to a new generation of high power transducers developed in collaboration with Imasonic (Besançon, France). Each element has a surface of 0.5cm2 and works at 0.9 MHz central frequency with a maximum 20W.cm−2 intensity on the transducer surface. In order to optimize the steering capabilities of the array, several transducers distributions on a spherical surface are simulated: hexagonal, annular, and quasi-random distributions. Using a quasi-random distribution significantly reduces the grating lobes. Furthermore, the simulations show the capability of the quasi-random array to electronically move the focal spot in the vicinity of the geometrical focus (up to +/− 15 mm). Based on the simulation study, the array is constructed and tested. The skull aberrations are corrected by using a time reversal mirror with amplitude correction achieved thanks to an implantable hydrophone, and a sharp focus is obtained through a human skull. Several lesions are induced in fresh liver and brain samples through human skulls, demonstrating the accuracy and the steering capabilities of the system. PMID:12974575

  13. Disaggregating census data for population mapping using random forests with remotely-sensed and ancillary data.

    PubMed

    Stevens, Forrest R; Gaughan, Andrea E; Linard, Catherine; Tatem, Andrew J

    2015-01-01

    High resolution, contemporary data on human population distributions are vital for measuring impacts of population growth, monitoring human-environment interactions and for planning and policy development. Many methods are used to disaggregate census data and predict population densities for finer scale, gridded population data sets. We present a new semi-automated dasymetric modeling approach that incorporates detailed census and ancillary data in a flexible, "Random Forest" estimation technique. We outline the combination of widely available, remotely-sensed and geospatial data that contribute to the modeled dasymetric weights and then use the Random Forest model to generate a gridded prediction of population density at ~100 m spatial resolution. This prediction layer is then used as the weighting surface to perform dasymetric redistribution of the census counts at a country level. As a case study we compare the new algorithm and its products for three countries (Vietnam, Cambodia, and Kenya) with other common gridded population data production methodologies. We discuss the advantages of the new method and increases over the accuracy and flexibility of those previous approaches. Finally, we outline how this algorithm will be extended to provide freely-available gridded population data sets for Africa, Asia and Latin America.

  14. High power transcranial beam steering for ultrasonic brain therapy

    NASA Astrophysics Data System (ADS)

    Pernot, M.; Aubry, J.-F.; Tanter, M.; Thomas, J.-L.; Fink, M.

    2003-08-01

    A sparse phased array is specially designed for non-invasive ultrasound transskull brain therapy. The array is made of 200 single elements corresponding to a new generation of high power transducers developed in collaboration with Imasonic (Besançon, France). Each element has a surface of 0.5 cm2 and works at 0.9 MHz central frequency with a maximum 20 W cm-2 intensity on the transducer surface. In order to optimize the steering capabilities of the array, several transducer distributions on a spherical surface are simulated: hexagonal, annular and quasi-random distributions. Using a quasi-random distribution significantly reduces the grating lobes. Furthermore, the simulations show the capability of the quasi-random array to electronically move the focal spot in the vicinity of the geometrical focus (up to +/-15 mm). Based on the simulation study, the array is constructed and tested. The skull aberrations are corrected by using a time reversal mirror with amplitude correction achieved thanks to an implantable hydrophone, and a sharp focus is obtained through a human skull. Several lesions are induced in fresh liver and brain samples through human skulls, demonstrating the accuracy and the steering capabilities of the system.

  15. Application of a haematopoetic progenitor cell-targeted adeno-associated viral (AAV) vector established by selection of an AAV random peptide library on a leukaemia cell line

    PubMed Central

    Stiefelhagen, Marius; Sellner, Leopold; Kleinschmidt, Jürgen A; Jauch, Anna; Laufs, Stephanie; Wenz, Frederik; Zeller, W Jens; Fruehauf, Stefan; Veldwijk, Marlon R

    2008-01-01

    Background For many promising target cells (e.g.: haematopoeitic progenitors), the susceptibility to standard adeno-associated viral (AAV) vectors is low. Advancements in vector development now allows the generation of target cell-selected AAV capsid mutants. Methods To determine its suitability, the method was applied on a chronic myelogenous leukaemia (CML) cell line (K562) to obtain a CML-targeted vector and the resulting vectors tested on leukaemia, non-leukaemia, primary human CML and CD34+ peripheral blood progenitor cells (PBPC); standard AAV2 and a random capsid mutant vector served as controls. Results Transduction of CML (BV173, EM3, K562 and Lama84) and AML (HL60 and KG1a) cell lines with the capsid mutants resulted in an up to 36-fold increase in CML transduction efficiency (K562: 2-fold, 60% ± 2% green fluorescent protein (GFP)+ cells; BV173: 9-fold, 37% ± 2% GFP+ cells; Lama84: 36-fold, 29% ± 2% GFP+ cells) compared to controls. For AML (KG1a, HL60) and one CML cell line (EM3), no significant transduction (<1% GFP+ cells) was observed for any vector. Although the capsid mutant clone was established on a cell line, proof-of-principle experiments using primary human cells were performed. For CML (3.2-fold, mutant: 1.75% ± 0.45% GFP+ cells, p = 0.03) and PBPC (3.5-fold, mutant: 4.21% ± 3.40% GFP+ cells) a moderate increase in gene transfer of the capsid mutant compared to control vectors was observed. Conclusion Using an AAV random peptide library on a CML cell line, we were able to generate a capsid mutant, which transduced CML cell lines and primary human haematopoietic progenitor cells with higher efficiency than standard recombinant AAV vectors. PMID:18789140

  16. Towards a high-speed quantum random number generator

    NASA Astrophysics Data System (ADS)

    Stucki, Damien; Burri, Samuel; Charbon, Edoardo; Chunnilall, Christopher; Meneghetti, Alessio; Regazzoni, Francesco

    2013-10-01

    Randomness is of fundamental importance in various fields, such as cryptography, numerical simulations, or the gaming industry. Quantum physics, which is fundamentally probabilistic, is the best option for a physical random number generator. In this article, we will present the work carried out in various projects in the context of the development of a commercial and certified high speed random number generator.

  17. Generating variable and random schedules of reinforcement using Microsoft Excel macros.

    PubMed

    Bancroft, Stacie L; Bourret, Jason C

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to write macros in Microsoft Excel that will generate variable-ratio, variable-interval, variable-time, random-ratio, random-interval, and random-time reinforcement schedule values.

  18. Markov and non-Markov processes in complex systems by the dynamical information entropy

    NASA Astrophysics Data System (ADS)

    Yulmetyev, R. M.; Gafarov, F. M.

    1999-12-01

    We consider the Markov and non-Markov processes in complex systems by the dynamical information Shannon entropy (DISE) method. The influence and important role of the two mutually dependent channels of entropy alternation (creation or generation of correlation) and anti-correlation (destroying or annihilation of correlation) have been discussed. The developed method has been used for the analysis of the complex systems of various natures: slow neutron scattering in liquid cesium, psychology (short-time numeral and pattern human memory and effect of stress on the dynamical taping-test), random dynamics of RR-intervals in human ECG (problem of diagnosis of various disease of the human cardio-vascular systems), chaotic dynamics of the parameters of financial markets and ecological systems.

  19. Utilisation of ISA Reverse Genetics and Large-Scale Random Codon Re-Encoding to Produce Attenuated Strains of Tick-Borne Encephalitis Virus within Days.

    PubMed

    de Fabritus, Lauriane; Nougairède, Antoine; Aubry, Fabien; Gould, Ernest A; de Lamballerie, Xavier

    2016-01-01

    Large-scale codon re-encoding is a new method of attenuating RNA viruses. However, the use of infectious clones to generate attenuated viruses has inherent technical problems. We previously developed a bacterium-free reverse genetics protocol, designated ISA, and now combined it with large-scale random codon-re-encoding method to produce attenuated tick-borne encephalitis virus (TBEV), a pathogenic flavivirus which causes febrile illness and encephalitis in humans. We produced wild-type (WT) and two re-encoded TBEVs, containing 273 or 273+284 synonymous mutations in the NS5 and NS5+NS3 coding regions respectively. Both re-encoded viruses were attenuated when compared with WT virus using a laboratory mouse model and the relative level of attenuation increased with the degree of re-encoding. Moreover, all infected animals produced neutralizing antibodies. This novel, rapid and efficient approach to engineering attenuated viruses could potentially expedite the development of safe and effective new-generation live attenuated vaccines.

  20. Dynamic Loads Generation for Multi-Point Vibration Excitation Problems

    NASA Technical Reports Server (NTRS)

    Shen, Lawrence

    2011-01-01

    A random-force method has been developed to predict dynamic loads produced by rocket-engine random vibrations for new rocket-engine designs. The method develops random forces at multiple excitation points based on random vibration environments scaled from accelerometer data obtained during hot-fire tests of existing rocket engines. This random-force method applies random forces to the model and creates expected dynamic response in a manner that simulates the way the operating engine applies self-generated random vibration forces (random pressure acting on an area) with the resulting responses that we measure with accelerometers. This innovation includes the methodology (implementation sequence), the computer code, two methods to generate the random-force vibration spectra, and two methods to reduce some of the inherent conservatism in the dynamic loads. This methodology would be implemented to generate the random-force spectra at excitation nodes without requiring the use of artificial boundary conditions in a finite element model. More accurate random dynamic loads than those predicted by current industry methods can then be generated using the random force spectra. The scaling method used to develop the initial power spectral density (PSD) environments for deriving the random forces for the rocket engine case is based on the Barrett Criteria developed at Marshall Space Flight Center in 1963. This invention approach can be applied in the aerospace, automotive, and other industries to obtain reliable dynamic loads and responses from a finite element model for any structure subject to multipoint random vibration excitations.

  1. Secure uniform random-number extraction via incoherent strategies

    NASA Astrophysics Data System (ADS)

    Hayashi, Masahito; Zhu, Huangjun

    2018-01-01

    To guarantee the security of uniform random numbers generated by a quantum random-number generator, we study secure extraction of uniform random numbers when the environment of a given quantum state is controlled by the third party, the eavesdropper. Here we restrict our operations to incoherent strategies that are composed of the measurement on the computational basis and incoherent operations (or incoherence-preserving operations). We show that the maximum secure extraction rate is equal to the relative entropy of coherence. By contrast, the coherence of formation gives the extraction rate when a certain constraint is imposed on the eavesdropper's operations. The condition under which the two extraction rates coincide is then determined. Furthermore, we find that the exponential decreasing rate of the leaked information is characterized by Rényi relative entropies of coherence. These results clarify the power of incoherent strategies in random-number generation, and can be applied to guarantee the quality of random numbers generated by a quantum random-number generator.

  2. Chemical name extraction based on automatic training data generation and rich feature set.

    PubMed

    Yan, Su; Spangler, W Scott; Chen, Ying

    2013-01-01

    The automation of extracting chemical names from text has significant value to biomedical and life science research. A major barrier in this task is the difficulty of getting a sizable and good quality data to train a reliable entity extraction model. Another difficulty is the selection of informative features of chemical names, since comprehensive domain knowledge on chemistry nomenclature is required. Leveraging random text generation techniques, we explore the idea of automatically creating training sets for the task of chemical name extraction. Assuming the availability of an incomplete list of chemical names, called a dictionary, we are able to generate well-controlled, random, yet realistic chemical-like training documents. We statistically analyze the construction of chemical names based on the incomplete dictionary, and propose a series of new features, without relying on any domain knowledge. Compared to state-of-the-art models learned from manually labeled data and domain knowledge, our solution shows better or comparable results in annotating real-world data with less human effort. Moreover, we report an interesting observation about the language for chemical names. That is, both the structural and semantic components of chemical names follow a Zipfian distribution, which resembles many natural languages.

  3. Certified randomness in quantum physics.

    PubMed

    Acín, Antonio; Masanes, Lluis

    2016-12-07

    The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.

  4. The BIG Bell Test: quantum physics experiments with direct public participation

    NASA Astrophysics Data System (ADS)

    Mitchell, Morgan; Abellan, Carlos; Tura, Jordi; Garcia Matos, Marta; Hirschmann, Alina; Beduini, Federica; Pruneri, Valerio; Acin, Antonio; Marti, Maria; BIG Bell Test Collaboration

    The BIG Bell Test is a suite of physics experiments - tests of quantum nonlocality, quantum communications, and related experiments - that use crowd-sourced human randomness as an experimental resource. By connecting participants - anyone with an internet connection - to state-of-the-art experiments on five continents, the project aims at two complementary goals: 1) to provide bits generated directly from human choices, a unique information resource, to physics experiments, and 2) to give the world public the opportunity to contribute in a meaningful way to quantum physics research. We also describe related outreach and educational efforts to spread awareness of quantum physics and its applications.

  5. Problems with the random number generator RANF implemented on the CDC cyber 205

    NASA Astrophysics Data System (ADS)

    Kalle, Claus; Wansleben, Stephan

    1984-10-01

    We show that using RANF may lead to wrong results when lattice models are simulated by Monte Carlo methods. We present a shift-register sequence random number generator which generates two random numbers per cycle on a two pipe CDC Cyber 205.

  6. Recommendations and illustrations for the evaluation of photonic random number generators

    NASA Astrophysics Data System (ADS)

    Hart, Joseph D.; Terashima, Yuta; Uchida, Atsushi; Baumgartner, Gerald B.; Murphy, Thomas E.; Roy, Rajarshi

    2017-09-01

    The never-ending quest to improve the security of digital information combined with recent improvements in hardware technology has caused the field of random number generation to undergo a fundamental shift from relying solely on pseudo-random algorithms to employing optical entropy sources. Despite these significant advances on the hardware side, commonly used statistical measures and evaluation practices remain ill-suited to understand or quantify the optical entropy that underlies physical random number generation. We review the state of the art in the evaluation of optical random number generation and recommend a new paradigm: quantifying entropy generation and understanding the physical limits of the optical sources of randomness. In order to do this, we advocate for the separation of the physical entropy source from deterministic post-processing in the evaluation of random number generators and for the explicit consideration of the impact of the measurement and digitization process on the rate of entropy production. We present the Cohen-Procaccia estimate of the entropy rate h (𝜖 ,τ ) as one way to do this. In order to provide an illustration of our recommendations, we apply the Cohen-Procaccia estimate as well as the entropy estimates from the new NIST draft standards for physical random number generators to evaluate and compare three common optical entropy sources: single photon time-of-arrival detection, chaotic lasers, and amplified spontaneous emission.

  7. Human behavioral complexity peaks at age 25

    PubMed Central

    Brugger, Peter

    2017-01-01

    Random Item Generation tasks (RIG) are commonly used to assess high cognitive abilities such as inhibition or sustained attention. They also draw upon our approximate sense of complexity. A detrimental effect of aging on pseudo-random productions has been demonstrated for some tasks, but little is as yet known about the developmental curve of cognitive complexity over the lifespan. We investigate the complexity trajectory across the lifespan of human responses to five common RIG tasks, using a large sample (n = 3429). Our main finding is that the developmental curve of the estimated algorithmic complexity of responses is similar to what may be expected of a measure of higher cognitive abilities, with a performance peak around 25 and a decline starting around 60, suggesting that RIG tasks yield good estimates of such cognitive abilities. Our study illustrates that very short strings of, i.e., 10 items, are sufficient to have their complexity reliably estimated and to allow the documentation of an age-dependent decline in the approximate sense of complexity. PMID:28406953

  8. Improved methods of DNA extraction from human spermatozoa that mitigate experimentally-induced oxidative DNA damage.

    PubMed

    Xavier, Miguel J; Nixon, Brett; Roman, Shaun D; Aitken, Robert John

    2018-01-01

    Current approaches for DNA extraction and fragmentation from mammalian spermatozoa provide several challenges for the investigation of the oxidative stress burden carried in the genome of male gametes. Indeed, the potential introduction of oxidative DNA damage induced by reactive oxygen species, reducing agents (dithiothreitol or beta-mercaptoethanol), and DNA shearing techniques used in the preparation of samples for chromatin immunoprecipitation and next-generation sequencing serve to cofound the reliability and accuracy of the results obtained. Here we report optimised methodology that minimises, or completely eliminates, exposure to DNA damaging compounds during extraction and fragmentation procedures. Specifically, we show that Micrococcal nuclease (MNase) digestion prior to cellular lysis generates a greater DNA yield with minimal collateral oxidation while randomly fragmenting the entire paternal genome. This modified methodology represents a significant improvement over traditional fragmentation achieved via sonication in the preparation of genomic DNA from human spermatozoa for downstream applications, such as next-generation sequencing. We also present a redesigned bioinformatic pipeline framework adjusted to correctly analyse this form of data and detect statistically relevant targets of oxidation.

  9. Generating Variable and Random Schedules of Reinforcement Using Microsoft Excel Macros

    PubMed Central

    Bancroft, Stacie L; Bourret, Jason C

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to write macros in Microsoft Excel that will generate variable-ratio, variable-interval, variable-time, random-ratio, random-interval, and random-time reinforcement schedule values. PMID:18595286

  10. The Metabolic and Endocrine Response and Health Implications of Consuming Sugar-Sweetened Beverages: Findings From Recent Randomized Controlled Trials123

    PubMed Central

    Rippe, James M.

    2013-01-01

    Fructose-containing sugars, including fructose itself, high fructose corn syrup (HFCS), and sucrose have engendered considerable controversy. The effects of HFCS and sucrose in sugar-sweetened beverages, in particular, have generated intense scientific debate that has spilled over to the public. This controversy is related to well-known differences in metabolism between fructose and glucose in the liver. In addition, research studies have often been conducted comparing pure fructose and pure glucose even though neither is consumed to any appreciable degree in isolation in the human diet. Other evidence has been drawn from animal studies and epidemiologic or cohort studies. Few randomized controlled trials (RCTs) have compared HFCS with sucrose (the 2 sugars most commonly consumed in the human diet) at dosage amounts within the normal human consumption range. This review compares results of recently concluded RCTs with other forms of evidence related to fructose, HFCS, and sucrose. We conclude that great caution must be used when suggesting adverse health effects of consuming these sugars in the normal way they are consumed and at the normal amounts in the human diet, because RCTs do not support adverse health consequences at these doses when employing these sugars. PMID:24228199

  11. Programmable random interval generator

    NASA Technical Reports Server (NTRS)

    Lindsey, R. S., Jr.

    1973-01-01

    Random pulse generator can supply constant-amplitude randomly distributed pulses with average rate ranging from a few counts per second to more than one million counts per second. Generator requires no high-voltage power supply or any special thermal cooling apparatus. Device is uniquely versatile and provides wide dynamic range of operation.

  12. Low-contrast lesion detection in tomosynthetic breast imaging using a realistic breast phantom

    NASA Astrophysics Data System (ADS)

    Zhou, Lili; Oldan, Jorge; Fisher, Paul; Gindi, Gene

    2006-03-01

    Tomosynthesis mammography is a potentially valuable technique for detection of breast cancer. In this simulation study, we investigate the efficacy of three different tomographic reconstruction methods, EM, SART and Backprojection, in the context of an especially difficult mammographic detection task. The task is the detection of a very low-contrast mass embedded in very dense fibro-glandular tissue - a clinically useful task for which tomosynthesis may be well suited. The project uses an anatomically realistic 3D digital breast phantom whose normal anatomic variability limits lesion conspicuity. In order to capture anatomical object variability, we generate an ensemble of phantoms, each of which comprises random instances of various breast structures. We construct medium-sized 3D breast phantoms which model random instances of ductal structures, fibrous connective tissue, Cooper's ligaments and power law structural noise for small scale object variability. Random instances of 7-8 mm irregular masses are generated by a 3D random walk algorithm and placed in very dense fibro-glandular tissue. Several other components of the breast phantom are held fixed, i.e. not randomly generated. These include the fixed breast shape and size, nipple structure, fixed lesion location, and a pectoralis muscle. We collect low-dose data using an isocentric tomosynthetic geometry at 11 angles over 50 degrees and add Poisson noise. The data is reconstructed using the three algorithms. Reconstructed slices through the center of the lesion are presented to human observers in a 2AFC (two-alternative-forced-choice) test that measures detectability by computing AUC (area under the ROC curve). The data collected in each simulation includes two sources of variability, that due to the anatomical variability of the phantom and that due to the Poisson data noise. We found that for this difficult task that the AUC value for EM (0.89) was greater than that for SART (0.83) and Backprojection (0.66).

  13. Experimentally generated randomness certified by the impossibility of superluminal signals.

    PubMed

    Bierhorst, Peter; Knill, Emanuel; Glancy, Scott; Zhang, Yanbao; Mink, Alan; Jordan, Stephen; Rommal, Andrea; Liu, Yi-Kai; Christensen, Bradley; Nam, Sae Woo; Stevens, Martin J; Shalm, Lynden K

    2018-04-01

    From dice to modern electronic circuits, there have been many attempts to build better devices to generate random numbers. Randomness is fundamental to security and cryptographic systems and to safeguarding privacy. A key challenge with random-number generators is that it is hard to ensure that their outputs are unpredictable 1-3 . For a random-number generator based on a physical process, such as a noisy classical system or an elementary quantum measurement, a detailed model that describes the underlying physics is necessary to assert unpredictability. Imperfections in the model compromise the integrity of the device. However, it is possible to exploit the phenomenon of quantum non-locality with a loophole-free Bell test to build a random-number generator that can produce output that is unpredictable to any adversary that is limited only by general physical principles, such as special relativity 1-11 . With recent technological developments, it is now possible to carry out such a loophole-free Bell test 12-14,22 . Here we present certified randomness obtained from a photonic Bell experiment and extract 1,024 random bits that are uniformly distributed to within 10 -12 . These random bits could not have been predicted according to any physical theory that prohibits faster-than-light (superluminal) signalling and that allows independent measurement choices. To certify and quantify the randomness, we describe a protocol that is optimized for devices that are characterized by a low per-trial violation of Bell inequalities. Future random-number generators based on loophole-free Bell tests may have a role in increasing the security and trust of our cryptographic systems and infrastructure.

  14. Emergence of encounter networks due to human mobility.

    PubMed

    Riascos, A P; Mateos, José L

    2017-01-01

    There is a burst of work on human mobility and encounter networks. However, the connection between these two important fields just begun recently. It is clear that both are closely related: Mobility generates encounters, and these encounters might give rise to contagion phenomena or even friendship. We model a set of random walkers that visit locations in space following a strategy akin to Lévy flights. We measure the encounters in space and time and establish a link between walkers after they coincide several times. This generates a temporal network that is characterized by global quantities. We compare this dynamics with real data for two cities: New York City and Tokyo. We use data from the location-based social network Foursquare and obtain the emergent temporal encounter network, for these two cities, that we compare with our model. We found long-range (Lévy-like) distributions for traveled distances and time intervals that characterize the emergent social network due to human mobility. Studying this connection is important for several fields like epidemics, social influence, voting, contagion models, behavioral adoption and diffusion of ideas.

  15. Is urbanisation scrambling the genetic structure of human populations? A case study

    PubMed Central

    Ashrafian-Bonab, Maziar; Handley, Lori Lawson; Balloux, François

    2007-01-01

    Recent population expansion and increased migration linked to urbanisation are assumed to be eroding the genetic structure of human populations. We investigated change in population structure over three generations by analysing both demographic and mitochondrial DNA (mtDNA) data from a random sample of 2351 men from twenty-two Iranian populations. Potential changes in genetic diversity (θ) and genetic distance (FST) over the last three generations were analysed by assigning mtDNA sequences to populations based on the individual's place of birth or that of their mother or grandmother. Despite the fact that several areas included cities of over one million inhabitants, we detected no change in genetic diversity, and only a small decrease in population structure, except in the capital city (Tehran), which was characterised by massive immigration, increased θ and a large decrease in FST over time. Our results suggest that recent erosion of human population structure might not be as important as previously thought, except in some large conurbations, and this clearly has important implications for future sampling strategies. PMID:17106453

  16. Pseudo-random number generator for the Sigma 5 computer

    NASA Technical Reports Server (NTRS)

    Carroll, S. N.

    1983-01-01

    A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.

  17. Implementation of a quantum random number generator based on the optimal clustering of photocounts

    NASA Astrophysics Data System (ADS)

    Balygin, K. A.; Zaitsev, V. I.; Klimov, A. N.; Kulik, S. P.; Molotkov, S. N.

    2017-10-01

    To implement quantum random number generators, it is fundamentally important to have a mathematically provable and experimentally testable process of measurements of a system from which an initial random sequence is generated. This makes sure that randomness indeed has a quantum nature. A quantum random number generator has been implemented with the use of the detection of quasi-single-photon radiation by a silicon photomultiplier (SiPM) matrix, which makes it possible to reliably reach the Poisson statistics of photocounts. The choice and use of the optimal clustering of photocounts for the initial sequence of photodetection events and a method of extraction of a random sequence of 0's and 1's, which is polynomial in the length of the sequence, have made it possible to reach a yield rate of 64 Mbit/s of the output certainly random sequence.

  18. Spontaneous Generation of Infectious Prion Disease in Transgenic Mice

    PubMed Central

    Castilla, Joaquín; Pintado, Belén; Gutiérrez-Adan, Alfonso; Andréoletti, Olivier; Aguilar-Calvo, Patricia; Arroba, Ana-Isabel; Parra-Arrondo, Beatriz; Ferrer, Isidro; Manzanares, Jorge; Espinosa, Juan-Carlos

    2013-01-01

    We generated transgenic mice expressing bovine cellular prion protein (PrPC) with a leucine substitution at codon 113 (113L). This protein is homologous to human protein with mutation 102L, and its genetic link with Gerstmann–Sträussler–Scheinker syndrome has been established. This mutation in bovine PrPC causes a fully penetrant, lethal, spongiform encephalopathy. This genetic disease was transmitted by intracerebral inoculation of brain homogenate from ill mice expressing mutant bovine PrP to mice expressing wild-type bovine PrP, which indicated de novo generation of infectious prions. Our findings demonstrate that a single amino acid change in the PrPC sequence can induce spontaneous generation of an infectious prion disease that differs from all others identified in hosts expressing the same PrPC sequence. These observations support the view that a variety of infectious prion strains might spontaneously emerge in hosts displaying random genetic PrPC mutations. PMID:24274622

  19. CrowdPhase: crowdsourcing the phase problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jorda, Julien; Sawaya, Michael R.; Yeates, Todd O., E-mail: yeates@mbi.ucla.edu

    The idea of attacking the phase problem by crowdsourcing is introduced. Using an interactive, multi-player, web-based system, participants work simultaneously to select phase sets that correspond to better electron-density maps in order to solve low-resolution phasing problems. The human mind innately excels at some complex tasks that are difficult to solve using computers alone. For complex problems amenable to parallelization, strategies can be developed to exploit human intelligence in a collective form: such approaches are sometimes referred to as ‘crowdsourcing’. Here, a first attempt at a crowdsourced approach for low-resolution ab initio phasing in macromolecular crystallography is proposed. A collaborativemore » online game named CrowdPhase was designed, which relies on a human-powered genetic algorithm, where players control the selection mechanism during the evolutionary process. The algorithm starts from a population of ‘individuals’, each with a random genetic makeup, in this case a map prepared from a random set of phases, and tries to cause the population to evolve towards individuals with better phases based on Darwinian survival of the fittest. Players apply their pattern-recognition capabilities to evaluate the electron-density maps generated from these sets of phases and to select the fittest individuals. A user-friendly interface, a training stage and a competitive scoring system foster a network of well trained players who can guide the genetic algorithm towards better solutions from generation to generation via gameplay. CrowdPhase was applied to two synthetic low-resolution phasing puzzles and it was shown that players could successfully obtain phase sets in the 30° phase error range and corresponding molecular envelopes showing agreement with the low-resolution models. The successful preliminary studies suggest that with further development the crowdsourcing approach could fill a gap in current crystallographic methods by making it possible to extract meaningful information in cases where limited resolution might otherwise prevent initial phasing.« less

  20. Monitoring fibrous scaffold guidance of three-dimensional collagen organisation using minimally-invasive second harmonic generation.

    PubMed

    Delaine-Smith, Robin M; Green, Nicola H; Matcher, Stephen J; MacNeil, Sheila; Reilly, Gwendolen C

    2014-01-01

    The biological and mechanical function of connective tissues is largely determined by controlled cellular alignment and therefore it seems appropriate that tissue-engineered constructs should be architecturally similar to the in vivo tissue targeted for repair or replacement. Collagen organisation dictates the tensile properties of most tissues and so monitoring the deposition of cell-secreted collagen as the construct develops is essential for understanding tissue formation. In this study, electrospun fibres with a random or high degree of orientation, mimicking two types of tissue architecture found in the body, were used to culture human fibroblasts for controlling cell alignment. The minimally-invasive technique of second harmonic generation was used with the aim of monitoring and profiling the deposition and organisation of collagen at different construct depths over time while construct mechanical properties were also determined over the culture period. It was seen that scaffold fibre organisation affected cell migration and orientation up to 21 days which in turn had an effect on collagen organisation. Collagen in random fibrous constructs was deposited in alternating configurations at different depths however a high degree of organisation was observed throughout aligned fibrous constructs orientated in the scaffold fibre direction. Three-dimensional second harmonic generation images showed that deposited collagen was more uniformly distributed in random constructs but aligned constructs were more organised and had higher intensities. The tensile properties of all constructs increased with increasing collagen deposition and were ultimately dictated by collagen organisation. This study highlights the importance of scaffold architecture for controlling the development of well-organised tissue engineered constructs and the usefulness of second harmonic generation imaging for monitoring collagen maturation in a minimally invasive manner.

  1. Study of Randomness in AES Ciphertexts Produced by Randomly Generated S-Boxes and S-Boxes with Various Modulus and Additive Constant Polynomials

    NASA Astrophysics Data System (ADS)

    Das, Suman; Sadique Uz Zaman, J. K. M.; Ghosh, Ranjan

    2016-06-01

    In Advanced Encryption Standard (AES), the standard S-Box is conventionally generated by using a particular irreducible polynomial {11B} in GF(28) as the modulus and a particular additive constant polynomial {63} in GF(2), though it can be generated by many other polynomials. In this paper, it has been shown that it is possible to generate secured AES S-Boxes by using some other selected modulus and additive polynomials and also can be generated randomly, using a PRNG like BBS. A comparative study has been made on the randomness of corresponding AES ciphertexts generated, using these S-Boxes, by the NIST Test Suite coded for this paper. It has been found that besides using the standard one, other moduli and additive constants are also able to generate equally or better random ciphertexts; the same is true for random S-Boxes also. As these new types of S-Boxes are user-defined, hence unknown, they are able to prevent linear and differential cryptanalysis. Moreover, they act as additional key-inputs to AES, thus increasing the key-space.

  2. A Human-Automation Interface Model to Guide Automation of System Functions: A Way to Achieve Manning Goals in New Systems

    DTIC Science & Technology

    2006-06-01

    levels of automation applied as per Figure 13. .................................. 60 x THIS PAGE...models generated for this thesis were set to run for 60 minutes. To run the simulation for the set time, the analyst provides a random number seed to...1984). The IMPRINT 59 workload value of 60 has been used by a consensus of workload modeling SMEs to represent the ‘high’ threshold, while the

  3. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  4. Harvesting Entropy for Random Number Generation for Internet of Things Constrained Devices Using On-Board Sensors

    PubMed Central

    Pawlowski, Marcin Piotr; Jara, Antonio; Ogorzalek, Maciej

    2015-01-01

    Entropy in computer security is associated with the unpredictability of a source of randomness. The random source with high entropy tends to achieve a uniform distribution of random values. Random number generators are one of the most important building blocks of cryptosystems. In constrained devices of the Internet of Things ecosystem, high entropy random number generators are hard to achieve due to hardware limitations. For the purpose of the random number generation in constrained devices, this work proposes a solution based on the least-significant bits concatenation entropy harvesting method. As a potential source of entropy, on-board integrated sensors (i.e., temperature, humidity and two different light sensors) have been analyzed. Additionally, the costs (i.e., time and memory consumption) of the presented approach have been measured. The results obtained from the proposed method with statistical fine tuning achieved a Shannon entropy of around 7.9 bits per byte of data for temperature and humidity sensors. The results showed that sensor-based random number generators are a valuable source of entropy with very small RAM and Flash memory requirements for constrained devices of the Internet of Things. PMID:26506357

  5. Harvesting entropy for random number generation for internet of things constrained devices using on-board sensors.

    PubMed

    Pawlowski, Marcin Piotr; Jara, Antonio; Ogorzalek, Maciej

    2015-10-22

    Entropy in computer security is associated with the unpredictability of a source of randomness. The random source with high entropy tends to achieve a uniform distribution of random values. Random number generators are one of the most important building blocks of cryptosystems. In constrained devices of the Internet of Things ecosystem, high entropy random number generators are hard to achieve due to hardware limitations. For the purpose of the random number generation in constrained devices, this work proposes a solution based on the least-significant bits concatenation entropy harvesting method. As a potential source of entropy, on-board integrated sensors (i.e., temperature, humidity and two different light sensors) have been analyzed. Additionally, the costs (i.e., time and memory consumption) of the presented approach have been measured. The results obtained from the proposed method with statistical fine tuning achieved a Shannon entropy of around 7.9 bits per byte of data for temperature and humidity sensors. The results showed that sensor-based random number generators are a valuable source of entropy with very small RAM and Flash memory requirements for constrained devices of the Internet of Things.

  6. Truly random number generation: an example

    NASA Astrophysics Data System (ADS)

    Frauchiger, Daniela; Renner, Renato

    2013-10-01

    Randomness is crucial for a variety of applications, ranging from gambling to computer simulations, and from cryptography to statistics. However, many of the currently used methods for generating randomness do not meet the criteria that are necessary for these applications to work properly and safely. A common problem is that a sequence of numbers may look random but nevertheless not be truly random. In fact, the sequence may pass all standard statistical tests and yet be perfectly predictable. This renders it useless for many applications. For example, in cryptography, the predictability of a "andomly" chosen password is obviously undesirable. Here, we review a recently developed approach to generating true | and hence unpredictable | randomness.

  7. A data-driven wavelet-based approach for generating jumping loads

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Li, Guo; Racic, Vitomir

    2018-06-01

    This paper suggests an approach to generate human jumping loads using wavelet transform and a database of individual jumping force records. A total of 970 individual jumping force records of various frequencies were first collected by three experiments from 147 test subjects. For each record, every jumping pulse was extracted and decomposed into seven levels by wavelet transform. All the decomposition coefficients were stored in an information database. Probability distributions of jumping cycle period, contact ratio and energy of the jumping pulse were statistically analyzed. Inspired by the theory of DNA recombination, an approach was developed by interchanging the wavelet coefficients between different jumping pulses. To generate a jumping force time history with N pulses, wavelet coefficients were first selected randomly from the database at each level. They were then used to reconstruct N pulses by the inverse wavelet transform. Jumping cycle periods and contract ratios were then generated randomly based on their probabilistic functions. These parameters were assigned to each of the N pulses which were in turn scaled by the amplitude factors βi to account for energy relationship between successive pulses. The final jumping force time history was obtained by linking all the N cycles end to end. This simulation approach can preserve the non-stationary features of the jumping load force in time-frequency domain. Application indicates that this approach can be used to generate jumping force time history due to single people jumping and also can be extended further to stochastic jumping loads due to groups and crowds.

  8. DNA-based random number generation in security circuitry.

    PubMed

    Gearheart, Christy M; Arazi, Benjamin; Rouchka, Eric C

    2010-06-01

    DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. This research focuses on further developing DNA-based methodologies to mimic digital data manipulation. While exhibiting fundamental principles, this work was done in conjunction with the vision that DNA-based circuitry, when the technology matures, will form the basis for a tamper-proof security module, revolutionizing the meaning and concept of tamper-proofing and possibly preventing it altogether based on accurate scientific observations. A paramount part of such a solution would be self-generation of random numbers. A novel prototype schema employs solid phase synthesis of oligonucleotides for random construction of DNA sequences; temporary storage and retrieval is achieved through plasmid vectors. A discussion of how to evaluate sequence randomness is included, as well as how these techniques are applied to a simulation of the random number generation circuitry. Simulation results show generated sequences successfully pass three selected NIST random number generation tests specified for security applications.

  9. Fast physical-random number generation using laser diode's frequency noise: influence of frequency discriminator

    NASA Astrophysics Data System (ADS)

    Matsumoto, Kouhei; Kasuya, Yuki; Yumoto, Mitsuki; Arai, Hideaki; Sato, Takashi; Sakamoto, Shuichi; Ohkawa, Masashi; Ohdaira, Yasuo

    2018-02-01

    Not so long ago, pseudo random numbers generated by numerical formulae were considered to be adequate for encrypting important data-files, because of the time needed to decode them. With today's ultra high-speed processors, however, this is no longer true. So, in order to thwart ever-more advanced attempts to breach our system's protections, cryptologists have devised a method that is considered to be virtually impossible to decode, and uses what is a limitless number of physical random numbers. This research describes a method, whereby laser diode's frequency noise generate a large quantities of physical random numbers. Using two types of photo detectors (APD and PIN-PD), we tested the abilities of two types of lasers (FP-LD and VCSEL) to generate random numbers. In all instances, an etalon served as frequency discriminator, the examination pass rates were determined using NIST FIPS140-2 test at each bit, and the Random Number Generation (RNG) speed was noted.

  10. Generation of pseudo-random numbers

    NASA Technical Reports Server (NTRS)

    Howell, L. W.; Rheinfurth, M. H.

    1982-01-01

    Practical methods for generating acceptable random numbers from a variety of probability distributions which are frequently encountered in engineering applications are described. The speed, accuracy, and guarantee of statistical randomness of the various methods are discussed.

  11. Ultra-fast quantum randomness generation by accelerated phase diffusion in a pulsed laser diode.

    PubMed

    Abellán, C; Amaya, W; Jofre, M; Curty, M; Acín, A; Capmany, J; Pruneri, V; Mitchell, M W

    2014-01-27

    We demonstrate a high bit-rate quantum random number generator by interferometric detection of phase diffusion in a gain-switched DFB laser diode. Gain switching at few-GHz frequencies produces a train of bright pulses with nearly equal amplitudes and random phases. An unbalanced Mach-Zehnder interferometer is used to interfere subsequent pulses and thereby generate strong random-amplitude pulses, which are detected and digitized to produce a high-rate random bit string. Using established models of semiconductor laser field dynamics, we predict a regime of high visibility interference and nearly complete vacuum-fluctuation-induced phase diffusion between pulses. These are confirmed by measurement of pulse power statistics at the output of the interferometer. Using a 5.825 GHz excitation rate and 14-bit digitization, we observe 43 Gbps quantum randomness generation.

  12. Self-balanced real-time photonic scheme for ultrafast random number generation

    NASA Astrophysics Data System (ADS)

    Li, Pu; Guo, Ya; Guo, Yanqiang; Fan, Yuanlong; Guo, Xiaomin; Liu, Xianglian; Shore, K. Alan; Dubrova, Elena; Xu, Bingjie; Wang, Yuncai; Wang, Anbang

    2018-06-01

    We propose a real-time self-balanced photonic method for extracting ultrafast random numbers from broadband randomness sources. In place of electronic analog-to-digital converters (ADCs), the balanced photo-detection technology is used to directly quantize optically sampled chaotic pulses into a continuous random number stream. Benefitting from ultrafast photo-detection, our method can efficiently eliminate the generation rate bottleneck from electronic ADCs which are required in nearly all the available fast physical random number generators. A proof-of-principle experiment demonstrates that using our approach 10 Gb/s real-time and statistically unbiased random numbers are successfully extracted from a bandwidth-enhanced chaotic source. The generation rate achieved experimentally here is being limited by the bandwidth of the chaotic source. The method described has the potential to attain a real-time rate of 100 Gb/s.

  13. Learning stochastic reward distributions in a speeded pointing task.

    PubMed

    Seydell, Anna; McCann, Brian C; Trommershäuser, Julia; Knill, David C

    2008-04-23

    Recent studies have shown that humans effectively take into account task variance caused by intrinsic motor noise when planning fast hand movements. However, previous evidence suggests that humans have greater difficulty accounting for arbitrary forms of stochasticity in their environment, both in economic decision making and sensorimotor tasks. We hypothesized that humans can learn to optimize movement strategies when environmental randomness can be experienced and thus implicitly learned over several trials, especially if it mimics the kinds of randomness for which subjects might have generative models. We tested the hypothesis using a task in which subjects had to rapidly point at a target region partly covered by three stochastic penalty regions introduced as "defenders." At movement completion, each defender jumped to a new position drawn randomly from fixed probability distributions. Subjects earned points when they hit the target, unblocked by a defender, and lost points otherwise. Results indicate that after approximately 600 trials, subjects approached optimal behavior. We further tested whether subjects simply learned a set of stimulus-contingent motor plans or the statistics of defenders' movements by training subjects with one penalty distribution and then testing them on a new penalty distribution. Subjects immediately changed their strategy to achieve the same average reward as subjects who had trained with the second penalty distribution. These results indicate that subjects learned the parameters of the defenders' jump distributions and used this knowledge to optimally plan their hand movements under conditions involving stochastic rewards and penalties.

  14. Autocorrelation peaks in congruential pseudorandom number generators

    NASA Technical Reports Server (NTRS)

    Neuman, F.; Merrick, R. B.

    1976-01-01

    The complete correlation structure of several congruential pseudorandom number generators (PRNG) of the same type and small cycle length was studied to deal with the problem of congruential PRNG almost repeating themselves at intervals smaller than their cycle lengths, during simulation of bandpass filtered normal random noise. Maximum period multiplicative and mixed congruential generators were studied, with inferences drawn from examination of several tractable members of a class of random number generators, and moduli from 2 to the 5th power to 2 to the 9th power. High correlation is shown to exist in mixed and multiplicative congruential random number generators and prime moduli Lehmer generators for shifts a fraction of their cycle length. The random noise sequences in question are required when simulating electrical noise, air turbulence, or time variation of wind parameters.

  15. A simple model for the generation of the vestibular evoked myogenic potential (VEMP).

    PubMed

    Wit, Hero P; Kingma, Charlotte M

    2006-06-01

    To describe the mechanism by which the vestibular evoked myogenic potential is generated. Vestibular evoked myogenic potential generation is modeled by adding a large number of muscle motor unit action potentials. These action potentials occur randomly in time along a 100 ms long time axis. But because between approximately 15 and 20 ms after a loud short sound stimulus (almost) no action potentials are generated during VEMP measurements in human subjects, no action potentials are present in the model during this time. The evoked potential is the result of the lack of amplitude cancellation in the averaged surface electromyogram at the edges of this 5 ms long time interval. The relatively simple model describes generation and some properties of the vestibular evoked myogenic potential very well. It is shown that, in contrast with other evoked potentials (BAEPs, VERs), the vestibular evoked myogenic potential is the result of an interruption of activity and not that of summed synchronized neural action potentials.

  16. One Dimensional Turing-Like Handshake Test for Motor Intelligence

    PubMed Central

    Karniel, Amir; Avraham, Guy; Peles, Bat-Chen; Levy-Tzedek, Shelly; Nisky, Ilana

    2010-01-01

    In the Turing test, a computer model is deemed to "think intelligently" if it can generate answers that are not distinguishable from those of a human. However, this test is limited to the linguistic aspects of machine intelligence. A salient function of the brain is the control of movement, and the movement of the human hand is a sophisticated demonstration of this function. Therefore, we propose a Turing-like handshake test, for machine motor intelligence. We administer the test through a telerobotic system in which the interrogator is engaged in a task of holding a robotic stylus and interacting with another party (human or artificial). Instead of asking the interrogator whether the other party is a person or a computer program, we employ a two-alternative forced choice method and ask which of two systems is more human-like. We extract a quantitative grade for each model according to its resemblance to the human handshake motion and name it "Model Human-Likeness Grade" (MHLG). We present three methods to estimate the MHLG. (i) By calculating the proportion of subjects' answers that the model is more human-like than the human; (ii) By comparing two weighted sums of human and model handshakes we fit a psychometric curve and extract the point of subjective equality (PSE); (iii) By comparing a given model with a weighted sum of human and random signal, we fit a psychometric curve to the answers of the interrogator and extract the PSE for the weight of the human in the weighted sum. Altogether, we provide a protocol to test computational models of the human handshake. We believe that building a model is a necessary step in understanding any phenomenon and, in this case, in understanding the neural mechanisms responsible for the generation of the human handshake. PMID:21206462

  17. Note: Fully integrated 3.2 Gbps quantum random number generator with real-time extraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiao-Guang; Nie, You-Qi; Liang, Hao

    2016-07-15

    We present a real-time and fully integrated quantum random number generator (QRNG) by measuring laser phase fluctuations. The QRNG scheme based on laser phase fluctuations is featured for its capability of generating ultra-high-speed random numbers. However, the speed bottleneck of a practical QRNG lies on the limited speed of randomness extraction. To close the gap between the fast randomness generation and the slow post-processing, we propose a pipeline extraction algorithm based on Toeplitz matrix hashing and implement it in a high-speed field-programmable gate array. Further, all the QRNG components are integrated into a module, including a compact and actively stabilizedmore » interferometer, high-speed data acquisition, and real-time data post-processing and transmission. The final generation rate of the QRNG module with real-time extraction can reach 3.2 Gbps.« less

  18. Disaggregating Census Data for Population Mapping Using Random Forests with Remotely-Sensed and Ancillary Data

    PubMed Central

    Stevens, Forrest R.; Gaughan, Andrea E.; Linard, Catherine; Tatem, Andrew J.

    2015-01-01

    High resolution, contemporary data on human population distributions are vital for measuring impacts of population growth, monitoring human-environment interactions and for planning and policy development. Many methods are used to disaggregate census data and predict population densities for finer scale, gridded population data sets. We present a new semi-automated dasymetric modeling approach that incorporates detailed census and ancillary data in a flexible, “Random Forest” estimation technique. We outline the combination of widely available, remotely-sensed and geospatial data that contribute to the modeled dasymetric weights and then use the Random Forest model to generate a gridded prediction of population density at ~100 m spatial resolution. This prediction layer is then used as the weighting surface to perform dasymetric redistribution of the census counts at a country level. As a case study we compare the new algorithm and its products for three countries (Vietnam, Cambodia, and Kenya) with other common gridded population data production methodologies. We discuss the advantages of the new method and increases over the accuracy and flexibility of those previous approaches. Finally, we outline how this algorithm will be extended to provide freely-available gridded population data sets for Africa, Asia and Latin America. PMID:25689585

  19. Experimental nonlocality-based randomness generation with nonprojective measurements

    NASA Astrophysics Data System (ADS)

    Gómez, S.; Mattar, A.; Gómez, E. S.; Cavalcanti, D.; Farías, O. Jiménez; Acín, A.; Lima, G.

    2018-04-01

    We report on an optical setup generating more than one bit of randomness from one entangled bit (i.e., a maximally entangled state of two qubits). The amount of randomness is certified through the observation of Bell nonlocal correlations. To attain this result we implemented a high-purity entanglement source and a nonprojective three-outcome measurement. Our implementation achieves a gain of 27% of randomness as compared with the standard methods using projective measurements. Additionally, we estimate the amount of randomness certified in a one-sided device-independent scenario, through the observation of Einstein-Podolsky-Rosen steering. Our results prove that nonprojective quantum measurements allow extending the limits for nonlocality-based certified randomness generation using current technology.

  20. The cosmic microwave background radiation power spectrum as a random bit generator for symmetric- and asymmetric-key cryptography.

    PubMed

    Lee, Jeffrey S; Cleaver, Gerald B

    2017-10-01

    In this note, the Cosmic Microwave Background (CMB) Radiation is shown to be capable of functioning as a Random Bit Generator, and constitutes an effectively infinite supply of truly random one-time pad values of arbitrary length. It is further argued that the CMB power spectrum potentially conforms to the FIPS 140-2 standard. Additionally, its applicability to the generation of a (n × n) random key matrix for a Vernam cipher is established.

  1. Pseudo CT estimation from MRI using patch-based random forest

    NASA Astrophysics Data System (ADS)

    Yang, Xiaofeng; Lei, Yang; Shu, Hui-Kuo; Rossi, Peter; Mao, Hui; Shim, Hyunsuk; Curran, Walter J.; Liu, Tian

    2017-02-01

    Recently, MR simulators gain popularity because of unnecessary radiation exposure of CT simulators being used in radiation therapy planning. We propose a method for pseudo CT estimation from MR images based on a patch-based random forest. Patient-specific anatomical features are extracted from the aligned training images and adopted as signatures for each voxel. The most robust and informative features are identified using feature selection to train the random forest. The well-trained random forest is used to predict the pseudo CT of a new patient. This prediction technique was tested with human brain images and the prediction accuracy was assessed using the original CT images. Peak signal-to-noise ratio (PSNR) and feature similarity (FSIM) indexes were used to quantify the differences between the pseudo and original CT images. The experimental results showed the proposed method could accurately generate pseudo CT images from MR images. In summary, we have developed a new pseudo CT prediction method based on patch-based random forest, demonstrated its clinical feasibility, and validated its prediction accuracy. This pseudo CT prediction technique could be a useful tool for MRI-based radiation treatment planning and attenuation correction in a PET/MRI scanner.

  2. A new simple technique for improving the random properties of chaos-based cryptosystems

    NASA Astrophysics Data System (ADS)

    Garcia-Bosque, M.; Pérez-Resa, A.; Sánchez-Azqueta, C.; Celma, S.

    2018-03-01

    A new technique for improving the security of chaos-based stream ciphers has been proposed and tested experimentally. This technique manages to improve the randomness properties of the generated keystream by preventing the system to fall into short period cycles due to digitation. In order to test this technique, a stream cipher based on a Skew Tent Map algorithm has been implemented on a Virtex 7 FPGA. The randomness of the keystream generated by this system has been compared to the randomness of the keystream generated by the same system with the proposed randomness-enhancement technique. By subjecting both keystreams to the National Institute of Standards and Technology (NIST) tests, we have proved that our method can considerably improve the randomness of the generated keystreams. In order to incorporate our randomness-enhancement technique, only 41 extra slices have been needed, proving that, apart from effective, this method is also efficient in terms of area and hardware resources.

  3. Doing better by getting worse: posthypnotic amnesia improves random number generation.

    PubMed

    Terhune, Devin Blair; Brugger, Peter

    2011-01-01

    Although forgetting is often regarded as a deficit that we need to control to optimize cognitive functioning, it can have beneficial effects in a number of contexts. We examined whether disrupting memory for previous numerical responses would attenuate repetition avoidance (the tendency to avoid repeating the same number) during random number generation and thereby improve the randomness of responses. Low suggestible and low dissociative and high dissociative highly suggestible individuals completed a random number generation task in a control condition, following a posthypnotic amnesia suggestion to forget previous numerical responses, and in a second control condition following the cancellation of the suggestion. High dissociative highly suggestible participants displayed a selective increase in repetitions during posthypnotic amnesia, with equivalent repetition frequency to a random system, whereas the other two groups exhibited repetition avoidance across conditions. Our results demonstrate that temporarily disrupting memory for previous numerical responses improves random number generation.

  4. Doing Better by Getting Worse: Posthypnotic Amnesia Improves Random Number Generation

    PubMed Central

    Terhune, Devin Blair; Brugger, Peter

    2011-01-01

    Although forgetting is often regarded as a deficit that we need to control to optimize cognitive functioning, it can have beneficial effects in a number of contexts. We examined whether disrupting memory for previous numerical responses would attenuate repetition avoidance (the tendency to avoid repeating the same number) during random number generation and thereby improve the randomness of responses. Low suggestible and low dissociative and high dissociative highly suggestible individuals completed a random number generation task in a control condition, following a posthypnotic amnesia suggestion to forget previous numerical responses, and in a second control condition following the cancellation of the suggestion. High dissociative highly suggestible participants displayed a selective increase in repetitions during posthypnotic amnesia, with equivalent repetition frequency to a random system, whereas the other two groups exhibited repetition avoidance across conditions. Our results demonstrate that temporarily disrupting memory for previous numerical responses improves random number generation. PMID:22195022

  5. Method and apparatus for in-situ characterization of energy storage and energy conversion devices

    DOEpatents

    Christophersen, Jon P [Idaho Falls, ID; Motloch, Chester G [Idaho Falls, ID; Morrison, John L [Butte, MT; Albrecht, Weston [Layton, UT

    2010-03-09

    Disclosed are methods and apparatuses for determining an impedance of an energy-output device using a random noise stimulus applied to the energy-output device. A random noise signal is generated and converted to a random noise stimulus as a current source correlated to the random noise signal. A bias-reduced response of the energy-output device to the random noise stimulus is generated by comparing a voltage at the energy-output device terminal to an average voltage signal. The random noise stimulus and bias-reduced response may be periodically sampled to generate a time-varying current stimulus and a time-varying voltage response, which may be correlated to generate an autocorrelated stimulus, an autocorrelated response, and a cross-correlated response. Finally, the autocorrelated stimulus, the autocorrelated response, and the cross-correlated response may be combined to determine at least one of impedance amplitude, impedance phase, and complex impedance.

  6. Universality of accelerating change

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Shlesinger, Michael F.

    2018-03-01

    On large time scales the progress of human technology follows an exponential growth trend that is termed accelerating change. The exponential growth trend is commonly considered to be the amalgamated effect of consecutive technology revolutions - where the progress carried in by each technology revolution follows an S-curve, and where the aging of each technology revolution drives humanity to push for the next technology revolution. Thus, as a collective, mankind is the 'intelligent designer' of accelerating change. In this paper we establish that the exponential growth trend - and only this trend - emerges universally, on large time scales, from systems that combine together two elements: randomness and amalgamation. Hence, the universal generation of accelerating change can be attained by systems with no 'intelligent designer'.

  7. Leveraging Random Number Generation for Mastery of Learning in Teaching Quantitative Research Courses via an E-Learning Method

    ERIC Educational Resources Information Center

    Boonsathorn, Wasita; Charoen, Danuvasin; Dryver, Arthur L.

    2014-01-01

    E-Learning brings access to a powerful but often overlooked teaching tool: random number generation. Using random number generation, a practically infinite number of quantitative problem-solution sets can be created. In addition, within the e-learning context, in the spirit of the mastery of learning, it is possible to assign online quantitative…

  8. Random numbers certified by Bell's theorem.

    PubMed

    Pironio, S; Acín, A; Massar, S; de la Giroday, A Boyer; Matsukevich, D N; Maunz, P; Olmschenk, S; Hayes, D; Luo, L; Manning, T A; Monroe, C

    2010-04-15

    Randomness is a fundamental feature of nature and a valuable resource for applications ranging from cryptography and gambling to numerical simulation of physical and biological systems. Random numbers, however, are difficult to characterize mathematically, and their generation must rely on an unpredictable physical process. Inaccuracies in the theoretical modelling of such processes or failures of the devices, possibly due to adversarial attacks, limit the reliability of random number generators in ways that are difficult to control and detect. Here, inspired by earlier work on non-locality-based and device-independent quantum information processing, we show that the non-local correlations of entangled quantum particles can be used to certify the presence of genuine randomness. It is thereby possible to design a cryptographically secure random number generator that does not require any assumption about the internal working of the device. Such a strong form of randomness generation is impossible classically and possible in quantum systems only if certified by a Bell inequality violation. We carry out a proof-of-concept demonstration of this proposal in a system of two entangled atoms separated by approximately one metre. The observed Bell inequality violation, featuring near perfect detection efficiency, guarantees that 42 new random numbers are generated with 99 per cent confidence. Our results lay the groundwork for future device-independent quantum information experiments and for addressing fundamental issues raised by the intrinsic randomness of quantum theory.

  9. Adeno-Associated Virus Type 2 Wild-Type and Vector-Mediated Genomic Integration Profiles of Human Diploid Fibroblasts Analyzed by Third-Generation PacBio DNA Sequencing

    PubMed Central

    Hüser, Daniela; Gogol-Döring, Andreas; Chen, Wei

    2014-01-01

    ABSTRACT Genome-wide analysis of adeno-associated virus (AAV) type 2 integration in HeLa cells has shown that wild-type AAV integrates at numerous genomic sites, including AAVS1 on chromosome 19q13.42. Multiple GAGY/C repeats, resembling consensus AAV Rep-binding sites are preferred, whereas rep-deficient AAV vectors (rAAV) regularly show a random integration profile. This study is the first study to analyze wild-type AAV integration in diploid human fibroblasts. Applying high-throughput third-generation PacBio-based DNA sequencing, integration profiles of wild-type AAV and rAAV are compared side by side. Bioinformatic analysis reveals that both wild-type AAV and rAAV prefer open chromatin regions. Although genomic features of AAV integration largely reproduce previous findings, the pattern of integration hot spots differs from that described in HeLa cells before. DNase-Seq data for human fibroblasts and for HeLa cells reveal variant chromatin accessibility at preferred AAV integration hot spots that correlates with variant hot spot preferences. DNase-Seq patterns of these sites in human tissues, including liver, muscle, heart, brain, skin, and embryonic stem cells further underline variant chromatin accessibility. In summary, AAV integration is dependent on cell-type-specific, variant chromatin accessibility leading to random integration profiles for rAAV, whereas wild-type AAV integration sites cluster near GAGY/C repeats. IMPORTANCE Adeno-associated virus type 2 (AAV) is assumed to establish latency by chromosomal integration of its DNA. This is the first genome-wide analysis of wild-type AAV2 integration in diploid human cells and the first to compare wild-type to recombinant AAV vector integration side by side under identical experimental conditions. Major determinants of wild-type AAV integration represent open chromatin regions with accessible consensus AAV Rep-binding sites. The variant chromatin accessibility of different human tissues or cell types will have impact on vector targeting to be considered during gene therapy. PMID:25031342

  10. High throughput mutagenesis for identification of residues regulating human prostacyclin (hIP) receptor expression and function.

    PubMed

    Bill, Anke; Rosethorne, Elizabeth M; Kent, Toby C; Fawcett, Lindsay; Burchell, Lynn; van Diepen, Michiel T; Marelli, Anthony; Batalov, Sergey; Miraglia, Loren; Orth, Anthony P; Renaud, Nicole A; Charlton, Steven J; Gosling, Martin; Gaither, L Alex; Groot-Kormelink, Paul J

    2014-01-01

    The human prostacyclin receptor (hIP receptor) is a seven-transmembrane G protein-coupled receptor (GPCR) that plays a critical role in vascular smooth muscle relaxation and platelet aggregation. hIP receptor dysfunction has been implicated in numerous cardiovascular abnormalities, including myocardial infarction, hypertension, thrombosis and atherosclerosis. Genomic sequencing has discovered several genetic variations in the PTGIR gene coding for hIP receptor, however, its structure-function relationship has not been sufficiently explored. Here we set out to investigate the applicability of high throughput random mutagenesis to study the structure-function relationship of hIP receptor. While chemical mutagenesis was not suitable to generate a mutagenesis library with sufficient coverage, our data demonstrate error-prone PCR (epPCR) mediated mutagenesis as a valuable method for the unbiased screening of residues regulating hIP receptor function and expression. Here we describe the generation and functional characterization of an epPCR derived mutagenesis library compromising >4000 mutants of the hIP receptor. We introduce next generation sequencing as a useful tool to validate the quality of mutagenesis libraries by providing information about the coverage, mutation rate and mutational bias. We identified 18 mutants of the hIP receptor that were expressed at the cell surface, but demonstrated impaired receptor function. A total of 38 non-synonymous mutations were identified within the coding region of the hIP receptor, mapping to 36 distinct residues, including several mutations previously reported to affect the signaling of the hIP receptor. Thus, our data demonstrates epPCR mediated random mutagenesis as a valuable and practical method to study the structure-function relationship of GPCRs.

  11. High Throughput Mutagenesis for Identification of Residues Regulating Human Prostacyclin (hIP) Receptor Expression and Function

    PubMed Central

    Kent, Toby C.; Fawcett, Lindsay; Burchell, Lynn; van Diepen, Michiel T.; Marelli, Anthony; Batalov, Sergey; Miraglia, Loren; Orth, Anthony P.; Renaud, Nicole A.; Charlton, Steven J.; Gosling, Martin; Gaither, L. Alex; Groot-Kormelink, Paul J.

    2014-01-01

    The human prostacyclin receptor (hIP receptor) is a seven-transmembrane G protein-coupled receptor (GPCR) that plays a critical role in vascular smooth muscle relaxation and platelet aggregation. hIP receptor dysfunction has been implicated in numerous cardiovascular abnormalities, including myocardial infarction, hypertension, thrombosis and atherosclerosis. Genomic sequencing has discovered several genetic variations in the PTGIR gene coding for hIP receptor, however, its structure-function relationship has not been sufficiently explored. Here we set out to investigate the applicability of high throughput random mutagenesis to study the structure-function relationship of hIP receptor. While chemical mutagenesis was not suitable to generate a mutagenesis library with sufficient coverage, our data demonstrate error-prone PCR (epPCR) mediated mutagenesis as a valuable method for the unbiased screening of residues regulating hIP receptor function and expression. Here we describe the generation and functional characterization of an epPCR derived mutagenesis library compromising >4000 mutants of the hIP receptor. We introduce next generation sequencing as a useful tool to validate the quality of mutagenesis libraries by providing information about the coverage, mutation rate and mutational bias. We identified 18 mutants of the hIP receptor that were expressed at the cell surface, but demonstrated impaired receptor function. A total of 38 non-synonymous mutations were identified within the coding region of the hIP receptor, mapping to 36 distinct residues, including several mutations previously reported to affect the signaling of the hIP receptor. Thus, our data demonstrates epPCR mediated random mutagenesis as a valuable and practical method to study the structure-function relationship of GPCRs. PMID:24886841

  12. Identification of Multiple Novel Viruses, Including a Parvovirus and a Hepevirus, in Feces of Red Foxes

    PubMed Central

    van der Giessen, Joke; Haagmans, Bart L.; Osterhaus, Albert D. M. E.; Smits, Saskia L.

    2013-01-01

    Red foxes (Vulpes vulpes) are the most widespread members of the order of Carnivora. Since they often live in (peri)urban areas, they are a potential reservoir of viruses that transmit from wildlife to humans or domestic animals. Here we evaluated the fecal viral microbiome of 13 red foxes by random PCR in combination with next-generation sequencing. Various novel viruses, including a parvovirus, bocavirus, adeno-associated virus, hepevirus, astroviruses, and picobirnaviruses, were identified. PMID:23616657

  13. Human X-chromosome inactivation pattern distributions fit a model of genetically influenced choice better than models of completely random choice

    PubMed Central

    Renault, Nisa K E; Pritchett, Sonja M; Howell, Robin E; Greer, Wenda L; Sapienza, Carmen; Ørstavik, Karen Helene; Hamilton, David C

    2013-01-01

    In eutherian mammals, one X-chromosome in every XX somatic cell is transcriptionally silenced through the process of X-chromosome inactivation (XCI). Females are thus functional mosaics, where some cells express genes from the paternal X, and the others from the maternal X. The relative abundance of the two cell populations (X-inactivation pattern, XIP) can have significant medical implications for some females. In mice, the ‘choice' of which X to inactivate, maternal or paternal, in each cell of the early embryo is genetically influenced. In humans, the timing of XCI choice and whether choice occurs completely randomly or under a genetic influence is debated. Here, we explore these questions by analysing the distribution of XIPs in large populations of normal females. Models were generated to predict XIP distributions resulting from completely random or genetically influenced choice. Each model describes the discrete primary distribution at the onset of XCI, and the continuous secondary distribution accounting for changes to the XIP as a result of development and ageing. Statistical methods are used to compare models with empirical data from Danish and Utah populations. A rigorous data treatment strategy maximises information content and allows for unbiased use of unphased XIP data. The Anderson–Darling goodness-of-fit statistics and likelihood ratio tests indicate that a model of genetically influenced XCI choice better fits the empirical data than models of completely random choice. PMID:23652377

  14. Social Noise: Generating Random Numbers from Twitter Streams

    NASA Astrophysics Data System (ADS)

    Fernández, Norberto; Quintas, Fernando; Sánchez, Luis; Arias, Jesús

    2015-12-01

    Due to the multiple applications of random numbers in computer systems (cryptography, online gambling, computer simulation, etc.) it is important to have mechanisms to generate these numbers. True Random Number Generators (TRNGs) are commonly used for this purpose. TRNGs rely on non-deterministic sources to generate randomness. Physical processes (like noise in semiconductors, quantum phenomenon, etc.) play this role in state of the art TRNGs. In this paper, we depart from previous work and explore the possibility of defining social TRNGs using the stream of public messages of the microblogging service Twitter as randomness source. Thus, we define two TRNGs based on Twitter stream information and evaluate them using the National Institute of Standards and Technology (NIST) statistical test suite. The results of the evaluation confirm the feasibility of the proposed approach.

  15. Fast generation of sparse random kernel graphs

    DOE PAGES

    Hagberg, Aric; Lemons, Nathan; Du, Wen -Bo

    2015-09-10

    The development of kernel-based inhomogeneous random graphs has provided models that are flexible enough to capture many observed characteristics of real networks, and that are also mathematically tractable. We specify a class of inhomogeneous random graph models, called random kernel graphs, that produces sparse graphs with tunable graph properties, and we develop an efficient generation algorithm to sample random instances from this model. As real-world networks are usually large, it is essential that the run-time of generation algorithms scales better than quadratically in the number of vertices n. We show that for many practical kernels our algorithm runs in timemore » at most ο(n(logn)²). As an example, we show how to generate samples of power-law degree distribution graphs with tunable assortativity.« less

  16. Compact Quantum Random Number Generator with Silicon Nanocrystals Light Emitting Device Coupled to a Silicon Photomultiplier

    NASA Astrophysics Data System (ADS)

    Bisadi, Zahra; Acerbi, Fabio; Fontana, Giorgio; Zorzi, Nicola; Piemonte, Claudio; Pucker, Georg; Pavesi, Lorenzo

    2018-02-01

    A small-sized photonic quantum random number generator, easy to be implemented in small electronic devices for secure data encryption and other applications, is highly demanding nowadays. Here, we propose a compact configuration with Silicon nanocrystals large area light emitting device (LED) coupled to a Silicon photomultiplier to generate random numbers. The random number generation methodology is based on the photon arrival time and is robust against the non-idealities of the detector and the source of quantum entropy. The raw data show high quality of randomness and pass all the statistical tests in national institute of standards and technology tests (NIST) suite without a post-processing algorithm. The highest bit rate is 0.5 Mbps with the efficiency of 4 bits per detected photon.

  17. Partial characterization of normal and Haemophilus influenzae-infected mucosal complementary DNA libraries in chinchilla middle ear mucosa.

    PubMed

    Kerschner, Joseph E; Erdos, Geza; Hu, Fen Ze; Burrows, Amy; Cioffi, Joseph; Khampang, Pawjai; Dahlgren, Margaret; Hayes, Jay; Keefe, Randy; Janto, Benjamin; Post, J Christopher; Ehrlich, Garth D

    2010-04-01

    We sought to construct and partially characterize complementary DNA (cDNA) libraries prepared from the middle ear mucosa (MEM) of chinchillas to better understand pathogenic aspects of infection and inflammation, particularly with respect to leukotriene biogenesis and response. Chinchilla MEM was harvested from controls and after middle ear inoculation with nontypeable Haemophilus influenzae. RNA was extracted to generate cDNA libraries. Randomly selected clones were subjected to sequence analysis to characterize the libraries and to provide DNA sequence for phylogenetic analyses. Reverse transcription-polymerase chain reaction of the RNA pools was used to generate cDNA sequences corresponding to genes associated with leukotriene biosynthesis and metabolism. Sequence analysis of 921 randomly selected clones from the uninfected MEM cDNA library produced approximately 250,000 nucleotides of almost entirely novel sequence data. Searches of the GenBank database with the Basic Local Alignment Search Tool provided for identification of 515 unique genes expressed in the MEM and not previously described in chinchillas. In almost all cases, the chinchilla cDNA sequences displayed much greater homology to human or other primate genes than with rodent species. Genes associated with leukotriene metabolism were present in both normal and infected MEM. Based on both phylogenetic comparisons and gene expression similarities with humans, chinchilla MEM appears to be an excellent model for the study of middle ear inflammation and infection. The higher degree of sequence similarity between chinchillas and humans compared to chinchillas and rodents was unexpected. The cDNA libraries from normal and infected chinchilla MEM will serve as useful molecular tools in the study of otitis media and should yield important information with respect to middle ear pathogenesis.

  18. Partial Characterization of Normal and Haemophilus influenzae–Infected Mucosal Complementary DNA Libraries in Chinchilla Middle Ear Mucosa

    PubMed Central

    Kerschner, Joseph E.; Erdos, Geza; Hu, Fen Ze; Burrows, Amy; Cioffi, Joseph; Khampang, Pawjai; Dahlgren, Margaret; Hayes, Jay; Keefe, Randy; Janto, Benjamin; Post, J. Christopher; Ehrlich, Garth D.

    2010-01-01

    Objectives We sought to construct and partially characterize complementary DNA (cDNA) libraries prepared from the middle ear mucosa (MEM) of chinchillas to better understand pathogenic aspects of infection and inflammation, particularly with respect to leukotriene biogenesis and response. Methods Chinchilla MEM was harvested from controls and after middle ear inoculation with nontypeable Haemophilus influenzae. RNA was extracted to generate cDNA libraries. Randomly selected clones were subjected to sequence analysis to characterize the libraries and to provide DNA sequence for phylogenetic analyses. Reverse transcription–polymerase chain reaction of the RNA pools was used to generate cDNA sequences corresponding to genes associated with leukotriene biosynthesis and metabolism. Results Sequence analysis of 921 randomly selected clones from the uninfected MEM cDNA library produced approximately 250,000 nucleotides of almost entirely novel sequence data. Searches of the GenBank database with the Basic Local Alignment Search Tool provided for identification of 515 unique genes expressed in the MEM and not previously described in chinchillas. In almost all cases, the chinchilla cDNA sequences displayed much greater homology to human or other primate genes than with rodent species. Genes associated with leukotriene metabolism were present in both normal and infected MEM. Conclusions Based on both phylogenetic comparisons and gene expression similarities with humans, chinchilla MEM appears to be an excellent model for the study of middle ear inflammation and infection. The higher degree of sequence similarity between chinchillas and humans compared to chinchillas and rodents was unexpected. The cDNA libraries from normal and infected chinchilla MEM will serve as useful molecular tools in the study of otitis media and should yield important information with respect to middle ear pathogenesis. PMID:20433028

  19. Understanding spatial connectivity of individuals with non-uniform population density.

    PubMed

    Wang, Pu; González, Marta C

    2009-08-28

    We construct a two-dimensional geometric graph connecting individuals placed in space within a given contact distance. The individuals are distributed using a measured country's density of population. We observe that while large clusters (group of individuals connected) emerge within some regions, they are trapped in detached urban areas owing to the low population density of the regions bordering them. To understand the emergence of a giant cluster that connects the entire population, we compare the empirical geometric graph with the one generated by placing the same number of individuals randomly in space. We find that, for small contact distances, the empirical distribution of population dominates the growth of connected components, but no critical percolation transition is observed in contrast to the graph generated by a random distribution of population. Our results show that contact distances from real-world situations as for WIFI and Bluetooth connections drop in a zone where a fully connected cluster is not observed, hinting that human mobility must play a crucial role in contact-based diseases and wireless viruses' large-scale spreading.

  20. Fine-scale population structure and the era of next-generation sequencing.

    PubMed

    Henn, Brenna M; Gravel, Simon; Moreno-Estrada, Andres; Acevedo-Acevedo, Suehelay; Bustamante, Carlos D

    2010-10-15

    Fine-scale population structure characterizes most continents and is especially pronounced in non-cosmopolitan populations. Roughly half of the world's population remains non-cosmopolitan and even populations within cities often assort along ethnic and linguistic categories. Barriers to random mating can be ecologically extreme, such as the Sahara Desert, or cultural, such as the Indian caste system. In either case, subpopulations accumulate genetic differences if the barrier is maintained over multiple generations. Genome-wide polymorphism data, initially with only a few hundred autosomal microsatellites, have clearly established differences in allele frequency not only among continental regions, but also within continents and within countries. We review recent evidence from the analysis of genome-wide polymorphism data for genetic boundaries delineating human population structure and the main demographic and genomic processes shaping variation, and discuss the implications of population structure for the distribution and discovery of disease-causing genetic variants, in the light of the imminent availability of sequencing data for a multitude of diverse human genomes.

  1. CK-2127107 amplifies skeletal muscle response to nerve activation in humans.

    PubMed

    Andrews, Jinsy A; Miller, Timothy M; Vijayakumar, Vipin; Stoltz, Randall; James, Joyce K; Meng, Lisa; Wolff, Andrew A; Malik, Fady I

    2018-05-01

    Three studies evaluated safety, tolerability, pharmacokinetics, and pharmacodynamics of CK-2127107 (CK-107), a next-generation fast skeletal muscle troponin activator (FSTA), in healthy participants. We tested the hypothesis that CK-107 would amplify the force-frequency response of muscle in humans. To assess the force-frequency response, participants received single doses of CK-107 and placebo in a randomized, double-blind, 4-period, crossover study. The force-frequency response of foot dorsiflexion following stimulation of the deep fibular nerve to activate the tibialis anterior muscle was assessed. CK-107 significantly increased tibialis anterior muscle response with increasing dose and plasma concentration in a frequency-dependent manner; the largest increase in peak force was ∼60% at 10 Hz. CK-107 appears more potent and produced larger increases in force than tirasemtiv-a first-generation FSTA-in a similar pharmacodynamic study, thereby supporting its development for improvement of muscle function of patients. Muscle Nerve 57: 729-734, 2018. © 2017 The Authors. Muscle & Nerve published by Wiley Periodicals, Inc.

  2. Response Rates in Random-Digit-Dialed Telephone Surveys: Estimation vs. Measurement.

    ERIC Educational Resources Information Center

    Franz, Jennifer D.

    The efficacy of the random digit dialing method in telephone surveys was examined. Random digit dialing (RDD) generates a pure random sample and provides the advantage of including unlisted phone numbers, as well as numbers which are too new to be listed. Its disadvantage is that it generates a major proportion of nonworking and business…

  3. Revisiting sample size: are big trials the answer?

    PubMed

    Lurati Buse, Giovanna A L; Botto, Fernando; Devereaux, P J

    2012-07-18

    The superiority of the evidence generated in randomized controlled trials over observational data is not only conditional to randomization. Randomized controlled trials require proper design and implementation to provide a reliable effect estimate. Adequate random sequence generation, allocation implementation, analyses based on the intention-to-treat principle, and sufficient power are crucial to the quality of a randomized controlled trial. Power, or the probability of the trial to detect a difference when a real difference between treatments exists, strongly depends on sample size. The quality of orthopaedic randomized controlled trials is frequently threatened by a limited sample size. This paper reviews basic concepts and pitfalls in sample-size estimation and focuses on the importance of large trials in the generation of valid evidence.

  4. Heterogeneous Suppression of Sequential Effects in Random Sequence Generation, but Not in Operant Learning.

    PubMed

    Shteingart, Hanan; Loewenstein, Yonatan

    2016-01-01

    There is a long history of experiments in which participants are instructed to generate a long sequence of binary random numbers. The scope of this line of research has shifted over the years from identifying the basic psychological principles and/or the heuristics that lead to deviations from randomness, to one of predicting future choices. In this paper, we used generalized linear regression and the framework of Reinforcement Learning in order to address both points. In particular, we used logistic regression analysis in order to characterize the temporal sequence of participants' choices. Surprisingly, a population analysis indicated that the contribution of the most recent trial has only a weak effect on behavior, compared to more preceding trials, a result that seems irreconcilable with standard sequential effects that decay monotonously with the delay. However, when considering each participant separately, we found that the magnitudes of the sequential effect are a monotonous decreasing function of the delay, yet these individual sequential effects are largely averaged out in a population analysis because of heterogeneity. The substantial behavioral heterogeneity in this task is further demonstrated quantitatively by considering the predictive power of the model. We show that a heterogeneous model of sequential dependencies captures the structure available in random sequence generation. Finally, we show that the results of the logistic regression analysis can be interpreted in the framework of reinforcement learning, allowing us to compare the sequential effects in the random sequence generation task to those in an operant learning task. We show that in contrast to the random sequence generation task, sequential effects in operant learning are far more homogenous across the population. These results suggest that in the random sequence generation task, different participants adopt different cognitive strategies to suppress sequential dependencies when generating the "random" sequences.

  5. Interactive learning in 2×2 normal form games by neural network agents

    NASA Astrophysics Data System (ADS)

    Spiliopoulos, Leonidas

    2012-11-01

    This paper models the learning process of populations of randomly rematched tabula rasa neural network (NN) agents playing randomly generated 2×2 normal form games of all strategic classes. This approach has greater external validity than the existing models in the literature, each of which is usually applicable to narrow subsets of classes of games (often a single game) and/or to fixed matching protocols. The learning prowess of NNs with hidden layers was impressive as they learned to play unique pure strategy equilibria with near certainty, adhered to principles of dominance and iterated dominance, and exhibited a preference for risk-dominant equilibria. In contrast, perceptron NNs were found to perform significantly worse than hidden layer NN agents and human subjects in experimental studies.

  6. Fast Human Detection for Intelligent Monitoring Using Surveillance Visible Sensors

    PubMed Central

    Ko, Byoung Chul; Jeong, Mira; Nam, JaeYeal

    2014-01-01

    Human detection using visible surveillance sensors is an important and challenging work for intruder detection and safety management. The biggest barrier of real-time human detection is the computational time required for dense image scaling and scanning windows extracted from an entire image. This paper proposes fast human detection by selecting optimal levels of image scale using each level's adaptive region-of-interest (ROI). To estimate the image-scaling level, we generate a Hough windows map (HWM) and select a few optimal image scales based on the strength of the HWM and the divide-and-conquer algorithm. Furthermore, adaptive ROIs are arranged per image scale to provide a different search area. We employ a cascade random forests classifier to separate candidate windows into human and nonhuman classes. The proposed algorithm has been successfully applied to real-world surveillance video sequences, and its detection accuracy and computational speed show a better performance than those of other related methods. PMID:25393782

  7. Solution-Processed Carbon Nanotube True Random Number Generator.

    PubMed

    Gaviria Rojas, William A; McMorrow, Julian J; Geier, Michael L; Tang, Qianying; Kim, Chris H; Marks, Tobin J; Hersam, Mark C

    2017-08-09

    With the growing adoption of interconnected electronic devices in consumer and industrial applications, there is an increasing demand for robust security protocols when transmitting and receiving sensitive data. Toward this end, hardware true random number generators (TRNGs), commonly used to create encryption keys, offer significant advantages over software pseudorandom number generators. However, the vast network of devices and sensors envisioned for the "Internet of Things" will require small, low-cost, and mechanically flexible TRNGs with low computational complexity. These rigorous constraints position solution-processed semiconducting single-walled carbon nanotubes (SWCNTs) as leading candidates for next-generation security devices. Here, we demonstrate the first TRNG using static random access memory (SRAM) cells based on solution-processed SWCNTs that digitize thermal noise to generate random bits. This bit generation strategy can be readily implemented in hardware with minimal transistor and computational overhead, resulting in an output stream that passes standardized statistical tests for randomness. By using solution-processed semiconducting SWCNTs in a low-power, complementary architecture to achieve TRNG, we demonstrate a promising approach for improving the security of printable and flexible electronics.

  8. 640-Gbit/s fast physical random number generation using a broadband chaotic semiconductor laser

    NASA Astrophysics Data System (ADS)

    Zhang, Limeng; Pan, Biwei; Chen, Guangcan; Guo, Lu; Lu, Dan; Zhao, Lingjuan; Wang, Wei

    2017-04-01

    An ultra-fast physical random number generator is demonstrated utilizing a photonic integrated device based broadband chaotic source with a simple post data processing method. The compact chaotic source is implemented by using a monolithic integrated dual-mode amplified feedback laser (AFL) with self-injection, where a robust chaotic signal with RF frequency coverage of above 50 GHz and flatness of ±3.6 dB is generated. By using 4-least significant bits (LSBs) retaining from the 8-bit digitization of the chaotic waveform, random sequences with a bit-rate up to 640 Gbit/s (160 GS/s × 4 bits) are realized. The generated random bits have passed each of the fifteen NIST statistics tests (NIST SP800-22), indicating its randomness for practical applications.

  9. A Comparison of One Time Pad Random Key Generation using Linear Congruential Generator and Quadratic Congruential Generator

    NASA Astrophysics Data System (ADS)

    Apdilah, D.; Harahap, M. K.; Khairina, N.; Husein, A. M.; Harahap, M.

    2018-04-01

    One Time Pad algorithm always requires a pairing of the key for plaintext. If the length of keys less than a length of the plaintext, the key will be repeated until the length of the plaintext same with the length of the key. In this research, we use Linear Congruential Generator and Quadratic Congruential Generator for generating a random number. One Time Pad use a random number as a key for encryption and decryption process. Key will generate the first letter from the plaintext, we compare these two algorithms in terms of time speed encryption, and the result is a combination of OTP with LCG faster than the combination of OTP with QCG.

  10. Early stage hot spot analysis through standard cell base random pattern generation

    NASA Astrophysics Data System (ADS)

    Jeon, Joong-Won; Song, Jaewan; Kim, Jeong-Lim; Park, Seongyul; Yang, Seung-Hune; Lee, Sooryong; Kang, Hokyu; Madkour, Kareem; ElManhawy, Wael; Lee, SeungJo; Kwan, Joe

    2017-04-01

    Due to limited availability of DRC clean patterns during the process and RET recipe development, OPC recipes are not tested with high pattern coverage. Various kinds of pattern can help OPC engineer to detect sensitive patterns to lithographic effects. Random pattern generation is needed to secure robust OPC recipe. However, simple random patterns without considering real product layout style can't cover patterning hotspot in production levels. It is not effective to use them for OPC optimization thus it is important to generate random patterns similar to real product patterns. This paper presents a strategy for generating random patterns based on design architecture information and preventing hotspot in early process development stage through a tool called Layout Schema Generator (LSG). Using LSG, we generate standard cell based on random patterns reflecting real design cell structure - fin pitch, gate pitch and cell height. The output standard cells from LSG are applied to an analysis methodology to assess their hotspot severity by assigning a score according to their optical image parameters - NILS, MEEF, %PV band and thus potential hotspots can be defined by determining their ranking. This flow is demonstrated on Samsung 7nm technology optimizing OPC recipe and early enough in the process avoiding using problematic patterns.

  11. Immunology of scorpion toxins and perspectives for generation of anti-venom vaccines.

    PubMed

    Gazarian, Karlen G; Gazarian, Tatiana; Hernández, Ricardo; Possani, Lourival D

    2005-05-16

    Scorpions and other venomous animals contain concentrates of biologically active substances developed to block vital physiological and biochemical functions of the victims. These have contrasting human health concerns, provide important pharmacological raw material and pose a serious threat to human life and health in tropical and subtropical regions. Because only occasional and minor quantities of venom are introduced into the human organism with a scorpion sting and their mortal effect is an acute phenomenon these substances are unknown to the immune defense system and thus no immunity has appeared against them during evolution. Antidotes prepared from animal anti-sera are effective against some species of scorpions but depend on the manufacturer and the availability of product to the medical community. Although significant progress has been made in immunological studies of certain groups of toxins, few centers are dedicated to this research. Information is still insufficient to generate a comprehensive picture of the subject and to propose vaccines against venoms. A novel approach based on mimotopes selected from phage-displayed random peptide libraries show potential to impel further progress of toxin immunological studies and to provide putative vaccine resources. In this report we revise the "state of the art" in the field.

  12. Antibody VH and VL recombination using phage and ribosome display technologies reveals distinct structural routes to affinity improvements with VH-VL interface residues providing important structural diversity

    PubMed Central

    Groves, Maria AT; Amanuel, Lily; Campbell, Jamie I; Rees, D Gareth; Sridharan, Sudharsan; Finch, Donna K; Lowe, David C; Vaughan, Tristan J

    2014-01-01

    In vitro selection technologies are an important means of affinity maturing antibodies to generate the optimal therapeutic profile for a particular disease target. Here, we describe the isolation of a parent antibody, KENB061 using phage display and solution phase selections with soluble biotinylated human IL-1R1. KENB061 was affinity matured using phage display and targeted mutagenesis of VH and VL CDR3 using NNS randomization. Affinity matured VHCDR3 and VLCDR3 library blocks were recombined and selected using phage and ribosome display protocol. A direct comparison of the phage and ribosome display antibodies generated was made to determine their functional characteristics. PMID:24256948

  13. Device-independent randomness generation from several Bell estimators

    NASA Astrophysics Data System (ADS)

    Nieto-Silleras, Olmo; Bamps, Cédric; Silman, Jonathan; Pironio, Stefano

    2018-02-01

    Device-independent randomness generation and quantum key distribution protocols rely on a fundamental relation between the non-locality of quantum theory and its random character. This relation is usually expressed in terms of a trade-off between the probability of guessing correctly the outcomes of measurements performed on quantum systems and the amount of violation of a given Bell inequality. However, a more accurate assessment of the randomness produced in Bell experiments can be obtained if the value of several Bell expressions is simultaneously taken into account, or if the full set of probabilities characterizing the behavior of the device is considered. We introduce protocols for device-independent randomness generation secure against classical side information, that rely on the estimation of an arbitrary number of Bell expressions or even directly on the experimental frequencies of measurement outcomes. Asymptotically, this results in an optimal generation of randomness from experimental data (as measured by the min-entropy), without having to assume beforehand that the devices violate a specific Bell inequality.

  14. A novel image encryption algorithm based on synchronized random bit generated in cascade-coupled chaotic semiconductor ring lasers

    NASA Astrophysics Data System (ADS)

    Li, Jiafu; Xiang, Shuiying; Wang, Haoning; Gong, Junkai; Wen, Aijun

    2018-03-01

    In this paper, a novel image encryption algorithm based on synchronization of physical random bit generated in a cascade-coupled semiconductor ring lasers (CCSRL) system is proposed, and the security analysis is performed. In both transmitter and receiver parts, the CCSRL system is a master-slave configuration consisting of a master semiconductor ring laser (M-SRL) with cross-feedback and a solitary SRL (S-SRL). The proposed image encryption algorithm includes image preprocessing based on conventional chaotic maps, pixel confusion based on control matrix extracted from physical random bit, and pixel diffusion based on random bit stream extracted from physical random bit. Firstly, the preprocessing method is used to eliminate the correlation between adjacent pixels. Secondly, physical random bit with verified randomness is generated based on chaos in the CCSRL system, and is used to simultaneously generate the control matrix and random bit stream. Finally, the control matrix and random bit stream are used for the encryption algorithm in order to change the position and the values of pixels, respectively. Simulation results and security analysis demonstrate that the proposed algorithm is effective and able to resist various typical attacks, and thus is an excellent candidate for secure image communication application.

  15. Probabilistic generation of random networks taking into account information on motifs occurrence.

    PubMed

    Bois, Frederic Y; Gayraud, Ghislaine

    2015-01-01

    Because of the huge number of graphs possible even with a small number of nodes, inference on network structure is known to be a challenging problem. Generating large random directed graphs with prescribed probabilities of occurrences of some meaningful patterns (motifs) is also difficult. We show how to generate such random graphs according to a formal probabilistic representation, using fast Markov chain Monte Carlo methods to sample them. As an illustration, we generate realistic graphs with several hundred nodes mimicking a gene transcription interaction network in Escherichia coli.

  16. Probabilistic Generation of Random Networks Taking into Account Information on Motifs Occurrence

    PubMed Central

    Bois, Frederic Y.

    2015-01-01

    Abstract Because of the huge number of graphs possible even with a small number of nodes, inference on network structure is known to be a challenging problem. Generating large random directed graphs with prescribed probabilities of occurrences of some meaningful patterns (motifs) is also difficult. We show how to generate such random graphs according to a formal probabilistic representation, using fast Markov chain Monte Carlo methods to sample them. As an illustration, we generate realistic graphs with several hundred nodes mimicking a gene transcription interaction network in Escherichia coli. PMID:25493547

  17. Human tracking in thermal images using adaptive particle filters with online random forest learning

    NASA Astrophysics Data System (ADS)

    Ko, Byoung Chul; Kwak, Joon-Young; Nam, Jae-Yeal

    2013-11-01

    This paper presents a fast and robust human tracking method to use in a moving long-wave infrared thermal camera under poor illumination with the existence of shadows and cluttered backgrounds. To improve the human tracking performance while minimizing the computation time, this study proposes an online learning of classifiers based on particle filters and combination of a local intensity distribution (LID) with oriented center-symmetric local binary patterns (OCS-LBP). Specifically, we design a real-time random forest (RF), which is the ensemble of decision trees for confidence estimation, and confidences of the RF are converted into a likelihood function of the target state. First, the target model is selected by the user and particles are sampled. Then, RFs are generated using the positive and negative examples with LID and OCS-LBP features by online learning. The learned RF classifiers are used to detect the most likely target position in the subsequent frame in the next stage. Then, the RFs are learned again by means of fast retraining with the tracked object and background appearance in the new frame. The proposed algorithm is successfully applied to various thermal videos as tests and its tracking performance is better than those of other methods.

  18. Extracting motor synergies from random movements for low-dimensional task-space control of musculoskeletal robots.

    PubMed

    Fu, Kin Chung Denny; Dalla Libera, Fabio; Ishiguro, Hiroshi

    2015-10-08

    In the field of human motor control, the motor synergy hypothesis explains how humans simplify body control dimensionality by coordinating groups of muscles, called motor synergies, instead of controlling muscles independently. In most applications of motor synergies to low-dimensional control in robotics, motor synergies are extracted from given optimal control signals. In this paper, we address the problems of how to extract motor synergies without optimal data given, and how to apply motor synergies to achieve low-dimensional task-space tracking control of a human-like robotic arm actuated by redundant muscles, without prior knowledge of the robot. We propose to extract motor synergies from a subset of randomly generated reaching-like movement data. The essence is to first approximate the corresponding optimal control signals, using estimations of the robot's forward dynamics, and to extract the motor synergies subsequently. In order to avoid modeling difficulties, a learning-based control approach is adopted such that control is accomplished via estimations of the robot's inverse dynamics. We present a kernel-based regression formulation to estimate the forward and the inverse dynamics, and a sliding controller in order to cope with estimation error. Numerical evaluations show that the proposed method enables extraction of motor synergies for low-dimensional task-space control.

  19. Pure random search for ambient sensor distribution optimisation in a smart home environment.

    PubMed

    Poland, Michael P; Nugent, Chris D; Wang, Hui; Chen, Liming

    2011-01-01

    Smart homes are living spaces facilitated with technology to allow individuals to remain in their own homes for longer, rather than be institutionalised. Sensors are the fundamental physical layer with any smart home, as the data they generate is used to inform decision support systems, facilitating appropriate actuator actions. Positioning of sensors is therefore a fundamental characteristic of a smart home. Contemporary smart home sensor distribution is aligned to either a) a total coverage approach; b) a human assessment approach. These methods for sensor arrangement are not data driven strategies, are unempirical and frequently irrational. This Study hypothesised that sensor deployment directed by an optimisation method that utilises inhabitants' spatial frequency data as the search space, would produce more optimal sensor distributions vs. the current method of sensor deployment by engineers. Seven human engineers were tasked to create sensor distributions based on perceived utility for 9 deployment scenarios. A Pure Random Search (PRS) algorithm was then tasked to create matched sensor distributions. The PRS method produced superior distributions in 98.4% of test cases (n=64) against human engineer instructed deployments when the engineers had no access to the spatial frequency data, and in 92.0% of test cases (n=64) when engineers had full access to these data. These results thus confirmed the hypothesis.

  20. Randomization in clinical trials in orthodontics: its significance in research design and methods to achieve it.

    PubMed

    Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore

    2011-12-01

    Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.

  1. A hybrid-type quantum random number generator

    NASA Astrophysics Data System (ADS)

    Hai-Qiang, Ma; Wu, Zhu; Ke-Jin, Wei; Rui-Xue, Li; Hong-Wei, Liu

    2016-05-01

    This paper proposes a well-performing hybrid-type truly quantum random number generator based on the time interval between two independent single-photon detection signals, which is practical and intuitive, and generates the initial random number sources from a combination of multiple existing random number sources. A time-to-amplitude converter and multichannel analyzer are used for qualitative analysis to demonstrate that each and every step is random. Furthermore, a carefully designed data acquisition system is used to obtain a high-quality random sequence. Our scheme is simple and proves that the random number bit rate can be dramatically increased to satisfy practical requirements. Project supported by the National Natural Science Foundation of China (Grant Nos. 61178010 and 11374042), the Fund of State Key Laboratory of Information Photonics and Optical Communications (Beijing University of Posts and Telecommunications), China, and the Fundamental Research Funds for the Central Universities of China (Grant No. bupt2014TS01).

  2. High-speed true random number generation based on paired memristors for security electronics

    NASA Astrophysics Data System (ADS)

    Zhang, Teng; Yin, Minghui; Xu, Changmin; Lu, Xiayan; Sun, Xinhao; Yang, Yuchao; Huang, Ru

    2017-11-01

    True random number generator (TRNG) is a critical component in hardware security that is increasingly important in the era of mobile computing and internet of things. Here we demonstrate a TRNG using intrinsic variation of memristors as a natural source of entropy that is otherwise undesirable in most applications. The random bits were produced by cyclically switching a pair of tantalum oxide based memristors and comparing their resistance values in the off state, taking advantage of the more pronounced resistance variation compared with that in the on state. Using an alternating read scheme in the designed TRNG circuit, the unbiasedness of the random numbers was significantly improved, and the bitstream passed standard randomness tests. The Pt/TaO x /Ta memristors fabricated in this work have fast programming/erasing speeds of ˜30 ns, suggesting a high random number throughput. The approach proposed here thus holds great promise for physically-implemented random number generation.

  3. High-speed true random number generation based on paired memristors for security electronics.

    PubMed

    Zhang, Teng; Yin, Minghui; Xu, Changmin; Lu, Xiayan; Sun, Xinhao; Yang, Yuchao; Huang, Ru

    2017-11-10

    True random number generator (TRNG) is a critical component in hardware security that is increasingly important in the era of mobile computing and internet of things. Here we demonstrate a TRNG using intrinsic variation of memristors as a natural source of entropy that is otherwise undesirable in most applications. The random bits were produced by cyclically switching a pair of tantalum oxide based memristors and comparing their resistance values in the off state, taking advantage of the more pronounced resistance variation compared with that in the on state. Using an alternating read scheme in the designed TRNG circuit, the unbiasedness of the random numbers was significantly improved, and the bitstream passed standard randomness tests. The Pt/TaO x /Ta memristors fabricated in this work have fast programming/erasing speeds of ∼30 ns, suggesting a high random number throughput. The approach proposed here thus holds great promise for physically-implemented random number generation.

  4. Synthetic single domain antibodies for the conformational trapping of membrane proteins

    PubMed Central

    Arnold, Fabian M; Stohler, Peter; Bocquet, Nicolas; Hug, Melanie N; Huber, Sylwia; Siegrist, Martin; Hetemann, Lisa; Gera, Jennifer; Gmür, Samira; Spies, Peter; Gygax, Daniel

    2018-01-01

    Mechanistic and structural studies of membrane proteins require their stabilization in specific conformations. Single domain antibodies are potent reagents for this purpose, but their generation relies on immunizations, which impedes selections in the presence of ligands typically needed to populate defined conformational states. To overcome this key limitation, we developed an in vitro selection platform based on synthetic single domain antibodies named sybodies. To target the limited hydrophilic surfaces of membrane proteins, we designed three sybody libraries that exhibit different shapes and moderate hydrophobicity of the randomized surface. A robust binder selection cascade combining ribosome and phage display enabled the generation of conformation-selective, high affinity sybodies against an ABC transporter and two previously intractable human SLC transporters, GlyT1 and ENT1. The platform does not require access to animal facilities and builds exclusively on commercially available reagents, thus enabling every lab to rapidly generate binders against challenging membrane proteins. PMID:29792401

  5. Nanodevices for generating power from molecules and batteryless sensing

    DOEpatents

    Wang, Yinmin; Wang, Xianying; Hamza, Alex V.

    2017-01-03

    A nanoconverter or nanosensor is disclosed capable of directly generating electricity through physisorption interactions with molecules that are dipole containing organic species in a molecule interaction zone. High surface-to-volume ratio semiconductor nanowires or nanotubes (such as ZnO, silicon, carbon, etc.) are grown either aligned or randomly-aligned on a substrate. Epoxy or other nonconductive polymers are used to seal portions of the nanowires or nanotubes to create molecule noninteraction zones. By correlating certain molecule species to voltages generated, a nanosensor may quickly identify which species is detected. Nanoconverters in a series parallel arrangement may be constructed in planar, stacked, or rolled arrays to supply power to nano- and micro-devices without use of external batteries. In some cases breath, from human or other life forms, contain sufficient molecules to power a nanoconverter. A membrane permeable to certain molecules around the molecule interaction zone increases specific molecule nanosensor selectivity response.

  6. Nanodevices for generating power from molecules and batteryless sensing

    DOEpatents

    Wang, Yinmin; Wang, Xianying; Hamza, Alex V.

    2015-06-09

    A nanoconverter or nanosensor is disclosed capable of directly generating electricity through physisorption interactions with molecules that are dipole containing organic species in a molecule interaction zone. High surface-to-volume ratio semiconductor nanowires or nanotubes (such as ZnO, silicon, carbon, etc.) are grown either aligned or randomly-aligned on a substrate. Epoxy or other nonconductive polymers are used to seal portions of the nanowires or nanotubes to create molecule noninteraction zones. By correlating certain molecule species to voltages generated, a nanosensor may quickly identify which species is detected. Nanoconverters in a series parallel arrangement may be constructed in planar, stacked, or rolled arrays to supply power to nano- and micro-devices without use of external batteries. In some cases breath, from human or other life forms, contain sufficient molecules to power a nanoconverter. A membrane permeable to certain molecules around the molecule interaction zone increases specific molecule nanosensor selectivity response.

  7. Nanodevices for generating power from molecules and batteryless sensing

    DOEpatents

    Wang, Yinmin; Wang, Xianying; Hamza, Alex V.

    2014-07-15

    A nanoconverter or nanosensor is disclosed capable of directly generating electricity through physisorption interactions with molecules that are dipole containing organic species in a molecule interaction zone. High surface-to-volume ratio semiconductor nanowires or nanotubes (such as ZnO, silicon, carbon, etc.) are grown either aligned or randomly-aligned on a substrate. Epoxy or other nonconductive polymers are used to seal portions of the nanowires or nanotubes to create molecule noninteraction zones. By correlating certain molecule species to voltages generated, a nanosensor may quickly identify which species is detected. Nanoconverters in a series parallel arrangement may be constructed in planar, stacked, or rolled arrays to supply power to nano- and micro-devices without use of external batteries. In some cases breath, from human or other life forms, contain sufficient molecules to power a nanoconverter. A membrane permeable to certain molecules around the molecule interaction zone increases specific molecule nanosensor selectivity response.

  8. On the design of henon and logistic map-based random number generator

    NASA Astrophysics Data System (ADS)

    Magfirawaty; Suryadi, M. T.; Ramli, Kalamullah

    2017-10-01

    The key sequence is one of the main elements in the cryptosystem. True Random Number Generators (TRNG) method is one of the approaches to generating the key sequence. The randomness source of the TRNG divided into three main groups, i.e. electrical noise based, jitter based and chaos based. The chaos based utilizes a non-linear dynamic system (continuous time or discrete time) as an entropy source. In this study, a new design of TRNG based on discrete time chaotic system is proposed, which is then simulated in LabVIEW. The principle of the design consists of combining 2D and 1D chaotic systems. A mathematical model is implemented for numerical simulations. We used comparator process as a harvester method to obtain the series of random bits. Without any post processing, the proposed design generated random bit sequence with high entropy value and passed all NIST 800.22 statistical tests.

  9. Pseudo-random bit generator based on lag time series

    NASA Astrophysics Data System (ADS)

    García-Martínez, M.; Campos-Cantón, E.

    2014-12-01

    In this paper, we present a pseudo-random bit generator (PRBG) based on two lag time series of the logistic map using positive and negative values in the bifurcation parameter. In order to hidden the map used to build the pseudo-random series we have used a delay in the generation of time series. These new series when they are mapped xn against xn+1 present a cloud of points unrelated to the logistic map. Finally, the pseudo-random sequences have been tested with the suite of NIST giving satisfactory results for use in stream ciphers.

  10. Multiple ECG Fiducial Points-Based Random Binary Sequence Generation for Securing Wireless Body Area Networks.

    PubMed

    Zheng, Guanglou; Fang, Gengfa; Shankaran, Rajan; Orgun, Mehmet A; Zhou, Jie; Qiao, Li; Saleem, Kashif

    2017-05-01

    Generating random binary sequences (BSes) is a fundamental requirement in cryptography. A BS is a sequence of N bits, and each bit has a value of 0 or 1. For securing sensors within wireless body area networks (WBANs), electrocardiogram (ECG)-based BS generation methods have been widely investigated in which interpulse intervals (IPIs) from each heartbeat cycle are processed to produce BSes. Using these IPI-based methods to generate a 128-bit BS in real time normally takes around half a minute. In order to improve the time efficiency of such methods, this paper presents an ECG multiple fiducial-points based binary sequence generation (MFBSG) algorithm. The technique of discrete wavelet transforms is employed to detect arrival time of these fiducial points, such as P, Q, R, S, and T peaks. Time intervals between them, including RR, RQ, RS, RP, and RT intervals, are then calculated based on this arrival time, and are used as ECG features to generate random BSes with low latency. According to our analysis on real ECG data, these ECG feature values exhibit the property of randomness and, thus, can be utilized to generate random BSes. Compared with the schemes that solely rely on IPIs to generate BSes, this MFBSG algorithm uses five feature values from one heart beat cycle, and can be up to five times faster than the solely IPI-based methods. So, it achieves a design goal of low latency. According to our analysis, the complexity of the algorithm is comparable to that of fast Fourier transforms. These randomly generated ECG BSes can be used as security keys for encryption or authentication in a WBAN system.

  11. Random Item Generation Is Affected by Age

    ERIC Educational Resources Information Center

    Multani, Namita; Rudzicz, Frank; Wong, Wing Yiu Stephanie; Namasivayam, Aravind Kumar; van Lieshout, Pascal

    2016-01-01

    Purpose: Random item generation (RIG) involves central executive functioning. Measuring aspects of random sequences can therefore provide a simple method to complement other tools for cognitive assessment. We examine the extent to which RIG relates to specific measures of cognitive function, and whether those measures can be estimated using RIG…

  12. On the limiting characteristics of quantum random number generators at various clusterings of photocounts

    NASA Astrophysics Data System (ADS)

    Molotkov, S. N.

    2017-03-01

    Various methods for the clustering of photocounts constituting a sequence of random numbers are considered. It is shown that the clustering of photocounts resulting in the Fermi-Dirac distribution makes it possible to achieve the theoretical limit of the random number generation rate.

  13. The correlation structure of several popular pseudorandom number generators

    NASA Technical Reports Server (NTRS)

    Neuman, F.; Merrick, R.; Martin, C. F.

    1973-01-01

    One of the desirable properties of a pseudorandom number generator is that the sequence of numbers it generates should have very low autocorrelation for all shifts except for zero shift and those that are multiples of its cycle length. Due to the simple methods of constructing random numbers, the ideal is often not quite fulfilled. A simple method of examining any random generator for previously unsuspected regularities is discussed. Once they are discovered it is often easy to derive the mathematical relationships, which describe the mathematical relationships, which describe the regular behavior. As examples, it is shown that high correlation exists in mixed and multiplicative congruential random number generators and prime moduli Lehmer generators for shifts a fraction of their cycle lengths.

  14. Randomized Trial of the Effect of Four Second-Generation Antipsychotics and One First-Generation Antipsychotic on Cigarette Smoking, Alcohol, and Drug Use in Chronic Schizophrenia.

    PubMed

    Mohamed, Somaia; Rosenheck, Robert A; Lin, Haiqun; Swartz, Marvin; McEvoy, Joseph; Stroup, Scott

    2015-07-01

    No large-scale randomized trial has compared the effect of different second-generation antipsychotic drugs and any first-generation drug on alcohol, drug and nicotine use in patients with schizophrenia. The Clinical Antipsychotic Trial of Intervention Effectiveness study randomly assigned 1432 patients formally diagnosed with schizophrenia to four second-generation antipsychotic drugs (olanzapine, risperidone quetiapine, and ziprasidone) and one first-generation antipsychotic (perphenazine) and followed them for up to 18 months. Secondary outcome data documented cigarettes smoked in the past week and alcohol and drug use severity ratings. At baseline, 61% of patients smoked, 35% used alcohol, and 23% used illicit drugs. Although there were significant effects of time showing reduction in substance use over the 18 months (all p < 0.0001), this study found no evidence that any antipsychotic was robustly superior to any other in a secondary analysis of data on substance use outcomes from a large 18-month randomized schizophrenia trial.

  15. High-Speed Device-Independent Quantum Random Number Generation without a Detection Loophole

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Yuan, Xiao; Li, Ming-Han; Zhang, Weijun; Zhao, Qi; Zhong, Jiaqiang; Cao, Yuan; Li, Yu-Huai; Chen, Luo-Kan; Li, Hao; Peng, Tianyi; Chen, Yu-Ao; Peng, Cheng-Zhi; Shi, Sheng-Cai; Wang, Zhen; You, Lixing; Ma, Xiongfeng; Fan, Jingyun; Zhang, Qiang; Pan, Jian-Wei

    2018-01-01

    Quantum mechanics provides the means of generating genuine randomness that is impossible with deterministic classical processes. Remarkably, the unpredictability of randomness can be certified in a manner that is independent of implementation devices. Here, we present an experimental study of device-independent quantum random number generation based on a detection-loophole-free Bell test with entangled photons. In the randomness analysis, without the independent identical distribution assumption, we consider the worst case scenario that the adversary launches the most powerful attacks against the quantum adversary. After considering statistical fluctuations and applying an 80 Gb ×45.6 Mb Toeplitz matrix hashing, we achieve a final random bit rate of 114 bits /s , with a failure probability less than 10-5. This marks a critical step towards realistic applications in cryptography and fundamental physics tests.

  16. KMCLib 1.1: Extended random number support and technical updates to the KMCLib general framework for kinetic Monte-Carlo simulations

    NASA Astrophysics Data System (ADS)

    Leetmaa, Mikael; Skorodumova, Natalia V.

    2015-11-01

    We here present a revised version, v1.1, of the KMCLib general framework for kinetic Monte-Carlo (KMC) simulations. The generation of random numbers in KMCLib now relies on the C++11 standard library implementation, and support has been added for the user to choose from a set of C++11 implemented random number generators. The Mersenne-twister, the 24 and 48 bit RANLUX and a 'minimal-standard' PRNG are supported. We have also included the possibility to use true random numbers via the C++11 std::random_device generator. This release also includes technical updates to support the use of an extended range of operating systems and compilers.

  17. Minimalist design of a robust real-time quantum random number generator

    NASA Astrophysics Data System (ADS)

    Kravtsov, K. S.; Radchenko, I. V.; Kulik, S. P.; Molotkov, S. N.

    2015-08-01

    We present a simple and robust construction of a real-time quantum random number generator (QRNG). Our minimalist approach ensures stable operation of the device as well as its simple and straightforward hardware implementation as a stand-alone module. As a source of randomness the device uses measurements of time intervals between clicks of a single-photon detector. The obtained raw sequence is then filtered and processed by a deterministic randomness extractor, which is realized as a look-up table. This enables high speed on-the-fly processing without the need of extensive computations. The overall performance of the device is around 1 random bit per detector click, resulting in 1.2 Mbit/s generation rate in our implementation.

  18. Anomalous Anticipatory Responses in Networked Random Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Roger D.; Bancel, Peter A.

    2006-10-16

    We examine an 8-year archive of synchronized, parallel time series of random data from a world spanning network of physical random event generators (REGs). The archive is a publicly accessible matrix of normally distributed 200-bit sums recorded at 1 Hz which extends from August 1998 to the present. The primary question is whether these data show non-random structure associated with major events such as natural or man-made disasters, terrible accidents, or grand celebrations. Secondarily, we examine the time course of apparently correlated responses. Statistical analyses of the data reveal consistent evidence that events which strongly affect people engender small butmore » significant effects. These include suggestions of anticipatory responses in some cases, leading to a series of specialized analyses to assess possible non-random structure preceding precisely timed events. A focused examination of data collected around the time of earthquakes with Richter magnitude 6 and greater reveals non-random structure with a number of intriguing, potentially important features. Anomalous effects in the REG data are seen only when the corresponding earthquakes occur in populated areas. No structure is found if they occur in the oceans. We infer that an important contributor to the effect is the relevance of the earthquake to humans. Epoch averaging reveals evidence for changes in the data some hours prior to the main temblor, suggestive of reverse causation.« less

  19. Spline methods for approximating quantile functions and generating random samples

    NASA Technical Reports Server (NTRS)

    Schiess, J. R.; Matthews, C. G.

    1985-01-01

    Two cubic spline formulations are presented for representing the quantile function (inverse cumulative distribution function) of a random sample of data. Both B-spline and rational spline approximations are compared with analytic representations of the quantile function. It is also shown how these representations can be used to generate random samples for use in simulation studies. Comparisons are made on samples generated from known distributions and a sample of experimental data. The spline representations are more accurate for multimodal and skewed samples and to require much less time to generate samples than the analytic representation.

  20. Random Number Generation and Executive Functions in Parkinson's Disease: An Event-Related Brain Potential Study.

    PubMed

    Münte, Thomas F; Joppich, Gregor; Däuper, Jan; Schrader, Christoph; Dengler, Reinhard; Heldmann, Marcus

    2015-01-01

    The generation of random sequences is considered to tax executive functions and has been reported to be impaired in Parkinson's disease (PD) previously. To assess the neurophysiological markers of random number generation in PD. Event-related potentials (ERP) were recorded in 12 PD patients and 12 age-matched normal controls (NC) while either engaging in random number generation (RNG) by pressing the number keys on a computer keyboard in a random sequence or in ordered number generation (ONG) necessitating key presses in the canonical order. Key presses were paced by an external auditory stimulus at a rate of 1 tone every 1800 ms. As a secondary task subjects had to monitor the tone-sequence for a particular target tone to which the number "0" key had to be pressed. This target tone occurred randomly and infrequently, thus creating a secondary oddball task. Behaviorally, PD patients showed an increased tendency to count in steps of one as well as a tendency towards repetition avoidance. Electrophysiologically, the amplitude of the P3 component of the ERP to the target tone of the secondary task was reduced during RNG in PD but not in NC. The behavioral findings indicate less random behavior in PD while the ERP findings suggest that this impairment comes about, because attentional resources are depleted in PD.

  1. Anderson localization for radial tree-like random quantum graphs

    NASA Astrophysics Data System (ADS)

    Hislop, Peter D.; Post, Olaf

    We prove that certain random models associated with radial, tree-like, rooted quantum graphs exhibit Anderson localization at all energies. The two main examples are the random length model (RLM) and the random Kirchhoff model (RKM). In the RLM, the lengths of each generation of edges form a family of independent, identically distributed random variables (iid). For the RKM, the iid random variables are associated with each generation of vertices and moderate the current flow through the vertex. We consider extensions to various families of decorated graphs and prove stability of localization with respect to decoration. In particular, we prove Anderson localization for the random necklace model.

  2. The Reliability of Randomly Generated Math Curriculum-Based Measurements

    ERIC Educational Resources Information Center

    Strait, Gerald G.; Smith, Bradley H.; Pender, Carolyn; Malone, Patrick S.; Roberts, Jarod; Hall, John D.

    2015-01-01

    "Curriculum-Based Measurement" (CBM) is a direct method of academic assessment used to screen and evaluate students' skills and monitor their responses to academic instruction and intervention. Interventioncentral.org offers a math worksheet generator at no cost that creates randomly generated "math curriculum-based measures"…

  3. Quantum random number generator based on quantum nature of vacuum fluctuations

    NASA Astrophysics Data System (ADS)

    Ivanova, A. E.; Chivilikhin, S. A.; Gleim, A. V.

    2017-11-01

    Quantum random number generator (QRNG) allows obtaining true random bit sequences. In QRNG based on quantum nature of vacuum, optical beam splitter with two inputs and two outputs is normally used. We compare mathematical descriptions of spatial beam splitter and fiber Y-splitter in the quantum model for QRNG, based on homodyne detection. These descriptions were identical, that allows to use fiber Y-splitters in practical QRNG schemes, simplifying the setup. Also we receive relations between the input radiation and the resulting differential current in homodyne detector. We experimentally demonstrate possibility of true random bits generation by using QRNG based on homodyne detection with Y-splitter.

  4. A revision of the subtract-with-borrow random number generators

    NASA Astrophysics Data System (ADS)

    Sibidanov, Alexei

    2017-12-01

    The most popular and widely used subtract-with-borrow generator, also known as RANLUX, is reimplemented as a linear congruential generator using large integer arithmetic with the modulus size of 576 bits. Modern computers, as well as the specific structure of the modulus inferred from RANLUX, allow for the development of a fast modular multiplication - the core of the procedure. This was previously believed to be slow and have too high cost in terms of computing resources. Our tests show a significant gain in generation speed which is comparable with other fast, high quality random number generators. An additional feature is the fast skipping of generator states leading to a seeding scheme which guarantees the uniqueness of random number sequences. Licensing provisions: GPLv3 Programming language: C++, C, Assembler

  5. Graphene based widely-tunable and singly-polarized pulse generation with random fiber lasers

    PubMed Central

    Yao, B. C.; Rao, Y. J.; Wang, Z. N.; Wu, Y.; Zhou, J. H.; Wu, H.; Fan, M. Q.; Cao, X. L.; Zhang, W. L.; Chen, Y. F.; Li, Y. R.; Churkin, D.; Turitsyn, S.; Wong, C. W.

    2015-01-01

    Pulse generation often requires a stabilized cavity and its corresponding mode structure for initial phase-locking. Contrastingly, modeless cavity-free random lasers provide new possibilities for high quantum efficiency lasing that could potentially be widely tunable spectrally and temporally. Pulse generation in random lasers, however, has remained elusive since the discovery of modeless gain lasing. Here we report coherent pulse generation with modeless random lasers based on the unique polarization selectivity and broadband saturable absorption of monolayer graphene. Simultaneous temporal compression of cavity-free pulses are observed with such a polarization modulation, along with a broadly-tunable pulsewidth across two orders of magnitude down to 900 ps, a broadly-tunable repetition rate across three orders of magnitude up to 3 MHz, and a singly-polarized pulse train at 41 dB extinction ratio, about an order of magnitude larger than conventional pulsed fiber lasers. Moreover, our graphene-based pulse formation also demonstrates robust pulse-to-pulse stability and wide-wavelength operation due to the cavity-less feature. Such a graphene-based architecture not only provides a tunable pulsed random laser for fiber-optic sensing, speckle-free imaging, and laser-material processing, but also a new way for the non-random CW fiber lasers to generate widely tunable and singly-polarized pulses. PMID:26687730

  6. Graphene based widely-tunable and singly-polarized pulse generation with random fiber lasers.

    PubMed

    Yao, B C; Rao, Y J; Wang, Z N; Wu, Y; Zhou, J H; Wu, H; Fan, M Q; Cao, X L; Zhang, W L; Chen, Y F; Li, Y R; Churkin, D; Turitsyn, S; Wong, C W

    2015-12-21

    Pulse generation often requires a stabilized cavity and its corresponding mode structure for initial phase-locking. Contrastingly, modeless cavity-free random lasers provide new possibilities for high quantum efficiency lasing that could potentially be widely tunable spectrally and temporally. Pulse generation in random lasers, however, has remained elusive since the discovery of modeless gain lasing. Here we report coherent pulse generation with modeless random lasers based on the unique polarization selectivity and broadband saturable absorption of monolayer graphene. Simultaneous temporal compression of cavity-free pulses are observed with such a polarization modulation, along with a broadly-tunable pulsewidth across two orders of magnitude down to 900 ps, a broadly-tunable repetition rate across three orders of magnitude up to 3 MHz, and a singly-polarized pulse train at 41 dB extinction ratio, about an order of magnitude larger than conventional pulsed fiber lasers. Moreover, our graphene-based pulse formation also demonstrates robust pulse-to-pulse stability and wide-wavelength operation due to the cavity-less feature. Such a graphene-based architecture not only provides a tunable pulsed random laser for fiber-optic sensing, speckle-free imaging, and laser-material processing, but also a new way for the non-random CW fiber lasers to generate widely tunable and singly-polarized pulses.

  7. High-Speed Device-Independent Quantum Random Number Generation without a Detection Loophole.

    PubMed

    Liu, Yang; Yuan, Xiao; Li, Ming-Han; Zhang, Weijun; Zhao, Qi; Zhong, Jiaqiang; Cao, Yuan; Li, Yu-Huai; Chen, Luo-Kan; Li, Hao; Peng, Tianyi; Chen, Yu-Ao; Peng, Cheng-Zhi; Shi, Sheng-Cai; Wang, Zhen; You, Lixing; Ma, Xiongfeng; Fan, Jingyun; Zhang, Qiang; Pan, Jian-Wei

    2018-01-05

    Quantum mechanics provides the means of generating genuine randomness that is impossible with deterministic classical processes. Remarkably, the unpredictability of randomness can be certified in a manner that is independent of implementation devices. Here, we present an experimental study of device-independent quantum random number generation based on a detection-loophole-free Bell test with entangled photons. In the randomness analysis, without the independent identical distribution assumption, we consider the worst case scenario that the adversary launches the most powerful attacks against the quantum adversary. After considering statistical fluctuations and applying an 80  Gb×45.6  Mb Toeplitz matrix hashing, we achieve a final random bit rate of 114  bits/s, with a failure probability less than 10^{-5}. This marks a critical step towards realistic applications in cryptography and fundamental physics tests.

  8. Flexible and multi-directional piezoelectric energy harvester for self-powered human motion sensor

    NASA Astrophysics Data System (ADS)

    Kim, Min-Ook; Pyo, Soonjae; Oh, Yongkeun; Kang, Yunsung; Cho, Kyung-Ho; Choi, Jungwook; Kim, Jongbaeg

    2018-03-01

    A flexible piezoelectric strain energy harvester that is responsive to multi-directional input forces produced by various human motions is proposed. The structure of the harvester, which includes a polydimethylsiloxane (PDMS) bump, facilitates the effective conversion of strain energy, produced by input forces applied in random directions, into electrical energy. The structural design of the PDMS bump and frame as well as the slits in the piezoelectric polyvinylidene fluoride (PVDF) film provide mechanical flexibility and enhance the strain induced in the PVDF film under input forces applied at various angles. The amount and direction of the strain induced in PVDF can be changed by the direction of the applied force; thus, the generated output power can be varied. The measured maximum output peak voltage is 1.75, 1.29, and 0.98 V when an input force of 4 N (2 Hz) is applied at angles of 0°, 45°, and 90°, and the corresponding maximum output power is 0.064, 0.026, and 0.02 μW, respectively. Moreover, the harvester stably generates output voltage over 1.4 × 104 cycles. Thus, the proposed harvester successfully identifies and converts strain energy produced by multi-directional input forces by various human motions into electrical energy. We demonstrate the potential utility of the proposed flexible energy harvester as a self-powered human motion sensor for wireless healthcare systems.

  9. Neutron monitor generated data distributions in quantum variational Monte Carlo

    NASA Astrophysics Data System (ADS)

    Kussainov, A. S.; Pya, N.

    2016-08-01

    We have assessed the potential applications of the neutron monitor hardware as random number generator for normal and uniform distributions. The data tables from the acquisition channels with no extreme changes in the signal level were chosen as the retrospective model. The stochastic component was extracted by fitting the raw data with splines and then subtracting the fit. Scaling the extracted data to zero mean and variance of one is sufficient to obtain a stable standard normal random variate. Distributions under consideration pass all available normality tests. Inverse transform sampling is suggested to use as a source of the uniform random numbers. Variational Monte Carlo method for quantum harmonic oscillator was used to test the quality of our random numbers. If the data delivery rate is of importance and the conventional one minute resolution neutron count is insufficient, we could always settle for an efficient seed generator to feed into the faster algorithmic random number generator or create a buffer.

  10. Random bits, true and unbiased, from atmospheric turbulence

    PubMed Central

    Marangon, Davide G.; Vallone, Giuseppe; Villoresi, Paolo

    2014-01-01

    Random numbers represent a fundamental ingredient for secure communications and numerical simulation as well as to games and in general to Information Science. Physical processes with intrinsic unpredictability may be exploited to generate genuine random numbers. The optical propagation in strong atmospheric turbulence is here taken to this purpose, by observing a laser beam after a 143 km free-space path. In addition, we developed an algorithm to extract the randomness of the beam images at the receiver without post-processing. The numbers passed very selective randomness tests for qualification as genuine random numbers. The extracting algorithm can be easily generalized to random images generated by different physical processes. PMID:24976499

  11. Randomized trial of exclusive human milk versus preterm formula diets in extremely premature infants

    USDA-ARS?s Scientific Manuscript database

    Our objective was to compare the duration of parenteral nutrition, growth, and morbidity in extremely premature infants fed exclusive diets of either bovine milk-based preterm formula (BOV) or donor human milk and human milk-based human milk fortifier (HUM), in a randomized trial of formula vs human...

  12. Asynchronous Replication and Autosome-Pair Non-Equivalence in Human Embryonic Stem Cells

    PubMed Central

    Dutta, Devkanya; Ensminger, Alexander W.; Zucker, Jacob P.; Chess, Andrew

    2009-01-01

    A number of mammalian genes exhibit the unusual properties of random monoallelic expression and random asynchronous replication. Such exceptional genes include genes subject to X inactivation and autosomal genes including odorant receptors, immunoglobulins, interleukins, pheromone receptors, and p120 catenin. In differentiated cells, random asynchronous replication of interspersed autosomal genes is coordinated at the whole chromosome level, indicative of chromosome-pair non-equivalence. Here we have investigated the replication pattern of the random asynchronously replicating genes in undifferentiated human embryonic stem cells, using fluorescence in situ hybridization based assay. We show that allele-specific replication of X-linked genes and random monoallelic autosomal genes occur in human embryonic stem cells. The direction of replication is coordinated at the whole chromosome level and can cross the centromere, indicating the existence of autosome-pair non-equivalence in human embryonic stem cells. These results suggest that epigenetic mechanism(s) that randomly distinguish between two parental alleles are emerging in the cells of the inner cell mass, the source of human embryonic stem cells. PMID:19325893

  13. Scope of Various Random Number Generators in Ant System Approach for TSP

    NASA Technical Reports Server (NTRS)

    Sen, S. K.; Shaykhian, Gholam Ali

    2007-01-01

    Experimented on heuristic, based on an ant system approach for traveling Salesman problem, are several quasi and pseudo-random number generators. This experiment is to explore if any particular generator is most desirable. Such an experiment on large samples has the potential to rank the performance of the generators for the foregoing heuristic. This is just to seek an answer to the controversial performance ranking of the generators in probabilistic/statically sense.

  14. True randomness from an incoherent source

    NASA Astrophysics Data System (ADS)

    Qi, Bing

    2017-11-01

    Quantum random number generators (QRNGs) harness the intrinsic randomness in measurement processes: the measurement outputs are truly random, given the input state is a superposition of the eigenstates of the measurement operators. In the case of trusted devices, true randomness could be generated from a mixed state ρ so long as the system entangled with ρ is well protected. We propose a random number generation scheme based on measuring the quadrature fluctuations of a single mode thermal state using an optical homodyne detector. By mixing the output of a broadband amplified spontaneous emission (ASE) source with a single mode local oscillator (LO) at a beam splitter and performing differential photo-detection, we can selectively detect the quadrature fluctuation of a single mode output of the ASE source, thanks to the filtering function of the LO. Experimentally, a quadrature variance about three orders of magnitude larger than the vacuum noise has been observed, suggesting this scheme can tolerate much higher detector noise in comparison with QRNGs based on measuring the vacuum noise. The high quality of this entropy source is evidenced by the small correlation coefficients of the acquired data. A Toeplitz-hashing extractor is applied to generate unbiased random bits from the Gaussian distributed raw data, achieving an efficiency of 5.12 bits per sample. The output of the Toeplitz extractor successfully passes all the NIST statistical tests for random numbers.

  15. Templated sequence insertion polymorphisms in the human genome

    NASA Astrophysics Data System (ADS)

    Onozawa, Masahiro; Aplan, Peter

    2016-11-01

    Templated Sequence Insertion Polymorphism (TSIP) is a recently described form of polymorphism recognized in the human genome, in which a sequence that is templated from a distant genomic region is inserted into the genome, seemingly at random. TSIPs can be grouped into two classes based on nucleotide sequence features at the insertion junctions; Class 1 TSIPs show features of insertions that are mediated via the LINE-1 ORF2 protein, including 1) target-site duplication (TSD), 2) polyadenylation 10-30 nucleotides downstream of a “cryptic” polyadenylation signal, and 3) preference for insertion at a 5’-TTTT/A-3’ sequence. In contrast, class 2 TSIPs show features consistent with repair of a DNA double-strand break via insertion of a DNA “patch” that is derived from a distant genomic region. Survey of a large number of normal human volunteers demonstrates that most individuals have 25-30 TSIPs, and that these TSIPs track with specific geographic regions. Similar to other forms of human polymorphism, we suspect that these TSIPs may be important for the generation of human diversity and genetic diseases.

  16. Golden Ratio Versus Pi as Random Sequence Sources for Monte Carlo Integration

    NASA Technical Reports Server (NTRS)

    Sen, S. K.; Agarwal, Ravi P.; Shaykhian, Gholam Ali

    2007-01-01

    We discuss here the relative merits of these numbers as possible random sequence sources. The quality of these sequences is not judged directly based on the outcome of all known tests for the randomness of a sequence. Instead, it is determined implicitly by the accuracy of the Monte Carlo integration in a statistical sense. Since our main motive of using a random sequence is to solve real world problems, it is more desirable if we compare the quality of the sequences based on their performances for these problems in terms of quality/accuracy of the output. We also compare these sources against those generated by a popular pseudo-random generator, viz., the Matlab rand and the quasi-random generator ha/ton both in terms of error and time complexity. Our study demonstrates that consecutive blocks of digits of each of these numbers produce a good random sequence source. It is observed that randomly chosen blocks of digits do not have any remarkable advantage over consecutive blocks for the accuracy of the Monte Carlo integration. Also, it reveals that pi is a better source of a random sequence than theta when the accuracy of the integration is concerned.

  17. Analysis of entropy extraction efficiencies in random number generation systems

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu

    2016-05-01

    Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.

  18. Quantum random bit generation using energy fluctuations in stimulated Raman scattering.

    PubMed

    Bustard, Philip J; England, Duncan G; Nunn, Josh; Moffatt, Doug; Spanner, Michael; Lausten, Rune; Sussman, Benjamin J

    2013-12-02

    Random number sequences are a critical resource in modern information processing systems, with applications in cryptography, numerical simulation, and data sampling. We introduce a quantum random number generator based on the measurement of pulse energy quantum fluctuations in Stokes light generated by spontaneously-initiated stimulated Raman scattering. Bright Stokes pulse energy fluctuations up to five times the mean energy are measured with fast photodiodes and converted to unbiased random binary strings. Since the pulse energy is a continuous variable, multiple bits can be extracted from a single measurement. Our approach can be generalized to a wide range of Raman active materials; here we demonstrate a prototype using the optical phonon line in bulk diamond.

  19. Experimental study of a quantum random-number generator based on two independent lasers

    NASA Astrophysics Data System (ADS)

    Sun, Shi-Hai; Xu, Feihu

    2017-12-01

    A quantum random-number generator (QRNG) can produce true randomness by utilizing the inherent probabilistic nature of quantum mechanics. Recently, the spontaneous-emission quantum phase noise of the laser has been widely deployed for quantum random-number generation, due to its high rate, its low cost, and the feasibility of chip-scale integration. Here, we perform a comprehensive experimental study of a phase-noise-based QRNG with two independent lasers, each of which operates in either continuous-wave (CW) or pulsed mode. We implement the QRNG by operating the two lasers in three configurations, namely, CW + CW, CW + pulsed, and pulsed + pulsed, and demonstrate their trade-offs, strengths, and weaknesses.

  20. Answer (in part) blowing in the wind. Comment on "Liberating Lévy walk research from the shackles of optimal foraging" by A. Reynolds

    NASA Astrophysics Data System (ADS)

    Cheng, Ken

    2015-09-01

    In a perspective in this issue based on thorough review, Andy Reynolds [1] tackles the issue of how the by now ubiquitously found Lévy walks can be generated, by animals, by organisms other than animals, and other forms of life below the level of organisms, such as cells. The answer comes not in a single whole cloth, but rather in a patchwork of generating factors. Lévy-like movements arise in objects blowing in the wind, or from travelers encountering turbulence in the seas or being repelled by boundaries. A variety of desiderata in movements, not related to achieving optimal foraging, may also engender Lévy-like movements. These include avoiding other organisms or not crossing one's traveled path. Adding to that plethora are ways in which variations on the theme of garden-variety random walks can at least approach a Lévy walk, if not capturing the mathematical form perfectly. Such variations include executing random walks on multiple scales, a strategy exhibited by desert ants [2,3], mussels [4], and quite likely extant hunter-gatherer humans as well [5]. It is possible that fossil tracks over 50 million years old also show this strategy, as the curve fitting with multiple random walks, characterized by multiple exponential distributions, is as good or better than curve fits having the power-law distribution characteristic of Lévy walks [6]. Another variation is to have a random walk search whose scale is expanding over time. In great detail and based on extensive literature - the review has over 200 references - a range of other ways in which Lévy-like movements might come about are also discussed.

  1. Minimal-post-processing 320-Gbps true random bit generation using physical white chaos.

    PubMed

    Wang, Anbang; Wang, Longsheng; Li, Pu; Wang, Yuncai

    2017-02-20

    Chaotic external-cavity semiconductor laser (ECL) is a promising entropy source for generation of high-speed physical random bits or digital keys. The rate and randomness is unfortunately limited by laser relaxation oscillation and external-cavity resonance, and is usually improved by complicated post processing. Here, we propose using a physical broadband white chaos generated by optical heterodyning of two ECLs as entropy source to construct high-speed random bit generation (RBG) with minimal post processing. The optical heterodyne chaos not only has a white spectrum without signature of relaxation oscillation and external-cavity resonance but also has a symmetric amplitude distribution. Thus, after quantization with a multi-bit analog-digital-convertor (ADC), random bits can be obtained by extracting several least significant bits (LSBs) without any other processing. In experiments, a white chaos with a 3-dB bandwidth of 16.7 GHz is generated. Its entropy rate is estimated as 16 Gbps by single-bit quantization which means a spectrum efficiency of 96%. With quantization using an 8-bit ADC, 320-Gbps physical RBG is achieved by directly extracting 4 LSBs at 80-GHz sampling rate.

  2. The connectivity of the brain: multi-level quantitative analysis.

    PubMed

    Murre, J M; Sturdy, D P

    1995-11-01

    We develop a mathematical formalism or calculating connectivity volumes generated by specific topologies with various physical packing strategies. We consider four topologies (full, random, nearest-neighbor, and modular connectivity) and three physical models: (i) interior packing, where neurons and connection fibers are intermixed, (ii) sheeted packing where neurons are located on a sheet with fibers running underneath, and (iii) exterior packing where the neurons are located at the surfaces of a cube or sphere with fibers taking up the internal volume. By extensive cross-referencing of available human neuroanatomical data we produce a consistent set of parameters for the whole brain, the cerebral cortex, and the cerebellar cortex. By comparing these inferred values with those predicted by the expressions, we draw the following general conclusions for the human brain, cortex, and cerebellum: (i) Interior packing is less efficient than exterior packing (in a sphere). (ii) Fully and randomly connected topologies are extremely inefficient. More specifically we find evidence that different topologies and physical packing strategies might be used at different scales. (iii) For the human brain at a macro-structural level, modular topologies on an exterior sphere approach the data most closely. (iv) On a mesostructural level, laminarization and columnarization are evidence of the superior efficiency of organizing the wiring as sheets. (v) Within sheets, microstructures emerge in which interior models are shown to be the most efficient. With regard to interspecies similarities and differences we conjecture (vi) that the remarkable constancy of number of neurons per underlying square millimeter of cortex may be the result of evolution minimizing interneuron distance in grey matter, and (vii) that the topologies that best fit the human brain data should not be assumed to apply to other mammals, such as the mouse for which we show that a random topology may be feasible for the cortex.

  3. Postural control model interpretation of stabilogram diffusion analysis

    NASA Technical Reports Server (NTRS)

    Peterka, R. J.

    2000-01-01

    Collins and De Luca [Collins JJ. De Luca CJ (1993) Exp Brain Res 95: 308-318] introduced a new method known as stabilogram diffusion analysis that provides a quantitative statistical measure of the apparently random variations of center-of-pressure (COP) trajectories recorded during quiet upright stance in humans. This analysis generates a stabilogram diffusion function (SDF) that summarizes the mean square COP displacement as a function of the time interval between COP comparisons. SDFs have a characteristic two-part form that suggests the presence of two different control regimes: a short-term open-loop control behavior and a longer-term closed-loop behavior. This paper demonstrates that a very simple closed-loop control model of upright stance can generate realistic SDFs. The model consists of an inverted pendulum body with torque applied at the ankle joint. This torque includes a random disturbance torque and a control torque. The control torque is a function of the deviation (error signal) between the desired upright body position and the actual body position, and is generated in proportion to the error signal, the derivative of the error signal, and the integral of the error signal [i.e. a proportional, integral and derivative (PID) neural controller]. The control torque is applied with a time delay representing conduction, processing, and muscle activation delays. Variations in the PID parameters and the time delay generate variations in SDFs that mimic real experimental SDFs. This model analysis allows one to interpret experimentally observed changes in SDFs in terms of variations in neural controller and time delay parameters rather than in terms of open-loop versus closed-loop behavior.

  4. Empirical Analysis and Refinement of Expert System Knowledge Bases

    DTIC Science & Technology

    1988-08-31

    refinement. Both a simulated case generation program, and a random rule basher were developed to enhance rule refinement experimentation. *Substantial...the second fiscal year 88 objective was fully met. Rule Refinement System Simulated Rule Basher Case Generator Stored Cases Expert System Knowledge...generated until the rule is satisfied. Cases may be randomly generated for a given rule or hypothesis. Rule Basher Given that one has a correct

  5. Reliability Overhaul Model

    DTIC Science & Technology

    1989-08-01

    Random variables for the conditional exponential distribution are generated using the inverse transform method. C1) Generate U - UCO,i) (2) Set s - A ln...e - [(x+s - 7)/ n] 0 + [Cx-T)/n]0 c. Random variables from the conditional weibull distribution are generated using the inverse transform method. C1...using a standard normal transformation and the inverse transform method. B - 3 APPENDIX 3 DISTRIBUTIONS SUPPORTED BY THE MODEL (1) Generate Y - PCX S

  6. Applying a weighted random forests method to extract karst sinkholes from LiDAR data

    NASA Astrophysics Data System (ADS)

    Zhu, Junfeng; Pierskalla, William P.

    2016-02-01

    Detailed mapping of sinkholes provides critical information for mitigating sinkhole hazards and understanding groundwater and surface water interactions in karst terrains. LiDAR (Light Detection and Ranging) measures the earth's surface in high-resolution and high-density and has shown great potentials to drastically improve locating and delineating sinkholes. However, processing LiDAR data to extract sinkholes requires separating sinkholes from other depressions, which can be laborious because of the sheer number of the depressions commonly generated from LiDAR data. In this study, we applied the random forests, a machine learning method, to automatically separate sinkholes from other depressions in a karst region in central Kentucky. The sinkhole-extraction random forest was grown on a training dataset built from an area where LiDAR-derived depressions were manually classified through a visual inspection and field verification process. Based on the geometry of depressions, as well as natural and human factors related to sinkholes, 11 parameters were selected as predictive variables to form the dataset. Because the training dataset was imbalanced with the majority of depressions being non-sinkholes, a weighted random forests method was used to improve the accuracy of predicting sinkholes. The weighted random forest achieved an average accuracy of 89.95% for the training dataset, demonstrating that the random forest can be an effective sinkhole classifier. Testing of the random forest in another area, however, resulted in moderate success with an average accuracy rate of 73.96%. This study suggests that an automatic sinkhole extraction procedure like the random forest classifier can significantly reduce time and labor costs and makes its more tractable to map sinkholes using LiDAR data for large areas. However, the random forests method cannot totally replace manual procedures, such as visual inspection and field verification.

  7. Some practical problems in implementing randomization.

    PubMed

    Downs, Matt; Tucker, Kathryn; Christ-Schmidt, Heidi; Wittes, Janet

    2010-06-01

    While often theoretically simple, implementing randomization to treatment in a masked, but confirmable, fashion can prove difficult in practice. At least three categories of problems occur in randomization: (1) bad judgment in the choice of method, (2) design and programming errors in implementing the method, and (3) human error during the conduct of the trial. This article focuses on these latter two types of errors, dealing operationally with what can go wrong after trial designers have selected the allocation method. We offer several case studies and corresponding recommendations for lessening the frequency of problems in allocating treatment or for mitigating the consequences of errors. Recommendations include: (1) reviewing the randomization schedule before starting a trial, (2) being especially cautious of systems that use on-demand random number generators, (3) drafting unambiguous randomization specifications, (4) performing thorough testing before entering a randomization system into production, (5) maintaining a dataset that captures the values investigators used to randomize participants, thereby allowing the process of treatment allocation to be reproduced and verified, (6) resisting the urge to correct errors that occur in individual treatment assignments, (7) preventing inadvertent unmasking to treatment assignments in kit allocations, and (8) checking a sample of study drug kits to allow detection of errors in drug packaging and labeling. Although we performed a literature search of documented randomization errors, the examples that we provide and the resultant recommendations are based largely on our own experience in industry-sponsored clinical trials. We do not know how representative our experience is or how common errors of the type we have seen occur. Our experience underscores the importance of verifying the integrity of the treatment allocation process before and during a trial. Clinical Trials 2010; 7: 235-245. http://ctj.sagepub.com.

  8. The Status, Quality, and Expansion of the NIH Full-Length cDNA Project: The Mammalian Gene Collection (MGC)

    PubMed Central

    2004-01-01

    The National Institutes of Health's Mammalian Gene Collection (MGC) project was designed to generate and sequence a publicly accessible cDNA resource containing a complete open reading frame (ORF) for every human and mouse gene. The project initially used a random strategy to select clones from a large number of cDNA libraries from diverse tissues. Candidate clones were chosen based on 5′-EST sequences, and then fully sequenced to high accuracy and analyzed by algorithms developed for this project. Currently, more than 11,000 human and 10,000 mouse genes are represented in MGC by at least one clone with a full ORF. The random selection approach is now reaching a saturation point, and a transition to protocols targeted at the missing transcripts is now required to complete the mouse and human collections. Comparison of the sequence of the MGC clones to reference genome sequences reveals that most cDNA clones are of very high sequence quality, although it is likely that some cDNAs may carry missense variants as a consequence of experimental artifact, such as PCR, cloning, or reverse transcriptase errors. Recently, a rat cDNA component was added to the project, and ongoing frog (Xenopus) and zebrafish (Danio) cDNA projects were expanded to take advantage of the high-throughput MGC pipeline. PMID:15489334

  9. Model for interevent times with long tails and multifractality in human communications: An application to financial trading

    NASA Astrophysics Data System (ADS)

    Perelló, Josep; Masoliver, Jaume; Kasprzak, Andrzej; Kutner, Ryszard

    2008-09-01

    Social, technological, and economic time series are divided by events which are usually assumed to be random, albeit with some hierarchical structure. It is well known that the interevent statistics observed in these contexts differs from the Poissonian profile by being long-tailed distributed with resting and active periods interwoven. Understanding mechanisms generating consistent statistics has therefore become a central issue. The approach we present is taken from the continuous-time random-walk formalism and represents an analytical alternative to models of nontrivial priority that have been recently proposed. Our analysis also goes one step further by looking at the multifractal structure of the interevent times of human decisions. We here analyze the intertransaction time intervals of several financial markets. We observe that empirical data describe a subtle multifractal behavior. Our model explains this structure by taking the pausing-time density in the form of a superstatistics where the integral kernel quantifies the heterogeneous nature of the executed tasks. A stretched exponential kernel provides a multifractal profile valid for a certain limited range. A suggested heuristic analytical profile is capable of covering a broader region.

  10. Preliminary investigation of human exhaled breath for tuberculosis diagnosis by multidimensional gas chromatography - Time of flight mass spectrometry and machine learning.

    PubMed

    Beccaria, Marco; Mellors, Theodore R; Petion, Jacky S; Rees, Christiaan A; Nasir, Mavra; Systrom, Hannah K; Sairistil, Jean W; Jean-Juste, Marc-Antoine; Rivera, Vanessa; Lavoile, Kerline; Severe, Patrice; Pape, Jean W; Wright, Peter F; Hill, Jane E

    2018-02-01

    Tuberculosis (TB) remains a global public health malady that claims almost 1.8 million lives annually. Diagnosis of TB represents perhaps one of the most challenging aspects of tuberculosis control. Gold standards for diagnosis of active TB (culture and nucleic acid amplification) are sputum-dependent, however, in up to a third of TB cases, an adequate biological sputum sample is not readily available. The analysis of exhaled breath, as an alternative to sputum-dependent tests, has the potential to provide a simple, fast, and non-invasive, and ready-available diagnostic service that could positively change TB detection. Human breath has been evaluated in the setting of active tuberculosis using thermal desorption-comprehensive two-dimensional gas chromatography-time of flight mass spectrometry methodology. From the entire spectrum of volatile metabolites in breath, three random forest machine learning models were applied leading to the generation of a panel of 46 breath features. The twenty-two common features within each random forest model used were selected as a set that could distinguish subjects with confirmed pulmonary M. tuberculosis infection and people with other pathologies than TB. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Practical quantum random number generator based on measuring the shot noise of vacuum states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen Yong; Zou Hongxin; Tian Liang

    2010-06-15

    The shot noise of vacuum states is a kind of quantum noise and is totally random. In this paper a nondeterministic random number generation scheme based on measuring the shot noise of vacuum states is presented and experimentally demonstrated. We use a homodyne detector to measure the shot noise of vacuum states. Considering that the frequency bandwidth of our detector is limited, we derive the optimal sampling rate so that sampling points have the least correlation with each other. We also choose a method to extract random numbers from sampling values, and prove that the influence of classical noise canmore » be avoided with this method so that the detector does not have to be shot-noise limited. The random numbers generated with this scheme have passed ent and diehard tests.« less

  12. Compact quantum random number generator based on superluminescent light-emitting diodes

    NASA Astrophysics Data System (ADS)

    Wei, Shihai; Yang, Jie; Fan, Fan; Huang, Wei; Li, Dashuang; Xu, Bingjie

    2017-12-01

    By measuring the amplified spontaneous emission (ASE) noise of the superluminescent light emitting diodes, we propose and realize a quantum random number generator (QRNG) featured with practicability. In the QRNG, after the detection and amplification of the ASE noise, the data acquisition and randomness extraction which is integrated in a field programmable gate array (FPGA) are both implemented in real-time, and the final random bit sequences are delivered to a host computer with a real-time generation rate of 1.2 Gbps. Further, to achieve compactness, all the components of the QRNG are integrated on three independent printed circuit boards with a compact design, and the QRNG is packed in a small enclosure sized 140 mm × 120 mm × 25 mm. The final random bit sequences can pass all the NIST-STS and DIEHARD tests.

  13. Pseudo-random properties of a linear congruential generator investigated by b-adic diaphony

    NASA Astrophysics Data System (ADS)

    Stoev, Peter; Stoilova, Stanislava

    2017-12-01

    In the proposed paper we continue the study of the diaphony, defined in b-adic number system, and we extend it in different directions. We investigate this diaphony as a tool for estimation of the pseudorandom properties of some of the most used random number generators. This is done by evaluating the distribution of specially constructed two-dimensional nets on the base of the obtained random numbers. The aim is to see how the generated numbers are suitable for calculations in some numerical methods (Monte Carlo etc.).

  14. Generating constrained randomized sequences: item frequency matters.

    PubMed

    French, Robert M; Perruchet, Pierre

    2009-11-01

    All experimental psychologists understand the importance of randomizing lists of items. However, randomization is generally constrained, and these constraints-in particular, not allowing immediately repeated items-which are designed to eliminate particular biases, frequently engender others. We describe a simple Monte Carlo randomization technique that solves a number of these problems. However, in many experimental settings, we are concerned not only with the number and distribution of items but also with the number and distribution of transitions between items. The algorithm mentioned above provides no control over this. We therefore introduce a simple technique that uses transition tables for generating correctly randomized sequences. We present an analytic method of producing item-pair frequency tables and item-pair transitional probability tables when immediate repetitions are not allowed. We illustrate these difficulties and how to overcome them, with reference to a classic article on word segmentation in infants. Finally, we provide free access to an Excel file that allows users to generate transition tables with up to 10 different item types, as well as to generate appropriately distributed randomized sequences of any length without immediately repeated elements. This file is freely available from http://leadserv.u-bourgogne.fr/IMG/xls/TransitionMatrix.xls.

  15. A random spatial network model based on elementary postulates

    USGS Publications Warehouse

    Karlinger, Michael R.; Troutman, Brent M.

    1989-01-01

    A model for generating random spatial networks that is based on elementary postulates comparable to those of the random topology model is proposed. In contrast to the random topology model, this model ascribes a unique spatial specification to generated drainage networks, a distinguishing property of some network growth models. The simplicity of the postulates creates an opportunity for potential analytic investigations of the probabilistic structure of the drainage networks, while the spatial specification enables analyses of spatially dependent network properties. In the random topology model all drainage networks, conditioned on magnitude (number of first-order streams), are equally likely, whereas in this model all spanning trees of a grid, conditioned on area and drainage density, are equally likely. As a result, link lengths in the generated networks are not independent, as usually assumed in the random topology model. For a preliminary model evaluation, scale-dependent network characteristics, such as geometric diameter and link length properties, and topologic characteristics, such as bifurcation ratio, are computed for sets of drainage networks generated on square and rectangular grids. Statistics of the bifurcation and length ratios fall within the range of values reported for natural drainage networks, but geometric diameters tend to be relatively longer than those for natural networks.

  16. Landscape-scale accessibility of livestock to tigers: implications of spatial grain for modeling predation risk to mitigate human-carnivore conflict.

    PubMed

    Miller, Jennifer R B; Jhala, Yadvendradev V; Jena, Jyotirmay; Schmitz, Oswald J

    2015-03-01

    Innovative conservation tools are greatly needed to reduce livelihood losses and wildlife declines resulting from human-carnivore conflict. Spatial risk modeling is an emerging method for assessing the spatial patterns of predator-prey interactions, with applications for mitigating carnivore attacks on livestock. Large carnivores that ambush prey attack and kill over small areas, requiring models at fine spatial grains to predict livestock depredation hot spots. To detect the best resolution for predicting where carnivores access livestock, we examined the spatial attributes associated with livestock killed by tigers in Kanha Tiger Reserve, India, using risk models generated at 20, 100, and 200-m spatial grains. We analyzed land-use, human presence, and vegetation structure variables at 138 kill sites and 439 random sites to identify key landscape attributes where livestock were vulnerable to tigers. Land-use and human presence variables contributed strongly to predation risk models, with most variables showing high relative importance (≥0.85) at all spatial grains. The risk of a tiger killing livestock increased near dense forests and near the boundary of the park core zone where human presence is restricted. Risk was nonlinearly related to human infrastructure and open vegetation, with the greatest risk occurring 1.2 km from roads, 1.1 km from villages, and 8.0 km from scrubland. Kill sites were characterized by denser, patchier, and more complex vegetation with lower visibility than random sites. Risk maps revealed high-risk hot spots inside of the core zone boundary and in several patches in the human-dominated buffer zone. Validation against known kills revealed predictive accuracy for only the 20 m model, the resolution best representing the kill stage of hunting for large carnivores that ambush prey, like the tiger. Results demonstrate that risk models developed at fine spatial grains can offer accurate guidance on landscape attributes livestock should avoid to minimize human-carnivore conflict.

  17. Method and apparatus for determining position using global positioning satellites

    NASA Technical Reports Server (NTRS)

    Ward, John (Inventor); Ward, William S. (Inventor)

    1998-01-01

    A global positioning satellite receiver having an antenna for receiving a L1 signal from a satellite. The L1 signal is processed by a preamplifier stage including a band pass filter and a low noise amplifier and output as a radio frequency (RF) signal. A mixer receives and de-spreads the RF signal in response to a pseudo-random noise code, i.e., Gold code, generated by an internal pseudo-random noise code generator. A microprocessor enters a code tracking loop, such that during the code tracking loop, it addresses the pseudo-random code generator to cause the pseudo-random code generator to sequentially output pseudo-random codes corresponding to satellite codes used to spread the L1 signal, until correlation occurs. When an output of the mixer is indicative of the occurrence of correlation between the RF signal and the generated pseudo-random codes, the microprocessor enters an operational state which slows the receiver code sequence to stay locked with the satellite code sequence. The output of the mixer is provided to a detector which, in turn, controls certain routines of the microprocessor. The microprocessor will output pseudo range information according to an interrupt routine in response detection of correlation. The pseudo range information is to be telemetered to a ground station which determines the position of the global positioning satellite receiver.

  18. Narrow-band generation in random distributed feedback fiber laser.

    PubMed

    Sugavanam, Srikanth; Tarasov, Nikita; Shu, Xuewen; Churkin, Dmitry V

    2013-07-15

    Narrow-band emission of spectral width down to ~0.05 nm line-width is achieved in the random distributed feedback fiber laser employing narrow-band fiber Bragg grating or fiber Fabry-Perot interferometer filters. The observed line-width is ~10 times less than line-width of other demonstrated up to date random distributed feedback fiber lasers. The random DFB laser with Fabry-Perot interferometer filter provides simultaneously multi-wavelength and narrow-band (within each line) generation with possibility of further wavelength tuning.

  19. RAQ–A Random Forest Approach for Predicting Air Quality in Urban Sensing Systems

    PubMed Central

    Yu, Ruiyun; Yang, Yu; Yang, Leyou; Han, Guangjie; Move, Oguti Ann

    2016-01-01

    Air quality information such as the concentration of PM2.5 is of great significance for human health and city management. It affects the way of traveling, urban planning, government policies and so on. However, in major cities there is typically only a limited number of air quality monitoring stations. In the meantime, air quality varies in the urban areas and there can be large differences, even between closely neighboring regions. In this paper, a random forest approach for predicting air quality (RAQ) is proposed for urban sensing systems. The data generated by urban sensing includes meteorology data, road information, real-time traffic status and point of interest (POI) distribution. The random forest algorithm is exploited for data training and prediction. The performance of RAQ is evaluated with real city data. Compared with three other algorithms, this approach achieves better prediction precision. Exciting results are observed from the experiments that the air quality can be inferred with amazingly high accuracy from the data which are obtained from urban sensing. PMID:26761008

  20. Programmable quantum random number generator without postprocessing.

    PubMed

    Nguyen, Lac; Rehain, Patrick; Sua, Yong Meng; Huang, Yu-Ping

    2018-02-15

    We demonstrate a viable source of unbiased quantum random numbers whose statistical properties can be arbitrarily programmed without the need for any postprocessing such as randomness distillation or distribution transformation. It is based on measuring the arrival time of single photons in shaped temporal modes that are tailored with an electro-optical modulator. We show that quantum random numbers can be created directly in customized probability distributions and pass all randomness tests of the NIST and Dieharder test suites without any randomness extraction. The min-entropies of such generated random numbers are measured close to the theoretical limits, indicating their near-ideal statistics and ultrahigh purity. Easy to implement and arbitrarily programmable, this technique can find versatile uses in a multitude of data analysis areas.

  1. Vehicle classification in WAMI imagery using deep network

    NASA Astrophysics Data System (ADS)

    Yi, Meng; Yang, Fan; Blasch, Erik; Sheaff, Carolyn; Liu, Kui; Chen, Genshe; Ling, Haibin

    2016-05-01

    Humans have always had a keen interest in understanding activities and the surrounding environment for mobility, communication, and survival. Thanks to recent progress in photography and breakthroughs in aviation, we are now able to capture tens of megapixels of ground imagery, namely Wide Area Motion Imagery (WAMI), at multiple frames per second from unmanned aerial vehicles (UAVs). WAMI serves as a great source for many applications, including security, urban planning and route planning. These applications require fast and accurate image understanding which is time consuming for humans, due to the large data volume and city-scale area coverage. Therefore, automatic processing and understanding of WAMI imagery has been gaining attention in both industry and the research community. This paper focuses on an essential step in WAMI imagery analysis, namely vehicle classification. That is, deciding whether a certain image patch contains a vehicle or not. We collect a set of positive and negative sample image patches, for training and testing the detector. Positive samples are 64 × 64 image patches centered on annotated vehicles. We generate two sets of negative images. The first set is generated from positive images with some location shift. The second set of negative patches is generated from randomly sampled patches. We also discard those patches if a vehicle accidentally locates at the center. Both positive and negative samples are randomly divided into 9000 training images and 3000 testing images. We propose to train a deep convolution network for classifying these patches. The classifier is based on a pre-trained AlexNet Model in the Caffe library, with an adapted loss function for vehicle classification. The performance of our classifier is compared to several traditional image classifier methods using Support Vector Machine (SVM) and Histogram of Oriented Gradient (HOG) features. While the SVM+HOG method achieves an accuracy of 91.2%, the accuracy of our deep network-based classifier reaches 97.9%.

  2. The role of ferroelectric domain structure in second harmonic generation in random quadratic media.

    PubMed

    Roppo, Vito; Wang, W; Kalinowski, K; Kong, Y; Cojocaru, C; Trull, J; Vilaseca, R; Scalora, M; Krolikowski, W; Kivshar, Yu

    2010-03-01

    We study theoretically and numerically the second harmonic generation in a nonlinear crystal with random distribution of ferroelectric domains. We show that the specific features of disordered domain structure greatly affect the emission pattern of the generated harmonics. This phenomena can be used to characterize the degree of disorder in nonlinear photonic structures.

  3. Genetic relatedness between oral and intestinal isolates of Porphyromonas endodontalis by analysis of random amplified polymorphic DNA.

    PubMed

    Gonçalves, R B; Väisänen, M L; Van Steenbergen, T J; Sundqvist, G; Mouton, C

    1999-01-01

    Genomic fingerprints from the DNA of 27 strains of Porphyromonas endodontalis from diverse clinical and geographic origins were generated as random amplified polymorphic DNA (RAPD) using the technique of PCR amplification with a single primer of arbitrary sequence. Cluster analysis of the combined RAPD data obtained with three selected 9- or 10-mer-long primers identified 25 distinct RAPD types which clustered as three main groups identifying three genogroups. Genogroups I and II included exclusively P. endodontalis isolates of oral origin, while 7/9 human intestinal strains of genogroup III which linked at a similarity level of 52% constituted the most homogeneous group in our study. Genotypic diversity within P. endodontalis, as shown by RAPD analysis, suggests that the taxon is composed of two oral genogroups and one intestinal genogroup. This hypothesis remains to be confirmed.

  4. Comparing vector-based and Bayesian memory models using large-scale datasets: User-generated hashtag and tag prediction on Twitter and Stack Overflow.

    PubMed

    Stanley, Clayton; Byrne, Michael D

    2016-12-01

    The growth of social media and user-created content on online sites provides unique opportunities to study models of human declarative memory. By framing the task of choosing a hashtag for a tweet and tagging a post on Stack Overflow as a declarative memory retrieval problem, 2 cognitively plausible declarative memory models were applied to millions of posts and tweets and evaluated on how accurately they predict a user's chosen tags. An ACT-R based Bayesian model and a random permutation vector-based model were tested on the large data sets. The results show that past user behavior of tag use is a strong predictor of future behavior. Furthermore, past behavior was successfully incorporated into the random permutation model that previously used only context. Also, ACT-R's attentional weight term was linked to an entropy-weighting natural language processing method used to attenuate high-frequency words (e.g., articles and prepositions). Word order was not found to be a strong predictor of tag use, and the random permutation model performed comparably to the Bayesian model without including word order. This shows that the strength of the random permutation model is not in the ability to represent word order, but rather in the way in which context information is successfully compressed. The results of the large-scale exploration show how the architecture of the 2 memory models can be modified to significantly improve accuracy, and may suggest task-independent general modifications that can help improve model fit to human data in a much wider range of domains. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  5. Using Neural Networks to Generate Inferential Roles for Natural Language

    PubMed Central

    Blouw, Peter; Eliasmith, Chris

    2018-01-01

    Neural networks have long been used to study linguistic phenomena spanning the domains of phonology, morphology, syntax, and semantics. Of these domains, semantics is somewhat unique in that there is little clarity concerning what a model needs to be able to do in order to provide an account of how the meanings of complex linguistic expressions, such as sentences, are understood. We argue that one thing such models need to be able to do is generate predictions about which further sentences are likely to follow from a given sentence; these define the sentence's “inferential role.” We then show that it is possible to train a tree-structured neural network model to generate very simple examples of such inferential roles using the recently released Stanford Natural Language Inference (SNLI) dataset. On an empirical front, we evaluate the performance of this model by reporting entailment prediction accuracies on a set of test sentences not present in the training data. We also report the results of a simple study that compares human plausibility ratings for both human-generated and model-generated entailments for a random selection of sentences in this test set. On a more theoretical front, we argue in favor of a revision to some common assumptions about semantics: understanding a linguistic expression is not only a matter of mapping it onto a representation that somehow constitutes its meaning; rather, understanding a linguistic expression is mainly a matter of being able to draw certain inferences. Inference should accordingly be at the core of any model of semantic cognition. PMID:29387031

  6. A Micro-Computer Model for Army Air Defense Training.

    DTIC Science & Technology

    1985-03-01

    generator. The period is 32763 numbers generated before a repetitive sequence is encountered on the development system. Chi-Squared tests for frequency...C’ Tests CPeriodicity. The period is 32763 numbers generated C’before a repetitive sequence is encountered on the development system. This was...positions in the test array. This was done with several different random number seeds. In each case 32763 p random numbers were generated before a

  7. Experimentally Generated Random Numbers Certified by the Impossibility of Superluminal Signaling

    NASA Astrophysics Data System (ADS)

    Bierhorst, Peter; Shalm, Lynden K.; Mink, Alan; Jordan, Stephen; Liu, Yi-Kai; Rommal, Andrea; Glancy, Scott; Christensen, Bradley; Nam, Sae Woo; Knill, Emanuel

    Random numbers are an important resource for applications such as numerical simulation and secure communication. However, it is difficult to certify whether a physical random number generator is truly unpredictable. Here, we exploit the phenomenon of quantum nonlocality in a loophole-free photonic Bell test experiment to obtain data containing randomness that cannot be predicted by any theory that does not also allow the sending of signals faster than the speed of light. To certify and quantify the randomness, we develop a new protocol that performs well in an experimental regime characterized by low violation of Bell inequalities. Applying an extractor function to our data, we obtain 256 new random bits, uniform to within 10- 3 .

  8. Random pulse generator

    NASA Technical Reports Server (NTRS)

    Lindsey, R. S., Jr. (Inventor)

    1975-01-01

    An exemplary embodiment of the present invention provides a source of random width and random spaced rectangular voltage pulses whose mean or average frequency of operation is controllable within prescribed limits of about 10 hertz to 1 megahertz. A pair of thin-film metal resistors are used to provide a differential white noise voltage pulse source. Pulse shaping and amplification circuitry provide relatively short duration pulses of constant amplitude which are applied to anti-bounce logic circuitry to prevent ringing effects. The pulse outputs from the anti-bounce circuits are then used to control two one-shot multivibrators whose output comprises the random length and random spaced rectangular pulses. Means are provided for monitoring, calibrating and evaluating the relative randomness of the generator.

  9. Polymorphisms in ERAP1 and ERAP2 are shared by Caninae and segregate within and between random- and pure-breeds of dogs.

    PubMed

    Pedersen, N C; Dhanota, J K; Liu, H

    2016-10-15

    Specific polymorphisms in the endoplasmic reticulum amino peptidase genes ERAP1 and ERAP2, when present with certain MHC class receptor types, have been associated with increased risk for specific cancers, infectious diseases and autoimmune disorders in humans. This increased risk has been linked to distinct polymorphisms in both ERAPs and MHC class I receptors that affect the way cell-generated peptides are screened for antigenicity. The incidence of cancer, infectious disease and autoimmune disorders differ greatly among pure breeds of dogs as it does in humans and it is possible that this heightened susceptibility is also due to specific polymorphisms in ERAP1 and ERAP2. In order to determine if such polymorphisms exist, the ERAP1 and ERAP2 genes of 10 dogs of nine diverse breeds were sequenced and SNPs causing synonymous or non-synonymous amino acid changes, deletions or insertions were identified. Eight ERAP1 and 10 ERAP2 SNPs were used to create a Sequenom MassARRAY iPLEX based test panel which defined 24 ERAP1, 36 ERAP2 and 128 ERAP1/2 haplotypes. The prevalence of these haplotypes was then measured among dog, wolf, coyote, jackal and red fox populations. Some haplotypes were species specific, while others were shared across species, especially between dog, wolf, coyote and jackal. The prevalence of these haplotypes was then compared among various canid populations, and in particular between various populations of random- and pure-bred dogs. Human-directed positive selection has led to loss of ERAP diversity and segregation of certain haplotypes among various dog breeds. A phylogenetic tree generated from 45 of the most common ERAP1/2 haplotypes demonstrated three distinct clades, all of which were rooted with haplotypes either shared among species or specific to contemporary dogs, coyote and wolf. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Effect of egg ingestion on trimethylamine-N-oxide production in humans: a randomized, controlled, dose-response study1234

    PubMed Central

    Miller, Carolyn A; Corbin, Karen D; da Costa, Kerry-Ann; Zhang, Shucha; Zhao, Xueqing; Galanko, Joseph A; Blevins, Tondra; Bennett, Brian J; O'Connor, Annalouise; Zeisel, Steven H

    2014-01-01

    Background: It is important to understand whether eating eggs, which are a major source of dietary choline, results in increased exposure to trimethylamine-N-oxide (TMAO), which is purported to be a risk factor for developing heart disease. Objective: We determined whether humans eating eggs generate TMAO and, if so, whether there is an associated increase in a marker for inflammation [ie, high-sensitivity C-reactive protein (hsCRP)] or increased oxidation of low-density lipoprotein (LDL). Design: In a longitudinal, double-blind, randomized dietary intervention, 6 volunteers were fed breakfast doses of 0, 1, 2, 4, or 6 egg yolks. Diets were otherwise controlled on the day before and day of each egg dose with a standardized low-choline menu. Plasma TMAO at timed intervals (immediately before and 1, 2, 4, 8, and 24 h after each dose), 24-h urine TMAO, predose and 24-h postdose serum hsCRP, and plasma oxidized LDL were measured. Volunteers received all 5 doses with each dose separated by >2-wk washout periods. Results: The consumption of eggs was associated with increased plasma and urine TMAO concentrations (P < 0.01), with ∼14% of the total choline in eggs having been converted to TMAO. There was considerable variation between individuals in the TMAO response. There was no difference in hsCRP or oxidized LDL concentrations after egg doses. Conclusions: The consumption of ≥2 eggs results in an increased formation of TMAO. Choline is an essential nutrient that is required for normal human liver and muscle functions and important for normal fetal development. Additional study is needed to both confirm the association between TMAO and atherosclerosis and identify factors, microbiota and genetic, that influence the generation of TMAO before policy and medical recommendations are made that suggest reduced dietary choline intake. This trial was registered at clinicaltrials.gov as NCT01906554. PMID:24944063

  11. Detecting targets hidden in random forests

    NASA Astrophysics Data System (ADS)

    Kouritzin, Michael A.; Luo, Dandan; Newton, Fraser; Wu, Biao

    2009-05-01

    Military tanks, cargo or troop carriers, missile carriers or rocket launchers often hide themselves from detection in the forests. This plagues the detection problem of locating these hidden targets. An electro-optic camera mounted on a surveillance aircraft or unmanned aerial vehicle is used to capture the images of the forests with possible hidden targets, e.g., rocket launchers. We consider random forests of longitudinal and latitudinal correlations. Specifically, foliage coverage is encoded with a binary representation (i.e., foliage or no foliage), and is correlated in adjacent regions. We address the detection problem of camouflaged targets hidden in random forests by building memory into the observations. In particular, we propose an efficient algorithm to generate random forests, ground, and camouflage of hidden targets with two dimensional correlations. The observations are a sequence of snapshots consisting of foliage-obscured ground or target. Theoretically, detection is possible because there are subtle differences in the correlations of the ground and camouflage of the rocket launcher. However, these differences are well beyond human perception. To detect the presence of hidden targets automatically, we develop a Markov representation for these sequences and modify the classical filtering equations to allow the Markov chain observation. Particle filters are used to estimate the position of the targets in combination with a novel random weighting technique. Furthermore, we give positive proof-of-concept simulations.

  12. Precise and in situ genetic humanization of 6 Mb of mouse immunoglobulin genes.

    PubMed

    Macdonald, Lynn E; Karow, Margaret; Stevens, Sean; Auerbach, Wojtek; Poueymirou, William T; Yasenchak, Jason; Frendewey, David; Valenzuela, David M; Giallourakis, Cosmas C; Alt, Frederick W; Yancopoulos, George D; Murphy, Andrew J

    2014-04-08

    Genetic humanization, which involves replacing mouse genes with their human counterparts, can create powerful animal models for the study of human genes and diseases. One important example of genetic humanization involves mice humanized for their Ig genes, allowing for human antibody responses within a mouse background (HumAb mice) and also providing a valuable platform for the generation of fully human antibodies as therapeutics. However, existing HumAb mice do not have fully functional immune systems, perhaps because of the manner in which they were genetically humanized. Heretofore, most genetic humanizations have involved disruption of the endogenous mouse gene with simultaneous introduction of a human transgene at a new and random location (so-called KO-plus-transgenic humanization). More recent efforts have attempted to replace mouse genes with their human counterparts at the same genetic location (in situ humanization), but such efforts involved laborious procedures and were limited in size and precision. We describe a general and efficient method for very large, in situ, and precise genetic humanization using large compound bacterial artificial chromosome-based targeting vectors introduced into mouse ES cells. We applied this method to genetically humanize 3-Mb segments of both the mouse heavy and κ light chain Ig loci, by far the largest genetic humanizations ever described. This paper provides a detailed description of our genetic humanization approach, and the companion paper reports that the humoral immune systems of mice bearing these genetically humanized loci function as efficiently as those of WT mice.

  13. True Randomness from Big Data.

    PubMed

    Papakonstantinou, Periklis A; Woodruff, David P; Yang, Guang

    2016-09-26

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  14. True Randomness from Big Data

    NASA Astrophysics Data System (ADS)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  15. True Randomness from Big Data

    PubMed Central

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  16. Source-Device-Independent Ultrafast Quantum Random Number Generation.

    PubMed

    Marangon, Davide G; Vallone, Giuseppe; Villoresi, Paolo

    2017-02-10

    Secure random numbers are a fundamental element of many applications in science, statistics, cryptography and more in general in security protocols. We present a method that enables the generation of high-speed unpredictable random numbers from the quadratures of an electromagnetic field without any assumption on the input state. The method allows us to eliminate the numbers that can be predicted due to the presence of classical and quantum side information. In particular, we introduce a procedure to estimate a bound on the conditional min-entropy based on the entropic uncertainty principle for position and momentum observables of infinite dimensional quantum systems. By the above method, we experimentally demonstrated the generation of secure true random bits at a rate greater than 1.7 Gbit/s.

  17. Random motor generation in a finger tapping task: influence of spatial contingency and of cortical and subcortical hemispheric brain lesions

    PubMed Central

    Annoni, J.; Pegna, A.

    1997-01-01

    OBJECTIVE—To test the hypothesis that, during random motor generation, the spatial contingencies inherent to the task would induce additional preferences in normal subjects, shifting their performances farther from randomness. By contrast, perceptual or executive dysfunction could alter these task related biases in patients with brain damage.
METHODS—Two groups of patients, with right and left focal brain lesions, as well as 25 right handed subjects matched for age and handedness were asked to execute a random choice motor task—namely, to generate a random series of 180 button presses from a set of 10 keys placed vertically in front of them.
RESULTS—In the control group, as in the left brain lesion group, motor generation was subject to deviations from theoretical expected randomness, similar to those when numbers are generated mentally, as immediate repetitions (successive presses on the same key) are avoided. However, the distribution of button presses was also contingent on the topographic disposition of the keys: the central keys were chosen more often than those placed at extreme positions. Small distances were favoured, particularly with the left hand. These patterns were influenced by implicit strategies and task related contingencies.
 By contrast, right brain lesion patients with frontal involvement tended to show a more square distribution of key presses—that is, the number of key presses tended to be more equally distributed. The strategies were also altered by brain lesions: the number of immediate repetitions was more frequent when the lesion involved the right frontal areas yielding a random generation nearer to expected theoretical randomness. The frequency of adjacent key presses was increased by right anterior and left posterior cortical as well as by right subcortical lesions, but decreased by left subcortical lesions.
CONCLUSIONS—Depending on the side of the lesion and the degree of cortical-subcortical involvement, the deficits take on a different aspect and direct repetions and adjacent key presses have different patterns of alterations. Motor random generation is therefore a complex task which seems to necessitate the participation of numerous cerebral structures, among which those situated in the right frontal, left posterior, and subcortical regions have a predominant role.

 PMID:9408109

  18. Demonstration of Numerical Equivalence of Ensemble and Spectral Averaging in Electromagnetic Scattering by Random Particulate Media

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Dlugach, Janna M.; Zakharova, Nadezhda T.

    2016-01-01

    The numerically exact superposition T-matrix method is used to model far-field electromagnetic scattering by two types of particulate object. Object 1 is a fixed configuration which consists of N identical spherical particles (with N 200 or 400) quasi-randomly populating a spherical volume V having a median size parameter of 50. Object 2 is a true discrete random medium (DRM) comprising the same number N of particles randomly moving throughout V. The median particle size parameter is fixed at 4. We show that if Object 1 is illuminated by a quasi-monochromatic parallel beam then it generates a typical speckle pattern having no resemblance to the scattering pattern generated by Object 2. However, if Object 1 is illuminated by a parallel polychromatic beam with a 10 bandwidth then it generates a scattering pattern that is largely devoid of speckles and closely reproduces the quasi-monochromatic pattern generated by Object 2. This result serves to illustrate the capacity of the concept of electromagnetic scattering by a DRM to encompass fixed quasi-random particulate samples provided that they are illuminated by polychromatic light.

  19. N-state random switching based on quantum tunnelling

    NASA Astrophysics Data System (ADS)

    Bernardo Gavito, Ramón; Jiménez Urbanos, Fernando; Roberts, Jonathan; Sexton, James; Astbury, Benjamin; Shokeir, Hamzah; McGrath, Thomas; Noori, Yasir J.; Woodhead, Christopher S.; Missous, Mohamed; Roedig, Utz; Young, Robert J.

    2017-08-01

    In this work, we show how the hysteretic behaviour of resonant tunnelling diodes (RTDs) can be exploited for new functionalities. In particular, the RTDs exhibit a stochastic 2-state switching mechanism that could be useful for random number generation and cryptographic applications. This behaviour can be scaled to N-bit switching, by connecting various RTDs in series. The InGaAs/AlAs RTDs used in our experiments display very sharp negative differential resistance (NDR) peaks at room temperature which show hysteresis cycles that, rather than having a fixed switching threshold, show a probability distribution about a central value. We propose to use this intrinsic uncertainty emerging from the quantum nature of the RTDs as a source of randomness. We show that a combination of two RTDs in series results in devices with three-state outputs and discuss the possibility of scaling to N-state devices by subsequent series connections of RTDs, which we demonstrate for the up to the 4-state case. In this work, we suggest using that the intrinsic uncertainty in the conduction paths of resonant tunnelling diodes can behave as a source of randomness that can be integrated into current electronics to produce on-chip true random number generators. The N-shaped I-V characteristic of RTDs results in a two-level random voltage output when driven with current pulse trains. Electrical characterisation and randomness testing of the devices was conducted in order to determine the validity of the true randomness assumption. Based on the results obtained for the single RTD case, we suggest the possibility of using multi-well devices to generate N-state random switching devices for their use in random number generation or multi-valued logic devices.

  20. Very late stent thrombosis with second generation drug eluting stents compared to bare metal stents: Network meta-analysis of randomized primary percutaneous coronary intervention trials.

    PubMed

    Philip, Femi; Stewart, Susan; Southard, Jeffrey A

    2016-07-01

    The relative safety of drug-eluting stents (DES) and bare-metal stents (BMS) in primary percutaneous coronary intervention (PPCI) in ST elevation myocardial infarction (STEMI) continues to be debated. The long-term clinical outcomes between second generation DES and BMS for primary percutaneous coronary intervention (PCI) using network meta-analysis were compared. Randomized controlled trials comparing stent types (first generation DES, second generation DES, or BMS) were considered for inclusion. A search strategy used Medline, Embase, Cochrane databases, and proceedings of international meetings. Information about study design, inclusion criteria, and sample characteristics were extracted. Network meta-analysis was used to pool direct (comparison of second generation DES to BMS) and indirect evidence (first generation DES with BMS and second generation DES) from the randomized trials. Twelve trials comparing all stents types including 9,673 patients randomly assigned to treatment groups were analyzed. Second generation DES was associated with significantly lower incidence of definite or probable ST (OR 0.59, 95% CI 0.39-0.89), MI (OR 0.59, 95% CI 0.39-0.89), and TVR at 3 years (OR 0.50: 95% CI 0.31-0.81) compared with BMS. In addition, there was a significantly lower incidence of MACE with second generation DES versus BMS (OR 0.54, 95% CI 0.34-0.74) at 3 years. These were driven by a higher rate of TVR, MI and stent thrombosis in the BMS group at 3 years. There was a non-significant reduction in the overall and cardiac mortality [OR 0.83, 95% CI (0.60-1.14), OR 0.88, 95% CI (0.6-1.28)] with the use of second generation DES versus BMS at 3 years. Network meta-analysis of randomized trials of primary PCI demonstrated lower incidence of MACE, MI, TVR, and stent thrombosis with second generation DES compared with BMS. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  1. A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing

    NASA Technical Reports Server (NTRS)

    Takaki, Mitsuo; Cavalcanti, Diego; Gheyi, Rohit; Iyoda, Juliano; dAmorim, Marcelo; Prudencio, Ricardo

    2009-01-01

    The complexity of constraints is a major obstacle for constraint-based software verification. Automatic constraint solvers are fundamentally incomplete: input constraints often build on some undecidable theory or some theory the solver does not support. This paper proposes and evaluates several randomized solvers to address this issue. We compare the effectiveness of a symbolic solver (CVC3), a random solver, three hybrid solvers (i.e., mix of random and symbolic), and two heuristic search solvers. We evaluate the solvers on two benchmarks: one consisting of manually generated constraints and another generated with a concolic execution of 8 subjects. In addition to fully decidable constraints, the benchmarks include constraints with non-linear integer arithmetic, integer modulo and division, bitwise arithmetic, and floating-point arithmetic. As expected symbolic solving (in particular, CVC3) subsumes the other solvers for the concolic execution of subjects that only generate decidable constraints. For the remaining subjects the solvers are complementary.

  2. Megahertz-Rate Semi-Device-Independent Quantum Random Number Generators Based on Unambiguous State Discrimination

    NASA Astrophysics Data System (ADS)

    Brask, Jonatan Bohr; Martin, Anthony; Esposito, William; Houlmann, Raphael; Bowles, Joseph; Zbinden, Hugo; Brunner, Nicolas

    2017-05-01

    An approach to quantum random number generation based on unambiguous quantum state discrimination is developed. We consider a prepare-and-measure protocol, where two nonorthogonal quantum states can be prepared, and a measurement device aims at unambiguously discriminating between them. Because the states are nonorthogonal, this necessarily leads to a minimal rate of inconclusive events whose occurrence must be genuinely random and which provide the randomness source that we exploit. Our protocol is semi-device-independent in the sense that the output entropy can be lower bounded based on experimental data and a few general assumptions about the setup alone. It is also practically relevant, which we demonstrate by realizing a simple optical implementation, achieving rates of 16.5 Mbits /s . Combining ease of implementation, a high rate, and a real-time entropy estimation, our protocol represents a promising approach intermediate between fully device-independent protocols and commercial quantum random number generators.

  3. Secure self-calibrating quantum random-bit generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiorentino, M.; Santori, C.; Spillane, S. M.

    2007-03-15

    Random-bit generators (RBGs) are key components of a variety of information processing applications ranging from simulations to cryptography. In particular, cryptographic systems require 'strong' RBGs that produce high-entropy bit sequences, but traditional software pseudo-RBGs have very low entropy content and therefore are relatively weak for cryptography. Hardware RBGs yield entropy from chaotic or quantum physical systems and therefore are expected to exhibit high entropy, but in current implementations their exact entropy content is unknown. Here we report a quantum random-bit generator (QRBG) that harvests entropy by measuring single-photon and entangled two-photon polarization states. We introduce and implement a quantum tomographicmore » method to measure a lower bound on the 'min-entropy' of the system, and we employ this value to distill a truly random-bit sequence. This approach is secure: even if an attacker takes control of the source of optical states, a secure random sequence can be distilled.« less

  4. Markov Chain Analysis of Musical Dice Games

    NASA Astrophysics Data System (ADS)

    Volchenkov, D.; Dawin, J. R.

    2012-07-01

    A system for using dice to compose music randomly is known as the musical dice game. The discrete time MIDI models of 804 pieces of classical music written by 29 composers have been encoded into the transition matrices and studied by Markov chains. Contrary to human languages, entropy dominates over redundancy, in the musical dice games based on the compositions of classical music. The maximum complexity is achieved on the blocks consisting of just a few notes (8 notes, for the musical dice games generated over Bach's compositions). First passage times to notes can be used to resolve tonality and feature a composer.

  5. Musical Markov Chains

    NASA Astrophysics Data System (ADS)

    Volchenkov, Dima; Dawin, Jean René

    A system for using dice to compose music randomly is known as the musical dice game. The discrete time MIDI models of 804 pieces of classical music written by 29 composers have been encoded into the transition matrices and studied by Markov chains. Contrary to human languages, entropy dominates over redundancy, in the musical dice games based on the compositions of classical music. The maximum complexity is achieved on the blocks consisting of just a few notes (8 notes, for the musical dice games generated over Bach's compositions). First passage times to notes can be used to resolve tonality and feature a composer.

  6. Biological cell controllable patch-clamp microchip

    NASA Astrophysics Data System (ADS)

    Penmetsa, Siva; Nagrajan, Krithika; Gong, Zhongcheng; Mills, David; Que, Long

    2010-12-01

    A patch-clamp (PC) microchip with cell sorting and positioning functions is reported, which can avoid drawbacks of random cell selection or positioning for a PC microchip. The cell sorting and positioning are enabled by air bubble (AB) actuators. AB actuators are pneumatic actuators, in which air pressure is generated by microheaters within sealed microchambers. The sorting, positioning, and capturing of 3T3 cells by this type of microchip have been demonstrated. Using human breast cancer cells MDA-MB-231 as the model, experiments have been demonstrated by this microchip as a label-free technical platform for real-time monitoring of the cell viability.

  7. Linguistic Analysis of the Human Heartbeat Using Frequency and Rank Order Statistics

    NASA Astrophysics Data System (ADS)

    Yang, Albert C.-C.; Hseu, Shu-Shya; Yien, Huey-Wen; Goldberger, Ary L.; Peng, C.-K.

    2003-03-01

    Complex physiologic signals may carry unique dynamical signatures that are related to their underlying mechanisms. We present a method based on rank order statistics of symbolic sequences to investigate the profile of different types of physiologic dynamics. We apply this method to heart rate fluctuations, the output of a central physiologic control system. The method robustly discriminates patterns generated from healthy and pathologic states, as well as aging. Furthermore, we observe increased randomness in the heartbeat time series with physiologic aging and pathologic states and also uncover nonrandom patterns in the ventricular response to atrial fibrillation.

  8. A second-generation blood substitute (perfluorodichlorooctane emulsion) does not activate complement during an ex vivo circulation model of bypass.

    PubMed

    Rosoff, J D; Soltow, L O; Vocelka, C R; Schmer, G; Chandler, W L; Cochran, R P; Kunzelman, K S; Spiess, B D

    1998-08-01

    To examine whether a second-generation perfluorocarbon (PFC) blood substitute added to the cardiopulmonary bypass (CPB) prime influences complement production. A prospective, randomized, single-blinded, ex vivo model. A university hospital, laboratory, and clinics. Ten healthy adult consented volunteer blood donors (five men, five women). Ex vivo closed-loop extracorporeal circuit including membrane oxygenator, tubing, and filter primed with crystalloid or crystalloid plus PFC was circulated for 1 hour with the addition of 500 mL of heparinized fresh human whole blood. Laboratory specimens were drawn from the circuit at 10-minute intervals for 1 hour and measured for complement (C3a, Bb fragment) concentrations, blood gases, fibrinogen concentration, platelet count, and hematocrit. In the PFC group, C3a and Bb fragments were equal to or less than those in the group that received crystalloid alone. The second-generation PFC added to the prime of a CPB circuit does not independently increase complement production.

  9. Pilot study of large-scale production of mutant pigs by ENU mutagenesis.

    PubMed

    Hai, Tang; Cao, Chunwei; Shang, Haitao; Guo, Weiwei; Mu, Yanshuang; Yang, Shulin; Zhang, Ying; Zheng, Qiantao; Zhang, Tao; Wang, Xianlong; Liu, Yu; Kong, Qingran; Li, Kui; Wang, Dayu; Qi, Meng; Hong, Qianlong; Zhang, Rui; Wang, Xiupeng; Jia, Qitao; Wang, Xiao; Qin, Guosong; Li, Yongshun; Luo, Ailing; Jin, Weiwu; Yao, Jing; Huang, Jiaojiao; Zhang, Hongyong; Li, Menghua; Xie, Xiangmo; Zheng, Xuejuan; Guo, Kenan; Wang, Qinghua; Zhang, Shibin; Li, Liang; Xie, Fei; Zhang, Yu; Weng, Xiaogang; Yin, Zhi; Hu, Kui; Cong, Yimei; Zheng, Peng; Zou, Hailong; Xin, Leilei; Xia, Jihan; Ruan, Jinxue; Li, Hegang; Zhao, Weiming; Yuan, Jing; Liu, Zizhan; Gu, Weiwang; Li, Ming; Wang, Yong; Wang, Hongmei; Yang, Shiming; Liu, Zhonghua; Wei, Hong; Zhao, Jianguo; Zhou, Qi; Meng, Anming

    2017-06-22

    N-ethyl-N-nitrosourea (ENU) mutagenesis is a powerful tool to generate mutants on a large scale efficiently, and to discover genes with novel functions at the whole-genome level in Caenorhabditis elegans, flies, zebrafish and mice, but it has never been tried in large model animals. We describe a successful systematic three-generation ENU mutagenesis screening in pigs with the establishment of the Chinese Swine Mutagenesis Consortium. A total of 6,770 G1 and 6,800 G3 pigs were screened, 36 dominant and 91 recessive novel pig families with various phenotypes were established. The causative mutations in 10 mutant families were further mapped. As examples, the mutation of SOX10 (R109W) in pig causes inner ear malfunctions and mimics human Mondini dysplasia, and upregulated expression of FBXO32 is associated with congenital splay legs. This study demonstrates the feasibility of artificial random mutagenesis in pigs and opens an avenue for generating a reservoir of mutants for agricultural production and biomedical research.

  10. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...

  11. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...

  12. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...

  13. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...

  14. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...

  15. A Western-like fat diet is sufficient to induce a gradual enhancement in fat mass over generations.

    PubMed

    Massiera, Florence; Barbry, Pascal; Guesnet, Philippe; Joly, Aurélie; Luquet, Serge; Moreilhon-Brest, Chimène; Mohsen-Kanson, Tala; Amri, Ez-Zoubir; Ailhaud, Gérard

    2010-08-01

    The prevalence of obesity has steadily increased over the last few decades. During this time, populations of industrialized countries have been exposed to diets rich in fat with a high content of linoleic acid and a low content of alpha-linolenic acid compared with recommended intake. To assess the contribution of dietary fatty acids, male and female mice fed a high-fat diet (35% energy as fat, linoleic acid:alpha-linolenic acid ratio of 28) were mated randomly and maintained after breeding on the same diet for successive generations. Offspring showed, over four generations, a gradual enhancement in fat mass due to combined hyperplasia and hypertrophy with no change in food intake. Transgenerational alterations in adipokine levels were accompanied by hyperinsulinemia. Gene expression analyses of the stromal vascular fraction of adipose tissue, over generations, revealed discrete and steady changes in certain important players, such as CSF3 and Nocturnin. Thus, under conditions of genome stability and with no change in the regimen over four generations, we show that a Western-like fat diet induces a gradual fat mass enhancement, in accordance with the increasing prevalence of obesity observed in humans.

  16. Accelerating Pseudo-Random Number Generator for MCNP on GPU

    NASA Astrophysics Data System (ADS)

    Gong, Chunye; Liu, Jie; Chi, Lihua; Hu, Qingfeng; Deng, Li; Gong, Zhenghu

    2010-09-01

    Pseudo-random number generators (PRNG) are intensively used in many stochastic algorithms in particle simulations, artificial neural networks and other scientific computation. The PRNG in Monte Carlo N-Particle Transport Code (MCNP) requires long period, high quality, flexible jump and fast enough. In this paper, we implement such a PRNG for MCNP on NVIDIA's GTX200 Graphics Processor Units (GPU) using CUDA programming model. Results shows that 3.80 to 8.10 times speedup are achieved compared with 4 to 6 cores CPUs and more than 679.18 million double precision random numbers can be generated per second on GPU.

  17. Digital-Analog Hybrid Scheme and Its Application to Chaotic Random Number Generators

    NASA Astrophysics Data System (ADS)

    Yuan, Zeshi; Li, Hongtao; Miao, Yunchi; Hu, Wen; Zhu, Xiaohua

    2017-12-01

    Practical random number generation (RNG) circuits are typically achieved with analog devices or digital approaches. Digital-based techniques, which use field programmable gate array (FPGA) and graphics processing units (GPU) etc. usually have better performances than analog methods as they are programmable, efficient and robust. However, digital realizations suffer from the effect of finite precision. Accordingly, the generated random numbers (RNs) are actually periodic instead of being real random. To tackle this limitation, in this paper we propose a novel digital-analog hybrid scheme that employs the digital unit as the main body, and minimum analog devices to generate physical RNs. Moreover, the possibility of realizing the proposed scheme with only one memory element is discussed. Without loss of generality, we use the capacitor and the memristor along with FPGA to construct the proposed hybrid system, and a chaotic true random number generator (TRNG) circuit is realized, producing physical RNs at a throughput of Gbit/s scale. These RNs successfully pass all the tests in the NIST SP800-22 package, confirming the significance of the scheme in practical applications. In addition, the use of this new scheme is not restricted to RNGs, and it also provides a strategy to solve the effect of finite precision in other digital systems.

  18. Persistence of Lactobacillus fermentum RC-14 and Lactobacillus rhamnosus GR-1 but Not L. rhamnosus GG in the Human Vagina as Demonstrated by Randomly Amplified Polymorphic DNA

    PubMed Central

    Gardiner, Gillian E.; Heinemann, Christine; Bruce, Andrew W.; Beuerman, Dee; Reid, Gregor

    2002-01-01

    Lactobacillus rhamnosus GR-1 and L. fermentum RC-14 are well-characterized probiotic strains with efficacy in the prevention and treatment of urogenital infections in women. The aim of the present study was to apply a molecular biology-based methodology for the detection of these strains and L. rhamnosus GG (a commercially available intestinal probiotic) in the human vagina in order to assess probiotic persistence at this site. Ten healthy women inserted vaginally a capsule containing either a combination of strains GR-1 and RC-14 or the GG strain for 3 consecutive nights. Vaginal swabs taken before and at various time points after probiotic insertion were analyzed, and the Lactobacillus flora was assessed by randomly amplified polymorphic DNA (RAPD) analysis. This method generated discrete DNA fingerprints for GR-1, RC-14, and GG and enabled successful detection of these strains in the vagina. Strain GR-1 and/or strain RC-14 was found to persist in the vaginal tract for up to 19 days after vaginal instillation, while L. rhamnosus GG was detectable for up to 5 days postadministration. In conclusion, the fates of probiotic L. rhamnosus and L. fermentum strains were successfully monitored in the human vagina by RAPD analysis. This technique provides molecular biology-based evidence that RC-14 and GR-1, strains selected as urogenital probiotics, persist in the human vagina and may be more suited to vaginal colonization than L. rhamnosus GG. This highlights the importance of proper selection of strains for urogenital probiotic applications. PMID:11777835

  19. Learning-based stochastic object models for characterizing anatomical variations

    NASA Astrophysics Data System (ADS)

    Dolly, Steven R.; Lou, Yang; Anastasio, Mark A.; Li, Hua

    2018-03-01

    It is widely known that the optimization of imaging systems based on objective, task-based measures of image quality via computer-simulation requires the use of a stochastic object model (SOM). However, the development of computationally tractable SOMs that can accurately model the statistical variations in human anatomy within a specified ensemble of patients remains a challenging task. Previously reported numerical anatomic models lack the ability to accurately model inter-patient and inter-organ variations in human anatomy among a broad patient population, mainly because they are established on image data corresponding to a few of patients and individual anatomic organs. This may introduce phantom-specific bias into computer-simulation studies, where the study result is heavily dependent on which phantom is used. In certain applications, however, databases of high-quality volumetric images and organ contours are available that can facilitate this SOM development. In this work, a novel and tractable methodology for learning a SOM and generating numerical phantoms from a set of volumetric training images is developed. The proposed methodology learns geometric attribute distributions (GAD) of human anatomic organs from a broad patient population, which characterize both centroid relationships between neighboring organs and anatomic shape similarity of individual organs among patients. By randomly sampling the learned centroid and shape GADs with the constraints of the respective principal attribute variations learned from the training data, an ensemble of stochastic objects can be created. The randomness in organ shape and position reflects the learned variability of human anatomy. To demonstrate the methodology, a SOM of an adult male pelvis is computed and examples of corresponding numerical phantoms are created.

  20. Anonymous authenticated communications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaver, Cheryl L; Schroeppel, Richard C; Snyder, Lillian A

    2007-06-19

    A method of performing electronic communications between members of a group wherein the communications are authenticated as being from a member of the group and have not been altered, comprising: generating a plurality of random numbers; distributing in a digital medium the plurality of random numbers to the members of the group; publishing a hash value of contents of the digital medium; distributing to the members of the group public-key-encrypted messages each containing a same token comprising a random number; and encrypting a message with a key generated from the token and the plurality of random numbers.

  1. NullSeq: A Tool for Generating Random Coding Sequences with Desired Amino Acid and GC Contents.

    PubMed

    Liu, Sophia S; Hockenberry, Adam J; Lancichinetti, Andrea; Jewett, Michael C; Amaral, Luís A N

    2016-11-01

    The existence of over- and under-represented sequence motifs in genomes provides evidence of selective evolutionary pressures on biological mechanisms such as transcription, translation, ligand-substrate binding, and host immunity. In order to accurately identify motifs and other genome-scale patterns of interest, it is essential to be able to generate accurate null models that are appropriate for the sequences under study. While many tools have been developed to create random nucleotide sequences, protein coding sequences are subject to a unique set of constraints that complicates the process of generating appropriate null models. There are currently no tools available that allow users to create random coding sequences with specified amino acid composition and GC content for the purpose of hypothesis testing. Using the principle of maximum entropy, we developed a method that generates unbiased random sequences with pre-specified amino acid and GC content, which we have developed into a python package. Our method is the simplest way to obtain maximally unbiased random sequences that are subject to GC usage and primary amino acid sequence constraints. Furthermore, this approach can easily be expanded to create unbiased random sequences that incorporate more complicated constraints such as individual nucleotide usage or even di-nucleotide frequencies. The ability to generate correctly specified null models will allow researchers to accurately identify sequence motifs which will lead to a better understanding of biological processes as well as more effective engineering of biological systems.

  2. Generation of Aptamers from A Primer-Free Randomized ssDNA Library Using Magnetic-Assisted Rapid Aptamer Selection

    NASA Astrophysics Data System (ADS)

    Tsao, Shih-Ming; Lai, Ji-Ching; Horng, Horng-Er; Liu, Tu-Chen; Hong, Chin-Yih

    2017-04-01

    Aptamers are oligonucleotides that can bind to specific target molecules. Most aptamers are generated using random libraries in the standard systematic evolution of ligands by exponential enrichment (SELEX). Each random library contains oligonucleotides with a randomized central region and two fixed primer regions at both ends. The fixed primer regions are necessary for amplifying target-bound sequences by PCR. However, these extra-sequences may cause non-specific bindings, which potentially interfere with good binding for random sequences. The Magnetic-Assisted Rapid Aptamer Selection (MARAS) is a newly developed protocol for generating single-strand DNA aptamers. No repeat selection cycle is required in the protocol. This study proposes and demonstrates a method to isolate aptamers for C-reactive proteins (CRP) from a randomized ssDNA library containing no fixed sequences at 5‧ and 3‧ termini using the MARAS platform. Furthermore, the isolated primer-free aptamer was sequenced and binding affinity for CRP was analyzed. The specificity of the obtained aptamer was validated using blind serum samples. The result was consistent with monoclonal antibody-based nephelometry analysis, which indicated that a primer-free aptamer has high specificity toward targets. MARAS is a feasible platform for efficiently generating primer-free aptamers for clinical diagnoses.

  3. A Comparative Study of Random Patterns for Digital Image Correlation

    NASA Astrophysics Data System (ADS)

    Stoilov, G.; Kavardzhikov, V.; Pashkouleva, D.

    2012-06-01

    Digital Image Correlation (DIC) is a computer based image analysis technique utilizing random patterns, which finds applications in experimental mechanics of solids and structures. In this paper a comparative study of three simulated random patterns is done. One of them is generated according to a new algorithm, introduced by the authors. A criterion for quantitative evaluation of random patterns after the calculation of their autocorrelation functions is introduced. The patterns' deformations are simulated numerically and realized experimentally. The displacements are measured by using the DIC method. Tensile tests are performed after printing the generated random patterns on surfaces of standard iron sheet specimens. It is found that the new designed random pattern keeps relatively good quality until reaching 20% deformation.

  4. SNP selection and classification of genome-wide SNP data using stratified sampling random forests.

    PubMed

    Wu, Qingyao; Ye, Yunming; Liu, Yang; Ng, Michael K

    2012-09-01

    For high dimensional genome-wide association (GWA) case-control data of complex disease, there are usually a large portion of single-nucleotide polymorphisms (SNPs) that are irrelevant with the disease. A simple random sampling method in random forest using default mtry parameter to choose feature subspace, will select too many subspaces without informative SNPs. Exhaustive searching an optimal mtry is often required in order to include useful and relevant SNPs and get rid of vast of non-informative SNPs. However, it is too time-consuming and not favorable in GWA for high-dimensional data. The main aim of this paper is to propose a stratified sampling method for feature subspace selection to generate decision trees in a random forest for GWA high-dimensional data. Our idea is to design an equal-width discretization scheme for informativeness to divide SNPs into multiple groups. In feature subspace selection, we randomly select the same number of SNPs from each group and combine them to form a subspace to generate a decision tree. The advantage of this stratified sampling procedure can make sure each subspace contains enough useful SNPs, but can avoid a very high computational cost of exhaustive search of an optimal mtry, and maintain the randomness of a random forest. We employ two genome-wide SNP data sets (Parkinson case-control data comprised of 408 803 SNPs and Alzheimer case-control data comprised of 380 157 SNPs) to demonstrate that the proposed stratified sampling method is effective, and it can generate better random forest with higher accuracy and lower error bound than those by Breiman's random forest generation method. For Parkinson data, we also show some interesting genes identified by the method, which may be associated with neurological disorders for further biological investigations.

  5. A system performance throughput model applicable to advanced manned telescience systems

    NASA Technical Reports Server (NTRS)

    Haines, Richard F.

    1990-01-01

    As automated space systems become more complex, autonomous, and opaque to the flight crew, it becomes increasingly difficult to determine whether the total system is performing as it should. Some of the complex and interrelated human performance measurement issues are addressed that are related to total system validation. An evaluative throughput model is presented which can be used to generate a human operator-related benchmark or figure of merit for a given system which involves humans at the input and output ends as well as other automated intelligent agents. The concept of sustained and accurate command/control data information transfer is introduced. The first two input parameters of the model involve nominal and off-nominal predicted events. The first of these calls for a detailed task analysis while the second is for a contingency event assessment. The last two required input parameters involving actual (measured) events, namely human performance and continuous semi-automated system performance. An expression combining these four parameters was found using digital simulations and identical, representative, random data to yield the smallest variance.

  6. EPHRIN-B1 Mosaicism Drives Cell Segregation in Craniofrontonasal Syndrome hiPSC-Derived Neuroepithelial Cells.

    PubMed

    Niethamer, Terren K; Larson, Andrew R; O'Neill, Audrey K; Bershteyn, Marina; Hsiao, Edward C; Klein, Ophir D; Pomerantz, Jason H; Bush, Jeffrey O

    2017-03-14

    Although human induced pluripotent stem cells (hiPSCs) hold great potential for the study of human diseases affecting disparate cell types, they have been underutilized in seeking mechanistic insights into the pathogenesis of congenital craniofacial disorders. Craniofrontonasal syndrome (CFNS) is a rare X-linked disorder caused by mutations in EFNB1 and characterized by craniofacial, skeletal, and neurological anomalies. Heterozygous females are more severely affected than hemizygous males, a phenomenon termed cellular interference that involves mosaicism for EPHRIN-B1 function. Although the mechanistic basis for cellular interference in CFNS has been hypothesized to involve Eph/ephrin-mediated cell segregation, no direct evidence for this has been demonstrated. Here, by generating hiPSCs from CFNS patients, we demonstrate that mosaicism for EPHRIN-B1 expression induced by random X inactivation in heterozygous females results in robust cell segregation in human neuroepithelial cells, thus supplying experimental evidence that Eph/ephrin-mediated cell segregation is relevant to pathogenesis in human CFNS patients. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Development of an errorable car-following driver model

    NASA Astrophysics Data System (ADS)

    Yang, H.-H.; Peng, H.

    2010-06-01

    An errorable car-following driver model is presented in this paper. An errorable driver model is one that emulates human driver's functions and can generate both nominal (error-free), as well as devious (with error) behaviours. This model was developed for evaluation and design of active safety systems. The car-following data used for developing and validating the model were obtained from a large-scale naturalistic driving database. The stochastic car-following behaviour was first analysed and modelled as a random process. Three error-inducing behaviours were then introduced. First, human perceptual limitation was studied and implemented. Distraction due to non-driving tasks was then identified based on the statistical analysis of the driving data. Finally, time delay of human drivers was estimated through a recursive least-square identification process. By including these three error-inducing behaviours, rear-end collisions with the lead vehicle could occur. The simulated crash rate was found to be similar but somewhat higher than that reported in traffic statistics.

  8. Wilhelm Weinberg’s Early Contribution to Segregation Analysis

    PubMed Central

    Stark, Alan; Seneta, Eugene

    2013-01-01

    Wilhelm Weinberg (1862–1937) is a largely forgotten pioneer of human and medical genetics. His name is linked with that of the English mathematician G. H. Hardy in the Hardy–Weinberg law, pervasive in textbooks on population genetics since it expresses stability over generations of zygote frequencies AA, Aa, aa under random mating. One of Weinberg’s signal contributions, in an article whose centenary we celebrate, was to verify that Mendel’s segregation law still held in the setting of human heredity, contrary to the then-prevailing view of William Bateson (1861–1926), the leading Mendelian geneticist of the time. Specifically, Weinberg verified that the proportion of recessive offspring genotypes aa in human parental crossings Aa × Aa (that is, the segregation ratio for such a setting) was indeed p=14. We focus in a nontechnical way on his procedure, called the simple sib method, and on the heated controversy with Felix Bernstein (1878–1956) in the 1920s and 1930s over work stimulated by Weinberg’s article. PMID:24018765

  9. Single-electron random-number generator (RNG) for highly secure ubiquitous computing applications

    NASA Astrophysics Data System (ADS)

    Uchida, Ken; Tanamoto, Tetsufumi; Fujita, Shinobu

    2007-11-01

    Since the security of all modern cryptographic techniques relies on unpredictable and irreproducible digital keys generated by random-number generators (RNGs), the realization of high-quality RNG is essential for secure communications. In this report, a new RNG, which utilizes single-electron phenomena, is proposed. A room-temperature operating silicon single-electron transistor (SET) having nearby an electron pocket is used as a high-quality, ultra-small RNG. In the proposed RNG, stochastic single-electron capture/emission processes to/from the electron pocket are detected with high sensitivity by the SET, and result in giant random telegraphic signals (GRTS) on the SET current. It is experimentally demonstrated that the single-electron RNG generates extremely high-quality random digital sequences at room temperature, in spite of its simple configuration. Because of its small-size and low-power properties, the single-electron RNG is promising as a key nanoelectronic device for future ubiquitous computing systems with highly secure mobile communication capabilities.

  10. Development of the piggyBac transposable system for Plasmodium berghei and its application for random mutagenesis in malaria parasites

    PubMed Central

    2011-01-01

    Background The genome of a number of species of malaria parasites (Plasmodium spp.) has been sequenced in the hope of identifying new drug and vaccine targets. However, almost one-half of predicted Plasmodium genes are annotated as hypothetical and are difficult to analyse in bulk due to the inefficiency of current reverse genetic methodologies for Plasmodium. Recently, it has been shown that the transposase piggyBac integrates at random into the genome of the human malaria parasite P. falciparum offering the possibility to develop forward genetic screens to analyse Plasmodium gene function. This study reports the development and application of the piggyBac transposition system for the rodent malaria parasite P. berghei and the evaluation of its potential as a tool in forward genetic studies. P. berghei is the most frequently used malaria parasite model in gene function analysis since phenotype screens throughout the complete Plasmodium life cycle are possible both in vitro and in vivo. Results We demonstrate that piggyBac based gene inactivation and promoter-trapping is both easier and more efficient in P. berghei than in the human malaria parasite, P. falciparum. Random piggyBac-mediated insertion into genes was achieved after parasites were transfected with the piggyBac donor plasmid either when transposase was expressed either from a helper plasmid or a stably integrated gene in the genome. Characterization of more than 120 insertion sites demonstrated that more than 70 most likely affect gene expression classifying their protein products as non-essential for asexual blood stage development. The non-essential nature of two of these genes was confirmed by targeted gene deletion one of which encodes P41, an ortholog of a human malaria vaccine candidate. Importantly for future development of whole genome phenotypic screens the remobilization of the piggyBac element in parasites that stably express transposase was demonstrated. Conclusion These data demonstrate that piggyBac behaved as an efficient and random transposon in P. berghei. Remobilization of piggyBac element shows that with further development the piggyBac system can be an effective tool to generate random genome-wide mutation parasite libraries, for use in large-scale phenotype screens in vitro and in vivo. PMID:21418605

  11. Depletion of polycistronic transcripts using short interfering RNAs: cDNA synthesis method affects levels of non-targeted genes determined by quantitative PCR.

    PubMed

    Hanning, Jennifer E; Groves, Ian J; Pett, Mark R; Coleman, Nicholas

    2013-05-21

    Short interfering RNAs (siRNAs) are often used to deplete viral polycistronic transcripts, such as those encoded by human papillomavirus (HPV). There are conflicting data in the literature concerning how siRNAs targeting one HPV gene can affect levels of other genes in the polycistronic transcripts. We hypothesised that the conflict might be partly explained by the method of cDNA synthesis used prior to transcript quantification. We treated HPV16-positive cervical keratinocytes with siRNAs targeting the HPV16 E7 gene and used quantitative PCR to compare transcript levels of E7 with those of E6 and E2, viral genes located upstream and downstream of the target site respectively. We compared our findings from cDNA generated using oligo-dT primers alone with those from cDNA generated using a combination of random hexamer and oligo-dT primers. Our data show that when polycistronic transcripts are targeted by siRNAs, there is a period when untranslatable cleaved mRNA upstream of the siRNA binding site remains detectable by PCR, if cDNA is generated using random hexamer primers. Such false indications of mRNA abundance are avoided using oligo-dT primers. The period corresponds to the time taken for siRNA activity and degradation of the cleaved transcripts. Genes downstream of the siRNA binding site are detectable during this interval, regardless of how the cDNA is generated. These data emphasise the importance of the cDNA synthesis method used when measuring transcript abundance following siRNA depletion of polycistronic transcripts. They provide a partial explanation for erroneous reports suggesting that siRNAs targeting HPV E7 can have gene-specific effects.

  12. Depletion of polycistronic transcripts using short interfering RNAs: cDNA synthesis method affects levels of non-targeted genes determined by quantitative PCR

    PubMed Central

    2013-01-01

    Background Short interfering RNAs (siRNAs) are often used to deplete viral polycistronic transcripts, such as those encoded by human papillomavirus (HPV). There are conflicting data in the literature concerning how siRNAs targeting one HPV gene can affect levels of other genes in the polycistronic transcripts. We hypothesised that the conflict might be partly explained by the method of cDNA synthesis used prior to transcript quantification. Findings We treated HPV16-positive cervical keratinocytes with siRNAs targeting the HPV16 E7 gene and used quantitative PCR to compare transcript levels of E7 with those of E6 and E2, viral genes located upstream and downstream of the target site respectively. We compared our findings from cDNA generated using oligo-dT primers alone with those from cDNA generated using a combination of random hexamer and oligo-dT primers. Our data show that when polycistronic transcripts are targeted by siRNAs, there is a period when untranslatable cleaved mRNA upstream of the siRNA binding site remains detectable by PCR, if cDNA is generated using random hexamer primers. Such false indications of mRNA abundance are avoided using oligo-dT primers. The period corresponds to the time taken for siRNA activity and degradation of the cleaved transcripts. Genes downstream of the siRNA binding site are detectable during this interval, regardless of how the cDNA is generated. Conclusions These data emphasise the importance of the cDNA synthesis method used when measuring transcript abundance following siRNA depletion of polycistronic transcripts. They provide a partial explanation for erroneous reports suggesting that siRNAs targeting HPV E7 can have gene-specific effects. PMID:23693071

  13. Design of Peptide Immunotherapies for MHC Class-II-Associated Autoimmune Disorders

    PubMed Central

    2013-01-01

    Autoimmune disorders, that occur when autoreactive immune cells are induced to activate their responses against self-tissues, affect one percent of the world population and represent one of the top 10 leading causes of death. The major histocompatibility complex (MHC) is a principal susceptibility locus for many human autoimmune diseases, in which self-tissue antigens providing targets for pathogenic lymphocytes are bound to HLA molecules encoded by disease-associated alleles. In spite of the attempts to design strategies for inhibition of antigen presentation targeting the MHC-peptide/TCR complex via generation of blocking antibodies, altered peptide ligands (APL), or inhibitors of costimulatory molecules, potent therapies with minimal side effects have yet to be developed. Copaxone (glatiramer acetate, GA) is a random synthetic amino acid copolymer that reduces the relapse rate by about 30% in relapsing-remitting multiple sclerosis (MS) patients. Based on the elucidated binding motifs of Copaxone and of the anchor residues of the immunogenic myelin basic protein (MBP) peptide to HLA-DR molecules, novel copolymers have been designed and proved to be more effective in suppressing MS-like disease in mice. In this report, we describe the rationale for design of second-generation synthetic random copolymers as candidate drugs for a number of MHC class-II-associated autoimmune disorders. PMID:24324511

  14. Viral metagenomic analysis of feces of wild small carnivores

    PubMed Central

    2014-01-01

    Background Recent studies have clearly demonstrated the enormous virus diversity that exists among wild animals. This exemplifies the required expansion of our knowledge of the virus diversity present in wildlife, as well as the potential transmission of these viruses to domestic animals or humans. Methods In the present study we evaluated the viral diversity of fecal samples (n = 42) collected from 10 different species of wild small carnivores inhabiting the northern part of Spain using random PCR in combination with next-generation sequencing. Samples were collected from American mink (Neovison vison), European mink (Mustela lutreola), European polecat (Mustela putorius), European pine marten (Martes martes), stone marten (Martes foina), Eurasian otter (Lutra lutra) and Eurasian badger (Meles meles) of the family of Mustelidae; common genet (Genetta genetta) of the family of Viverridae; red fox (Vulpes vulpes) of the family of Canidae and European wild cat (Felis silvestris) of the family of Felidae. Results A number of sequences of possible novel viruses or virus variants were detected, including a theilovirus, phleboviruses, an amdovirus, a kobuvirus and picobirnaviruses. Conclusions Using random PCR in combination with next generation sequencing, sequences of various novel viruses or virus variants were detected in fecal samples collected from Spanish carnivores. Detected novel viruses highlight the viral diversity that is present in fecal material of wild carnivores. PMID:24886057

  15. Weighted social networks for a large scale artificial society

    NASA Astrophysics Data System (ADS)

    Fan, Zong Chen; Duan, Wei; Zhang, Peng; Qiu, Xiao Gang

    2016-12-01

    The method of artificial society has provided a powerful way to study and explain how individual behaviors at micro level give rise to the emergence of global social phenomenon. It also creates the need for an appropriate representation of social structure which usually has a significant influence on human behaviors. It has been widely acknowledged that social networks are the main paradigm to describe social structure and reflect social relationships within a population. To generate social networks for a population of interest, considering physical distance and social distance among people, we propose a generation model of social networks for a large-scale artificial society based on human choice behavior theory under the principle of random utility maximization. As a premise, we first build an artificial society through constructing a synthetic population with a series of attributes in line with the statistical (census) data for Beijing. Then the generation model is applied to assign social relationships to each individual in the synthetic population. Compared with previous empirical findings, the results show that our model can reproduce the general characteristics of social networks, such as high clustering coefficient, significant community structure and small-world property. Our model can also be extended to a larger social micro-simulation as an input initial. It will facilitate to research and predict some social phenomenon or issues, for example, epidemic transition and rumor spreading.

  16. Random number generation in bilingual Balinese and German students: preliminary findings from an exploratory cross-cultural study.

    PubMed

    Strenge, Hans; Lesmana, Cokorda Bagus Jaya; Suryani, Luh Ketut

    2009-08-01

    Verbal random number generation is a procedurally simple task to assess executive function and appears ideally suited for the use under diverse settings in cross-cultural research. The objective of this study was to examine ethnic group differences between young adults in Bali (Indonesia) and Kiel (Germany): 50 bilingual healthy students, 30 Balinese and 20 Germans, attempted to generate a random sequence of the digits 1 to 9. In Balinese participants, randomization was done in Balinese (native language L1) and Indonesian (first foreign language L2), in German subjects in the German (L1) and English (L2) languages. 10 of 30 Balinese (33%), but no Germans, were unable to inhibit habitual counting in more than half of the responses. The Balinese produced significantly more nonrandom responses than the Germans with higher rates of counting and significantly less occurrence of the digits 2 and 3 in L1 compared with L2. Repetition and cycling behavior did not differ between the four languages. The findings highlight the importance of taking into account culture-bound psychosocial factors for Balinese individuals when administering and interpreting a random number generation test.

  17. Random number generators tested on quantum Monte Carlo simulations.

    PubMed

    Hongo, Kenta; Maezono, Ryo; Miura, Kenichi

    2010-08-01

    We have tested and compared several (pseudo) random number generators (RNGs) applied to a practical application, ground state energy calculations of molecules using variational and diffusion Monte Carlo metheds. A new multiple recursive generator with 8th-order recursion (MRG8) and the Mersenne twister generator (MT19937) are tested and compared with the RANLUX generator with five luxury levels (RANLUX-[0-4]). Both MRG8 and MT19937 are proven to give the same total energy as that evaluated with RANLUX-4 (highest luxury level) within the statistical error bars with less computational cost to generate the sequence. We also tested the notorious implementation of linear congruential generator (LCG), RANDU, for comparison. (c) 2010 Wiley Periodicals, Inc.

  18. An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution

    NASA Technical Reports Server (NTRS)

    Campbell, C. W.

    1983-01-01

    An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.

  19. Fidelity under isospectral perturbations: a random matrix study

    NASA Astrophysics Data System (ADS)

    Leyvraz, F.; García, A.; Kohler, H.; Seligman, T. H.

    2013-07-01

    The set of Hamiltonians generated by all unitary transformations from a single Hamiltonian is the largest set of isospectral Hamiltonians we can form. Taking advantage of the fact that the unitary group can be generated from Hermitian matrices we can take the ones generated by the Gaussian unitary ensemble with a small parameter as small perturbations. Similarly, the transformations generated by Hermitian antisymmetric matrices from orthogonal matrices form isospectral transformations among symmetric matrices. Based on this concept we can obtain the fidelity decay of a system that decays under a random isospectral perturbation with well-defined properties regarding time-reversal invariance. If we choose the Hamiltonian itself also from a classical random matrix ensemble, then we obtain solutions in terms of form factors in the limit of large matrices.

  20. Investigation of estimators of probability density functions

    NASA Technical Reports Server (NTRS)

    Speed, F. M.

    1972-01-01

    Four research projects are summarized which include: (1) the generation of random numbers on the IBM 360/44, (2) statistical tests used to check out random number generators, (3) Specht density estimators, and (4) use of estimators of probability density functions in analyzing large amounts of data.

  1. DNA based random key generation and management for OTP encryption.

    PubMed

    Zhang, Yunpeng; Liu, Xin; Sun, Manhui

    2017-09-01

    One-time pad (OTP) is a principle of key generation applied to the stream ciphering method which offers total privacy. The OTP encryption scheme has proved to be unbreakable in theory, but difficult to realize in practical applications. Because OTP encryption specially requires the absolute randomness of the key, its development has suffered from dense constraints. DNA cryptography is a new and promising technology in the field of information security. DNA chromosomes storing capabilities can be used as one-time pad structures with pseudo-random number generation and indexing in order to encrypt the plaintext messages. In this paper, we present a feasible solution to the OTP symmetric key generation and transmission problem with DNA at the molecular level. Through recombinant DNA technology, by using only sender-receiver known restriction enzymes to combine the secure key represented by DNA sequence and the T vector, we generate the DNA bio-hiding secure key and then place the recombinant plasmid in implanted bacteria for secure key transmission. The designed bio experiments and simulation results show that the security of the transmission of the key is further improved and the environmental requirements of key transmission are reduced. Analysis has demonstrated that the proposed DNA-based random key generation and management solutions are marked by high security and usability. Published by Elsevier B.V.

  2. Spectral turning bands for efficient Gaussian random fields generation on GPUs and accelerators

    NASA Astrophysics Data System (ADS)

    Hunger, L.; Cosenza, B.; Kimeswenger, S.; Fahringer, T.

    2015-11-01

    A random field (RF) is a set of correlated random variables associated with different spatial locations. RF generation algorithms are of crucial importance for many scientific areas, such as astrophysics, geostatistics, computer graphics, and many others. Current approaches commonly make use of 3D fast Fourier transform (FFT), which does not scale well for RF bigger than the available memory; they are also limited to regular rectilinear meshes. We introduce random field generation with the turning band method (RAFT), an RF generation algorithm based on the turning band method that is optimized for massively parallel hardware such as GPUs and accelerators. Our algorithm replaces the 3D FFT with a lower-order, one-dimensional FFT followed by a projection step and is further optimized with loop unrolling and blocking. RAFT can easily generate RF on non-regular (non-uniform) meshes and efficiently produce fields with mesh sizes bigger than the available device memory by using a streaming, out-of-core approach. Our algorithm generates RF with the correct statistical behavior and is tested on a variety of modern hardware, such as NVIDIA Tesla, AMD FirePro and Intel Phi. RAFT is faster than the traditional methods on regular meshes and has been successfully applied to two real case scenarios: planetary nebulae and cosmological simulations.

  3. A high-speed on-chip pseudo-random binary sequence generator for multi-tone phase calibration

    NASA Astrophysics Data System (ADS)

    Gommé, Liesbeth; Vandersteen, Gerd; Rolain, Yves

    2011-07-01

    An on-chip reference generator is conceived by adopting the technique of decimating a pseudo-random binary sequence (PRBS) signal in parallel sequences. This is of great benefit when high-speed generation of PRBS and PRBS-derived signals is the objective. The design implemented standard CMOS logic is available in commercial libraries to provide the logic functions for the generator. The design allows the user to select the periodicity of the PRBS and the PRBS-derived signals. The characterization of the on-chip generator marks its performance and reveals promising specifications.

  4. Scope of Various Random Number Generators in ant System Approach for TSP

    NASA Technical Reports Server (NTRS)

    Sen, S. K.; Shaykhian, Gholam Ali

    2007-01-01

    Experimented on heuristic, based on an ant system approach for traveling salesman problem, are several quasi- and pseudo-random number generators. This experiment is to explore if any particular generator is most desirable. Such an experiment on large samples has the potential to rank the performance of the generators for the foregoing heuristic. This is mainly to seek an answer to the controversial issue "which generator is the best in terms of quality of the result (accuracy) as well as cost of producing the result (time/computational complexity) in a probabilistic/statistical sense."

  5. Participation in international human rights NGOs: The effect of democracy and state capacity.

    PubMed

    Zhou, Min

    2012-09-01

    This study examines the effect of the state on participation in human rights international non-governmental organizations (INGOs) from 1966 through 2006, using random effects negative binomial models. Civic participation in human rights INGOs is not responsive to human rights abuses of the state, but is affected by the institutional environment provided by the state. Two intertwined dimensions within the state, democracy and state capacity, are found to be important in explaining cross-national variation in this participation. Strong state capacity magnifies the effect of democracy. A strong democratic state generates the most favorable condition. A strong but authoritarian state, however, is the most unfavorable, because it has both the motive and the capacity to restrain its citizens' global civic engagement. In contrast, an authoritarian but weak state lacks adequate capacity to intervene, and thus tolerates more participation than its strong counterpart. Over time differential participation across different types of states has not been diminished. This study reveals the role of the state in mediating between local citizens and global civil society, and develops a state-centered explanation for unequal participation in human rights INGOs across countries. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Production and perception rules underlying visual patterns: effects of symmetry and hierarchy.

    PubMed

    Westphal-Fitch, Gesche; Huber, Ludwig; Gómez, Juan Carlos; Fitch, W Tecumseh

    2012-07-19

    Formal language theory has been extended to two-dimensional patterns, but little is known about two-dimensional pattern perception. We first examined spontaneous two-dimensional visual pattern production by humans, gathered using a novel touch screen approach. Both spontaneous creative production and subsequent aesthetic ratings show that humans prefer ordered, symmetrical patterns over random patterns. We then further explored pattern-parsing abilities in different human groups, and compared them with pigeons. We generated visual plane patterns based on rules varying in complexity. All human groups tested, including children and individuals diagnosed with autism spectrum disorder (ASD), were able to detect violations of all production rules tested. Our ASD participants detected pattern violations with the same speed and accuracy as matched controls. Children's ability to detect violations of a relatively complex rotational rule correlated with age, whereas their ability to detect violations of a simple translational rule did not. By contrast, even with extensive training, pigeons were unable to detect orientation-based structural violations, suggesting that, unlike humans, they did not learn the underlying structural rules. Visual two-dimensional patterns offer a promising new formally-grounded way to investigate pattern production and perception in general, widely applicable across species and age groups.

  7. Production and perception rules underlying visual patterns: effects of symmetry and hierarchy

    PubMed Central

    Westphal-Fitch, Gesche; Huber, Ludwig; Gómez, Juan Carlos; Fitch, W. Tecumseh

    2012-01-01

    Formal language theory has been extended to two-dimensional patterns, but little is known about two-dimensional pattern perception. We first examined spontaneous two-dimensional visual pattern production by humans, gathered using a novel touch screen approach. Both spontaneous creative production and subsequent aesthetic ratings show that humans prefer ordered, symmetrical patterns over random patterns. We then further explored pattern-parsing abilities in different human groups, and compared them with pigeons. We generated visual plane patterns based on rules varying in complexity. All human groups tested, including children and individuals diagnosed with autism spectrum disorder (ASD), were able to detect violations of all production rules tested. Our ASD participants detected pattern violations with the same speed and accuracy as matched controls. Children's ability to detect violations of a relatively complex rotational rule correlated with age, whereas their ability to detect violations of a simple translational rule did not. By contrast, even with extensive training, pigeons were unable to detect orientation-based structural violations, suggesting that, unlike humans, they did not learn the underlying structural rules. Visual two-dimensional patterns offer a promising new formally-grounded way to investigate pattern production and perception in general, widely applicable across species and age groups. PMID:22688636

  8. Characteristics of allelic gene expression in human brain cells from single-cell RNA-seq data analysis.

    PubMed

    Zhao, Dejian; Lin, Mingyan; Pedrosa, Erika; Lachman, Herbert M; Zheng, Deyou

    2017-11-10

    Monoallelic expression of autosomal genes has been implicated in human psychiatric disorders. However, there is a paucity of allelic expression studies in human brain cells at the single cell and genome wide levels. In this report, we reanalyzed a previously published single-cell RNA-seq dataset from several postmortem human brains and observed pervasive monoallelic expression in individual cells, largely in a random manner. Examining single nucleotide variants with a predicted functional disruption, we found that the "damaged" alleles were overall expressed in fewer brain cells than their counterparts, and at a lower level in cells where their expression was detected. We also identified many brain cell type-specific monoallelically expressed genes. Interestingly, many of these cell type-specific monoallelically expressed genes were enriched for functions important for those brain cell types. In addition, function analysis showed that genes displaying monoallelic expression and correlated expression across neuronal cells from different individual brains were implicated in the regulation of synaptic function. Our findings suggest that monoallelic gene expression is prevalent in human brain cells, which may play a role in generating cellular identity and neuronal diversity and thus increasing the complexity and diversity of brain cell functions.

  9. Owls see in stereo much like humans do.

    PubMed

    van der Willigen, Robert F

    2011-06-10

    While 3D experiences through binocular disparity sensitivity have acquired special status in the understanding of human stereo vision, much remains to be learned about how binocularity is put to use in animals. The owl provides an exceptional model to study stereo vision as it displays one of the highest degrees of binocular specialization throughout the animal kingdom. In a series of six behavioral experiments, equivalent to hallmark human psychophysical studies, I compiled an extensive body of stereo performance data from two trained owls. Computer-generated, binocular random-dot patterns were used to ensure pure stereo performance measurements. In all cases, I found that owls perform much like humans do, viz.: (1) disparity alone can evoke figure-ground segmentation; (2) selective use of "relative" rather than "absolute" disparity; (3) hyperacute sensitivity; (4) disparity processing allows for the avoidance of monocular feature detection prior to object recognition; (5) large binocular disparities are not tolerated; (6) disparity guides the perceptual organization of 2D shape. The robustness and very nature of these binocular disparity-based perceptual phenomena bear out that owls, like humans, exploit the third dimension to facilitate early figure-ground segmentation of tangible objects.

  10. Block randomization versus complete randomization of human perception stimuli: is there a difference?

    NASA Astrophysics Data System (ADS)

    Moyer, Steve; Uhl, Elizabeth R.

    2015-05-01

    For more than 50 years, the U.S. Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD) has been studying and modeling the human visual discrimination process as it pertains to military imaging systems. In order to develop sensor performance models, human observers are trained to expert levels in the identification of military vehicles. From 1998 until 2006, the experimental stimuli were block randomized, meaning that stimuli with similar difficulty levels (for example, in terms of distance from target, blur, noise, etc.) were presented together in blocks of approximately 24 images but the order of images within the block was random. Starting in 2006, complete randomization came into vogue, meaning that difficulty could change image to image. It was thought that this would provide a more statistically robust result. In this study we investigated the impact of the two types of randomization on performance in two groups of observers matched for skill to create equivalent groups. It is hypothesized that Soldiers in the Complete Randomized condition will have to shift their decision criterion more frequently than Soldiers in the Block Randomization group and this shifting is expected to impede performance so that Soldiers in the Block Randomized group perform better.

  11. A Statistical Method to Distinguish Functional Brain Networks

    PubMed Central

    Fujita, André; Vidal, Maciel C.; Takahashi, Daniel Y.

    2017-01-01

    One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism (p < 0.001). PMID:28261045

  12. A Statistical Method to Distinguish Functional Brain Networks.

    PubMed

    Fujita, André; Vidal, Maciel C; Takahashi, Daniel Y

    2017-01-01

    One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism ( p < 0.001).

  13. Note: The design of thin gap chamber simulation signal source based on field programmable gate array.

    PubMed

    Hu, Kun; Lu, Houbing; Wang, Xu; Li, Feng; Liang, Futian; Jin, Ge

    2015-01-01

    The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.

  14. Note: The design of thin gap chamber simulation signal source based on field programmable gate array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Kun; Wang, Xu; Li, Feng

    The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.

  15. A Prospective, Randomized, Double-blind, Split-face Clinical Trial Comparing the Efficacy of Two Topical Human Growth Factors for the Rejuvenation of the Aging Face

    PubMed Central

    Goldman, Mitchel P.

    2017-01-01

    Background: Cosmeceutical products represent an increasingly important therapeutic option for anti-aging and rejuvenation, either used alone or in combination with dermatologic surgical procedures. Among this group of products, topical growth factors have demonstrated efficacy in randomized, controlled clinical trials. However, comparisons between different products remain uncommon. Objective: The objective of this randomized, double-blind, split-face clinical trial was to compare two different topical growth factor formulations derived from either human fibroblasts or human adipose tissue derived mesenchymal stem cells. Methods: This was an institutional review board-approved, randomized, double-blind, split-face clinical trial involving 20 healthy subjects with moderate-to-severe facial wrinkling secondary to photodamage. One half of the face was randomized to receive topical human fibroblast growth factors and the other topical human mesenchymal stem cell growth factors. Treatment was continued for three months, and evaluations were performed in a double-blind fashion. Results: Both growth factor formulations achieved significant improvement in facial wrinkling. Blinded investigator and subject evaluations did not detect any significant differences between the two formulations in terms of efficacy, safety, or tolerability. Conclusion: Both human fibroblast growth factors and human mesenchymal stem cell growth factors are effective at facial rejuvenation. Topical growth factors represent a useful therapeutic modality. PMID:28670356

  16. A Prospective, Randomized, Double-blind, Split-face Clinical Trial Comparing the Efficacy of Two Topical Human Growth Factors for the Rejuvenation of the Aging Face.

    PubMed

    Wu, Douglas C; Goldman, Mitchel P

    2017-05-01

    Background: Cosmeceutical products represent an increasingly important therapeutic option for anti-aging and rejuvenation, either used alone or in combination with dermatologic surgical procedures. Among this group of products, topical growth factors have demonstrated efficacy in randomized, controlled clinical trials. However, comparisons between different products remain uncommon. Objective: The objective of this randomized, double-blind, split-face clinical trial was to compare two different topical growth factor formulations derived from either human fibroblasts or human adipose tissue derived mesenchymal stem cells. Methods: This was an institutional review board-approved, randomized, double-blind, split-face clinical trial involving 20 healthy subjects with moderate-to-severe facial wrinkling secondary to photodamage. One half of the face was randomized to receive topical human fibroblast growth factors and the other topical human mesenchymal stem cell growth factors. Treatment was continued for three months, and evaluations were performed in a double-blind fashion. Results: Both growth factor formulations achieved significant improvement in facial wrinkling. Blinded investigator and subject evaluations did not detect any significant differences between the two formulations in terms of efficacy, safety, or tolerability. Conclusion: Both human fibroblast growth factors and human mesenchymal stem cell growth factors are effective at facial rejuvenation. Topical growth factors represent a useful therapeutic modality.

  17. Random sampling of constrained phylogenies: conducting phylogenetic analyses when the phylogeny is partially known.

    PubMed

    Housworth, E A; Martins, E P

    2001-01-01

    Statistical randomization tests in evolutionary biology often require a set of random, computer-generated trees. For example, earlier studies have shown how large numbers of computer-generated trees can be used to conduct phylogenetic comparative analyses even when the phylogeny is uncertain or unknown. These methods were limited, however, in that (in the absence of molecular sequence or other data) they allowed users to assume that no phylogenetic information was available or that all possible trees were known. Intermediate situations where only a taxonomy or other limited phylogenetic information (e.g., polytomies) are available are technically more difficult. The current study describes a procedure for generating random samples of phylogenies while incorporating limited phylogenetic information (e.g., four taxa belong together in a subclade). The procedure can be used to conduct comparative analyses when the phylogeny is only partially resolved or can be used in other randomization tests in which large numbers of possible phylogenies are needed.

  18. Real World Testing Of A Piezoelectric Rotational Energy Harvester For Human Motion

    NASA Astrophysics Data System (ADS)

    Pillatsch, P.; Yeatman, E. M.; Holmes, A. S.

    2013-12-01

    Harvesting energy from human motion is challenging because the frequencies are generally low and random compared to industrial machinery that vibrates at much higher frequencies. One of the most promising and popular strategies to overcome this is frequency up-conversion. The transducing element is actuated at its optimal frequency of operation, higher than the source excitation frequency, through some kind of catch and release mechanism. This is beneficial for efficient power generation. Such devices have now been investigated for a few years and this paper takes a previously introduced piezoelectric rotational harvester, relying on beam plucking for the energy conversion, to the next step by testing the device during a half marathon race. The prototype and data acquisition system are described in detail and the experimental results presented. A comparison of the input excitation, based on an accelerometer readout, and the output voltage of the piezoelectric beam, recorded at the same time, confirm the successful implementation of the system. For a device functional volume of 1.85 cm3, a maximum power output of 7 μW was achieved when the system was worn on the upper arm. However, degradation of the piezoelectric material meant that the performance dropped rapidly from this initial level; this requires further research. Furthermore, the need for intermediate energy storage solutions is discussed, as human motion harvesters only generate power as long as the wearer is actually moving.

  19. Physical layer one-time-pad data encryption through synchronized semiconductor laser networks

    NASA Astrophysics Data System (ADS)

    Argyris, Apostolos; Pikasis, Evangelos; Syvridis, Dimitris

    2016-02-01

    Semiconductor lasers (SL) have been proven to be a key device in the generation of ultrafast true random bit streams. Their potential to emit chaotic signals under conditions with desirable statistics, establish them as a low cost solution to cover various needs, from large volume key generation to real-time encrypted communications. Usually, only undemanding post-processing is needed to convert the acquired analog timeseries to digital sequences that pass all established tests of randomness. A novel architecture that can generate and exploit these true random sequences is through a fiber network in which the nodes are semiconductor lasers that are coupled and synchronized to central hub laser. In this work we show experimentally that laser nodes in such a star network topology can synchronize with each other through complex broadband signals that are the seed to true random bit sequences (TRBS) generated at several Gb/s. The potential for each node to access real-time generated and synchronized with the rest of the nodes random bit streams, through the fiber optic network, allows to implement an one-time-pad encryption protocol that mixes the synchronized true random bit sequence with real data at Gb/s rates. Forward-error correction methods are used to reduce the errors in the TRBS and the final error rate at the data decoding level. An appropriate selection in the sampling methodology and properties, as well as in the physical properties of the chaotic seed signal through which network locks in synchronization, allows an error free performance.

  20. Security under Uncertainty: Adaptive Attackers Are More Challenging to Human Defenders than Random Attackers

    PubMed Central

    Moisan, Frédéric; Gonzalez, Cleotilde

    2017-01-01

    Game Theory is a common approach used to understand attacker and defender motives, strategies, and allocation of limited security resources. For example, many defense algorithms are based on game-theoretic solutions that conclude that randomization of defense actions assures unpredictability, creating difficulties for a human attacker. However, many game-theoretic solutions often rely on idealized assumptions of decision making that underplay the role of human cognition and information uncertainty. The consequence is that we know little about how effective these algorithms are against human players. Using a simplified security game, we study the type of attack strategy and the uncertainty about an attacker's strategy in a laboratory experiment where participants play the role of defenders against a simulated attacker. Our goal is to compare a human defender's behavior in three levels of uncertainty (Information Level: Certain, Risky, Uncertain) and three types of attacker's strategy (Attacker's strategy: Minimax, Random, Adaptive) in a between-subjects experimental design. Best defense performance is achieved when defenders play against a minimax and a random attack strategy compared to an adaptive strategy. Furthermore, when payoffs are certain, defenders are as efficient against random attack strategy as they are against an adaptive strategy, but when payoffs are uncertain, defenders have most difficulties defending against an adaptive attacker compared to a random attacker. We conclude that given conditions of uncertainty in many security problems, defense algorithms would be more efficient if they are adaptive to the attacker actions, taking advantage of the attacker's human inefficiencies. PMID:28690557

  1. An analysis of the effect of funding source in randomized clinical trials of second generation antipsychotics for the treatment of schizophrenia.

    PubMed

    Montgomery, John H; Byerly, Matthew; Carmody, Thomas; Li, Baitao; Miller, Daniel R; Varghese, Femina; Holland, Rhiannon

    2004-12-01

    The effect of funding source on the outcome of randomized controlled trials has been investigated in several medical disciplines; however, psychiatry has been largely excluded from such analyses. In this article, randomized controlled trials of second generation antipsychotics in schizophrenia are reviewed and analyzed with respect to funding source (industry vs. non-industry funding). A literature search was conducted for randomized, double-blind trials in which at least one of the tested treatments was a second generation antipsychotic. In each study, design quality and study outcome were assessed quantitatively according to rating scales. Mean quality and outcome scores were compared in the industry-funded studies and non-industry-funded studies. An analysis of the primary author's affiliation with industry was similarly performed. Results of industry-funded studies significantly favored second generation over first generation antipsychotics when compared to non-industry-funded studies. Non-industry-funded studies showed a trend toward higher quality than industry-funded studies; however, the difference between the two was not significant. Also, within the industry-funded studies, outcomes of trials involving first authors employed by industry sponsors demonstrated a trend toward second generation over first generation antipsychotics to a greater degree than did trials involving first authors employed outside the industry (p=0.05). While the retrospective design of the study limits the strength of the findings, the data suggest that industry bias may occur in randomized controlled trials in schizophrenia. There appears to be several sources by which bias may enter clinical research, including trial design, control of data analysis and multiplicity/redundancy of trials.

  2. An investigation of the uniform random number generator

    NASA Technical Reports Server (NTRS)

    Temple, E. C.

    1982-01-01

    Most random number generators that are in use today are of the congruential form X(i+1) + AX(i) + C mod M where A, C, and M are nonnegative integers. If C=O, the generator is called the multiplicative type and those for which C/O are called mixed congruential generators. It is easy to see that congruential generators will repeat a sequence of numbers after a maximum of M values have been generated. The number of numbers that a procedure generates before restarting the sequence is called the length or the period of the generator. Generally, it is desirable to make the period as long as possible. A detailed discussion of congruential generators is given. Also, several promising procedures that differ from the multiplicative and mixed procedure are discussed.

  3. Scaling up the evaluation of psychotherapy: evaluating motivational interviewing fidelity via statistical text classification

    PubMed Central

    2014-01-01

    Background Behavioral interventions such as psychotherapy are leading, evidence-based practices for a variety of problems (e.g., substance abuse), but the evaluation of provider fidelity to behavioral interventions is limited by the need for human judgment. The current study evaluated the accuracy of statistical text classification in replicating human-based judgments of provider fidelity in one specific psychotherapy—motivational interviewing (MI). Method Participants (n = 148) came from five previously conducted randomized trials and were either primary care patients at a safety-net hospital or university students. To be eligible for the original studies, participants met criteria for either problematic drug or alcohol use. All participants received a type of brief motivational interview, an evidence-based intervention for alcohol and substance use disorders. The Motivational Interviewing Skills Code is a standard measure of MI provider fidelity based on human ratings that was used to evaluate all therapy sessions. A text classification approach called a labeled topic model was used to learn associations between human-based fidelity ratings and MI session transcripts. It was then used to generate codes for new sessions. The primary comparison was the accuracy of model-based codes with human-based codes. Results Receiver operating characteristic (ROC) analyses of model-based codes showed reasonably strong sensitivity and specificity with those from human raters (range of area under ROC curve (AUC) scores: 0.62 – 0.81; average AUC: 0.72). Agreement with human raters was evaluated based on talk turns as well as code tallies for an entire session. Generated codes had higher reliability with human codes for session tallies and also varied strongly by individual code. Conclusion To scale up the evaluation of behavioral interventions, technological solutions will be required. The current study demonstrated preliminary, encouraging findings regarding the utility of statistical text classification in bridging this methodological gap. PMID:24758152

  4. Toward DNA-based Security Circuitry: First Step - Random Number Generation.

    PubMed

    Bogard, Christy M; Arazi, Benjamin; Rouchka, Eric C

    2008-08-10

    DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. Our team investigates the implications of DNA-based circuit design in serving security applications. As an initial step we develop a random number generation circuitry. A novel prototype schema employs solid-phase synthesis of oligonucleotides for random construction of DNA sequences. Temporary storage and retrieval is achieved through plasmid vectors.

  5. Simulations Using Random-Generated DNA and RNA Sequences

    ERIC Educational Resources Information Center

    Bryce, C. F. A.

    1977-01-01

    Using a very simple computer program written in BASIC, a very large number of random-generated DNA or RNA sequences are obtained. Students use these sequences to predict complementary sequences and translational products, evaluate base compositions, determine frequencies of particular triplet codons, and suggest possible secondary structures.…

  6. Generating "Random" Integers

    ERIC Educational Resources Information Center

    Griffiths, Martin

    2011-01-01

    One of the author's undergraduate students recently asked him whether it was possible to generate a random positive integer. After some thought, the author realised that there were plenty of interesting mathematical ideas inherent in her question. So much so in fact, that the author decided to organise a workshop, open both to undergraduates and…

  7. Some Aspects of the Investigation of Random Vibration Influence on Ride Comfort

    NASA Astrophysics Data System (ADS)

    DEMIĆ, M.; LUKIĆ, J.; MILIĆ, Ž.

    2002-05-01

    Contemporary vehicles must satisfy high ride comfort criteria. This paper attempts to develop criteria for ride comfort improvement. The highest loading levels have been found to be in the vertical direction and the lowest in lateral direction in passenger cars and trucks. These results have formed the basis for further laboratory and field investigations. An investigation of the human body behaviour under random vibrations is reported in this paper. The research included two phases; biodynamic research and ride comfort investigation. A group of 30 subjects was tested. The influence of broadband random vibrations on the human body was examined through the seat-to-head transmissibility function (STHT). Initially, vertical and fore and aft vibrations were considered. Multi-directional vibration was also investigated. In the biodynamic research, subjects were exposed to 0·55, 1·75 and 2·25 m/s2 r.m.s. vibration levels in the 0·5- 40 Hz frequency domain. The influence of sitting position on human body behaviour under two axial vibrations was also examined. Data analysis showed that the human body behaviour under two-directional random vibrations could not be approximated by superposition of one-directional random vibrations. Non-linearity of the seated human body in the vertical and fore and aft directions was observed. Seat-backrest angle also influenced STHT. In the second phase of experimental research, a new method for the assessment of the influence of narrowband random vibration on the human body was formulated and tested. It included determination of equivalent comfort curves in the vertical and fore and aft directions under one- and two-directional narrowband random vibrations. Equivalent comfort curves for durations of 2·5, 4 and 8 h were determined.

  8. GASPRNG: GPU accelerated scalable parallel random number generator library

    NASA Astrophysics Data System (ADS)

    Gao, Shuang; Peterson, Gregory D.

    2013-04-01

    Graphics processors represent a promising technology for accelerating computational science applications. Many computational science applications require fast and scalable random number generation with good statistical properties, so they use the Scalable Parallel Random Number Generators library (SPRNG). We present the GPU Accelerated SPRNG library (GASPRNG) to accelerate SPRNG in GPU-based high performance computing systems. GASPRNG includes code for a host CPU and CUDA code for execution on NVIDIA graphics processing units (GPUs) along with a programming interface to support various usage models for pseudorandom numbers and computational science applications executing on the CPU, GPU, or both. This paper describes the implementation approach used to produce high performance and also describes how to use the programming interface. The programming interface allows a user to be able to use GASPRNG the same way as SPRNG on traditional serial or parallel computers as well as to develop tightly coupled programs executing primarily on the GPU. We also describe how to install GASPRNG and use it. To help illustrate linking with GASPRNG, various demonstration codes are included for the different usage models. GASPRNG on a single GPU shows up to 280x speedup over SPRNG on a single CPU core and is able to scale for larger systems in the same manner as SPRNG. Because GASPRNG generates identical streams of pseudorandom numbers as SPRNG, users can be confident about the quality of GASPRNG for scalable computational science applications. Catalogue identifier: AEOI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOI_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: UTK license. No. of lines in distributed program, including test data, etc.: 167900 No. of bytes in distributed program, including test data, etc.: 1422058 Distribution format: tar.gz Programming language: C and CUDA. Computer: Any PC or workstation with NVIDIA GPU (Tested on Fermi GTX480, Tesla C1060, Tesla M2070). Operating system: Linux with CUDA version 4.0 or later. Should also run on MacOS, Windows, or UNIX. Has the code been vectorized or parallelized?: Yes. Parallelized using MPI directives. RAM: 512 MB˜ 732 MB (main memory on host CPU, depending on the data type of random numbers.) / 512 MB (GPU global memory) Classification: 4.13, 6.5. Nature of problem: Many computational science applications are able to consume large numbers of random numbers. For example, Monte Carlo simulations are able to consume limitless random numbers for the computation as long as resources for the computing are supported. Moreover, parallel computational science applications require independent streams of random numbers to attain statistically significant results. The SPRNG library provides this capability, but at a significant computational cost. The GASPRNG library presented here accelerates the generators of independent streams of random numbers using graphical processing units (GPUs). Solution method: Multiple copies of random number generators in GPUs allow a computational science application to consume large numbers of random numbers from independent, parallel streams. GASPRNG is a random number generators library to allow a computational science application to employ multiple copies of random number generators to boost performance. Users can interface GASPRNG with software code executing on microprocessors and/or GPUs. Running time: The tests provided take a few minutes to run.

  9. Multiwavelength generation in a random distributed feedback fiber laser using an all fiber Lyot filter.

    PubMed

    Sugavanam, S; Yan, Z; Kamynin, V; Kurkov, A S; Zhang, L; Churkin, D V

    2014-02-10

    Multiwavelength lasing in the random distributed feedback fiber laser is demonstrated by employing an all fiber Lyot filter. Stable multiwavelength generation is obtained, with each line exhibiting sub-nanometer line-widths. A flat power distribution over multiple lines is obtained, which indicates that the power between lines is redistributed in nonlinear mixing processes. The multiwavelength generation is observed both in first and second Stokes waves.

  10. Small-scale seismic inversion using surface waves extracted from noise cross correlation.

    PubMed

    Gouédard, Pierre; Roux, Philippe; Campillo, Michel

    2008-03-01

    Green's functions can be retrieved between receivers from the correlation of ambient seismic noise or with an appropriate set of randomly distributed sources. This principle is demonstrated in small-scale geophysics using noise sources generated by human steps during a 10-min walk in the alignment of a 14-m-long accelerometer line array. The time-domain correlation of the records yields two surface wave modes extracted from the Green's function between each pair of accelerometers. A frequency-wave-number Fourier analysis yields each mode contribution and their dispersion curve. These dispersion curves are then inverted to provide the one-dimensional shear velocity of the near surface.

  11. On framing the research question and choosing the appropriate research design.

    PubMed

    Parfrey, Patrick S; Ravani, Pietro

    2015-01-01

    Clinical epidemiology is the science of human disease investigation with a focus on diagnosis, prognosis, and treatment. The generation of a reasonable question requires definition of patients, interventions, controls, and outcomes. The goal of research design is to minimize error, to ensure adequate samples, to measure input and output variables appropriately, to consider external and internal validities, to limit bias, and to address clinical as well as statistical relevance. The hierarchy of evidence for clinical decision-making places randomized controlled trials (RCT) or systematic review of good quality RCTs at the top of the evidence pyramid. Prognostic and etiologic questions are best addressed with longitudinal cohort studies.

  12. On framing the research question and choosing the appropriate research design.

    PubMed

    Parfrey, Patrick; Ravani, Pietro

    2009-01-01

    Clinical epidemiology is the science of human disease investigation with a focus on diagnosis, prognosis, and treatment. The generation of a reasonable question requires the definition of patients, interventions, controls, and outcomes. The goal of research design is to minimize error, ensure adequate samples, measure input and output variables appropriately, consider external and internal validities, limit bias, and address clinical as well as statistical relevance. The hierarchy of evidence for clinical decision making places randomized controlled trials (RCT) or systematic review of good quality RCTs at the top of the evidence pyramid. Prognostic and etiologic questions are best addressed with longitudinal cohort studies.

  13. Single-shot stand-off chemical identification of powders using random Raman lasing

    PubMed Central

    Hokr, Brett H.; Bixler, Joel N.; Noojin, Gary D.; Thomas, Robert J.; Rockwell, Benjamin A.; Yakovlev, Vladislav V.; Scully, Marlan O.

    2014-01-01

    The task of identifying explosives, hazardous chemicals, and biological materials from a safe distance is the subject we consider. Much of the prior work on stand-off spectroscopy using light has been devoted to generating a backward-propagating beam of light that can be used drive further spectroscopic processes. The discovery of random lasing and, more recently, random Raman lasing provide a mechanism for remotely generating copious amounts of chemically specific Raman scattered light. The bright nature of random Raman lasing renders directionality unnecessary, allowing for the detection and identification of chemicals from large distances in real time. In this article, the single-shot remote identification of chemicals at kilometer-scale distances is experimentally demonstrated using random Raman lasing. PMID:25114231

  14. Optimized random phase only holograms.

    PubMed

    Zea, Alejandro Velez; Barrera Ramirez, John Fredy; Torroba, Roberto

    2018-02-15

    We propose a simple and efficient technique capable of generating Fourier phase only holograms with a reconstruction quality similar to the results obtained with the Gerchberg-Saxton (G-S) algorithm. Our proposal is to use the traditional G-S algorithm to optimize a random phase pattern for the resolution, pixel size, and target size of the general optical system without any specific amplitude data. This produces an optimized random phase (ORAP), which is used for fast generation of phase only holograms of arbitrary amplitude targets. This ORAP needs to be generated only once for a given optical system, avoiding the need for costly iterative algorithms for each new target. We show numerical and experimental results confirming the validity of the proposal.

  15. Critical side channel effects in random bit generation with multiple semiconductor lasers in a polarization-based quantum key distribution system.

    PubMed

    Ko, Heasin; Choi, Byung-Seok; Choe, Joong-Seon; Kim, Kap-Joong; Kim, Jong-Hoi; Youn, Chun Ju

    2017-08-21

    Most polarization-based BB84 quantum key distribution (QKD) systems utilize multiple lasers to generate one of four polarization quantum states randomly. However, random bit generation with multiple lasers can potentially open critical side channels that significantly endangers the security of QKD systems. In this paper, we show unnoticed side channels of temporal disparity and intensity fluctuation, which possibly exist in the operation of multiple semiconductor laser diodes. Experimental results show that the side channels can enormously degrade security performance of QKD systems. An important system issue for the improvement of quantum bit error rate (QBER) related with laser driving condition is further addressed with experimental results.

  16. RandomSpot: A web-based tool for systematic random sampling of virtual slides.

    PubMed

    Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E

    2015-01-01

    This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.

  17. Direct generation of all-optical random numbers from optical pulse amplitude chaos.

    PubMed

    Li, Pu; Wang, Yun-Cai; Wang, An-Bang; Yang, Ling-Zhen; Zhang, Ming-Jiang; Zhang, Jian-Zhong

    2012-02-13

    We propose and theoretically demonstrate an all-optical method for directly generating all-optical random numbers from pulse amplitude chaos produced by a mode-locked fiber ring laser. Under an appropriate pump intensity, the mode-locked laser can experience a quasi-periodic route to chaos. Such a chaos consists of a stream of pulses with a fixed repetition frequency but random intensities. In this method, we do not require sampling procedure and external triggered clocks but directly quantize the chaotic pulses stream into random number sequence via an all-optical flip-flop. Moreover, our simulation results show that the pulse amplitude chaos has no periodicity and possesses a highly symmetric distribution of amplitude. Thus, in theory, the obtained random number sequence without post-processing has a high-quality randomness verified by industry-standard statistical tests.

  18. Toward an evolutionary-predictive foundation for creativity : Commentary on "Human creativity, evolutionary algorithms, and predictive representations: The mechanics of thought trials" by Arne Dietrich and Hilde Haider, 2014 (Accepted pending minor revisions for publication in Psychonomic Bulletin & Review).

    PubMed

    Gabora, Liane; Kauffman, Stuart

    2016-04-01

    Dietrich and Haider (Psychonomic Bulletin & Review, 21 (5), 897-915, 2014) justify their integrative framework for creativity founded on evolutionary theory and prediction research on the grounds that "theories and approaches guiding empirical research on creativity have not been supported by the neuroimaging evidence." Although this justification is controversial, the general direction holds promise. This commentary clarifies points of disagreement and unresolved issues, and addresses mis-applications of evolutionary theory that lead the authors to adopt a Darwinian (versus Lamarckian) approach. To say that creativity is Darwinian is not to say that it consists of variation plus selection - in the everyday sense of the term - as the authors imply; it is to say that evolution is occurring because selection is affecting the distribution of randomly generated heritable variation across generations. In creative thought the distribution of variants is not key, i.e., one is not inclined toward idea A because 60 % of one's candidate ideas are variants of A while only 40 % are variants of B; one is inclined toward whichever seems best. The authors concede that creative variation is partly directed; however, the greater the extent to which variants are generated non-randomly, the greater the extent to which the distribution of variants can reflect not selection but the initial generation bias. Since each thought in a creative process can alter the selective criteria against which the next is evaluated, there is no demarcation into generations as assumed in a Darwinian model. We address the authors' claim that reduced variability and individuality are more characteristic of Lamarckism than Darwinian evolution, and note that a Lamarckian approach to creativity has addressed the challenge of modeling the emergent features associated with insight.

  19. Noise-Induced Synchronization among Sub-RF CMOS Analog Oscillators for Skew-Free Clock Distribution

    NASA Astrophysics Data System (ADS)

    Utagawa, Akira; Asai, Tetsuya; Hirose, Tetsuya; Amemiya, Yoshihito

    We present on-chip oscillator arrays synchronized by random noises, aiming at skew-free clock distribution on synchronous digital systems. Nakao et al. recently reported that independent neural oscillators can be synchronized by applying temporal random impulses to the oscillators [1], [2]. We regard neural oscillators as independent clock sources on LSIs; i. e., clock sources are distributed on LSIs, and they are forced to synchronize through the use of random noises. We designed neuron-based clock generators operating at sub-RF region (<1GHz) by modifying the original neuron model to a new model that is suitable for CMOS implementation with 0.25-μm CMOS parameters. Through circuit simulations, we demonstrate that i) the clock generators are certainly synchronized by pseudo-random noises and ii) clock generators exhibited phase-locked oscillations even if they had small device mismatches.

  20. Human Inferences about Sequences: A Minimal Transition Probability Model

    PubMed Central

    2016-01-01

    The brain constantly infers the causes of the inputs it receives and uses these inferences to generate statistical expectations about future observations. Experimental evidence for these expectations and their violations include explicit reports, sequential effects on reaction times, and mismatch or surprise signals recorded in electrophysiology and functional MRI. Here, we explore the hypothesis that the brain acts as a near-optimal inference device that constantly attempts to infer the time-varying matrix of transition probabilities between the stimuli it receives, even when those stimuli are in fact fully unpredictable. This parsimonious Bayesian model, with a single free parameter, accounts for a broad range of findings on surprise signals, sequential effects and the perception of randomness. Notably, it explains the pervasive asymmetry between repetitions and alternations encountered in those studies. Our analysis suggests that a neural machinery for inferring transition probabilities lies at the core of human sequence knowledge. PMID:28030543

  1. An Ethyl-Nitrosourea-Induced Point Mutation in Phex Causes Exon Skipping, X-Linked Hypophosphatemia, and Rickets

    PubMed Central

    Carpinelli, Marina R.; Wicks, Ian P.; Sims, Natalie A.; O’Donnell, Kristy; Hanzinikolas, Katherine; Burt, Rachel; Foote, Simon J.; Bahlo, Melanie; Alexander, Warren S.; Hilton, Douglas J.

    2002-01-01

    We describe the clinical, genetic, biochemical, and molecular characterization of a mouse that arose in the first generation (G1) of a random mutagenesis screen with the chemical mutagen ethyl-nitrosourea. The mouse was observed to have skeletal abnormalities inherited with an X-linked dominant pattern of inheritance. The causative mutation, named Skeletal abnormality 1 (Ska1), was shown to be a single base pair mutation in a splice donor site immediately following exon 8 of the Phex (phosphate-regulating gene with homologies to endopeptidases located on the X-chromosome) gene. This point mutation caused skipping of exon 8 from Phex mRNA, hypophosphatemia, and features of rickets. This experimentally induced phenotype mirrors the human condition X-linked hypophosphatemia; directly confirms the role of Phex in phosphate homeostasis, normal skeletal development, and rickets; and illustrates the power of mutagenesis in exploring animal models of human disease. PMID:12414538

  2. An ethyl-nitrosourea-induced point mutation in phex causes exon skipping, x-linked hypophosphatemia, and rickets.

    PubMed

    Carpinelli, Marina R; Wicks, Ian P; Sims, Natalie A; O'Donnell, Kristy; Hanzinikolas, Katherine; Burt, Rachel; Foote, Simon J; Bahlo, Melanie; Alexander, Warren S; Hilton, Douglas J

    2002-11-01

    We describe the clinical, genetic, biochemical, and molecular characterization of a mouse that arose in the first generation (G(1)) of a random mutagenesis screen with the chemical mutagen ethyl-nitrosourea. The mouse was observed to have skeletal abnormalities inherited with an X-linked dominant pattern of inheritance. The causative mutation, named Skeletal abnormality 1 (Ska1), was shown to be a single base pair mutation in a splice donor site immediately following exon 8 of the Phex (phosphate-regulating gene with homologies to endopeptidases located on the X-chromosome) gene. This point mutation caused skipping of exon 8 from Phex mRNA, hypophosphatemia, and features of rickets. This experimentally induced phenotype mirrors the human condition X-linked hypophosphatemia; directly confirms the role of Phex in phosphate homeostasis, normal skeletal development, and rickets; and illustrates the power of mutagenesis in exploring animal models of human disease.

  3. Mouse forward genetics in the study of the peripheral nervous system and human peripheral neuropathy

    PubMed Central

    Douglas, Darlene S.; Popko, Brian

    2009-01-01

    Forward genetics, the phenotype-driven approach to investigating gene identity and function, has a long history in mouse genetics. Random mutations in the mouse transcend bias about gene function and provide avenues towards unique discoveries. The study of the peripheral nervous system is no exception; from historical strains such as the trembler mouse, which led to the identification of PMP22 as a human disease gene causing multiple forms of peripheral neuropathy, to the more recent identification of the claw paw and sprawling mutations, forward genetics has long been a tool for probing the physiology, pathogenesis, and genetics of the PNS. Even as spontaneous and mutagenized mice continue to enable the identification of novel genes, provide allelic series for detailed functional studies, and generate models useful for clinical research, new methods, such as the piggyBac transposon, are being developed to further harness the power of forward genetics. PMID:18481175

  4. Fused cerebral organoids model interactions between brain regions.

    PubMed

    Bagley, Joshua A; Reumann, Daniel; Bian, Shan; Lévi-Strauss, Julie; Knoblich, Juergen A

    2017-07-01

    Human brain development involves complex interactions between different regions, including long-distance neuronal migration or formation of major axonal tracts. Different brain regions can be cultured in vitro within 3D cerebral organoids, but the random arrangement of regional identities limits the reliable analysis of complex phenotypes. Here, we describe a coculture method combining brain regions of choice within one organoid tissue. By fusing organoids of dorsal and ventral forebrain identities, we generate a dorsal-ventral axis. Using fluorescent reporters, we demonstrate CXCR4-dependent GABAergic interneuron migration from ventral to dorsal forebrain and describe methodology for time-lapse imaging of human interneuron migration. Our results demonstrate that cerebral organoid fusion cultures can model complex interactions between different brain regions. Combined with reprogramming technology, fusions should offer researchers the possibility to analyze complex neurodevelopmental defects using cells from neurological disease patients and to test potential therapeutic compounds.

  5. Second generation codon optimized minicircle (CoMiC) for nonviral reprogramming of human adult fibroblasts.

    PubMed

    Diecke, Sebastian; Lisowski, Leszek; Kooreman, Nigel G; Wu, Joseph C

    2014-01-01

    The ability to induce pluripotency in somatic cells is one of the most important scientific achievements in the fields of stem cell research and regenerative medicine. This technique allows researchers to obtain pluripotent stem cells without the controversial use of embryos, providing a novel and powerful tool for disease modeling and drug screening approaches. However, using viruses for the delivery of reprogramming genes and transcription factors may result in integration into the host genome and cause random mutations within the target cell, thus limiting the use of these cells for downstream applications. To overcome this limitation, various non-integrating techniques, including Sendai virus, mRNA, minicircle, and plasmid-based methods, have recently been developed. Utilizing a newly developed codon optimized 4-in-1 minicircle (CoMiC), we were able to reprogram human adult fibroblasts using chemically defined media and without the need for feeder cells.

  6. Cascaded Raman lasing in a PM phosphosilicate fiber with random distributed feedback

    NASA Astrophysics Data System (ADS)

    Lobach, Ivan A.; Kablukov, Sergey I.; Babin, Sergey A.

    2018-02-01

    We report on the first demonstration of a linearly polarized cascaded Raman fiber laser based on a simple half-open cavity with a broadband composite reflector and random distributed feedback in a polarization maintaining phosphosilicate fiber operating beyond zero dispersion wavelength ( 1400 nm). With increasing pump power from a Yb-doped fiber laser at 1080 nm, the random laser generates subsequently 8 W at 1262 nm and 9 W at 1515 nm with polarization extinction ratio of 27 dB. The generation linewidths amount to about 1 nm and 3 nm, respectively, being almost independent of power, in correspondence with the theory of a cascaded random lasing.

  7. Random phase encoding for optical security

    NASA Astrophysics Data System (ADS)

    Wang, RuiKang K.; Watson, Ian A.; Chatwin, Christopher R.

    1996-09-01

    A new optical encoding method for security applications is proposed. The encoded image (encrypted into the security products) is merely a random phase image statistically and randomly generated by a random number generator using a computer, which contains no information from the reference pattern (stored for verification) or the frequency plane filter (a phase-only function for decoding). The phase function in the frequency plane is obtained using a modified phase retrieval algorithm. The proposed method uses two phase-only functions (images) at both the input and frequency planes of the optical processor leading to maximum optical efficiency. Computer simulation shows that the proposed method is robust for optical security applications.

  8. Multi-peak structure of generation spectrum of random distributed feedback fiber Raman lasers.

    PubMed

    Vatnik, I D; Zlobina, E A; Kablukov, S I; Babin, S A

    2017-02-06

    We study spectral features of the generation of random distributed feedback fiber Raman laser arising from two-peak shape of the Raman gain spectral profile realized in the germanosilicate fibers. We demonstrate that number of peaks can be calculated using power balance model considering different subcomponents within each Stokes component.

  9. Physical Principle for Generation of Randomness

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2009-01-01

    A physical principle (more precisely, a principle that incorporates mathematical models used in physics) has been conceived as the basis of a method of generating randomness in Monte Carlo simulations. The principle eliminates the need for conventional random-number generators. The Monte Carlo simulation method is among the most powerful computational methods for solving high-dimensional problems in physics, chemistry, economics, and information processing. The Monte Carlo simulation method is especially effective for solving problems in which computational complexity increases exponentially with dimensionality. The main advantage of the Monte Carlo simulation method over other methods is that the demand on computational resources becomes independent of dimensionality. As augmented by the present principle, the Monte Carlo simulation method becomes an even more powerful computational method that is especially useful for solving problems associated with dynamics of fluids, planning, scheduling, and combinatorial optimization. The present principle is based on coupling of dynamical equations with the corresponding Liouville equation. The randomness is generated by non-Lipschitz instability of dynamics triggered and controlled by feedback from the Liouville equation. (In non-Lipschitz dynamics, the derivatives of solutions of the dynamical equations are not required to be bounded.)

  10. The acute effect of beta-guanidinopropionic acid versus creatine or placebo in healthy men (ABC-Trial): A randomized controlled first-in-human trial.

    PubMed

    Karamat, Fares A; Horjus, Deborah L; Haan, Yentl C; van der Woude, Lisa; Schaap, Marianne C; Oudman, Inge; van Montfrans, Gert A; Nieuwland, Rienk; Salomons, Gajja S; Clark, Joseph F; Brewster, Lizzy M

    2017-12-01

    Increasing evidence indicates that the ATP-generating enzyme creatine kinase (CK) is involved in hypertension. CK rapidly regenerates ATP from creatine phosphate and ADP. Recently, it has been shown that beta-guanidinopropionic acid (GPA), a kidney-synthesized creatine analogue and competitive CK inhibitor, reduced blood pressure in spontaneously hypertensive rats. To further develop the substance as a potential blood pressure-lowering agent, we assessed the tolerability of a sub-therapeutic GPA dose in healthy men. In this active and placebo-controlled, triple-blind, single-centre trial, we recruited 24 healthy men (18-50 years old, BMI 18.5-29.9 kg m -2 ) in the Netherlands. Participants were randomized (1:1:1) to one week daily oral administration of GPA 100 mg, creatine 5 g, or matching placebo. The primary outcome was the tolerability of GPA, in an intent-to-treat analysis. Twenty-four randomized participants received the allocated intervention and 23 completed the study. One participant in the placebo arm dropped out for personal reasons. GPA was well tolerated, without serious or severe adverse events. No abnormalities were reported with GPA use in clinical safety parameters, including physical examination, laboratory studies, or 12-Lead ECG. At day 8, mean plasma GPA was 213.88 (SE 0.07) in the GPA arm vs. 32.75 (0.00) nmol l -1 in the placebo arm, a mean difference of 181.13 (95% CI 26.53-335.72). In this first-in-human trial, low-dose GPA was safe and well-tolerated when used during 1 week in healthy men. Subsequent studies should focus on human pharmacokinetic and pharmacodynamic assessments with different doses. © 2017 The Authors. British Journal of Clinical Pharmacology published by John Wiley & Sons Ltd on behalf of British Pharmacological Society.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cryns, Jackson W.; Hatchell, Brian K.; Santiago-Rojas, Emiliano

    Formal journal article Experimental analysis of a piezoelectric energy harvesting system for harmonic, random, and sine on random vibration Abstract: Harvesting power with a piezoelectric vibration powered generator using a full-wave rectifier conditioning circuit is experimentally compared for varying sinusoidal, random and sine on random (SOR) input vibration scenarios. Additionally, the implications of source vibration characteristics on harvester design are discussed. Studies in vibration harvesting have yielded numerous alternatives for harvesting electrical energy from vibrations but piezoceramics arose as the most compact, energy dense means of energy transduction. The rise in popularity of harvesting energy from ambient vibrations has mademore » piezoelectric generators commercially available. Much of the available literature focuses on maximizing harvested power through nonlinear processing circuits that require accurate knowledge of generator internal mechanical and electrical characteristics and idealization of the input vibration source, which cannot be assumed in general application. In this manuscript, variations in source vibration and load resistance are explored for a commercially available piezoelectric generator. We characterize the source vibration by its acceleration response for repeatability and transcription to general application. The results agree with numerical and theoretical predictions for in previous literature that load optimal resistance varies with transducer natural frequency and source type, and the findings demonstrate that significant gains are seen with lower tuned transducer natural frequencies for similar source amplitudes. Going beyond idealized steady state sinusoidal and simplified random vibration input, SOR testing allows for more accurate representation of real world ambient vibration. It is shown that characteristic interactions from more complex vibrational sources significantly alter power generation and power processing requirements by increasing harvested power, shifting optimal conditioning impedance, inducing significant voltage supply fluctuations and ultimately rendering idealized sinusoidal and random analyses insufficient.« less

  12. Immortality of cancers

    PubMed Central

    Duesberg, Peter; McCormack, Amanda

    2013-01-01

    Immortality is a common characteristic of cancers, but its origin and purpose are still unclear. Here we advance a karyotypic theory of immortality based on the theory that carcinogenesis is a form of speciation. Accordingly, cancers are generated from normal cells by random karyotypic rearrangements and selection for cancer-specific reproductive autonomy. Since such rearrangements unbalance long-established mitosis genes, cancer karyotypes vary spontaneously but are stabilized perpetually by clonal selections for autonomy. To test this theory we have analyzed neoplastic clones, presumably immortalized by transfection with overexpressed telomerase or with SV40 tumor virus, for the predicted clonal yet flexible karyotypes. The following results were obtained: (1) All immortal tumorigenic lines from cells transfected with overexpressed telomerase had clonal and flexible karyotypes; (2) Searching for the origin of such karyotypes, we found spontaneously increasing, random aneuploidy in human fibroblasts early after transfection with overexpressed telomerase; (3) Late after transfection, new immortal tumorigenic clones with new clonal and flexible karyotypes were found; (4) Testing immortality of one clone during 848 unselected generations showed the chromosome number was stable, but the copy numbers of 36% of chromosomes drifted ± 1; (5) Independent immortal tumorigenic clones with individual, flexible karyotypes arose after individual latencies; (6) Immortal tumorigenic clones with new flexible karyotypes also arose late from cells of a telomerase-deficient mouse rendered aneuploid by SV40 virus. Because immortality and tumorigenicity: (1) correlated exactly with individual clonal but flexible karyotypes; (2) originated simultaneously with such karyotypes; and (3) arose in the absence of telomerase, we conclude that clonal and flexible karyotypes generate the immortality of cancers. PMID:23388461

  13. Learning accurate and interpretable models based on regularized random forests regression

    PubMed Central

    2014-01-01

    Background Many biology related research works combine data from multiple sources in an effort to understand the underlying problems. It is important to find and interpret the most important information from these sources. Thus it will be beneficial to have an effective algorithm that can simultaneously extract decision rules and select critical features for good interpretation while preserving the prediction performance. Methods In this study, we focus on regression problems for biological data where target outcomes are continuous. In general, models constructed from linear regression approaches are relatively easy to interpret. However, many practical biological applications are nonlinear in essence where we can hardly find a direct linear relationship between input and output. Nonlinear regression techniques can reveal nonlinear relationship of data, but are generally hard for human to interpret. We propose a rule based regression algorithm that uses 1-norm regularized random forests. The proposed approach simultaneously extracts a small number of rules from generated random forests and eliminates unimportant features. Results We tested the approach on some biological data sets. The proposed approach is able to construct a significantly smaller set of regression rules using a subset of attributes while achieving prediction performance comparable to that of random forests regression. Conclusion It demonstrates high potential in aiding prediction and interpretation of nonlinear relationships of the subject being studied. PMID:25350120

  14. The Microbial Detection Array Combined with Random Phi29-Amplification Used as a Diagnostic Tool for Virus Detection in Clinical Samples

    PubMed Central

    Erlandsson, Lena; Rosenstierne, Maiken W.; McLoughlin, Kevin; Jaing, Crystal; Fomsgaard, Anders

    2011-01-01

    A common technique used for sensitive and specific diagnostic virus detection in clinical samples is PCR that can identify one or several viruses in one assay. However, a diagnostic microarray containing probes for all human pathogens could replace hundreds of individual PCR-reactions and remove the need for a clear clinical hypothesis regarding a suspected pathogen. We have established such a diagnostic platform for random amplification and subsequent microarray identification of viral pathogens in clinical samples. We show that Phi29 polymerase-amplification of a diverse set of clinical samples generates enough viral material for successful identification by the Microbial Detection Array, demonstrating the potential of the microarray technique for broad-spectrum pathogen detection. We conclude that this method detects both DNA and RNA virus, present in the same sample, as well as differentiates between different virus subtypes. We propose this assay for diagnostic analysis of viruses in clinical samples. PMID:21853040

  15. Autonomous Byte Stream Randomizer

    NASA Technical Reports Server (NTRS)

    Paloulian, George K.; Woo, Simon S.; Chow, Edward T.

    2013-01-01

    Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.

  16. Numerical Generation of Dense Plume Fingers in Unsaturated Homogeneous Porous Media

    NASA Astrophysics Data System (ADS)

    Cremer, C.; Graf, T.

    2012-04-01

    In nature, the migration of dense plumes typically results in the formation of vertical plume fingers. Flow direction in fingers is downwards, which is counterbalanced by upwards flow of less dense fluid between fingers. In heterogeneous media, heterogeneity itself is known to trigger the formation of fingers. In homogeneous media, however, fingers are also created even if all grains had the same diameter. The reason is that pore-scale heterogeneity leading to different flow velocities also exists in homogeneous media due to two effects: (i) Grains of identical size may randomly arrange differently, e.g. forming tetrahedrons, hexahedrons or octahedrons. Each arrangement creates pores of varying diameter, thus resulting in different average flow velocities. (ii) Random variations of solute concentration lead to varying buoyancy effects, thus also resulting in different velocities. As a continuation of previously made efforts to incorporate pore-scale heterogeneity into fully saturated soil such that dense fingers are realistically generated (Cremer and Graf, EGU Assembly, 2011), the current paper extends the research scope from saturated to unsaturated soil. Perturbation methods are evaluated by numerically re-simulating a laboratory-scale experiment of plume transport in homogeneous unsaturated sand (Simmons et al., Transp. Porous Media, 2002). The following 5 methods are being discussed: (i) homogeneous sand, (ii) initial perturbation of solute concentration, (iii) spatially random, time-constant perturbation of solute source, (iv) spatially and temporally random noise of simulated solute concentration, and (v) random K-field that introduces physically insignificant but numerically significant heterogeneity. Results demonstrate that, as opposed to saturated flow, perturbing the solute source will not result in plume fingering. This is because the location of the perturbed source (domain top) and the location of finger generation (groundwater surface) do not coincide. Alternatively, similar to saturated flow, applying either a random concentration noise (iv) or a random K-field (v) generates realistic plume fingering. Future work will focus on the generation mechanisms of plume finger splitting.

  17. Pilot study of large-scale production of mutant pigs by ENU mutagenesis

    PubMed Central

    Hai, Tang; Cao, Chunwei; Shang, Haitao; Guo, Weiwei; Mu, Yanshuang; Yang, Shulin; Zhang, Ying; Zheng, Qiantao; Zhang, Tao; Wang, Xianlong; Liu, Yu; Kong, Qingran; Li, Kui; Wang, Dayu; Qi, Meng; Hong, Qianlong; Zhang, Rui; Wang, Xiupeng; Jia, Qitao; Wang, Xiao; Qin, Guosong; Li, Yongshun; Luo, Ailing; Jin, Weiwu; Yao, Jing; Huang, Jiaojiao; Zhang, Hongyong; Li, Menghua; Xie, Xiangmo; Zheng, Xuejuan; Guo, Kenan; Wang, Qinghua; Zhang, Shibin; Li, Liang; Xie, Fei; Zhang, Yu; Weng, Xiaogang; Yin, Zhi; Hu, Kui; Cong, Yimei; Zheng, Peng; Zou, Hailong; Xin, Leilei; Xia, Jihan; Ruan, Jinxue; Li, Hegang; Zhao, Weiming; Yuan, Jing; Liu, Zizhan; Gu, Weiwang; Li, Ming; Wang, Yong; Wang, Hongmei; Yang, Shiming; Liu, Zhonghua; Wei, Hong; Zhao, Jianguo; Zhou, Qi; Meng, Anming

    2017-01-01

    N-ethyl-N-nitrosourea (ENU) mutagenesis is a powerful tool to generate mutants on a large scale efficiently, and to discover genes with novel functions at the whole-genome level in Caenorhabditis elegans, flies, zebrafish and mice, but it has never been tried in large model animals. We describe a successful systematic three-generation ENU mutagenesis screening in pigs with the establishment of the Chinese Swine Mutagenesis Consortium. A total of 6,770 G1 and 6,800 G3 pigs were screened, 36 dominant and 91 recessive novel pig families with various phenotypes were established. The causative mutations in 10 mutant families were further mapped. As examples, the mutation of SOX10 (R109W) in pig causes inner ear malfunctions and mimics human Mondini dysplasia, and upregulated expression of FBXO32 is associated with congenital splay legs. This study demonstrates the feasibility of artificial random mutagenesis in pigs and opens an avenue for generating a reservoir of mutants for agricultural production and biomedical research. DOI: http://dx.doi.org/10.7554/eLife.26248.001 PMID:28639938

  18. Radiation Transport in Random Media With Large Fluctuations

    NASA Astrophysics Data System (ADS)

    Olson, Aaron; Prinja, Anil; Franke, Brian

    2017-09-01

    Neutral particle transport in media exhibiting large and complex material property spatial variation is modeled by representing cross sections as lognormal random functions of space and generated through a nonlinear memory-less transformation of a Gaussian process with covariance uniquely determined by the covariance of the cross section. A Karhunen-Loève decomposition of the Gaussian process is implemented to effciently generate realizations of the random cross sections and Woodcock Monte Carlo used to transport particles on each realization and generate benchmark solutions for the mean and variance of the particle flux as well as probability densities of the particle reflectance and transmittance. A computationally effcient stochastic collocation method is implemented to directly compute the statistical moments such as the mean and variance, while a polynomial chaos expansion in conjunction with stochastic collocation provides a convenient surrogate model that also produces probability densities of output quantities of interest. Extensive numerical testing demonstrates that use of stochastic reduced-order modeling provides an accurate and cost-effective alternative to random sampling for particle transport in random media.

  19. Synchronization of random bit generators based on coupled chaotic lasers and application to cryptography.

    PubMed

    Kanter, Ido; Butkovski, Maria; Peleg, Yitzhak; Zigzag, Meital; Aviad, Yaara; Reidler, Igor; Rosenbluh, Michael; Kinzel, Wolfgang

    2010-08-16

    Random bit generators (RBGs) constitute an important tool in cryptography, stochastic simulations and secure communications. The later in particular has some difficult requirements: high generation rate of unpredictable bit strings and secure key-exchange protocols over public channels. Deterministic algorithms generate pseudo-random number sequences at high rates, however, their unpredictability is limited by the very nature of their deterministic origin. Recently, physical RBGs based on chaotic semiconductor lasers were shown to exceed Gbit/s rates. Whether secure synchronization of two high rate physical RBGs is possible remains an open question. Here we propose a method, whereby two fast RBGs based on mutually coupled chaotic lasers, are synchronized. Using information theoretic analysis we demonstrate security against a powerful computational eavesdropper, capable of noiseless amplification, where all parameters are publicly known. The method is also extended to secure synchronization of a small network of three RBGs.

  20. Dipolar eddies in a decaying stratified turbulent flow

    NASA Astrophysics Data System (ADS)

    Voropayev, S. I.; Fernando, H. J. S.; Morrison, R.

    2008-02-01

    Laboratory experiments on the evolution of dipolar (momentum) eddies in a stratified fluid in the presence of random background motions are described. A turbulent jet puff was used to generate the momentum eddies, and a decaying field of ambient random vortical motions was generated by a towed grid. Data on vorticity/velocity fields of momentum eddies, those of background motions, and their interactions were collected in the presence and absence of the other, and the main characteristics thereof were parametrized. Similarity arguments predict that dipolar eddies in stratified fluids may preserve their identity in decaying grid-generated stratified turbulence, which was verified experimentally. Possible applications of the results include mushroomlike currents and other naturally/artificially generated large dipolar eddies in strongly stratified layers of the ocean, the longevity of which is expected to be determined by the characteristics of the eddies and random background motions.

  1. 25 CFR 547.14 - What are the minimum technical standards for electronic random number generation?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... CLASS II GAMES § 547.14 What are the minimum technical standards for electronic random number generation... rules of the game. For example, if a bingo game with 75 objects with numbers or other designations has a... serial correlation (outcomes shall be independent from the previous game); and (x) Test on subsequences...

  2. 25 CFR 547.14 - What are the minimum technical standards for electronic random number generation?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... CLASS II GAMES § 547.14 What are the minimum technical standards for electronic random number generation... rules of the game. For example, if a bingo game with 75 objects with numbers or other designations has a... serial correlation (outcomes shall be independent from the previous game); and (x) Test on subsequences...

  3. 25 CFR 547.14 - What are the minimum technical standards for electronic random number generation?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... CLASS II GAMES § 547.14 What are the minimum technical standards for electronic random number generation... rules of the game. For example, if a bingo game with 75 objects with numbers or other designations has a... serial correlation (outcomes shall be independent from the previous game); and (x) Test on subsequences...

  4. Monte Carlo Simulation Using HyperCard and Lotus 1-2-3.

    ERIC Educational Resources Information Center

    Oulman, Charles S.; Lee, Motoko Y.

    Monte Carlo simulation is a computer modeling procedure for mimicking observations on a random variable. A random number generator is used in generating the outcome for the events that are being modeled. The simulation can be used to obtain results that otherwise require extensive testing or complicated computations. This paper describes how Monte…

  5. System and method for key generation in security tokens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Philip G.; Humble, Travis S.; Paul, Nathanael R.

    Functional randomness in security tokens (FRIST) may achieve improved security in two-factor authentication hardware tokens by improving on the algorithms used to securely generate random data. A system and method in one embodiment according to the present invention may allow for security of a token based on storage cost and computational security. This approach may enable communication where security is no longer based solely on onetime pads (OTPs) generated from a single cryptographic function (e.g., SHA-256).

  6. Beat-to-beat control of human optokinetic nystagmus slow phase durations

    PubMed Central

    Furman, Joseph M.

    2016-01-01

    This study provides the first clear evidence that the generation of optokinetic nystagmus fast phases (FPs) is a decision process that is influenced by performance of a concurrent disjunctive reaction time task (DRT). Ten subjects performed an auditory DRT during constant velocity optokinetic stimulation. Eye movements were measured in three dimensions with a magnetic search coil. Slow phase (SP) durations were defined as the interval between FPs. There were three main findings. Firstly, human optokinetic nystagmus SP durations are consistent with a model of a Gaussian basic interval generator (a type of biological clock), such that FPs can be triggered randomly at the end of a clock cycle (mean duration: 200–250 ms). Kolmogorov-Smirnov tests could not reject the modeled cumulative distribution for any data trials. Secondly, the FP need not be triggered at the end of a clock cycle, so that individual SP durations represent single or multiple clock cycles. Thirdly, the probability of generating a FP at the end of each interval generator cycle decreases significantly during performance of a DRT. These findings indicate that the alternation between SPs and FPs of optokinetic nystagmus is not purely reflexive. Rather, the triggering of the next FP is postponed more frequently if a recently presented DRT trial is pending action when the timing cycle expires. Hence, optokinetic nystagmus FPs show dual-task interference in a manner usually attributed to voluntary movements, including saccades. NEW & NOTEWORTHY This study provides the first clear evidence that the generation of optokinetic nystagmus (OKN) fast phases is a decision process that is influenced by performance of a concurrent disjunctive reaction time task (DRT). The slow phase (SP) durations are consistent with a Gaussian basic interval generator and multiple interval SP durations occur more frequently in the presence of the DRT. Hence, OKN shows dual-task interference in a manner observed in voluntary movements, such as saccades. PMID:27760815

  7. Beat-to-beat control of human optokinetic nystagmus slow phase durations.

    PubMed

    Balaban, Carey D; Furman, Joseph M

    2017-01-01

    This study provides the first clear evidence that the generation of optokinetic nystagmus fast phases (FPs) is a decision process that is influenced by performance of a concurrent disjunctive reaction time task (DRT). Ten subjects performed an auditory DRT during constant velocity optokinetic stimulation. Eye movements were measured in three dimensions with a magnetic search coil. Slow phase (SP) durations were defined as the interval between FPs. There were three main findings. Firstly, human optokinetic nystagmus SP durations are consistent with a model of a Gaussian basic interval generator (a type of biological clock), such that FPs can be triggered randomly at the end of a clock cycle (mean duration: 200-250 ms). Kolmogorov-Smirnov tests could not reject the modeled cumulative distribution for any data trials. Secondly, the FP need not be triggered at the end of a clock cycle, so that individual SP durations represent single or multiple clock cycles. Thirdly, the probability of generating a FP at the end of each interval generator cycle decreases significantly during performance of a DRT. These findings indicate that the alternation between SPs and FPs of optokinetic nystagmus is not purely reflexive. Rather, the triggering of the next FP is postponed more frequently if a recently presented DRT trial is pending action when the timing cycle expires. Hence, optokinetic nystagmus FPs show dual-task interference in a manner usually attributed to voluntary movements, including saccades. This study provides the first clear evidence that the generation of optokinetic nystagmus (OKN) fast phases is a decision process that is influenced by performance of a concurrent disjunctive reaction time task (DRT). The slow phase (SP) durations are consistent with a Gaussian basic interval generator and multiple interval SP durations occur more frequently in the presence of the DRT. Hence, OKN shows dual-task interference in a manner observed in voluntary movements, such as saccades. Copyright © 2017 the American Physiological Society.

  8. Random numbers from vacuum fluctuations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Yicheng; Kurtsiefer, Christian, E-mail: christian.kurtsiefer@gmail.com; Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543

    2016-07-25

    We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.

  9. Hundred-watt-level high power random distributed feedback Raman fiber laser at 1150 nm and its application in mid-infrared laser generation.

    PubMed

    Zhang, Hanwei; Zhou, Pu; Wang, Xiong; Du, Xueyuan; Xiao, Hu; Xu, Xiaojun

    2015-06-29

    Two kinds of hundred-watt-level random distributed feedback Raman fiber have been demonstrated. The optical efficiency can reach to as high as 84.8%. The reported power and efficiency of the random laser is the highest one as we know. We have also demonstrated that the developed random laser can be further used to pump a Ho-doped fiber laser for mid-infrared laser generation. Finally, 23 W 2050 nm laser is achieved. The presented laser can obtain high power output efficiently and conveniently and opens a new direction for high power laser sources at designed wavelength.

  10. [A magnetic therapy apparatus with an adaptable electromagnetic spectrum for the treatment of prostatitis and gynecopathies].

    PubMed

    Kuz'min, A A; Meshkovskiĭ, D V; Filist, S A

    2008-01-01

    Problems of engineering and algorithm development of magnetic therapy apparatuses with pseudo-random radiation spectrum within the audio range for treatment of prostatitis and gynecopathies are considered. A typical design based on a PIC 16F microcontroller is suggested. It includes a keyboard, LCD indicator, audio amplifier, inducer, and software units. The problem of pseudo-random signal generation within the audio range is considered. A series of rectangular pulses is generated on a random-length interval on the basis of a three-component random vector. This series provides the required spectral characteristics of the therapeutic magnetic field and their adaptation to the therapeutic conditions and individual features of the patient.

  11. Random functions via Dyson Brownian Motion: progress and problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Gaoyuan; Battefeld, Thorsten

    2016-09-05

    We develope a computationally efficient extension of the Dyson Brownian Motion (DBM) algorithm to generate random function in C{sup 2} locally. We further explain that random functions generated via DBM show an unstable growth as the traversed distance increases. This feature restricts the use of such functions considerably if they are to be used to model globally defined ones. The latter is the case if one uses random functions to model landscapes in string theory. We provide a concrete example, based on a simple axionic potential often used in cosmology, to highlight this problem and also offer an ad hocmore » modification of DBM that suppresses this growth to some degree.« less

  12. Recruitment methods and costs for a randomized, placebo-controlled trial of chiropractic care for lumbar spinal stenosis: a single-site pilot study.

    PubMed

    Cambron, Jerrilyn A; Dexheimer, Jennifer M; Chang, Mabel; Cramer, Gregory D

    2010-01-01

    The purpose of this article is to describe the methods for recruitment in a clinical trial on chiropractic care for lumbar spinal stenosis. This randomized, placebo-controlled pilot study investigated the efficacy of different amounts of total treatment dosage over 6 weeks in 60 volunteer subjects with lumbar spinal stenosis. Subjects were recruited for this study through several media venues, focusing on successful and cost-effective strategies. Included in our efforts were radio advertising, newspaper advertising, direct mail, and various other low-cost initiatives. Of the 1211 telephone screens, 60 responders (5.0%) were randomized into the study. The most successful recruitment method was radio advertising, generating more than 64% of the calls (776 subjects). Newspaper and magazine advertising generated approximately 9% of all calls (108 subjects), and direct mail generated less than 7% (79 subjects). The total direct cost for recruitment was $40 740 or $679 per randomized patient. The costs per randomization were highest for direct mail ($995 per randomization) and lowest for newspaper/magazine advertising ($558 per randomization). Success of recruitment methods may vary based on target population and location. Planning of recruitment efforts is essential to the success of any clinical trial. Copyright 2010 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.

  13. Creative Test Generators

    ERIC Educational Resources Information Center

    Vickers, F. D.

    1973-01-01

    A brief description of a test generating program which generates questions concerning the Fortran programming language in a random but guided fashion and without resorting to an item bank.'' (Author/AK)

  14. Motion planning and synchronized control of the dental arch generator of the tooth-arrangement robot.

    PubMed

    Jiang, Jin-Gang; Zhang, Yong-De

    2013-03-01

    The traditional, manual method of reproducing the dental arch form is prone to numerous random errors caused by human factors. The purpose of this study was to investigate the automatic acquisition of the dental arch and implement the motion planning and synchronized control of the dental arch generator of the multi-manipulator tooth-arrangement robot for use in full denture manufacture. First, the mathematical model of the dental arch generator was derived. Then the kinematics and control point position of the dental arch generator of the tooth arrangement robot were calculated and motion planning of each control point was analysed. A hardware control scheme is presented, based on the industrial personal computer and control card PC6401. In order to gain single-axis, precise control of the dental arch generator, we studied the control pulse realization of high-resolution timing. Real-time, closed-loop, synchronous control was applied to the dental arch generator. Experimental control of the dental arch generator and preliminary tooth arrangement were gained by using the multi-manipulator tooth-arrangement robotic system. The dental arch generator can automatically generate a dental arch to fit a patient according to the patient's arch parameters. Repeated positioning accuracy is 0.12 mm for the slipways that drive the dental arch generator. The maximum value of single-point error is 1.83 mm, while the arc-width direction (x axis) is -33.29 mm. A novel system that generates the dental arch has been developed. The traditional method of manually determining the dental arch may soon be replaced by a robot to assist in generating a more individual dental arch. The system can be used to fabricate full dentures and bend orthodontic wires. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Mutation as a Stress Response and the Regulation of Evolvability

    PubMed Central

    Galhardo, Rodrigo S.; Hastings, P. J.; Rosenberg, Susan M.

    2010-01-01

    Our concept of a stable genome is evolving to one in which genomes are plastic and responsive to environmental changes. Growing evidence shows that a variety of environmental stresses induce genomic instability in bacteria, yeast, and human cancer cells, generating occasional fitter mutants and potentially accelerating adaptive evolution. The emerging molecular mechanisms of stress-induced mutagenesis vary but share telling common components that underscore two common themes. The first is the regulation of mutagenesis in time by cellular stress responses, which promote random mutations specifically when cells are poorly adapted to their environments, i.e., when they are stressed. A second theme is the possible restriction of random mutagenesis in genomic space, achieved via coupling of mutation-generating machinery to local events such as DNA-break repair or transcription. Such localization may minimize accumulation of deleterious mutations in the genomes of rare fitter mutants, and promote local concerted evolution. Although mutagenesis induced by stresses other than direct damage to DNA was previously controversial, evidence for the existence of various stress-induced mutagenesis programs is now overwhelming and widespread. Such mechanisms probably fuel evolution of microbial pathogenesis and antibiotic-resistance, and tumor progression and chemotherapy resistance, all of which occur under stress, driven by mutations. The emerging commonalities in stress-induced-mutation mechanisms provide hope for new therapeutic interventions for all of these processes. PMID:17917874

  16. Color image encryption based on gyrator transform and Arnold transform

    NASA Astrophysics Data System (ADS)

    Sui, Liansheng; Gao, Bo

    2013-06-01

    A color image encryption scheme using gyrator transform and Arnold transform is proposed, which has two security levels. In the first level, the color image is separated into three components: red, green and blue, which are normalized and scrambled using the Arnold transform. The green component is combined with the first random phase mask and transformed to an interim using the gyrator transform. The first random phase mask is generated with the sum of the blue component and a logistic map. Similarly, the red component is combined with the second random phase mask and transformed to three-channel-related data. The second random phase mask is generated with the sum of the phase of the interim and an asymmetrical tent map. In the second level, the three-channel-related data are scrambled again and combined with the third random phase mask generated with the sum of the previous chaotic maps, and then encrypted into a gray scale ciphertext. The encryption result has stationary white noise distribution and camouflage property to some extent. In the process of encryption and decryption, the rotation angle of gyrator transform, the iterative numbers of Arnold transform, the parameters of the chaotic map and generated accompanied phase function serve as encryption keys, and hence enhance the security of the system. Simulation results and security analysis are presented to confirm the security, validity and feasibility of the proposed scheme.

  17. An adaptive random search for short term generation scheduling with network constraints.

    PubMed

    Marmolejo, J A; Velasco, Jonás; Selley, Héctor J

    2017-01-01

    This paper presents an adaptive random search approach to address a short term generation scheduling with network constraints, which determines the startup and shutdown schedules of thermal units over a given planning horizon. In this model, we consider the transmission network through capacity limits and line losses. The mathematical model is stated in the form of a Mixed Integer Non Linear Problem with binary variables. The proposed heuristic is a population-based method that generates a set of new potential solutions via a random search strategy. The random search is based on the Markov Chain Monte Carlo method. The main key of the proposed method is that the noise level of the random search is adaptively controlled in order to exploring and exploiting the entire search space. In order to improve the solutions, we consider coupling a local search into random search process. Several test systems are presented to evaluate the performance of the proposed heuristic. We use a commercial optimizer to compare the quality of the solutions provided by the proposed method. The solution of the proposed algorithm showed a significant reduction in computational effort with respect to the full-scale outer approximation commercial solver. Numerical results show the potential and robustness of our approach.

  18. Transposon mutagenesis identifies chromatin modifiers cooperating with Ras in thyroid tumorigenesis and detects ATXN7 as a cancer gene.

    PubMed

    Montero-Conde, Cristina; Leandro-Garcia, Luis J; Chen, Xu; Oler, Gisele; Ruiz-Llorente, Sergio; Ryder, Mabel; Landa, Iñigo; Sanchez-Vega, Francisco; La, Konnor; Ghossein, Ronald A; Bajorin, Dean F; Knauf, Jeffrey A; Riordan, Jesse D; Dupuy, Adam J; Fagin, James A

    2017-06-20

    Oncogenic RAS mutations are present in 15-30% of thyroid carcinomas. Endogenous expression of mutant Ras is insufficient to initiate thyroid tumorigenesis in murine models, indicating that additional genetic alterations are required. We used Sleeping Beauty (SB) transposon mutagenesis to identify events that cooperate with Hras G12V in thyroid tumor development. Random genomic integration of SB transposons primarily generated loss-of-function events that significantly increased thyroid tumor penetrance in Tpo-Cre/homozygous FR-Hras G12V mice. The thyroid tumors closely phenocopied the histological features of human RAS-driven, poorly differentiated thyroid cancers. Characterization of transposon insertion sites in the SB-induced tumors identified 45 recurrently mutated candidate cancer genes. These mutation profiles were remarkably concordant with mutated cancer genes identified in a large series of human poorly differentiated and anaplastic thyroid cancers screened by next-generation sequencing using the MSK-IMPACT panel of cancer genes, which we modified to include all SB candidates. The disrupted genes primarily clustered in chromatin remodeling functional nodes and in the PI3K pathway. ATXN7 , a component of a multiprotein complex with histone acetylase activity, scored as a significant SB hit. It was recurrently mutated in advanced human cancers and significantly co-occurred with RAS or NF1 mutations. Expression of ATXN7 mutants cooperated with oncogenic RAS to induce thyroid cell proliferation, pointing to ATXN7 as a previously unrecognized cancer gene.

  19. Transposon mutagenesis identifies chromatin modifiers cooperating with Ras in thyroid tumorigenesis and detects ATXN7 as a cancer gene

    PubMed Central

    Montero-Conde, Cristina; Leandro-Garcia, Luis J.; Chen, Xu; Oler, Gisele; Ruiz-Llorente, Sergio; Ryder, Mabel; Landa, Iñigo; Sanchez-Vega, Francisco; La, Konnor; Ghossein, Ronald A.; Bajorin, Dean F.; Knauf, Jeffrey A.; Riordan, Jesse D.; Dupuy, Adam J.; Fagin, James A.

    2017-01-01

    Oncogenic RAS mutations are present in 15–30% of thyroid carcinomas. Endogenous expression of mutant Ras is insufficient to initiate thyroid tumorigenesis in murine models, indicating that additional genetic alterations are required. We used Sleeping Beauty (SB) transposon mutagenesis to identify events that cooperate with HrasG12V in thyroid tumor development. Random genomic integration of SB transposons primarily generated loss-of-function events that significantly increased thyroid tumor penetrance in Tpo-Cre/homozygous FR-HrasG12V mice. The thyroid tumors closely phenocopied the histological features of human RAS-driven, poorly differentiated thyroid cancers. Characterization of transposon insertion sites in the SB-induced tumors identified 45 recurrently mutated candidate cancer genes. These mutation profiles were remarkably concordant with mutated cancer genes identified in a large series of human poorly differentiated and anaplastic thyroid cancers screened by next-generation sequencing using the MSK-IMPACT panel of cancer genes, which we modified to include all SB candidates. The disrupted genes primarily clustered in chromatin remodeling functional nodes and in the PI3K pathway. ATXN7, a component of a multiprotein complex with histone acetylase activity, scored as a significant SB hit. It was recurrently mutated in advanced human cancers and significantly co-occurred with RAS or NF1 mutations. Expression of ATXN7 mutants cooperated with oncogenic RAS to induce thyroid cell proliferation, pointing to ATXN7 as a previously unrecognized cancer gene. PMID:28584132

  20. Which Melodic Universals Emerge from Repeated Signaling Games? A Note on Lumaca and Baggio (2017) ‡.

    PubMed

    Ravignani, Andrea; Verhoef, Tessa

    2018-01-01

    Music is a peculiar human behavior, yet we still know little as to why and how music emerged. For centuries, the study of music has been the sole prerogative of the humanities. Lately, however, music is being increasingly investigated by psychologists, neuroscientists, biologists, and computer scientists. One approach to studying the origins of music is to empirically test hypotheses about the mechanisms behind this structured behavior. Recent lab experiments show how musical rhythm and melody can emerge via the process of cultural transmission. In particular, Lumaca and Baggio (2017) tested the emergence of a sound system at the boundary between music and language. In this study, participants were given random pairs of signal-meanings; when participants negotiated their meaning and played a "game of telephone" with them, these pairs became more structured and systematic. Over time, the small biases introduced in each artificial transmission step accumulated, displaying quantitative trends, including the emergence, over the course of artificial human generations, of features resembling properties of language and music. In this Note, we highlight the importance of Lumaca and Baggio's experiment, place it in the broader literature on the evolution of language and music, and suggest refinements for future experiments. We conclude that, while psychological evidence for the emergence of proto-musical features is accumulating, complementary work is needed: Mathematical modeling and computer simulations should be used to test the internal consistency of experimentally generated hypotheses and to make new predictions.

  1. Speed in Information Processing with a Computer Driven Visual Display in a Real-time Digital Simulation. M.S. Thesis - Virginia Polytechnic Inst.

    NASA Technical Reports Server (NTRS)

    Kyle, R. G.

    1972-01-01

    Information transfer between the operator and computer-generated display systems is an area where the human factors engineer discovers little useful design data relating human performance to system effectiveness. This study utilized a computer-driven, cathode-ray-tube graphic display to quantify human response speed in a sequential information processing task. The performance criteria was response time to sixteen cell elements of a square matrix display. A stimulus signal instruction specified selected cell locations by both row and column identification. An equal probable number code, from one to four, was assigned at random to the sixteen cells of the matrix and correspondingly required one of four, matched keyed-response alternatives. The display format corresponded to a sequence of diagnostic system maintenance events, that enable the operator to verify prime system status, engage backup redundancy for failed subsystem components, and exercise alternate decision-making judgements. The experimental task bypassed the skilled decision-making element and computer processing time, in order to determine a lower bound on the basic response speed for given stimulus/response hardware arrangement.

  2. Pseudo-Random Number Generation in Children with High-Functioning Autism and Asperger's Disorder: Further Evidence for a Dissociation in Executive Functioning?

    ERIC Educational Resources Information Center

    Rinehart, Nicole J.; Bradshaw, John L.; Moss, Simon A.; Brereton, Avril V.; Tonge, Bruce J.

    2006-01-01

    The repetitive, stereotyped and obsessive behaviours, which are core diagnostic features of autism, are thought to be underpinned by executive dysfunction. This study examined executive impairment in individuals with autism and Asperger's disorder using a verbal equivalent of an established pseudo-random number generating task. Different patterns…

  3. Theory and implementation of a very high throughput true random number generator in field programmable gate array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving,more » so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.« less

  4. Theory and implementation of a very high throughput true random number generator in field programmable gate array.

    PubMed

    Wang, Yonggang; Hui, Cong; Liu, Chong; Xu, Chao

    2016-04-01

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.

  5. Design of high-throughput and low-power true random number generator utilizing perpendicularly magnetized voltage-controlled magnetic tunnel junction

    NASA Astrophysics Data System (ADS)

    Lee, Hochul; Ebrahimi, Farbod; Amiri, Pedram Khalili; Wang, Kang L.

    2017-05-01

    A true random number generator based on perpendicularly magnetized voltage-controlled magnetic tunnel junction devices (MRNG) is presented. Unlike MTJs used in memory applications where a stable bit is needed to store information, in this work, the MTJ is intentionally designed with small perpendicular magnetic anisotropy (PMA). This allows one to take advantage of the thermally activated fluctuations of its free layer as a stochastic noise source. Furthermore, we take advantage of the voltage dependence of anisotropy to temporarily change the MTJ state into an unstable state when a voltage is applied. Since the MTJ has two energetically stable states, the final state is randomly chosen by thermal fluctuation. The voltage controlled magnetic anisotropy (VCMA) effect is used to generate the metastable state of the MTJ by lowering its energy barrier. The proposed MRNG achieves a high throughput (32 Gbps) by implementing a 64 ×64 MTJ array into CMOS circuits and executing operations in a parallel manner. Furthermore, the circuit consumes very low energy to generate a random bit (31.5 fJ/bit) due to the high energy efficiency of the voltage-controlled MTJ switching.

  6. Shaping the spectrum of random-phase radar waveforms

    DOEpatents

    Doerry, Armin W.; Marquette, Brandeis

    2017-05-09

    The various technologies presented herein relate to generation of a desired waveform profile in the form of a spectrum of apparently random noise (e.g., white noise or colored noise), but with precise spectral characteristics. Hence, a waveform profile that could be readily determined (e.g., by a spoofing system) is effectively obscured. Obscuration is achieved by dividing the waveform into a series of chips, each with an assigned frequency, wherein the sequence of chips are subsequently randomized. Randomization can be a function of the application of a key to the chip sequence. During processing of the echo pulse, a copy of the randomized transmitted pulse is recovered or regenerated against which the received echo is correlated. Hence, with the echo energy range-compressed in this manner, it is possible to generate a radar image with precise impulse response.

  7. True random bit generators based on current time series of contact glow discharge electrolysis

    NASA Astrophysics Data System (ADS)

    Rojas, Andrea Espinel; Allagui, Anis; Elwakil, Ahmed S.; Alawadhi, Hussain

    2018-05-01

    Random bit generators (RBGs) in today's digital information and communication systems employ a high rate physical entropy sources such as electronic, photonic, or thermal time series signals. However, the proper functioning of such physical systems is bound by specific constrains that make them in some cases weak and susceptible to external attacks. In this study, we show that the electrical current time series of contact glow discharge electrolysis, which is a dc voltage-powered micro-plasma in liquids, can be used for generating random bit sequences in a wide range of high dc voltages. The current signal is quantized into a binary stream by first using a simple moving average function which makes the distribution centered around zero, and then applying logical operations which enables the binarized data to pass all tests in industry-standard randomness test suite by the National Institute of Standard Technology. Furthermore, the robustness of this RBG against power supply attacks has been examined and verified.

  8. A New Quantum Gray-Scale Image Encoding Scheme

    NASA Astrophysics Data System (ADS)

    Naseri, Mosayeb; Abdolmaleky, Mona; Parandin, Fariborz; Fatahi, Negin; Farouk, Ahmed; Nazari, Reza

    2018-02-01

    In this paper, a new quantum images encoding scheme is proposed. The proposed scheme mainly consists of four different encoding algorithms. The idea behind of the scheme is a binary key generated randomly for each pixel of the original image. Afterwards, the employed encoding algorithm is selected corresponding to the qubit pair of the generated randomized binary key. The security analysis of the proposed scheme proved its enhancement through both randomization of the generated binary image key and altering the gray-scale value of the image pixels using the qubits of randomized binary key. The simulation of the proposed scheme assures that the final encoded image could not be recognized visually. Moreover, the histogram diagram of encoded image is flatter than the original one. The Shannon entropies of the final encoded images are significantly higher than the original one, which indicates that the attacker can not gain any information about the encoded images. Supported by Kermanshah Branch, Islamic Azad University, Kermanshah, IRAN

  9. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.

  10. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    PubMed Central

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information. PMID:26823196

  11. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption.

    PubMed

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-29

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.

  12. A Hybrid Color Space for Skin Detection Using Genetic Algorithm Heuristic Search and Principal Component Analysis Technique

    PubMed Central

    2015-01-01

    Color is one of the most prominent features of an image and used in many skin and face detection applications. Color space transformation is widely used by researchers to improve face and skin detection performance. Despite the substantial research efforts in this area, choosing a proper color space in terms of skin and face classification performance which can address issues like illumination variations, various camera characteristics and diversity in skin color tones has remained an open issue. This research proposes a new three-dimensional hybrid color space termed SKN by employing the Genetic Algorithm heuristic and Principal Component Analysis to find the optimal representation of human skin color in over seventeen existing color spaces. Genetic Algorithm heuristic is used to find the optimal color component combination setup in terms of skin detection accuracy while the Principal Component Analysis projects the optimal Genetic Algorithm solution to a less complex dimension. Pixel wise skin detection was used to evaluate the performance of the proposed color space. We have employed four classifiers including Random Forest, Naïve Bayes, Support Vector Machine and Multilayer Perceptron in order to generate the human skin color predictive model. The proposed color space was compared to some existing color spaces and shows superior results in terms of pixel-wise skin detection accuracy. Experimental results show that by using Random Forest classifier, the proposed SKN color space obtained an average F-score and True Positive Rate of 0.953 and False Positive Rate of 0.0482 which outperformed the existing color spaces in terms of pixel wise skin detection accuracy. The results also indicate that among the classifiers used in this study, Random Forest is the most suitable classifier for pixel wise skin detection applications. PMID:26267377

  13. Sampling large random knots in a confined space

    NASA Astrophysics Data System (ADS)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  14. Looking away from faces: influence of high-level visual processes on saccade programming.

    PubMed

    Morand, Stéphanie M; Grosbras, Marie-Hélène; Caldara, Roberto; Harvey, Monika

    2010-03-30

    Human faces capture attention more than other visual stimuli. Here we investigated whether such face-specific biases rely on automatic (involuntary) or voluntary orienting responses. To this end, we used an anti-saccade paradigm, which requires the ability to inhibit a reflexive automatic response and to generate a voluntary saccade in the opposite direction of the stimulus. To control for potential low-level confounds in the eye-movement data, we manipulated the high-level visual properties of the stimuli while normalizing their global low-level visual properties. Eye movements were recorded in 21 participants who performed either pro- or anti-saccades to a face, car, or noise pattern, randomly presented to the left or right of a fixation point. For each trial, a symbolic cue instructed the observer to generate either a pro-saccade or an anti-saccade. We report a significant increase in anti-saccade error rates for faces compared to cars and noise patterns, as well as faster pro-saccades to faces and cars in comparison to noise patterns. These results indicate that human faces induce stronger involuntary orienting responses than other visual objects, i.e., responses that are beyond the control of the observer. Importantly, this involuntary processing cannot be accounted for by global low-level visual factors.

  15. Efficacy of killed whole-parasite vaccines in the prevention of leishmaniasis: a meta-analysis.

    PubMed

    Noazin, Sassan; Khamesipour, Ali; Moulton, Lawrence H; Tanner, Marcel; Nasseri, Kiumarss; Modabber, Farrokh; Sharifi, Iraj; Khalil, E A G; Bernal, Ivan Dario Velez; Antunes, Carlos M F; Smith, Peter G

    2009-07-30

    Despite decades of investigation in countries on three continents, an efficacious vaccine against Leishmania infections has not been developed. Although some indication of protection was observed in some of the controlled trials conducted with "first-generation" whole, inactivated Leishmania parasite vaccines, convincing evidence of protection was lacking. After reviewing all previously published or unpublished randomized, controlled field efficacy clinical trials of prophylactic candidate vaccines, a meta-analysis of qualified trials was conducted to evaluate whether there was some evidence of protection revealed by considering the results of all trials together. The findings indicate that the whole-parasite vaccine candidates tested do not confer significant protection against human leishmaniasis.

  16. Comparison of the effect of aspirin and choline magnesium trisalicylate on thromboxane biosynthesis in human platelets: role of the acetyl moiety.

    PubMed

    Danesh, B J; McLaren, M; Russell, R I; Lowe, G D; Forbes, C D

    1989-01-01

    Parameters of platelet thromboxane biosynthesis were measured 24 h after ingestion of equivalent salicylate doses (500 mg) of aspirin (ASA) and choline magnesium trisalicylate (CMT), a non-acetylated salicylate. In random order, 10 healthy volunteers received these drugs on 2 separate days, 2 weeks apart. While ASA significantly prolonged bleeding time, and decreased plasma thromboxane generation and serum thromboxane B2 levels, CMT failed to produce such effects. Thus CMT, which lacks an acetyl moiety in its structure, has no inhibitory effect on platelet thromboxane biosynthesis, and may therefore be considered safer than ASA for therapeutic use, when inhibition of platelet function can be hazardous.

  17. Automatic Nanodesign Using Evolutionary Techniques

    NASA Technical Reports Server (NTRS)

    Globus, Al; Saini, Subhash (Technical Monitor)

    1998-01-01

    Many problems associated with the development of nanotechnology require custom designed molecules. We use genetic graph software, a new development, to automatically evolve molecules of interest when only the requirements are known. Genetic graph software designs molecules, and potentially nanoelectronic circuits, given a fitness function that determines which of two molecules is better. A set of molecules, the first generation, is generated at random then tested with the fitness function, Subsequent generations are created by randomly choosing two parent molecules with a bias towards high scoring molecules, tearing each molecules in two at random, and mating parts from the mother and father to create two children. This procedure is repeated until a satisfactory molecule is found. An atom pair similarity test is currently used as the fitness function to evolve molecules similar to existing pharmaceuticals.

  18. High efficiency family shuffling based on multi-step PCR and in vivo DNA recombination in yeast: statistical and functional analysis of a combinatorial library between human cytochrome P450 1A1 and 1A2.

    PubMed

    Abécassis, V; Pompon, D; Truan, G

    2000-10-15

    The design of a family shuffling strategy (CLERY: Combinatorial Libraries Enhanced by Recombination in Yeast) associating PCR-based and in vivo recombination and expression in yeast is described. This strategy was tested using human cytochrome P450 CYP1A1 and CYP1A2 as templates, which share 74% nucleotide sequence identity. Construction of highly shuffled libraries of mosaic structures and reduction of parental gene contamination were two major goals. Library characterization involved multiprobe hybridization on DNA macro-arrays. The statistical analysis of randomly selected clones revealed a high proportion of chimeric genes (86%) and a homogeneous representation of the parental contribution among the sequences (55.8 +/- 2.5% for parental sequence 1A2). A microtiter plate screening system was designed to achieve colorimetric detection of polycyclic hydrocarbon hydroxylation by transformed yeast cells. Full sequences of five randomly picked and five functionally selected clones were analyzed. Results confirmed the shuffling efficiency and allowed calculation of the average length of sequence exchange and mutation rates. The efficient and statistically representative generation of mosaic structures by this type of family shuffling in a yeast expression system constitutes a novel and promising tool for structure-function studies and tuning enzymatic activities of multicomponent eucaryote complexes involving non-soluble enzymes.

  19. Early vision and focal attention

    NASA Astrophysics Data System (ADS)

    Julesz, Bela

    1991-07-01

    At the thirty-year anniversary of the introduction of the technique of computer-generated random-dot stereograms and random-dot cinematograms into psychology, the impact of the technique on brain research and on the study of artificial intelligence is reviewed. The main finding-that stereoscopic depth perception (stereopsis), motion perception, and preattentive texture discrimination are basically bottom-up processes, which occur without the help of the top-down processes of cognition and semantic memory-greatly simplifies the study of these processes of early vision and permits the linking of human perception with monkey neurophysiology. Particularly interesting are the unexpected findings that stereopsis (assumed to be local) is a global process, while texture discrimination (assumed to be a global process, governed by statistics) is local, based on some conspicuous local features (textons). It is shown that the top-down process of "shape (depth) from shading" does not affect stereopsis, and some of the models of machine vision are evaluated. The asymmetry effect of human texture discrimination is discussed, together with recent nonlinear spatial filter models and a novel extension of the texton theory that can cope with the asymmetry problem. This didactic review attempts to introduce the physicist to the field of psychobiology and its problems-including metascientific problems of brain research, problems of scientific creativity, the state of artificial intelligence research (including connectionist neural networks) aimed at modeling brain activity, and the fundamental role of focal attention in mental events.

  20. Statistical aspects of evolution under natural selection, with implications for the advantage of sexual reproduction.

    PubMed

    Crouch, Daniel J M

    2017-10-27

    The prevalence of sexual reproduction remains mysterious, as it poses clear evolutionary drawbacks compared to reproducing asexually. Several possible explanations exist, with one of the most likely being that finite population size causes linkage disequilibria to randomly generate and impede the progress of natural selection, and that these are eroded by recombination via sexual reproduction. Previous investigations have either analysed this phenomenon in detail for small numbers of loci, or performed population simulations for many loci. Here we present a quantitative genetic model for fitness, based on the Price Equation, in order to examine the theoretical consequences of randomly generated linkage disequilibria when there are many loci. In addition, most previous work has been concerned with the long-term consequences of deleterious linkage disequilibria for population fitness. The expected change in mean fitness between consecutive generations, a measure of short-term evolutionary success, is shown under random environmental influences to be related to the autocovariance in mean fitness between the generations, capturing the effects of stochastic forces such as genetic drift. Interaction between genetic drift and natural selection, due to randomly generated linkage disequilibria, is demonstrated to be one possible source of mean fitness autocovariance. This suggests a possible role for sexual reproduction in reducing the negative effects of genetic drift, thereby improving the short-term efficacy of natural selection. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Design and implementation of the NaI(Tl)/CsI(Na) detectors output signal generator

    NASA Astrophysics Data System (ADS)

    Zhou, Xu; Liu, Cong-Zhan; Zhao, Jian-Ling; Zhang, Fei; Zhang, Yi-Fei; Li, Zheng-Wei; Zhang, Shuo; Li, Xu-Fang; Lu, Xue-Feng; Xu, Zhen-Ling; Lu, Fang-Jun

    2014-02-01

    We designed and implemented a signal generator that can simulate the output of the NaI(Tl)/CsI(Na) detectors' pre-amplifier onboard the Hard X-ray Modulation Telescope (HXMT). Using the development of the FPGA (Field Programmable Gate Array) with VHDL language and adding a random constituent, we have finally produced the double exponential random pulse signal generator. The statistical distribution of the signal amplitude is programmable. The occurrence time intervals of the adjacent signals contain negative exponential distribution statistically.

  2. Evidence for a bimodal distribution in human communication.

    PubMed

    Wu, Ye; Zhou, Changsong; Xiao, Jinghua; Kurths, Jürgen; Schellnhuber, Hans Joachim

    2010-11-02

    Interacting human activities underlie the patterns of many social, technological, and economic phenomena. Here we present clear empirical evidence from Short Message correspondence that observed human actions are the result of the interplay of three basic ingredients: Poisson initiation of tasks and decision making for task execution in individual humans as well as interaction among individuals. This interplay leads to new types of interevent time distribution, neither completely Poisson nor power-law, but a bimodal combination of them. We show that the events can be separated into independent bursts which are generated by frequent mutual interactions in short times following random initiations of communications in longer times by the individuals. We introduce a minimal model of two interacting priority queues incorporating the three basic ingredients which fits well the distributions using the parameters extracted from the empirical data. The model can also embrace a range of realistic social interacting systems such as e-mail and letter communications when taking the time scale of processing into account. Our findings provide insight into various human activities both at the individual and network level. Our analysis and modeling of bimodal activity in human communication from the viewpoint of the interplay between processes of different time scales is likely to shed light on bimodal phenomena in other complex systems, such as interevent times in earthquakes, rainfall, forest fire, and economic systems, etc.

  3. Evidence for a bimodal distribution in human communication

    PubMed Central

    Wu, Ye; Zhou, Changsong; Xiao, Jinghua; Kurths, Jürgen; Schellnhuber, Hans Joachim

    2010-01-01

    Interacting human activities underlie the patterns of many social, technological, and economic phenomena. Here we present clear empirical evidence from Short Message correspondence that observed human actions are the result of the interplay of three basic ingredients: Poisson initiation of tasks and decision making for task execution in individual humans as well as interaction among individuals. This interplay leads to new types of interevent time distribution, neither completely Poisson nor power-law, but a bimodal combination of them. We show that the events can be separated into independent bursts which are generated by frequent mutual interactions in short times following random initiations of communications in longer times by the individuals. We introduce a minimal model of two interacting priority queues incorporating the three basic ingredients which fits well the distributions using the parameters extracted from the empirical data. The model can also embrace a range of realistic social interacting systems such as e-mail and letter communications when taking the time scale of processing into account. Our findings provide insight into various human activities both at the individual and network level. Our analysis and modeling of bimodal activity in human communication from the viewpoint of the interplay between processes of different time scales is likely to shed light on bimodal phenomena in other complex systems, such as interevent times in earthquakes, rainfall, forest fire, and economic systems, etc. PMID:20959414

  4. Anisotropic microfibrous scaffolds enhance the organization and function of cardiomyocytes derived from induced pluripotent stem cells.

    PubMed

    Wanjare, Maureen; Hou, Luqia; Nakayama, Karina H; Kim, Joseph J; Mezak, Nicholas P; Abilez, Oscar J; Tzatzalos, Evangeline; Wu, Joseph C; Huang, Ngan F

    2017-07-25

    Engineering of myocardial tissue constructs is a promising approach for treatment of coronary heart disease. To engineer myocardial tissues that better mimic the highly ordered physiological arrangement and function of native cardiomyocytes, we generated electrospun microfibrous polycaprolactone scaffolds with either randomly oriented (14 μm fiber diameter) or parallel-aligned (7 μm fiber diameter) microfiber arrangement and co-seeded the scaffolds with human induced pluripotent stem cell-derived cardiomyocytes (iCMs) and endothelial cells (iECs) for up to 12 days after iCM seeding. Here we demonstrated that aligned microfibrous scaffolds induced iCM alignment along the direction of the aligned microfibers after 2 days of iCM seeding, as well as promoted greater iCM maturation by increasing the sarcomeric length and gene expression of myosin heavy chain adult isoform (MYH7), in comparison to randomly oriented scaffolds. Furthermore, the benefit of scaffold anisotropy was evident in the significantly higher maximum contraction velocity of iCMs on the aligned scaffolds, compared to randomly oriented scaffolds, at 12 days of culture. Co-seeding of iCMs with iECs led to reduced contractility, compared to when iCMs were seeded alone. These findings demonstrate a dominant role of scaffold anisotropy in engineering cardiovascular tissues that maintain iCM organization and contractile function.

  5. Prediction of truly random future events using analysis of prestimulus electroencephalographic data

    NASA Astrophysics Data System (ADS)

    Baumgart, Stephen L.; Franklin, Michael S.; Jimbo, Hiroumi K.; Su, Sharon J.; Schooler, Jonathan

    2017-05-01

    Our hypothesis is that pre-stimulus physiological data can be used to predict truly random events tied to perceptual stimuli (e.g., lights and sounds). Our experiment presents light and sound stimuli to a passive human subject while recording electrocortical potentials using a 32-channel Electroencephalography (EEG) system. For every trial a quantum random number generator (qRNG) chooses from three possible selections with equal probability: a light stimulus, a sound stimulus, and no stimulus. Time epochs are defined preceding and post-ceding each stimulus for which mean average potentials were computed across all trials for the three possible stimulus types. Data from three regions of the brain are examined. In all three regions mean potential for light stimuli was generally enhanced relative to baseline during the period starting approximately 2 seconds before the stimulus. For sound stimuli, mean potential decreased relative to baseline during the period starting approximately 2 seconds before the stimulus. These changes from baseline may indicated the presence of evoked potentials arising from the stimulus. A P200 peak was observed in data recorded from frontal electrodes. The P200 is a well-known potential arising from the brain's processing of visual stimuli and its presence represents a replication of a known neurological phenomenon.

  6. Fast and secure encryption-decryption method based on chaotic dynamics

    DOEpatents

    Protopopescu, Vladimir A.; Santoro, Robert T.; Tolliver, Johnny S.

    1995-01-01

    A method and system for the secure encryption of information. The method comprises the steps of dividing a message of length L into its character components; generating m chaotic iterates from m independent chaotic maps; producing an "initial" value based upon the m chaotic iterates; transforming the "initial" value to create a pseudo-random integer; repeating the steps of generating, producing and transforming until a pseudo-random integer sequence of length L is created; and encrypting the message as ciphertext based upon the pseudo random integer sequence. A system for accomplishing the invention is also provided.

  7. Computer methods for sampling from the gamma distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, M.E.; Tadikamalla, P.R.

    1978-01-01

    Considerable attention has recently been directed at developing ever faster algorithms for generating gamma random variates on digital computers. This paper surveys the current state of the art including the leading algorithms of Ahrens and Dieter, Atkinson, Cheng, Fishman, Marsaglia, Tadikamalla, and Wallace. General random variate generation techniques are explained with reference to these gamma algorithms. Computer simulation experiments on IBM and CDC computers are reported.

  8. Preference for point-light human biological motion in newborns: contribution of translational displacement.

    PubMed

    Bidet-Ildei, Christel; Kitromilides, Elenitsa; Orliaguet, Jean-Pierre; Pavlova, Marina; Gentaz, Edouard

    2014-01-01

    In human newborns, spontaneous visual preference for biological motion is reported to occur at birth, but the factors underpinning this preference are still in debate. Using a standard visual preferential looking paradigm, 4 experiments were carried out in 3-day-old human newborns to assess the influence of translational displacement on perception of human locomotion. Experiment 1 shows that human newborns prefer a point-light walker display representing human locomotion as if on a treadmill over random motion. However, no preference for biological movement is observed in Experiment 2 when both biological and random motion displays are presented with translational displacement. Experiments 3 and 4 show that newborns exhibit preference for translated biological motion (Experiment 3) and random motion (Experiment 4) displays over the same configurations moving without translation. These findings reveal that human newborns have a preference for the translational component of movement independently of the presence of biological kinematics. The outcome suggests that translation constitutes the first step in development of visual preference for biological motion. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  9. A Random Variable Transformation Process.

    ERIC Educational Resources Information Center

    Scheuermann, Larry

    1989-01-01

    Provides a short BASIC program, RANVAR, which generates random variates for various theoretical probability distributions. The seven variates include: uniform, exponential, normal, binomial, Poisson, Pascal, and triangular. (MVL)

  10. The conditional power of randomization tests for single-case effect sizes in designs with randomized treatment order: A Monte Carlo simulation study.

    PubMed

    Michiels, Bart; Heyvaert, Mieke; Onghena, Patrick

    2018-04-01

    The conditional power (CP) of the randomization test (RT) was investigated in a simulation study in which three different single-case effect size (ES) measures were used as the test statistics: the mean difference (MD), the percentage of nonoverlapping data (PND), and the nonoverlap of all pairs (NAP). Furthermore, we studied the effect of the experimental design on the RT's CP for three different single-case designs with rapid treatment alternation: the completely randomized design (CRD), the randomized block design (RBD), and the restricted randomized alternation design (RRAD). As a third goal, we evaluated the CP of the RT for three types of simulated data: data generated from a standard normal distribution, data generated from a uniform distribution, and data generated from a first-order autoregressive Gaussian process. The results showed that the MD and NAP perform very similarly in terms of CP, whereas the PND performs substantially worse. Furthermore, the RRAD yielded marginally higher power in the RT, followed by the CRD and then the RBD. Finally, the power of the RT was almost unaffected by the type of the simulated data. On the basis of the results of the simulation study, we recommend at least 20 measurement occasions for single-case designs with a randomized treatment order that are to be evaluated with an RT using a 5% significance level. Furthermore, we do not recommend use of the PND, because of its low power in the RT.

  11. Threshold matrix for digital halftoning by genetic algorithm optimization

    NASA Astrophysics Data System (ADS)

    Alander, Jarmo T.; Mantere, Timo J.; Pyylampi, Tero

    1998-10-01

    Digital halftoning is used both in low and high resolution high quality printing technologies. Our method is designed to be mainly used for low resolution ink jet marking machines to produce both gray tone and color images. The main problem with digital halftoning is pink noise caused by the human eye's visual transfer function. To compensate for this the random dot patterns used are optimized to contain more blue than pink noise. Several such dot pattern generator threshold matrices have been created automatically by using genetic algorithm optimization, a non-deterministic global optimization method imitating natural evolution and genetics. A hybrid of genetic algorithm with a search method based on local backtracking was developed together with several fitness functions evaluating dot patterns for rectangular grids. By modifying the fitness function, a family of dot generators results, each with its particular statistical features. Several versions of genetic algorithms, backtracking and fitness functions were tested to find a reasonable combination. The generated threshold matrices have been tested by simulating a set of test images using the Khoros image processing system. Even though the work was focused on developing low resolution marking technology, the resulting family of dot generators can be applied also in other halftoning application areas including high resolution printing technology.

  12. Autologous blood cell therapies from pluripotent stem cells

    PubMed Central

    Lengerke, Claudia; Daley, George Q.

    2010-01-01

    Summary The discovery of human embryonic stem cells (hESCs) raised promises for a universal resource for cell based therapies in regenerative medicine. Recently, fast-paced progress has been made towards the generation of pluripotent stem cells (PSCs) amenable for clinical applications, culminating in reprogramming of adult somatic cells to autologous PSCs that can be indefinitely expanded in vitro. However, besides the efficient generation of bona fide, clinically safe PSCs (e.g. without the use of oncoproteins and gene transfer based on viruses inserting randomly into the genome), a major challenge in the field remains how to efficiently differentiate PSCs to specific lineages and how to select for cells that will function normally upon transplantation in adults. In this review, we analyse the in vitro differentiation potential of PSCs to the hematopoietic lineage discussing blood cell types that can be currently obtained, limitations in derivation of adult-type HSCs and prospects for clinical application of PSCs-derived blood cells. PMID:19910091

  13. [Screening specific recognition motif of RNA-binding proteins by SELEX in combination with next-generation sequencing technique].

    PubMed

    Zhang, Lu; Xu, Jinhao; Ma, Jinbiao

    2016-07-25

    RNA-binding protein exerts important biological function by specifically recognizing RNA motif. SELEX (Systematic evolution of ligands by exponential enrichment), an in vitro selection method, can obtain consensus motif with high-affinity and specificity for many target molecules from DNA or RNA libraries. Here, we combined SELEX with next-generation sequencing to study the protein-RNA interaction in vitro. A pool of RNAs with 20 bp random sequences were transcribed by T7 promoter, and target protein was inserted into plasmid containing SBP-tag, which can be captured by streptavidin beads. Through only one cycle, the specific RNA motif can be obtained, which dramatically improved the selection efficiency. Using this method, we found that human hnRNP A1 RRMs domain (UP1 domain) bound RNA motifs containing AGG and AG sequences. The EMSA experiment indicated that hnRNP A1 RRMs could bind the obtained RNA motif. Taken together, this method provides a rapid and effective method to study the RNA binding specificity of proteins.

  14. Current Progress of Genetically Engineered Pig Models for Biomedical Research

    PubMed Central

    Gün, Gökhan

    2014-01-01

    Abstract The first transgenic pigs were generated for agricultural purposes about three decades ago. Since then, the micromanipulation techniques of pig oocytes and embryos expanded from pronuclear injection of foreign DNA to somatic cell nuclear transfer, intracytoplasmic sperm injection-mediated gene transfer, lentiviral transduction, and cytoplasmic injection. Mechanistically, the passive transgenesis approach based on random integration of foreign DNA was developed to active genetic engineering techniques based on the transient activity of ectopic enzymes, such as transposases, recombinases, and programmable nucleases. Whole-genome sequencing and annotation of advanced genome maps of the pig complemented these developments. The full implementation of these tools promises to immensely increase the efficiency and, in parallel, to reduce the costs for the generation of genetically engineered pigs. Today, the major application of genetically engineered pigs is found in the field of biomedical disease modeling. It is anticipated that genetically engineered pigs will increasingly be used in biomedical research, since this model shows several similarities to humans with regard to physiology, metabolism, genome organization, pathology, and aging. PMID:25469311

  15. Scalable hierarchical PDE sampler for generating spatially correlated random fields using nonmatching meshes: Scalable hierarchical PDE sampler using nonmatching meshes

    DOE PAGES

    Osborn, Sarah; Zulian, Patrick; Benson, Thomas; ...

    2018-01-30

    This work describes a domain embedding technique between two nonmatching meshes used for generating realizations of spatially correlated random fields with applications to large-scale sampling-based uncertainty quantification. The goal is to apply the multilevel Monte Carlo (MLMC) method for the quantification of output uncertainties of PDEs with random input coefficients on general and unstructured computational domains. We propose a highly scalable, hierarchical sampling method to generate realizations of a Gaussian random field on a given unstructured mesh by solving a reaction–diffusion PDE with a stochastic right-hand side. The stochastic PDE is discretized using the mixed finite element method on anmore » embedded domain with a structured mesh, and then, the solution is projected onto the unstructured mesh. This work describes implementation details on how to efficiently transfer data from the structured and unstructured meshes at coarse levels, assuming that this can be done efficiently on the finest level. We investigate the efficiency and parallel scalability of the technique for the scalable generation of Gaussian random fields in three dimensions. An application of the MLMC method is presented for quantifying uncertainties of subsurface flow problems. Here, we demonstrate the scalability of the sampling method with nonmatching mesh embedding, coupled with a parallel forward model problem solver, for large-scale 3D MLMC simulations with up to 1.9·109 unknowns.« less

  16. Scalable hierarchical PDE sampler for generating spatially correlated random fields using nonmatching meshes: Scalable hierarchical PDE sampler using nonmatching meshes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, Sarah; Zulian, Patrick; Benson, Thomas

    This work describes a domain embedding technique between two nonmatching meshes used for generating realizations of spatially correlated random fields with applications to large-scale sampling-based uncertainty quantification. The goal is to apply the multilevel Monte Carlo (MLMC) method for the quantification of output uncertainties of PDEs with random input coefficients on general and unstructured computational domains. We propose a highly scalable, hierarchical sampling method to generate realizations of a Gaussian random field on a given unstructured mesh by solving a reaction–diffusion PDE with a stochastic right-hand side. The stochastic PDE is discretized using the mixed finite element method on anmore » embedded domain with a structured mesh, and then, the solution is projected onto the unstructured mesh. This work describes implementation details on how to efficiently transfer data from the structured and unstructured meshes at coarse levels, assuming that this can be done efficiently on the finest level. We investigate the efficiency and parallel scalability of the technique for the scalable generation of Gaussian random fields in three dimensions. An application of the MLMC method is presented for quantifying uncertainties of subsurface flow problems. Here, we demonstrate the scalability of the sampling method with nonmatching mesh embedding, coupled with a parallel forward model problem solver, for large-scale 3D MLMC simulations with up to 1.9·109 unknowns.« less

  17. All about Eve: Secret Sharing using Quantum Effects

    NASA Technical Reports Server (NTRS)

    Jackson, Deborah J.

    2005-01-01

    This document discusses the nature of light (including classical light and photons), encryption, quantum key distribution (QKD), light polarization and beamsplitters and their application to information communication. A quantum of light represents the smallest possible subdivision of radiant energy (light) and is called a photon. The QKD key generation sequence is outlined including the receiver broadcasting the initial signal indicating reception availability, timing pulses from the sender to provide reference for gated detection of photons, the sender generating photons through random polarization while the receiver detects photons with random polarization and communicating via data link to mutually establish random keys. The QKD network vision includes inter-SATCOM, point-to-point Gnd Fiber and SATCOM-fiber nodes. QKD offers an unconditionally secure method of exchanging encryption keys. Ongoing research will focus on how to increase the key generation rate.

  18. A dose optimization method for electron radiotherapy using randomized aperture beams

    NASA Astrophysics Data System (ADS)

    Engel, Konrad; Gauer, Tobias

    2009-09-01

    The present paper describes the entire optimization process of creating a radiotherapy treatment plan for advanced electron irradiation. Special emphasis is devoted to the selection of beam incidence angles and beam energies as well as to the choice of appropriate subfields generated by a refined version of intensity segmentation and a novel random aperture approach. The algorithms have been implemented in a stand-alone programme using dose calculations from a commercial treatment planning system. For this study, the treatment planning system Pinnacle from Philips has been used and connected to the optimization programme using an ASCII interface. Dose calculations in Pinnacle were performed by Monte Carlo simulations for a remote-controlled electron multileaf collimator (MLC) from Euromechanics. As a result, treatment plans for breast cancer patients could be significantly improved when using randomly generated aperture beams. The combination of beams generated through segmentation and randomization achieved the best results in terms of target coverage and sparing of critical organs. The treatment plans could be further improved by use of a field reduction algorithm. Without a relevant loss in dose distribution, the total number of MLC fields and monitor units could be reduced by up to 20%. In conclusion, using randomized aperture beams is a promising new approach in radiotherapy and exhibits potential for further improvements in dose optimization through a combination of randomized electron and photon aperture beams.

  19. Quasirandom geometric networks from low-discrepancy sequences

    NASA Astrophysics Data System (ADS)

    Estrada, Ernesto

    2017-08-01

    We define quasirandom geometric networks using low-discrepancy sequences, such as Halton, Sobol, and Niederreiter. The networks are built in d dimensions by considering the d -tuples of digits generated by these sequences as the coordinates of the vertices of the networks in a d -dimensional Id unit hypercube. Then, two vertices are connected by an edge if they are at a distance smaller than a connection radius. We investigate computationally 11 network-theoretic properties of two-dimensional quasirandom networks and compare them with analogous random geometric networks. We also study their degree distribution and their spectral density distributions. We conclude from this intensive computational study that in terms of the uniformity of the distribution of the vertices in the unit square, the quasirandom networks look more random than the random geometric networks. We include an analysis of potential strategies for generating higher-dimensional quasirandom networks, where it is know that some of the low-discrepancy sequences are highly correlated. In this respect, we conclude that up to dimension 20, the use of scrambling, skipping and leaping strategies generate quasirandom networks with the desired properties of uniformity. Finally, we consider a diffusive process taking place on the nodes and edges of the quasirandom and random geometric graphs. We show that the diffusion time is shorter in the quasirandom graphs as a consequence of their larger structural homogeneity. In the random geometric graphs the diffusion produces clusters of concentration that make the process more slow. Such clusters are a direct consequence of the heterogeneous and irregular distribution of the nodes in the unit square in which the generation of random geometric graphs is based on.

  20. Resveratrol and Clinical Trials: The Crossroad from In Vitro Studies to Human Evidence

    PubMed Central

    Tomé-Carneiro, Joao; Larrosa, Mar; González-Sarrías, Antonio; Tomás-Barberán, Francisco A.; García-Conesa, María Teresa; Espín, Juan Carlos

    2013-01-01

    Resveratrol (3,5,4’-trihydroxy-trans-stilbene) is a non-flavonoid polyphenol that may be present in a limited number of food-stuffs such as grapes and red wine. Resveratrol has been reported to exert a plethora of health benefits through many different mechanisms of action. This versatility and presence in the human diet have drawn the worldwide attention of many research groups over the past twenty years, which has resulted in a huge output of in vitro and animal (preclinical) studies. In line with this expectation, many resveratrol-based nutraceuticals are consumed all over the world with questionable clinical/scientific support. In fact, the confirmation of these benefits in humans through randomized clinical trials is still very limited. The vast majority of preclinical studies have been performed using assay conditions with a questionable extrapolation to humans, i.e. too high concentrations with potential safety concerns (adverse effects and drug interactions), short-term exposures, in vitro tests carried out with non-physiological metabolites and/or concentrations, etc. Unfortunately, all these hypothesis-generating studies have contributed to increased the number of ‘potential’ benefits and mechanisms of resveratrol but confirmation in humans is very limited. Therefore, there are many issues that should be addressed to avoid an apparent endless loop in resveratrol research. The so-called ‘Resveratrol Paradox’, i.e., low bioavailability but high bioactivity, is a conundrum not yet solved in which the final responsible actor (if any) for the exerted effects has not yet been unequivocally identified. It is becoming evident that resveratrol exerts cardioprotective benefits through the improvement of inflammatory markers, atherogenic profile, glucose metabolism and endothelial function. However, safety concerns remain unsolved regarding chronic consumption of high RES doses, specially in medicated people. This review will focus on the currently available evidence regarding resveratrol’s effects on humans obtained from randomized clinical trials. In addition, we will provide a critical outlook for further research on this molecule that is evolving from a minor dietary compound to a possible multi-target therapeutic drug. PMID:23448440

  1. Randomized comparison of next-generation sequencing and array comparative genomic hybridization for preimplantation genetic screening: a pilot study.

    PubMed

    Yang, Zhihong; Lin, James; Zhang, John; Fong, Wai Ieng; Li, Pei; Zhao, Rong; Liu, Xiaohong; Podevin, William; Kuang, Yanping; Liu, Jiaen

    2015-06-23

    Recent advances in next-generation sequencing (NGS) have provided new methods for preimplantation genetic screening (PGS) of human embryos from in vitro fertilization (IVF) cycles. However, there is still limited information about clinical applications of NGS in IVF and PGS (IVF-PGS) treatments. The present study aimed to investigate the effects of NGS screening on clinical pregnancy and implantation outcomes for PGS patients in comparison to array comparative genomic hybridization (aCGH) screening. This study was performed in two phases. Phase I study evaluated the accuracy of NGS for aneuploidy screening in comparison to aCGH. Whole-genome amplification (WGA) products (n = 164) derived from previous IVF-PGS cycles (n = 38) were retrospectively analyzed with NGS. The NGS results were then compared with those of aCGH. Phase II study further compared clinical pregnancy and implantation outcomes between NGS and aCGH for IVF-PGS patients. A total of 172 patients at mean age 35.2 ± 3.5 years were randomized into two groups: 1) NGS (Group A): patients (n = 86) had embryos screened with NGS and 2) aCGH (Group B): patients (n = 86) had embryos screened with aCGH. For both groups, blastocysts were vitrified after trophectoderm biopsy. One to two euploid blastocysts were thawed and transferred to individual patients primarily based on the PGS results. Ongoing pregnancy and implantation rates were compared between the two study groups. NGS detected all types of aneuploidies of human blastocysts accurately and provided a 100 % 24-chromosome diagnosis consistency with the highly validated aCGH method. Moreover, NGS screening identified euploid blastocysts for transfer and resulted in similarly high ongoing pregnancy rates for PGS patients compared to aCGH screening (74.7 % vs. 69.2 %, respectively, p >0.05). The observed implantation rates were also comparable between the NGS and aCGH groups (70.5 % vs. 66.2 %, respectively, p >0.05). While NGS screening has been recently introduced to assist IVF patients, this is the first randomized clinical study on the efficiency of NGS for preimplantation genetic screening in comparison to aCGH. With the observed high accuracy of 24-chromosome diagnosis and the resulting high ongoing pregnancy and implantation rates, NGS has demonstrated an efficient, robust high-throughput technology for PGS.

  2. Human norovirus inactivation in oysters by high hydrostatic pressure processing: A randomized double-blinded study

    USDA-ARS?s Scientific Manuscript database

    This randomized, double-blinded, clinical trial assessed the effect of high hydrostatic pressure processing (HPP) on genogroup I.1 human norovirus (HuNoV) inactivation in virus-seeded oysters when ingested by subjects. The safety and efficacy of HPP treatments were assessed in three study phases wi...

  3. Variation of mutational burden in healthy human tissues suggests non-random strand segregation and allows measuring somatic mutation rates.

    PubMed

    Werner, Benjamin; Sottoriva, Andrea

    2018-06-01

    The immortal strand hypothesis poses that stem cells could produce differentiated progeny while conserving the original template strand, thus avoiding accumulating somatic mutations. However, quantitating the extent of non-random DNA strand segregation in human stem cells remains difficult in vivo. Here we show that the change of the mean and variance of the mutational burden with age in healthy human tissues allows estimating strand segregation probabilities and somatic mutation rates. We analysed deep sequencing data from healthy human colon, small intestine, liver, skin and brain. We found highly effective non-random DNA strand segregation in all adult tissues (mean strand segregation probability: 0.98, standard error bounds (0.97,0.99)). In contrast, non-random strand segregation efficiency is reduced to 0.87 (0.78,0.88) in neural tissue during early development, suggesting stem cell pool expansions due to symmetric self-renewal. Healthy somatic mutation rates differed across tissue types, ranging from 3.5 × 10-9/bp/division in small intestine to 1.6 × 10-7/bp/division in skin.

  4. Evaluation of Inventory Reduction Strategies: Balad Air Base Case Study

    DTIC Science & Technology

    2012-03-01

    produced by conducting individual simulations using a unique random seed generated by the default Anylogic © random number generator. The...develops an agent-based simulation model of the sustainment supply chain supporting Balad AB during its closure using the software AnyLogic ®. The...research. The goal of USAF Stockage Policy is to maximize customer support while minimizing inventory costs (DAF, 2011:1). USAF stocking decisions

  5. Generating equilateral random polygons in confinement

    NASA Astrophysics Data System (ADS)

    Diao, Y.; Ernst, C.; Montemayor, A.; Ziegler, U.

    2011-10-01

    One challenging problem in biology is to understand the mechanism of DNA packing in a confined volume such as a cell. It is known that confined circular DNA is often knotted and hence the topology of the extracted (and relaxed) circular DNA can be used as a probe of the DNA packing mechanism. However, in order to properly estimate the topological properties of the confined circular DNA structures using mathematical models, it is necessary to generate large ensembles of simulated closed chains (i.e. polygons) of equal edge lengths that are confined in a volume such as a sphere of certain fixed radius. Finding efficient algorithms that properly sample the space of such confined equilateral random polygons is a difficult problem. In this paper, we propose a method that generates confined equilateral random polygons based on their probability distribution. This method requires the creation of a large database initially. However, once the database has been created, a confined equilateral random polygon of length n can be generated in linear time in terms of n. The errors introduced by the method can be controlled and reduced by the refinement of the database. Furthermore, our numerical simulations indicate that these errors are unbiased and tend to cancel each other in a long polygon.

  6. A critical evaluation of random copolymer mimesis of homogeneous antimicrobial peptides.

    PubMed

    Hu, Kan; Schmidt, Nathan W; Zhu, Rui; Jiang, Yunjiang; Lai, Ghee Hwee; Wei, Gang; Palermo, Edmund F; Kuroda, Kenichi; Wong, Gerard C L; Yang, Lihua

    2013-01-01

    Polymeric synthetic mimics of antimicrobial peptides (SMAMPs) have recently demonstrated similar antimicrobial activity as natural antimicrobial peptides (AMPs) from innate immunity. This is surprising, since polymeric SMAMPs are heterogeneous in terms of chemical structure (random sequence) and conformation (random coil), in contrast to defined amino acid sequence and intrinsic secondary structure. To understand this better, we compare AMPs with a 'minimal' mimic, a well characterized family of polydisperse cationic methacrylate-based random copolymer SMAMPs. Specifically, we focus on a comparison between the quantifiable membrane curvature generating capacity, charge density, and hydrophobicity of the polymeric SMAMPs and AMPs. Synchrotron small angle x-ray scattering (SAXS) results indicate that typical AMPs and these methacrylate SMAMPs generate similar amounts of membrane negative Gaussian curvature (NGC), which is topologically necessary for a variety of membrane-destabilizing processes. Moreover, the curvature generating ability of SMAMPs is more tolerant of changes in the lipid composition than that of natural AMPs with similar chemical groups, consistent with the lower specificity of SMAMPs. We find that, although the amount of NGC generated by these SMAMPs and AMPs are similar, the SMAMPs require significantly higher levels of hydrophobicity and cationic charge to achieve the same level of membrane deformation. We propose an explanation for these differences, which has implications for new synthetic strategies aimed at improved mimesis of AMPs.

  7. On Digital Simulation of Multicorrelated Random Processes and Its Applications. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Sinha, A. K.

    1973-01-01

    Two methods are described to simulate, on a digital computer, a set of correlated, stationary, and Gaussian time series with zero mean from the given matrix of power spectral densities and cross spectral densities. The first method is based upon trigonometric series with random amplitudes and deterministic phase angles. The random amplitudes are generated by using a standard random number generator subroutine. An example is given which corresponds to three components of wind velocities at two different spatial locations for a total of six correlated time series. In the second method, the whole process is carried out using the Fast Fourier Transform approach. This method gives more accurate results and works about twenty times faster for a set of six correlated time series.

  8. A Stochastic Version of the Noether Theorem

    NASA Astrophysics Data System (ADS)

    González Lezcano, Alfredo; Cabo Montes de Oca, Alejandro

    2018-06-01

    A stochastic version of the Noether theorem is derived for systems under the action of external random forces. The concept of moment generating functional is employed to describe the symmetry of the stochastic forces. The theorem is applied to two kinds of random covariant forces. One of them generated in an electrodynamic way and the other is defined in the rest frame of the particle as a function of the proper time. For both of them, it is shown the conservation of the mean value of a random drift momentum. The validity of the theorem makes clear that random systems can produce causal stochastic correlations between two faraway separated systems, that had interacted in the past. In addition possible connections of the discussion with the Ives Couder's experimental results are remarked.

  9. Strategic Use of Random Subsample Replication and a Coefficient of Factor Replicability

    ERIC Educational Resources Information Center

    Katzenmeyer, William G.; Stenner, A. Jackson

    1975-01-01

    The problem of demonstrating replicability of factor structure across random variables is addressed. Procedures are outlined which combine the use of random subsample replication strategies with the correlations between factor score estimates across replicate pairs to generate a coefficient of replicability and confidence intervals associated with…

  10. Comparative Emissions of Random Orbital Sanding between Conventional and Self-Generated Vacuum Systems

    PubMed Central

    Liverseed, David R.

    2013-01-01

    Conventional abrasive sanding generates high concentrations of particles. Depending on the substrate being abraded and exposure duration, overexposure to the particles can cause negative health effects ranging from respiratory irritation to cancer. The goal of this study was to understand the differences in particle emissions between a conventional random orbital sanding system and a self-generated vacuum random orbital sanding system with attached particle filtration bag. Particle concentrations were sampled for each system in a controlled test chamber for oak wood, chromate painted (hexavalent chromium) steel panels, and gel-coated (titanium dioxide) fiberglass panels using a Gesamtstaub-Probenahmesystem (GSP) sampler at three different locations adjacent to the sanding. Elevated concentrations were reported for all particles in the samples collected during conventional sanding. The geometric mean concentration ratios for the three substrates ranged from 320 to 4640 times greater for the conventional sanding system than the self-generated vacuum sanding system. The differences in the particle concentration generated by the two sanding systems were statistically significant with the two sample t-test (P < 0.0001) for all three substances. The data suggest that workers using conventional sanding systems could utilize the self-generated vacuum sanding system technology to potentially reduce exposure to particles and mitigate negative health effects. PMID:23065674

  11. Comparative emissions of random orbital sanding between conventional and self-generated vacuum systems.

    PubMed

    Liverseed, David R; Logan, Perry W; Johnson, Carl E; Morey, Sandy Z; Raynor, Peter C

    2013-03-01

    Conventional abrasive sanding generates high concentrations of particles. Depending on the substrate being abraded and exposure duration, overexposure to the particles can cause negative health effects ranging from respiratory irritation to cancer. The goal of this study was to understand the differences in particle emissions between a conventional random orbital sanding system and a self-generated vacuum random orbital sanding system with attached particle filtration bag. Particle concentrations were sampled for each system in a controlled test chamber for oak wood, chromate painted (hexavalent chromium) steel panels, and gel-coated (titanium dioxide) fiberglass panels using a Gesamtstaub-Probenahmesystem (GSP) sampler at three different locations adjacent to the sanding. Elevated concentrations were reported for all particles in the samples collected during conventional sanding. The geometric mean concentration ratios for the three substrates ranged from 320 to 4640 times greater for the conventional sanding system than the self-generated vacuum sanding system. The differences in the particle concentration generated by the two sanding systems were statistically significant with the two sample t-test (P < 0.0001) for all three substances. The data suggest that workers using conventional sanding systems could utilize the self-generated vacuum sanding system technology to potentially reduce exposure to particles and mitigate negative health effects.

  12. Remote Effects of Electromagnetic Millimeter Waves on Experimentally Induced Cold Pain: A Double-Blinded Crossover Investigation in Healthy Volunteers.

    PubMed

    Partyla, Tomasz; Hacker, Henriette; Edinger, Hardy; Leutzow, Bianca; Lange, Joern; Usichenko, Taras

    2017-03-01

    The hypoalgesic effect of electromagnetic millimeter waves (MW) is well studied in animal model; however, the results of human research are controversial. The aim of this study was to evaluate the effects of various frequency ranges of MW on hypoalgesia using the cold pressor test (CPT). Experimental pain was induced using standardized CPT protocols in 20 healthy male volunteers. The skin of the lower part of sternum was exposed to MW with a frequency of 42.25 GHz (active generator); MW within 50-75 GHz frequency range (noise generator); or an inactive MW device (placebo generator) in a random crossover double-blinded manner. Pain threshold, measured using the CPT, was the primary outcome. Other CPT parameters, heart rate, blood pressure, incidence of subjective sensations (paresthesia) during exposure, as well as quality of volunteers' blinding were also recorded. The end points of the condition with exposure to 42.25 GHz, were compared with baseline; exposure to noise 50-75 GHz; and placebo generators. Pain threshold increased during exposure to the 42.25 GHz generator when compared with baseline: median difference (MD), 1.97 seconds (95% confidence interval [CI], 0.35-3.73) and noise generator: MD, 1.27 seconds (95% CI, 0.05-2.33) but not compared with the placebo generator. Time to onset of cold and increasing pain sensations as well as diastolic blood pressure increased under the exposure to the 42.25 GHz generator when compared with baseline and noise generator. Other outcome measures were comparable among the study conditions. We were able to partially confirm the previously suggested hypoalgesic effects of low-intensity electromagnetic MW. However, the effect was indistinguishable from the placebo condition in our investigation.

  13. Cluster randomization and political philosophy.

    PubMed

    Chwang, Eric

    2012-11-01

    In this paper, I will argue that, while the ethical issues raised by cluster randomization can be challenging, they are not new. My thesis divides neatly into two parts. In the first, easier part I argue that many of the ethical challenges posed by cluster randomized human subjects research are clearly present in other types of human subjects research, and so are not novel. In the second, more difficult part I discuss the thorniest ethical challenge for cluster randomized research--cases where consent is genuinely impractical to obtain. I argue that once again these cases require no new analytic insight; instead, we should look to political philosophy for guidance. In other words, the most serious ethical problem that arises in cluster randomized research also arises in political philosophy. © 2011 Blackwell Publishing Ltd.

  14. Parallel Immunizations of Rabbits Using the Same Antigen Yield Antibodies with Similar, but Not Identical, Epitopes

    PubMed Central

    Hjelm, Barbara; Forsström, Björn; Löfblom, John; Rockberg, Johan; Uhlén, Mathias

    2012-01-01

    A problem for the generation of polyclonal antibodies is the potential difficulties for obtaining a renewable resource due to batch-to-batch variations when the same antigen is immunized into several separate animals. Here, we have investigated this issue by determining the epitopes of antibodies generated from parallel immunizations of rabbits with recombinant antigens corresponding to ten human protein targets. The epitopes were mapped by both a suspension bead array approach using overlapping synthetic 15-mer peptides and a bacterial display approach using expression of random fragments of the antigen on the surface of bacteria. Both methods determined antibody binding with the aid of fluorescent-based analysis. In addition, one polyclonal antibody was fractionated by peptide-specific affinity capture for in-depth comparison of epitopes. The results show that the same antigen immunized in several rabbits yields polyclonal antibodies with similar epitopes, but with larger differences in the relative amounts of antibodies to the different epitopes. In some cases, unique epitopes were observed for one of the immunizations. The results suggest that polyclonal antibodies generated by repeated immunizations do not display an identical epitope pattern, although many of the epitopes are similar. PMID:23284606

  15. Modulating ectopic gene expression levels by using retroviral vectors equipped with synthetic promoters.

    PubMed

    Ferreira, Joshua P; Peacock, Ryan W S; Lawhorn, Ingrid E B; Wang, Clifford L

    2011-12-01

    The human cytomegalovirus and elongation factor 1α promoters are constitutive promoters commonly employed by mammalian expression vectors. These promoters generally produce high levels of expression in many types of cells and tissues. To generate a library of synthetic promoters capable of generating a range of low, intermediate, and high expression levels, the TATA and CAAT box elements of these promoters were mutated. Other promoter variants were also generated by random mutagenesis. Evaluation using plasmid vectors integrated at a single site in the genome revealed that these various synthetic promoters were capable of expression levels spanning a 40-fold range. Retroviral vectors were equipped with the synthetic promoters and evaluated for their ability to reproduce the graded expression demonstrated by plasmid integration. A vector with a self-inactivating long terminal repeat could neither reproduce the full range of expression levels nor produce stable expression. Using a second vector design, the different synthetic promoters enabled stable expression over a broad range of expression levels in different cell lines. The online version of this article (doi:10.1007/s11693-011-9089-0) contains supplementary material, which is available to authorized users.

  16. Shot noise in radiobiological systems.

    PubMed

    Datesman, A

    2016-11-01

    As a model for human tissue, this report considers the rate of free radical generation in a dilute solution of water in which a beta-emitting radionuclide is uniformly dispersed. Each decay dissipates a discrete quantity of energy, creating a large number of free radicals in a short time within a small volume determined by the beta particle range. Representing the instantaneous dissipated power as a train of randomly-spaced pulses, the time-averaged dissipated power p¯ and rate of free radical generation g¯ are derived. The analogous result in the theory of electrical circuits is known as the shot noise theorem. The reference dose of X-rays D ref producing an identical rate of free radical generation and level of oxidative stress is shown a) to increase with the square root of the absorbed dose, D, and b) to be far larger than D. This finding may have important consequences for public health in cases where the level of shot noise exceeds some noise floor corresponding to equilibrium biological processes. An estimate of this noise floor is made using the example of potassium-40, a beta-emitting radioisotope universally present in living tissue. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Catalytic micromotor generating self-propelled regular motion through random fluctuation.

    PubMed

    Yamamoto, Daigo; Mukai, Atsushi; Okita, Naoaki; Yoshikawa, Kenichi; Shioi, Akihisa

    2013-07-21

    Most of the current studies on nano∕microscale motors to generate regular motion have adapted the strategy to fabricate a composite with different materials. In this paper, we report that a simple object solely made of platinum generates regular motion driven by a catalytic chemical reaction with hydrogen peroxide. Depending on the morphological symmetry of the catalytic particles, a rich variety of random and regular motions are observed. The experimental trend is well reproduced by a simple theoretical model by taking into account of the anisotropic viscous effect on the self-propelled active Brownian fluctuation.

  18. Catalytic micromotor generating self-propelled regular motion through random fluctuation

    NASA Astrophysics Data System (ADS)

    Yamamoto, Daigo; Mukai, Atsushi; Okita, Naoaki; Yoshikawa, Kenichi; Shioi, Akihisa

    2013-07-01

    Most of the current studies on nano/microscale motors to generate regular motion have adapted the strategy to fabricate a composite with different materials. In this paper, we report that a simple object solely made of platinum generates regular motion driven by a catalytic chemical reaction with hydrogen peroxide. Depending on the morphological symmetry of the catalytic particles, a rich variety of random and regular motions are observed. The experimental trend is well reproduced by a simple theoretical model by taking into account of the anisotropic viscous effect on the self-propelled active Brownian fluctuation.

  19. The influence of an uncertain force environment on reshaping trial-to-trial motor variability.

    PubMed

    Izawa, Jun; Yoshioka, Toshinori; Osu, Rieko

    2014-09-10

    Motor memory is updated to generate ideal movements in a novel environment. When the environment changes every trial randomly, how does the brain incorporate this uncertainty into motor memory? To investigate how the brain adapts to an uncertain environment, we considered a reach adaptation protocol where individuals practiced moving in a force field where a noise was injected. After they had adapted, we measured the trial-to-trial variability in the temporal profiles of the produced hand force. We found that the motor variability was significantly magnified by the adaptation to the random force field. Temporal profiles of the motor variance were significantly dissociable between two different types of random force fields experienced. A model-based analysis suggests that the variability is generated by noise in the gains of the internal model. It further suggests that the trial-to-trial motor variability magnified by the adaptation in a random force field is generated by the uncertainty of the internal model formed in the brain as a result of the adaptation.

  20. Mining the human gut microbiota for effector strains that shape the immune system

    PubMed Central

    Ahern, Philip P.; Faith, Jeremiah J.; Gordon, Jeffrey I.

    2014-01-01

    Summary The gut microbiota co-develops with the immune system beginning at birth. Mining the microbiota for bacterial strains responsible for shaping the structure and dynamic operations of the innate and adaptive arms of the immune system represents a formidable combinatorial problem but one that needs to be overcome to advance mechanistic understanding of microbial community-immune system co-regulation, and in order to develop new diagnostic and therapeutic approaches that promote health. Here, we discuss a scalable, less biased approach for identifying effector strains in complex microbial communities that impact immune function. The approach begins by identifying uncultured human fecal microbiota samples that transmit immune phenotypes to germ-free mice. Clonally-arrayed sequenced collections of bacterial strains are constructed from representative donor microbiota. If the collection transmits phenotypes, effector strains are identified by testing randomly generated subsets with overlapping membership in individually-housed germ-free animals. Detailed mechanistic studies of effector strain-host interactions can then be performed. PMID:24950201

  1. Do gender gaps in education and health affect economic growth? A cross-country study from 1975 to 2010.

    PubMed

    Mandal, Bidisha; Batina, Raymond G; Chen, Wen

    2018-05-01

    We use system-generalized method-of-moments to estimate the effect of gender-specific human capital on economic growth in a cross-country panel of 127 countries between 1975 and 2010. There are several benefits of using this methodology. First, a dynamic lagged dependent econometric model is suitable to address persistence in per capita output. Second, the generalized method-of-moments estimator uses dynamic properties of the data to generate appropriate instrumental variables to address joint endogeneity of the explanatory variables. Third, we allow the measurement error to include unobserved country-specific effect and random noise. We include two gender-disaggregated measures of human capital-education and health. We find that gender gap in health plays a critical role in explaining economic growth in developing countries. Our results provide aggregate evidence that returns to investments in health systematically differ across gender and between low-income and high-income countries. Copyright © 2018 John Wiley & Sons, Ltd.

  2. Response Surface Analysis of Stochastic Network Performance

    DTIC Science & Technology

    1988-12-01

    Bl5/32768/, B16 /65536/,P/2147483647/ XHI-IX/B 16 XALO=(IX-XHI* Bl6 )*A LEFTLO=XALO/ Bl6 FHI=XHI*A+LEFTLO IC=FHI/B1 5 IX-(((XALO-LEFTLO* Bl6 )-P)4(FHI-K*Bl5...ELSE GO TO 50 END IF GO TO 50 100 END D-5 * RANDOM NUMBER GENERATOR FUNCTION RANDOM( IX) INTEGER AP, IX,B15, B16 ,XHI ,XALOI,LEFTLO,FHI ,K DATA A/16807... Bl6 )+K IF(IX.LT.O) IX=IX+P RANDOM-FLOAT( IX) *4.656612875E-1O RETURN END * NETWORK ENTRY and * PATHSET AND CUTSET GENERATION SUBROUTINE SUBROUTINE

  3. Implicit learning in cotton-top tamarins (Saguinus oedipus) and pigeons (Columba livia).

    PubMed

    Locurto, Charles; Fox, Maura; Mazzella, Andrea

    2015-06-01

    There is considerable interest in the conditions under which human subjects learn patterned information without explicit instructions to learn that information. This form of learning, termed implicit or incidental learning, can be approximated in nonhumans by exposing subjects to patterned information but delivering reinforcement randomly, thereby not requiring the subjects to learn the information in order to be reinforced. Following acquisition, nonhuman subjects are queried as to what they have learned about the patterned information. In the present experiment, we extended the study of implicit learning in nonhumans by comparing two species, cotton-top tamarins (Saguinus oedipus) and pigeons (Columba livia), on an implicit learning task that used an artificial grammar to generate the patterned elements for training. We equated the conditions of training and testing as much as possible between the two species. The results indicated that both species demonstrated approximately the same magnitude of implicit learning, judged both by a random test and by choice tests between pairs of training elements. This finding suggests that the ability to extract patterned information from situations in which such learning is not demanded is of longstanding origin.

  4. Some problems of molecular biology of poliovirus infection relevant to pathogenesis, viral spread and evolution.

    PubMed

    Agol, V I; Belov, G A; Cherkasova, E A; Gavrilin, G V; Kolesnikova, M S; Romanova, L I; Tolskaya, E A

    2001-01-01

    Molecular mechanisms of poliovirus reproduction in the human gut remain largely unexplored. Nevertheless, there are grounds to believe that the virus spreads from cell to cell, like that from person to person during natural circulation, and involves a relatively small proportion of the highly heterogeneous viral population generated by the previous host. This mechanism of random sampling is responsible for the majority of fixed mutations, and contributes to the maintenance of a certain level of viral fitness (virulence). In the long term, random sampling may lead to the decrease in fitness and even to extinction of some viral evolutionary branches, explaining cases of self-limiting poliovirus infection in immunodeficient patients. A low propensity of the Sabin viruses for natural circulation may also be a related phenomenon. The trend to decrease in fitness may be interrupted by the appearance of rare, fitter (more virulent) variants, which may be responsible for poliomyelitis outbreaks caused by wild type virus, and for the development of paralytic disease in chronic carriers of the Sabin vaccine. All these evolutionary events are largely stochastic and hence are unpredictable in principle.

  5. Stimulus novelty, task relevance and the visual evoked potential in man

    NASA Technical Reports Server (NTRS)

    Courchesne, E.; Hillyard, S. A.; Galambos, R.

    1975-01-01

    The effect of task relevance on P3 (waveform of human evoked potential) waves and the methodologies used to deal with them are outlined. Visual evoked potentials (VEPs) were recorded from normal adult subjects performing in a visual discrimination task. Subjects counted the number of presentations of the numeral 4 which was interposed rarely and randomly within a sequence of tachistoscopically flashed background stimuli. Intrusive, task-irrelevant (not counted) stimuli were also interspersed rarely and randomly in the sequence of 2s; these stimuli were of two types: simples, which were easily recognizable, and novels, which were completely unrecognizable. It was found that the simples and the counted 4s evoked posteriorly distributed P3 waves while the irrelevant novels evoked large, frontally distributed P3 waves. These large, frontal P3 waves to novels were also found to be preceded by large N2 waves. These findings indicate that the P3 wave is not a unitary phenomenon but should be considered in terms of a family of waves, differing in their brain generators and in their psychological correlates.

  6. The role of ECT in posttraumatic stress disorder: A systematic review.

    PubMed

    Youssef, Nagy A; McCall, W Vaughn; Andrade, Chittaranjan

    2017-02-01

    Posttraumatic stress disorder (PTSD) is associated with a high burden of disability and mortality and frequently is treatment resistant. There is little to offer patients who are not responding to standard interventions. Thus, the objective of this report is to systematically review human data on whether electroconvulsive therapy (ECT) is effective in PTSD. We performed a systematic literature review from 1958 through August 2016 for clinical studies and case reports published in English examining the efficacy of ECT in improving PTSD symptoms. The literature search generated 3 retrospective studies, 1 prospective uncontrolled clinical trial, and 5 case reports. It is not clear, given the small sample size and lack of a large randomized trial, whether favorable outcomes were attributed to improvement in depression (as opposed to core PTSD symptoms). Current efficacy data do not separate conclusively the effects of ECT on PTSD symptoms from those on depression. Randomized controlled trials are necessary to examine the use of ECT in medication-refractory PTSD patients with and without comorbid depression. Subsequent studies may address response in PTSD subtypes, and the use of novel techniques, such as memory reactivation, before ECT.

  7. Banknote authentication using chaotic elements technology

    NASA Astrophysics Data System (ADS)

    Ambadiyil, Sajan; P. S., Krishnendu; Mahadevan Pillai, V. P.; Prabhu, Radhakrishna

    2017-10-01

    The counterfeit banknote is a growing threat to the society since the advancements in the field of computers, scanners and photocopiers, as they have made the duplication process for banknote much simpler. The fake note detection systems developed so far have many drawbacks such as high cost, poor accuracy, unavailability, lack of user-friendliness and lower effectiveness. One possible solution to this problem could be the use of a system uniquely linked to the banknote itself. In this paper, we present a unique identification and authentication process for the banknote using chaotic elements embedded in it. A chaotic element means that the physical elements are formed from a random process independent from human intervention. The chaotic elements used in this paper are the random distribution patterns of such security fibres set into the paper pulp. A unique ID is generated from the fibre pattern obtained from UV image of the note, which can be verified by any person who receives the banknote to decide whether the banknote is authentic or not. Performance analysis of the system is also studied in this paper.

  8. Multiplicative processes in visual cognition

    NASA Astrophysics Data System (ADS)

    Credidio, H. F.; Teixeira, E. N.; Reis, S. D. S.; Moreira, A. A.; Andrade, J. S.

    2014-03-01

    The Central Limit Theorem (CLT) is certainly one of the most important results in the field of statistics. The simple fact that the addition of many random variables can generate the same probability curve, elucidated the underlying process for a broad spectrum of natural systems, ranging from the statistical distribution of human heights to the distribution of measurement errors, to mention a few. An extension of the CLT can be applied to multiplicative processes, where a given measure is the result of the product of many random variables. The statistical signature of these processes is rather ubiquitous, appearing in a diverse range of natural phenomena, including the distributions of incomes, body weights, rainfall, and fragment sizes in a rock crushing process. Here we corroborate results from previous studies which indicate the presence of multiplicative processes in a particular type of visual cognition task, namely, the visual search for hidden objects. Precisely, our results from eye-tracking experiments show that the distribution of fixation times during visual search obeys a log-normal pattern, while the fixational radii of gyration follow a power-law behavior.

  9. Generation of cell lines for drug discovery through random activation of gene expression: application to the human histamine H3 receptor.

    PubMed

    Song, J; Doucette, C; Hanniford, D; Hunady, K; Wang, N; Sherf, B; Harrington, J J; Brunden, K R; Stricker-Krongrad, A

    2005-06-01

    Target-based high-throughput screening (HTS) plays an integral role in drug discovery. The implementation of HTS assays generally requires high expression levels of the target protein, and this is typically accomplished using recombinant cDNA methodologies. However, the isolated gene sequences to many drug targets have intellectual property claims that restrict the ability to implement drug discovery programs. The present study describes the pharmacological characterization of the human histamine H3 receptor that was expressed using random activation of gene expression (RAGE), a technology that over-expresses proteins by up-regulating endogenous genes rather than introducing cDNA expression vectors into the cell. Saturation binding analysis using [125I]iodoproxyfan and RAGE-H3 membranes revealed a single class of binding sites with a K(D) value of 0.77 nM and a B(max) equal to 756 fmol/mg of protein. Competition binding studies showed that the rank order of potency for H3 agonists was N(alpha)-methylhistamine approximately (R)-alpha- methylhistamine > histamine and that the rank order of potency for H3 antagonists was clobenpropit > iodophenpropit > thioperamide. The same rank order of potency for H3 agonists and antagonists was observed in the functional assays as in the binding assays. The Fluorometic Imaging Plate Reader assays in RAGE-H3 cells gave high Z' values for agonist and antagonist screening, respectively. These results reveal that the human H3 receptor expressed with the RAGE technology is pharmacologically comparable to that expressed through recombinant methods. Moreover, the level of expression of the H3 receptor in the RAGE-H3 cells is suitable for HTS and secondary assays.

  10. Crowdsourcing reproducible seizure forecasting in human and canine epilepsy

    PubMed Central

    Wagenaar, Joost; Abbot, Drew; Adkins, Phillip; Bosshard, Simone C.; Chen, Min; Tieng, Quang M.; He, Jialune; Muñoz-Almaraz, F. J.; Botella-Rocamora, Paloma; Pardo, Juan; Zamora-Martinez, Francisco; Hills, Michael; Wu, Wei; Korshunova, Iryna; Cukierski, Will; Vite, Charles; Patterson, Edward E.; Litt, Brian; Worrell, Gregory A.

    2016-01-01

    See Mormann and Andrzejak (doi:10.1093/brain/aww091) for a scientific commentary on this article.   Accurate forecasting of epileptic seizures has the potential to transform clinical epilepsy care. However, progress toward reliable seizure forecasting has been hampered by lack of open access to long duration recordings with an adequate number of seizures for investigators to rigorously compare algorithms and results. A seizure forecasting competition was conducted on kaggle.com using open access chronic ambulatory intracranial electroencephalography from five canines with naturally occurring epilepsy and two humans undergoing prolonged wide bandwidth intracranial electroencephalographic monitoring. Data were provided to participants as 10-min interictal and preictal clips, with approximately half of the 60 GB data bundle labelled (interictal/preictal) for algorithm training and half unlabelled for evaluation. The contestants developed custom algorithms and uploaded their classifications (interictal/preictal) for the unknown testing data, and a randomly selected 40% of data segments were scored and results broadcasted on a public leader board. The contest ran from August to November 2014, and 654 participants submitted 17 856 classifications of the unlabelled test data. The top performing entry scored 0.84 area under the classification curve. Following the contest, additional held-out unlabelled data clips were provided to the top 10 participants and they submitted classifications for the new unseen data. The resulting area under the classification curves were well above chance forecasting, but did show a mean 6.54 ± 2.45% (min, max: 0.30, 20.2) decline in performance. The kaggle.com model using open access data and algorithms generated reproducible research that advanced seizure forecasting. The overall performance from multiple contestants on unseen data was better than a random predictor, and demonstrates the feasibility of seizure forecasting in canine and human epilepsy. PMID:27034258

  11. Simulation of flight maneuver-load distributions by utilizing stationary, non-Gaussian random load histories

    NASA Technical Reports Server (NTRS)

    Leybold, H. A.

    1971-01-01

    Random numbers were generated with the aid of a digital computer and transformed such that the probability density function of a discrete random load history composed of these random numbers had one of the following non-Gaussian distributions: Poisson, binomial, log-normal, Weibull, and exponential. The resulting random load histories were analyzed to determine their peak statistics and were compared with cumulative peak maneuver-load distributions for fighter and transport aircraft in flight.

  12. Two spatial light modulator system for laboratory simulation of random beam propagation in random media.

    PubMed

    Wang, Fei; Toselli, Italo; Korotkova, Olga

    2016-02-10

    An optical system consisting of a laser source and two independent consecutive phase-only spatial light modulators (SLMs) is shown to accurately simulate a generated random beam (first SLM) after interaction with a stationary random medium (second SLM). To illustrate the range of possibilities, a recently introduced class of random optical frames is examined on propagation in free space and several weak turbulent channels with Kolmogorov and non-Kolmogorov statistics.

  13. Evolving random fractal Cantor superlattices for the infrared using a genetic algorithm

    PubMed Central

    Bossard, Jeremy A.; Lin, Lan; Werner, Douglas H.

    2016-01-01

    Ordered and chaotic superlattices have been identified in Nature that give rise to a variety of colours reflected by the skin of various organisms. In particular, organisms such as silvery fish possess superlattices that reflect a broad range of light from the visible to the UV. Such superlattices have previously been identified as ‘chaotic’, but we propose that apparent ‘chaotic’ natural structures, which have been previously modelled as completely random structures, should have an underlying fractal geometry. Fractal geometry, often described as the geometry of Nature, can be used to mimic structures found in Nature, but deterministic fractals produce structures that are too ‘perfect’ to appear natural. Introducing variability into fractals produces structures that appear more natural. We suggest that the ‘chaotic’ (purely random) superlattices identified in Nature are more accurately modelled by multi-generator fractals. Furthermore, we introduce fractal random Cantor bars as a candidate for generating both ordered and ‘chaotic’ superlattices, such as the ones found in silvery fish. A genetic algorithm is used to evolve optimal fractal random Cantor bars with multiple generators targeting several desired optical functions in the mid-infrared and the near-infrared. We present optimized superlattices demonstrating broadband reflection as well as single and multiple pass bands in the near-infrared regime. PMID:26763335

  14. Random field assessment of nanoscopic inhomogeneity of bone

    PubMed Central

    Dong, X. Neil; Luo, Qing; Sparkman, Daniel M.; Millwater, Harry R.; Wang, Xiaodu

    2010-01-01

    Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to present the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. PMID:20817128

  15. RANDOM MATRIX DIAGONALIZATION--A COMPUTER PROGRAM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuchel, K.; Greibach, R.J.; Porter, C.E.

    A computer prograra is described which generates random matrices, diagonalizes them and sorts appropriately the resulting eigenvalues and eigenvector components. FAP and FORTRAN listings for the IBM 7090 computer are included. (auth)

  16. A Layered Approach for Robust Spatial Virtual Human Pose Reconstruction Using a Still Image

    PubMed Central

    Guo, Chengyu; Ruan, Songsong; Liang, Xiaohui; Zhao, Qinping

    2016-01-01

    Pedestrian detection and human pose estimation are instructive for reconstructing a three-dimensional scenario and for robot navigation, particularly when large amounts of vision data are captured using various data-recording techniques. Using an unrestricted capture scheme, which produces occlusions or breezing, the information describing each part of a human body and the relationship between each part or even different pedestrians must be present in a still image. Using this framework, a multi-layered, spatial, virtual, human pose reconstruction framework is presented in this study to recover any deficient information in planar images. In this framework, a hierarchical parts-based deep model is used to detect body parts by using the available restricted information in a still image and is then combined with spatial Markov random fields to re-estimate the accurate joint positions in the deep network. Then, the planar estimation results are mapped onto a virtual three-dimensional space using multiple constraints to recover any deficient spatial information. The proposed approach can be viewed as a general pre-processing method to guide the generation of continuous, three-dimensional motion data. The experiment results of this study are used to describe the effectiveness and usability of the proposed approach. PMID:26907289

  17. Immunoglobulin gene usage in the human anti-pathogen response.

    PubMed

    Newkirk, M M; Rioux, J D

    1995-09-01

    The human antibody response to foreign pathogens is generated to a relatively small number of target surface proteins and carbohydrates that nonetheless have an extensive array of epitopes. The study of human monoclonal antibodies to different pathogens shows that there are a diversity of mechanisms used to generate a sufficient repertoire of antibodies to combat the invading pathogens. Although many different immunoglobulin gene elements are used to construct the anti-pathogen response, some elements are used more often than would be expected if all elements were used randomly. For example, the immune response to Haemophilus influenzae polysaccharide appears to be quite narrow, being restricted primarily to a specific heavy-chain gene, 3-15, and a lambda light-chain family II member, 4A. In contrast, for the immune response to cytomegalovirus proteins, a wider group of gene elements is needed. It is also surprising that despite an investigator bias for IgG- rather than IgM-secreting immortal B cells (because of their high affinity and neutralizing abilities), 26% of light chains and 13% of heavy chains showed a very low level of somatic mutation, equivalent to an IgM molecule that has not undergone affinity maturation. Although some highly mutated IgG molecules are present in the anti-pathogen response, most of the monoclonal antibodies specific for viruses or bacteria have a level of somatic hypermutation similar to that of the adult IgM repertoire. A number of studies have shown that there are similarities in the antibody responses to pathogens and to self (autoantibodies).(ABSTRACT TRUNCATED AT 250 WORDS)

  18. Private randomness expansion with untrusted devices

    NASA Astrophysics Data System (ADS)

    Colbeck, Roger; Kent, Adrian

    2011-03-01

    Randomness is an important resource for many applications, from gambling to secure communication. However, guaranteeing that the output from a candidate random source could not have been predicted by an outside party is a challenging task, and many supposedly random sources used today provide no such guarantee. Quantum solutions to this problem exist, for example a device which internally sends a photon through a beamsplitter and observes on which side it emerges, but, presently, such solutions require the user to trust the internal workings of the device. Here, we seek to go beyond this limitation by asking whether randomness can be generated using untrusted devices—even ones created by an adversarial agent—while providing a guarantee that no outside party (including the agent) can predict it. Since this is easily seen to be impossible unless the user has an initially private random string, the task we investigate here is private randomness expansion. We introduce a protocol for private randomness expansion with untrusted devices which is designed to take as input an initially private random string and produce as output a longer private random string. We point out that private randomness expansion protocols are generally vulnerable to attacks that can render the initial string partially insecure, even though that string is used only inside a secure laboratory; our protocol is designed to remove this previously unconsidered vulnerability by privacy amplification. We also discuss extensions of our protocol designed to generate an arbitrarily long random string from a finite initially private random string. The security of these protocols against the most general attacks is left as an open question.

  19. Determining who responds better to a computer- vs. human-delivered physical activity intervention: results from the community health advice by telephone (CHAT) trial

    PubMed Central

    2013-01-01

    Background Little research has explored who responds better to an automated vs. human advisor for health behaviors in general, and for physical activity (PA) promotion in particular. The purpose of this study was to explore baseline factors (i.e., demographics, motivation, interpersonal style, and external resources) that moderate intervention efficacy delivered by either a human or automated advisor. Methods Data were from the CHAT Trial, a 12-month randomized controlled trial to increase PA among underactive older adults (full trial N = 218) via a human advisor or automated interactive voice response advisor. Trial results indicated significant increases in PA in both interventions by 12 months that were maintained at 18-months. Regression was used to explore moderation of the two interventions. Results Results indicated amotivation (i.e., lack of intent in PA) moderated 12-month PA (d = 0.55, p < 0.01) and private self-consciousness (i.e., tendency to attune to one’s own inner thoughts and emotions) moderated 18-month PA (d = 0.34, p < 0.05) but a variety of other factors (e.g., demographics) did not (p > 0.12). Conclusions Results provide preliminary evidence for generating hypotheses about pathways for supporting later clinical decision-making with regard to the use of either human- vs. computer-delivered interventions for PA promotion. PMID:24053756

  20. Machine vs. human translation of SNOMED CT terms.

    PubMed

    Schulz, Stefan; Bernhardt-Melischnig, Johannes; Kreuzthaler, Markus; Daumke, Philipp; Boeker, Martin

    2013-01-01

    In the context of past and current SNOMED CT translation projects we compare three kinds of SNOMED CT translations from English to German by: (t1) professional medical translators; (t2) a free Web-based machine translation service; (t3) medical students. 500 SNOMED CT fully specified names from the (English) International release were randomly selected. Based on this, German translations t1, t2, and t3 were generated. A German and an Austrian physician rated the translations for linguistic correctness and content fidelity. Kappa for inter-rater reliability was 0.4 for linguistic correctness and 0.23 for content fidelity. Average ratings of linguistic correctness did not differ significantly between human translation scenarios. Content fidelity was rated slightly better for student translators compared to professional translators. Comparing machine to human translation, the linguistic correctness differed about 0.5 scale units in favour of the human translation and about 0.25 regarding content fidelity, equally in favour of the human translation. The results demonstrate that low-cost translation solutions of medical terms may produce surprisingly good results. Although we would not recommend low-cost translation for producing standardized preferred terms, this approach can be useful for creating additional language-specific entry terms. This may serve several important use cases. We also recommend testing this method to bootstrap a crowdsourcing process, by which term translations are gathered, improved, maintained, and rated by the user community.

  1. Objective Model Selection for Identifying the Human Feedforward Response in Manual Control.

    PubMed

    Drop, Frank M; Pool, Daan M; van Paassen, Marinus Rene M; Mulder, Max; Bulthoff, Heinrich H

    2018-01-01

    Realistic manual control tasks typically involve predictable target signals and random disturbances. The human controller (HC) is hypothesized to use a feedforward control strategy for target-following, in addition to feedback control for disturbance-rejection. Little is known about human feedforward control, partly because common system identification methods have difficulty in identifying whether, and (if so) how, the HC applies a feedforward strategy. In this paper, an identification procedure is presented that aims at an objective model selection for identifying the human feedforward response, using linear time-invariant autoregressive with exogenous input models. A new model selection criterion is proposed to decide on the model order (number of parameters) and the presence of feedforward in addition to feedback. For a range of typical control tasks, it is shown by means of Monte Carlo computer simulations that the classical Bayesian information criterion (BIC) leads to selecting models that contain a feedforward path from data generated by a pure feedback model: "false-positive" feedforward detection. To eliminate these false-positives, the modified BIC includes an additional penalty on model complexity. The appropriate weighting is found through computer simulations with a hypothesized HC model prior to performing a tracking experiment. Experimental human-in-the-loop data will be considered in future work. With appropriate weighting, the method correctly identifies the HC dynamics in a wide range of control tasks, without false-positive results.

  2. The Microcosm within: An interview with William B. Miller, Jr., on the Extended Hologenome theory of evolution.

    PubMed

    Hunt, Tam

    2015-01-01

    There is a singular unifying reality underlying every biologic interaction on our planet. In immunology, that which does not kill you makes you different. -William B. Miller, Jr. We are experiencing a revolution in our understanding of inner space on a par with our exponentially increasing understanding of outer space. In biology, we are learning that the genetic and epigenetic complexity within organisms is far deeper than suspected. This is a key theme in William B. Miller Jr.'s book, The Microcosm Within: Evolution and Extinction in the Hologenome. We are learning also that a focus on the human genome alone is misleading when it comes to who we really are as biological entities, and in terms of how we and other creatures have evolved. Rather than being defined by the human genome alone, we are instead defined by the "hologenome," the sum of the human genome and the far larger genetic endowment of the microbiome and symbiotic communities that reside within and around us. Miller is a medical doctor previously in private practice in Pennsylvania and Phoenix, Arizona. This book is his first foray into evolutionary theory. His book could have been titled "The Origin of Variation" because this is his primary focus. He accepts that natural selection plays a role in evolution, but he demotes this mechanism to a less important role than the Modern Synthesis suggests. His main gripe, however, concerns random variation. He argues that random variation is unable to explain the origin and evolution of biological forms that we see in the world around us and in the historical record. Miller suggests that, rather than random variation as the engine of novelty, there is a creative impulse at the heart of cellular life, and even at the level of the genetic aggregate, that generates novelty on a regular basis. I probe this assertion in the interview below. He also highlights the strong role of "exogenous genetic assault" in variation and in his immunological model of evolution.

  3. Evidence for attractors in English intonation.

    PubMed

    Braun, Bettina; Kochanski, Greg; Grabe, Esther; Rosner, Burton S

    2006-06-01

    Although the pitch of the human voice is continuously variable, some linguists contend that intonation in speech is restricted to a small, limited set of patterns. This claim is tested by asking subjects to mimic a block of 100 randomly generated intonation contours and then to imitate themselves in several successive sessions. The produced f0 contours gradually converge towards a limited set of distinct, previously recognized basic English intonation patterns. These patterns are "attractors" in the space of possible intonation English contours. The convergence does not occur immediately. Seven of the ten participants show continued convergence toward their attractors after the first iteration. Subjects retain and use information beyond phonological contrasts, suggesting that intonational phonology is not a complete description of their mental representation of intonation.

  4. Acute effects of cigarette smoke exposure on experimental skin flaps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nolan, J.; Jenkins, R.A.; Kurihara, K.

    1985-04-01

    Random vascular patterned caudally based McFarlane-type skin flaps were elevated in groups of Fischer 344 rats. Groups of rats were then acutely exposed on an intermittent basis to smoke generated from well-characterized research filter cigarettes. Previously developed smoke inhalation exposure protocols were employed using a Maddox-ORNL inhalation exposure system. Rats that continued smoke exposure following surgery showed a significantly greater mean percent area of flap necrosis compared with sham-exposed groups or control groups not exposed. The possible pathogenesis of this observation as well as considerations and correlations with chronic human smokers are discussed. Increased risks of flap necrosis by smokingmore » in the perioperative period are suggested by this study.« less

  5. A random set scoring model for prioritization of disease candidate genes using protein complexes and data-mining of GeneRIF, OMIM and PubMed records.

    PubMed

    Jiang, Li; Edwards, Stefan M; Thomsen, Bo; Workman, Christopher T; Guldbrandtsen, Bernt; Sørensen, Peter

    2014-09-24

    Prioritizing genetic variants is a challenge because disease susceptibility loci are often located in genes of unknown function or the relationship with the corresponding phenotype is unclear. A global data-mining exercise on the biomedical literature can establish the phenotypic profile of genes with respect to their connection to disease phenotypes. The importance of protein-protein interaction networks in the genetic heterogeneity of common diseases or complex traits is becoming increasingly recognized. Thus, the development of a network-based approach combined with phenotypic profiling would be useful for disease gene prioritization. We developed a random-set scoring model and implemented it to quantify phenotype relevance in a network-based disease gene-prioritization approach. We validated our approach based on different gene phenotypic profiles, which were generated from PubMed abstracts, OMIM, and GeneRIF records. We also investigated the validity of several vocabulary filters and different likelihood thresholds for predicted protein-protein interactions in terms of their effect on the network-based gene-prioritization approach, which relies on text-mining of the phenotype data. Our method demonstrated good precision and sensitivity compared with those of two alternative complex-based prioritization approaches. We then conducted a global ranking of all human genes according to their relevance to a range of human diseases. The resulting accurate ranking of known causal genes supported the reliability of our approach. Moreover, these data suggest many promising novel candidate genes for human disorders that have a complex mode of inheritance. We have implemented and validated a network-based approach to prioritize genes for human diseases based on their phenotypic profile. We have devised a powerful and transparent tool to identify and rank candidate genes. Our global gene prioritization provides a unique resource for the biological interpretation of data from genome-wide association studies, and will help in the understanding of how the associated genetic variants influence disease or quantitative phenotypes.

  6. Rationale and Design of a Clinical Trial to Evaluate the Safety and Efficacy of Intracoronary Infusion of Allogeneic Human Cardiac Stem Cells in Patients With Acute Myocardial Infarction and Left Ventricular Dysfunction: The Randomized Multicenter Double-Blind Controlled CAREMI Trial (Cardiac Stem Cells in Patients With Acute Myocardial Infarction).

    PubMed

    Sanz-Ruiz, Ricardo; Casado Plasencia, Ana; Borlado, Luis R; Fernández-Santos, María Eugenia; Al-Daccak, Reem; Claus, Piet; Palacios, Itziar; Sádaba, Rafael; Charron, Dominique; Bogaert, Jan; Mulet, Miguel; Yotti, Raquel; Gilaberte, Immaculada; Bernad, Antonio; Bermejo, Javier; Janssens, Stefan; Fernández-Avilés, Franciso

    2017-06-23

    Stem cell therapy has increased the therapeutic armamentarium in the fight against ischemic heart disease and heart failure. The administration of exogenous stem cells has been investigated in patients suffering an acute myocardial infarction, with the final aim of salvaging jeopardized myocardium and preventing left ventricular adverse remodeling and functional deterioration. However, phase I and II clinical trials with autologous and first-generation stem cells have yielded inconsistent benefits and mixed results. In the search for new and more efficient cellular regenerative products, interesting cardioprotective, immunoregulatory, and cardioregenerative properties have been demonstrated for human cardiac stem cells. On the other hand, allogeneic cells show several advantages over autologous sources: they can be produced in large quantities, easily administered off-the-shelf early after an acute myocardial infarction, comply with stringent criteria for product homogeneity, potency, and quality control, and may exhibit a distinctive immunologic behavior. With a promising preclinical background, CAREMI (Cardiac Stem Cells in Patients With Acute Myocardial Infarction) has been designed as a double-blind, 2:1 randomized, controlled, and multicenter clinical trial that will evaluate the safety, feasibility, and efficacy of intracoronary delivery of allogeneic human cardiac stem cell in 55 patients with large acute myocardial infarction, left ventricular dysfunction, and at high risk of developing heart failure. This phase I/II clinical trial represents a novel experience in humans with allogeneic cardiac stem cell in a rigorously imaging-based selected group of acute myocardial infarction patients, with detailed safety immunologic assessments and magnetic resonance imaging-based efficacy end points. URL: http://www.clinicaltrials.gov. Unique identifier: NCT02439398. © 2017 American Heart Association, Inc.

  7. Pigeons' Choices between Fixed-Interval and Random-Interval Schedules: Utility of Variability?

    ERIC Educational Resources Information Center

    Andrzejewski, Matthew E.; Cardinal, Claudia D.; Field, Douglas P.; Flannery, Barbara A.; Johnson, Michael; Bailey, Kathleen; Hineline, Philip N.

    2005-01-01

    Pigeons' choosing between fixed-interval and random-interval schedules of reinforcement was investigated in three experiments using a discrete-trial procedure. In all three experiments, the random-interval schedule was generated by sampling a probability distribution at an interval (and in multiples of the interval) equal to that of the…

  8. Theoretical deposition of carcinogenic particle aggregates in the upper respiratory tract.

    PubMed

    Sturm, Robert

    2013-10-01

    Numerous particles suspended in the atmosphere are composed of smaller particular components that form aggregates with highly irregular shape. Such aggregates, among which dusts and soot are the most prominent examples, may be taken up into the respiratory tract and, in the worst case, initiate a malignant transformation of lung cells. Particle aggregates were theoretically modelled by using small spheres with equal diameters (1 nm) and arranging them randomly. This procedure resulted in the generation of various aggregate shapes (chain-like, loose, compact), for which essential parameters such as dynamic shape factors, χ, and aerodynamic diameters, dae , were computed. Deposition of aggregates consisting of 10, 50, 100, and 1,000 nano-spheres was simulated for the uppermost parts of the human respiratory system (extrathoracic region and airway generation 0 to 4), thereby distinguishing between sitting and light-work breathing as well as between nasal and oral inhalation. Based upon the modelling results, aggregate deposition in the human respiratory system can be described as a function of (I) aerodynamic diameter; (II) inhaled particle position within the airway system; and (III) breathing conditions. Therefore, highest deposition values were obtained for nano-scale aggregates (<10 nm), whereas larger aggregates exhibited slightly to significantly reduced deposition probabilities. Extrathoracic regions and uppermost bronchi (generations 0 to 1) were marked by most effective particle capture. Any increase of inhaled air volumes and reduction of breathing times resulted in an enhancement of deposition probabilities of larger particles. Based on the results derived from this study it may be concluded that small particle aggregates are accumulated in the uppermost compartments of the human respiratory tract, where they may unfold their unwholesome potential. In the case of carcinogenic particles being stored in epithelial cells for a longer time span, malignant transformations starting with the formation of cancerous cells and ending with the growth of a tumour have to be assumed.

  9. The origin of human complex diversity: Stochastic epistatic modules and the intrinsic compatibility between distributional robustness and phenotypic changeability.

    PubMed

    Ijichi, Shinji; Ijichi, Naomi; Ijichi, Yukina; Imamura, Chikako; Sameshima, Hisami; Kawaike, Yoichi; Morioka, Hirofumi

    2018-01-01

    The continuing prevalence of a highly heritable and hypo-reproductive extreme tail of a human neurobehavioral quantitative diversity suggests the possibility that the reproductive majority retains the genetic mechanism for the extremes. From the perspective of stochastic epistasis, the effect of an epistatic modifier variant can randomly vary in both phenotypic value and effect direction among the careers depending on the genetic individuality, and the modifier careers are ubiquitous in the population distribution. The neutrality of the mean genetic effect in the careers warrants the survival of the variant under selection pressures. Functionally or metabolically related modifier variants make an epistatic network module and dozens of modules may be involved in the phenotype. To assess the significance of stochastic epistasis, a simplified module-based model was employed. The individual repertoire of the modifier variants in a module also participates in the genetic individuality which determines the genetic contribution of each modifier in the career. Because the entire contribution of a module to the phenotypic outcome is consequently unpredictable in the model, the module effect represents the total contribution of the related modifiers as a stochastic unit in the simulations. As a result, the intrinsic compatibility between distributional robustness and quantitative changeability could mathematically be simulated using the model. The artificial normal distribution shape in large-sized simulations was preserved in each generation even if the lowest fitness tail was un-reproductive. The robustness of normality beyond generations is analogous to the real situations of human complex diversity including neurodevelopmental conditions. The repeated regeneration of the un-reproductive extreme tail may be inevitable for the reproductive majority's competence to survive and change, suggesting implications of the extremes for others. Further model-simulations to illustrate how the fitness of extreme individuals can be low through generations may be warranted to increase the credibility of this stochastic epistasis model.

  10. The effect of atomoxetine on random and directed exploration in humans.

    PubMed

    Warren, Christopher M; Wilson, Robert C; van der Wee, Nic J; Giltay, Eric J; van Noorden, Martijn S; Cohen, Jonathan D; Nieuwenhuis, Sander

    2017-01-01

    The adaptive regulation of the trade-off between pursuing a known reward (exploitation) and sampling lesser-known options in search of something better (exploration) is critical for optimal performance. Theory and recent empirical work suggest that humans use at least two strategies for solving this dilemma: a directed strategy in which choices are explicitly biased toward information seeking, and a random strategy in which decision noise leads to exploration by chance. Here we examined the hypothesis that random exploration is governed by the neuromodulatory locus coeruleus-norepinephrine system. We administered atomoxetine, a norepinephrine transporter blocker that increases extracellular levels of norepinephrine throughout the cortex, to 22 healthy human participants in a double-blind crossover design. We examined the effect of treatment on performance in a gambling task designed to produce distinct measures of directed exploration and random exploration. In line with our hypothesis we found an effect of atomoxetine on random, but not directed exploration. However, contrary to expectation, atomoxetine reduced rather than increased random exploration. We offer three potential explanations of our findings, involving the non-linear relationship between tonic NE and cognitive performance, the interaction of atomoxetine with other neuromodulators, and the possibility that atomoxetine affected phasic norepinephrine activity more so than tonic norepinephrine activity.

  11. Searching for Survivors through Random Human-Body Movement Outdoors by Continuous-Wave Radar Array

    PubMed Central

    Liu, Miao; Li, Zhao; Liang, Fulai; Jing, Xijing; Lu, Guohua; Wang, Jianqi

    2016-01-01

    It is a major challenge to search for survivors after chemical or nuclear leakage or explosions. At present, biological radar can be used to achieve this goal by detecting the survivor’s respiration signal. However, owing to the random posture of an injured person at a rescue site, the radar wave may directly irradiate the person’s head or feet, in which it is difficult to detect the respiration signal. This paper describes a multichannel-based antenna array technology, which forms an omnidirectional detection system via 24-GHz Doppler biological radar, to address the random positioning relative to the antenna of an object to be detected. Furthermore, since the survivors often have random body movement such as struggling and twitching, the slight movements of the body caused by breathing are obscured by these movements. Therefore, a method is proposed to identify random human-body movement by utilizing multichannel information to calculate the background variance of the environment in combination with a constant-false-alarm-rate detector. The conducted outdoor experiments indicate that the system can realize the omnidirectional detection of random human-body movement and distinguish body movement from environmental interference such as movement of leaves and grass. The methods proposed in this paper will be a promising way to search for survivors outdoors. PMID:27073860

  12. Searching for Survivors through Random Human-Body Movement Outdoors by Continuous-Wave Radar Array.

    PubMed

    Li, Chuantao; Chen, Fuming; Qi, Fugui; Liu, Miao; Li, Zhao; Liang, Fulai; Jing, Xijing; Lu, Guohua; Wang, Jianqi

    2016-01-01

    It is a major challenge to search for survivors after chemical or nuclear leakage or explosions. At present, biological radar can be used to achieve this goal by detecting the survivor's respiration signal. However, owing to the random posture of an injured person at a rescue site, the radar wave may directly irradiate the person's head or feet, in which it is difficult to detect the respiration signal. This paper describes a multichannel-based antenna array technology, which forms an omnidirectional detection system via 24-GHz Doppler biological radar, to address the random positioning relative to the antenna of an object to be detected. Furthermore, since the survivors often have random body movement such as struggling and twitching, the slight movements of the body caused by breathing are obscured by these movements. Therefore, a method is proposed to identify random human-body movement by utilizing multichannel information to calculate the background variance of the environment in combination with a constant-false-alarm-rate detector. The conducted outdoor experiments indicate that the system can realize the omnidirectional detection of random human-body movement and distinguish body movement from environmental interference such as movement of leaves and grass. The methods proposed in this paper will be a promising way to search for survivors outdoors.

  13. Pulmonary rehabilitation after total laryngectomy: a randomized cross-over clinical trial comparing two different heat and moisture exchangers (HMEs).

    PubMed

    Herranz, Jesús; Espiño, María Alvarez; Morado, Carolina Ogen

    2013-09-01

    Post-laryngectomy heat and moisture exchanger (HME) use is known to have a beneficial effect on tracheal climate, pulmonary symptoms and related aspects. This study aims to investigate differences in clinical effects between the first and second generation Provox HMEs. The second generation (Provox XtraHME) has better humidification properties than the first generation (Provox HME), and has been shown to further improve tracheal climate. Forty-five laryngectomized patients, who were already using an HME, participated in a prospective, randomized cross-over clinical study in which each HME was used for 6 weeks. Results showed that for most parameters studied, the second generation HME performed equally well or better than the first generation HME. The improvement in tracheal climate translated into patients reporting significantly less tracheal dryness with the second generation than with the first generation (p = 0.039). Using an HME with better humidification properties is related to a reduction in tracheal dryness in our study population.

  14. Dynamic analysis of a pumped-storage hydropower plant with random power load

    NASA Astrophysics Data System (ADS)

    Zhang, Hao; Chen, Diyi; Xu, Beibei; Patelli, Edoardo; Tolo, Silvia

    2018-02-01

    This paper analyzes the dynamic response of a pumped-storage hydropower plant in generating mode. Considering the elastic water column effects in the penstock, a linearized reduced order dynamic model of the pumped-storage hydropower plant is used in this paper. As the power load is always random, a set of random generator electric power output is introduced to research the dynamic behaviors of the pumped-storage hydropower plant. Then, the influences of the PI gains on the dynamic characteristics of the pumped-storage hydropower plant with the random power load are analyzed. In addition, the effects of initial power load and PI parameters on the stability of the pumped-storage hydropower plant are studied in depth. All of the above results will provide theoretical guidance for the study and analysis of the pumped-storage hydropower plant.

  15. Random Telegraph Signal-Like Fluctuation Created by Fowler-Nordheim Stress in Gate Induced Drain Leakage Current of the Saddle Type Dynamic Random Access Memory Cell Transistor

    NASA Astrophysics Data System (ADS)

    Kim, Heesang; Oh, Byoungchan; Kim, Kyungdo; Cha, Seon-Yong; Jeong, Jae-Goan; Hong, Sung-Joo; Lee, Jong-Ho; Park, Byung-Gook; Shin, Hyungcheol

    2010-09-01

    We generated traps inside gate oxide in gate-drain overlap region of recess channel type dynamic random access memory (DRAM) cell transistor through Fowler-Nordheim (FN) stress, and observed gate induced drain leakage (GIDL) current both in time domain and in frequency domain. It was found that the trap inside gate oxide could generate random telegraph signal (RTS)-like fluctuation in GIDL current. The characteristics of that fluctuation were similar to those of RTS-like fluctuation in GIDL current observed in the non-stressed device. This result shows the possibility that the trap causing variable retention time (VRT) in DRAM data retention time can be located inside gate oxide like channel RTS of metal-oxide-semiconductor field-effect transistors (MOSFETs).

  16. Seminar on Understanding Digital Control and Analysis in Vibration Test Systems, part 2

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A number of techniques for dealing with important technical aspects of the random vibration control problem are described. These include the generation of pseudo-random and true random noise, the control spectrum estimation problem, the accuracy/speed tradeoff, and control correction strategies. System hardware, the operator-system interface, safety features, and operational capabilities of sophisticated digital random vibration control systems are also discussed.

  17. Examining the Preparatory Effects of Problem Generation and Solution Generation on Learning from Instruction

    ERIC Educational Resources Information Center

    Kapur, Manu

    2018-01-01

    The goal of this paper is to isolate the preparatory effects of problem-generation from solution generation in problem-posing contexts, and their underlying mechanisms on learning from instruction. Using a randomized-controlled design, students were assigned to one of two conditions: (a) problem-posing with solution generation, where they…

  18. Recurrence of random walks with long-range steps generated by fractional Laplacian matrices on regular networks and simple cubic lattices

    NASA Astrophysics Data System (ADS)

    Michelitsch, T. M.; Collet, B. A.; Riascos, A. P.; Nowakowski, A. F.; Nicolleau, F. C. G. A.

    2017-12-01

    We analyze a Markovian random walk strategy on undirected regular networks involving power matrix functions of the type L\\frac{α{2}} where L indicates a ‘simple’ Laplacian matrix. We refer to such walks as ‘fractional random walks’ with admissible interval 0<α ≤slant 2 . We deduce probability-generating functions (network Green’s functions) for the fractional random walk. From these analytical results we establish a generalization of Polya’s recurrence theorem for fractional random walks on d-dimensional infinite lattices: The fractional random walk is transient for dimensions d > α (recurrent for d≤slantα ) of the lattice. As a consequence, for 0<α< 1 the fractional random walk is transient for all lattice dimensions d=1, 2, .. and in the range 1≤slantα < 2 for dimensions d≥slant 2 . Finally, for α=2 , Polya’s classical recurrence theorem is recovered, namely the walk is transient only for lattice dimensions d≥slant 3 . The generalization of Polya’s recurrence theorem remains valid for the class of random walks with Lévy flight asymptotics for long-range steps. We also analyze the mean first passage probabilities, mean residence times, mean first passage times and global mean first passage times (Kemeny constant) for the fractional random walk. For an infinite 1D lattice (infinite ring) we obtain for the transient regime 0<α<1 closed form expressions for the fractional lattice Green’s function matrix containing the escape and ever passage probabilities. The ever passage probabilities (fractional lattice Green’s functions) in the transient regime fulfil Riesz potential power law decay asymptotic behavior for nodes far from the departure node. The non-locality of the fractional random walk is generated by the non-diagonality of the fractional Laplacian matrix with Lévy-type heavy tailed inverse power law decay for the probability of long-range moves. This non-local and asymptotic behavior of the fractional random walk introduces small-world properties with the emergence of Lévy flights on large (infinite) lattices.

  19. Change blindness in pigeons (Columba livia): the effects of change salience and timing

    PubMed Central

    Herbranson, Walter T.

    2015-01-01

    Change blindness is a well-established phenomenon in humans, in which plainly visible changes in the environment go unnoticed. Recently a parallel change blindness phenomenon has been demonstrated in pigeons. The reported experiment follows up on this finding by investigating whether change salience affects change blindness in pigeons the same way it affects change blindness in humans. Birds viewed alternating displays of randomly generated lines back-projected onto three response keys, with one or more line features on a single key differing between consecutive displays. Change salience was manipulated by varying the number of line features that changed on the critical response key. Results indicated that change blindness is reduced if a change is made more salient, and this matches previous human results. Furthermore, accuracy patterns indicate that pigeons’ effective search area expanded over the course of a trial to encompass a larger portion of the stimulus environment. Thus, the data indicate two important aspects of temporal cognition. First, the timing of a change has a profound influence on whether or not that change will be perceived. Second, pigeons appear to engage in a serial search for changes, in which additional time is required to search additional locations. PMID:26284021

  20. Iteration of ultrasound aberration correction methods

    NASA Astrophysics Data System (ADS)

    Maasoey, Svein-Erik; Angelsen, Bjoern; Varslot, Trond

    2004-05-01

    Aberration in ultrasound medical imaging is usually modeled by time-delay and amplitude variations concentrated on the transmitting/receiving array. This filter process is here denoted a TDA filter. The TDA filter is an approximation to the physical aberration process, which occurs over an extended part of the human body wall. Estimation of the TDA filter, and performing correction on transmit and receive, has proven difficult. It has yet to be shown that this method works adequately for severe aberration. Estimation of the TDA filter can be iterated by retransmitting a corrected signal and re-estimate until a convergence criterion is fulfilled (adaptive imaging). Two methods for estimating time-delay and amplitude variations in receive signals from random scatterers have been developed. One method correlates each element signal with a reference signal. The other method use eigenvalue decomposition of the receive cross-spectrum matrix, based upon a receive energy-maximizing criterion. Simulations of iterating aberration correction with a TDA filter have been investigated to study its convergence properties. A weak and strong human-body wall model generated aberration. Both emulated the human abdominal wall. Results after iteration improve aberration correction substantially, and both estimation methods converge, even for the case of strong aberration.

  1. A replication defective recombinant Ad5 vaccine expressing Ebola virus GP is safe and immunogenic in healthy adults.

    PubMed

    Ledgerwood, J E; Costner, P; Desai, N; Holman, L; Enama, M E; Yamshchikov, G; Mulangu, S; Hu, Z; Andrews, C A; Sheets, R A; Koup, R A; Roederer, M; Bailer, R; Mascola, J R; Pau, M G; Sullivan, N J; Goudsmit, J; Nabel, G J; Graham, B S

    2010-12-16

    Ebola virus causes irregular outbreaks of severe hemorrhagic fever in equatorial Africa. Case mortality remains high; there is no effective treatment and outbreaks are sporadic and unpredictable. Studies of Ebola virus vaccine platforms in non-human primates have established that the induction of protective immunity is possible and safety and human immunogenicity has been demonstrated in a previous Phase I clinical trial of a 1st generation Ebola DNA vaccine. We now report the safety and immunogenicity of a recombinant adenovirus serotype 5 (rAd5) vaccine encoding the envelope glycoprotein (GP) from the Zaire and Sudan Ebola virus species, in a randomized, placebo-controlled, double-blinded, dose escalation, Phase I human study. Thirty-one healthy adults received vaccine at 2×10(9) (n=12), or 2×10(10) (n=11) viral particles or placebo (n=8) as an intramuscular injection. Antibody responses were assessed by ELISA and neutralizing assays; and T cell responses were assessed by ELISpot and intracellular cytokine staining assays. This recombinant Ebola virus vaccine was safe and subjects developed antigen specific humoral and cellular immune responses. Published by Elsevier Ltd.

  2. Origin of Somatic Mutations in β-Catenin versus Adenomatous Polyposis Coli in Colon Cancer: Random Mutagenesis in Animal Models versus Nonrandom Mutagenesis in Humans.

    PubMed

    Yang, Da; Zhang, Min; Gold, Barry

    2017-07-17

    Wnt signaling is compromised early in the development of human colorectal cancer (CRC) due to truncating nonsense mutations in adenomatous polyposis coli (APC). CRC induced by chemical carcinogens, such as heterocyclic aromatic amines and azoxymethane, in mice also involves dysregulation of Wnt signaling but via activating missense mutations in the β-catenin oncogene despite the fact that genetically modified mice harboring an inactive APC allele efficiently develop CRC. In contrast, activating mutations in β-catenin are rarely observed in human CRC. Dysregulation of the Wnt signaling pathway by the two distinct mechanisms reveals insights into the etiology of human CRC. On the basis of calculations related to DNA adduct levels produced in mouse CRC models using mutagens, and the number of stem cells in the mouse colon, we show that two nonsense mutations required for biallelic disruption of APC are statistically unlikely to produce CRC in experiments using small numbers of mice. We calculate that an activating mutation in one allele near the critical GSK3β phosphorylation site on β-catenin is >10 5 -times more likely to produce CRC by random mutagenesis due to chemicals than inactivating two alleles in APC, yet it does not occur in humans. Therefore, the mutagenesis mechanism in human CRC cannot be random. We explain that nonsense APC mutations predominate in human CRC because of deamination at 5-methylcytosine at CGA and CAG codons, coupled with the number of human colonic stem cells and lifespan. Our analyses, including a comparison of mutation type and age at CRC diagnosis in U.S. and Chinese patients, also indicate that APC mutations in CRC are not due to environmental mutagens that randomly damage DNA.

  3. A critical evaluation of random copolymer mimesis of homogeneous antimicrobial peptides

    PubMed Central

    Hu, Kan; Schmidt, Nathan W.; Zhu, Rui; Jiang, Yunjiang; Lai, Ghee Hwee; Wei, Gang; Palermo, Edmund F.; Kuroda, Kenichi; Wong, Gerard C. L.; Yang, Lihua

    2013-01-01

    Polymeric synthetic mimics of antimicrobial peptides (SMAMPs) have recently demonstrated similar antimicrobial activity as natural antimicrobial peptides (AMPs) from innate immunity. This is surprising, since polymeric SMAMPs are heterogeneous in terms of chemical structure (random sequence) and conformation (random coil), in contrast to defined amino acid sequence and intrinsic secondary structure. To understand this better, we compare AMPs with a ‘minimal’ mimic, a well characterized family of polydisperse cationic methacrylate-based random copolymer SMAMPs. Specifically, we focus on a comparison between the quantifiable membrane curvature generating capacity, charge density, and hydrophobicity of the polymeric SMAMPs and AMPs. Synchrotron small angle x-ray scattering (SAXS) results indicate that typical AMPs and these methacrylate SMAMPs generate similar amounts of membrane negative Gaussian curvature (NGC), which is topologically necessary for a variety of membrane-destabilizing processes. Moreover, the curvature generating ability of SMAMPs is more tolerant of changes in the lipid composition than that of natural AMPs with similar chemical groups, consistent with the lower specificity of SMAMPs. We find that, although the amount of NGC generated by these SMAMPs and AMPs are similar, the SMAMPs require significantly higher levels of hydrophobicity and cationic charge to achieve the same level of membrane deformation. We propose an explanation for these differences, which has implications for new synthetic strategies aimed at improved mimesis of AMPs. PMID:23750051

  4. Tailpulse signal generator

    DOEpatents

    Baker, John [Walnut Creek, CA; Archer, Daniel E [Knoxville, TN; Luke, Stanley John [Pleasanton, CA; Decman, Daniel J [Livermore, CA; White, Gregory K [Livermore, CA

    2009-06-23

    A tailpulse signal generating/simulating apparatus, system, and method designed to produce electronic pulses which simulate tailpulses produced by a gamma radiation detector, including the pileup effect caused by the characteristic exponential decay of the detector pulses, and the random Poisson distribution pulse timing for radioactive materials. A digital signal process (DSP) is programmed and configured to produce digital values corresponding to pseudo-randomly selected pulse amplitudes and pseudo-randomly selected Poisson timing intervals of the tailpulses. Pulse amplitude values are exponentially decayed while outputting the digital value to a digital to analog converter (DAC). And pulse amplitudes of new pulses are added to decaying pulses to simulate the pileup effect for enhanced realism in the simulation.

  5. Discovery of common sequences absent in the human reference genome using pooled samples from next generation sequencing.

    PubMed

    Liu, Yu; Koyutürk, Mehmet; Maxwell, Sean; Xiang, Min; Veigl, Martina; Cooper, Richard S; Tayo, Bamidele O; Li, Li; LaFramboise, Thomas; Wang, Zhenghe; Zhu, Xiaofeng; Chance, Mark R

    2014-08-16

    Sequences up to several megabases in length have been found to be present in individual genomes but absent in the human reference genome. These sequences may be common in populations, and their absence in the reference genome may indicate rare variants in the genomes of individuals who served as donors for the human genome project. As the reference genome is used in probe design for microarray technology and mapping short reads in next generation sequencing (NGS), this missing sequence could be a source of bias in functional genomic studies and variant analysis. One End Anchor (OEA) and/or orphan reads from paired-end sequencing have been used to identify novel sequences that are absent in reference genome. However, there is no study to investigate the distribution, evolution and functionality of those sequences in human populations. To systematically identify and study the missing common sequences (micSeqs), we extended the previous method by pooling OEA reads from large number of individuals and applying strict filtering methods to remove false sequences. The pipeline was applied to data from phase 1 of the 1000 Genomes Project. We identified 309 micSeqs that are present in at least 1% of the human population, but absent in the reference genome. We confirmed 76% of these 309 micSeqs by comparison to other primate genomes, individual human genomes, and gene expression data. Furthermore, we randomly selected fifteen micSeqs and confirmed their presence using PCR validation in 38 additional individuals. Functional analysis using published RNA-seq and ChIP-seq data showed that eleven micSeqs are highly expressed in human brain and three micSeqs contain transcription factor (TF) binding regions, suggesting they are functional elements. In addition, the identified micSeqs are absent in non-primates and show dynamic acquisition during primate evolution culminating with most micSeqs being present in Africans, suggesting some micSeqs may be important sources of human diversity. 76% of micSeqs were confirmed by a comparative genomics approach. Fourteen micSeqs are expressed in human brain or contain TF binding regions. Some micSeqs are primate-specific, conserved and may play a role in the evolution of primates.

  6. A nonparametric method to generate synthetic populations to adjust for complex sampling design features.

    PubMed

    Dong, Qi; Elliott, Michael R; Raghunathan, Trivellore E

    2014-06-01

    Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs.

  7. A nonparametric method to generate synthetic populations to adjust for complex sampling design features

    PubMed Central

    Dong, Qi; Elliott, Michael R.; Raghunathan, Trivellore E.

    2017-01-01

    Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs. PMID:29200608

  8. Impact of Health Research Systems on Under-5 Mortality Rate: A Trend Analysis.

    PubMed

    Yazdizadeh, Bahareh; Parsaeian, Mahboubeh; Majdzadeh, Reza; Nikooee, Sima

    2016-11-26

    Between 1990 and 2015, under-5 mortality rate (U5MR) declined by 53%, from an estimated rate of 91 deaths per 1000 live births to 43, globally. The aim of this study was to determine the share of health research systems in this decrease alongside other influential factors. We used random effect regression models including the 'random intercept' and 'random intercept and random slope' models to analyze the panel data from 1990 to 2010. We selected the countries with U5MRs falling between the first and third quartiles in 1990. We used both the total articles (TA) and the number of child-specific articles (CSA) as a proxy of the health research system. In order to account for the impact of other factors, measles vaccination coverage (MVC) (as a proxy of health system performance), gross domestic product (GDP), human development index (HDI), and corruption perception index (CPI) (as proxies of development), were embedded in the model. Among all the models, 'the random intercept and random slope models' had lower residuals. The same variables of CSA, HDI, and time were significant and the coefficient of CSA was estimated at -0.17; meaning, with the addition of every 100 CSA, the rate of U5MR decreased by 17 per 1000 live births. Although the number of CSA has contributed to the reduction of U5MR, the amount of its contribution is negligible compared to the countries' development. We recommend entering different types of researches into the model separately in future research and including the variable of 'exchange between knowledge generator and user.' © 2017 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  9. Understanding and predicting binding between human leukocyte antigens (HLAs) and peptides by network analysis.

    PubMed

    Luo, Heng; Ye, Hao; Ng, Hui; Shi, Leming; Tong, Weida; Mattes, William; Mendrick, Donna; Hong, Huixiao

    2015-01-01

    As the major histocompatibility complex (MHC), human leukocyte antigens (HLAs) are one of the most polymorphic genes in humans. Patients carrying certain HLA alleles may develop adverse drug reactions (ADRs) after taking specific drugs. Peptides play an important role in HLA related ADRs as they are the necessary co-binders of HLAs with drugs. Many experimental data have been generated for understanding HLA-peptide binding. However, efficiently utilizing the data for understanding and accurately predicting HLA-peptide binding is challenging. Therefore, we developed a network analysis based method to understand and predict HLA-peptide binding. Qualitative Class I HLA-peptide binding data were harvested and prepared from four major databases. An HLA-peptide binding network was constructed from this dataset and modules were identified by the fast greedy modularity optimization algorithm. To examine the significance of signals in the yielded models, the modularity was compared with the modularity values generated from 1,000 random networks. The peptides and HLAs in the modules were characterized by similarity analysis. The neighbor-edges based and unbiased leverage algorithm (Nebula) was developed for predicting HLA-peptide binding. Leave-one-out (LOO) validations and two-fold cross-validations were conducted to evaluate the performance of Nebula using the constructed HLA-peptide binding network. Nine modules were identified from analyzing the HLA-peptide binding network with a highest modularity compared to all the random networks. Peptide length and functional side chains of amino acids at certain positions of the peptides were different among the modules. HLA sequences were module dependent to some extent. Nebula archived an overall prediction accuracy of 0.816 in the LOO validations and average accuracy of 0.795 in the two-fold cross-validations and outperformed the method reported in the literature. Network analysis is a useful approach for analyzing large and sparse datasets such as the HLA-peptide binding dataset. The modules identified from the network analysis clustered peptides and HLAs with similar sequences and properties of amino acids. Nebula performed well in the predictions of HLA-peptide binding. We demonstrated that network analysis coupled with Nebula is an efficient approach to understand and predict HLA-peptide binding interactions and thus, could further our understanding of ADRs.

  10. Understanding and predicting binding between human leukocyte antigens (HLAs) and peptides by network analysis

    PubMed Central

    2015-01-01

    Background As the major histocompatibility complex (MHC), human leukocyte antigens (HLAs) are one of the most polymorphic genes in humans. Patients carrying certain HLA alleles may develop adverse drug reactions (ADRs) after taking specific drugs. Peptides play an important role in HLA related ADRs as they are the necessary co-binders of HLAs with drugs. Many experimental data have been generated for understanding HLA-peptide binding. However, efficiently utilizing the data for understanding and accurately predicting HLA-peptide binding is challenging. Therefore, we developed a network analysis based method to understand and predict HLA-peptide binding. Methods Qualitative Class I HLA-peptide binding data were harvested and prepared from four major databases. An HLA-peptide binding network was constructed from this dataset and modules were identified by the fast greedy modularity optimization algorithm. To examine the significance of signals in the yielded models, the modularity was compared with the modularity values generated from 1,000 random networks. The peptides and HLAs in the modules were characterized by similarity analysis. The neighbor-edges based and unbiased leverage algorithm (Nebula) was developed for predicting HLA-peptide binding. Leave-one-out (LOO) validations and two-fold cross-validations were conducted to evaluate the performance of Nebula using the constructed HLA-peptide binding network. Results Nine modules were identified from analyzing the HLA-peptide binding network with a highest modularity compared to all the random networks. Peptide length and functional side chains of amino acids at certain positions of the peptides were different among the modules. HLA sequences were module dependent to some extent. Nebula archived an overall prediction accuracy of 0.816 in the LOO validations and average accuracy of 0.795 in the two-fold cross-validations and outperformed the method reported in the literature. Conclusions Network analysis is a useful approach for analyzing large and sparse datasets such as the HLA-peptide binding dataset. The modules identified from the network analysis clustered peptides and HLAs with similar sequences and properties of amino acids. Nebula performed well in the predictions of HLA-peptide binding. We demonstrated that network analysis coupled with Nebula is an efficient approach to understand and predict HLA-peptide binding interactions and thus, could further our understanding of ADRs. PMID:26424483

  11. You Look Human, But Act Like a Machine: Agent Appearance and Behavior Modulate Different Aspects of Human-Robot Interaction.

    PubMed

    Abubshait, Abdulaziz; Wiese, Eva

    2017-01-01

    Gaze following occurs automatically in social interactions, but the degree to which gaze is followed depends on whether an agent is perceived to have a mind, making its behavior socially more relevant for the interaction. Mind perception also modulates the attitudes we have toward others, and determines the degree of empathy, prosociality, and morality invested in social interactions. Seeing mind in others is not exclusive to human agents, but mind can also be ascribed to non-human agents like robots, as long as their appearance and/or behavior allows them to be perceived as intentional beings. Previous studies have shown that human appearance and reliable behavior induce mind perception to robot agents, and positively affect attitudes and performance in human-robot interaction. What has not been investigated so far is whether different triggers of mind perception have an independent or interactive effect on attitudes and performance in human-robot interaction. We examine this question by manipulating agent appearance (human vs. robot) and behavior (reliable vs. random) within the same paradigm and examine how congruent (human/reliable vs. robot/random) versus incongruent (human/random vs. robot/reliable) combinations of these triggers affect performance (i.e., gaze following) and attitudes (i.e., agent ratings) in human-robot interaction. The results show that both appearance and behavior affect human-robot interaction but that the two triggers seem to operate in isolation, with appearance more strongly impacting attitudes, and behavior more strongly affecting performance. The implications of these findings for human-robot interaction are discussed.

  12. A Random Forest-based ensemble method for activity recognition.

    PubMed

    Feng, Zengtao; Mo, Lingfei; Li, Meng

    2015-01-01

    This paper presents a multi-sensor ensemble approach to human physical activity (PA) recognition, using random forest. We designed an ensemble learning algorithm, which integrates several independent Random Forest classifiers based on different sensor feature sets to build a more stable, more accurate and faster classifier for human activity recognition. To evaluate the algorithm, PA data collected from the PAMAP (Physical Activity Monitoring for Aging People), which is a standard, publicly available database, was utilized to train and test. The experimental results show that the algorithm is able to correctly recognize 19 PA types with an accuracy of 93.44%, while the training is faster than others. The ensemble classifier system based on the RF (Random Forest) algorithm can achieve high recognition accuracy and fast calculation.

  13. Unified underpinning of human mobility in the real world and cyberspace

    NASA Astrophysics Data System (ADS)

    Zhao, Yi-Ming; Zeng, An; Yan, Xiao-Yong; Wang, Wen-Xu; Lai, Ying-Cheng

    2016-05-01

    Human movements in the real world and in cyberspace affect not only dynamical processes such as epidemic spreading and information diffusion but also social and economical activities such as urban planning and personalized recommendation in online shopping. Despite recent efforts in characterizing and modeling human behaviors in both the real and cyber worlds, the fundamental dynamics underlying human mobility have not been well understood. We develop a minimal, memory-based random walk model in limited space for reproducing, with a single parameter, the key statistical behaviors characterizing human movements in both cases. The model is validated using relatively big data from mobile phone and online commerce, suggesting memory-based random walk dynamics as the unified underpinning for human mobility, regardless of whether it occurs in the real world or in cyberspace.

  14. [Development of the next generation humanized mouse for drug discovery].

    PubMed

    Ito, Ryoji

    A humanized mouse, which is efficiently engrafted human cells and tissues, is an important tool to mimic human physiology for biomedical researches. Since 2000s, severe combined immunodeficient mouse strains such as NOG, BRG, and NSG mice have been generated. They are great recipients to create humanized mouse models compared to previous other immunodeficient strains due to their multiple dysfunctions of innate and acquired immunity. Especially, the transfer of human hematopoietic stem cells into these immunodeficient mice has been enabled to reconstitute human immune systems, because the mice show high engraftment level of human leukocyte in peripheral blood (~50%), spleen and bone marrow (60~90%) and generate well-differentiated multilineage human immune cells including lymphoid and myeloid lineage cells. Using these mice, several human disease models such as cancer, allergy, graft-versus-host disease (GVHD), and etc. have been established to understand the pathogenic mechanisms of the diseases and to evaluate the efficacy and safety of novel drugs. In this review, I provide an overview of recent advances in the humanized mouse technology, including generation of novel platforms of genetically modified NOG (next generation NOG) mice and some applications of them to create human disease models for drug discovery in preclinical researches.

  15. Using Maximum Entropy to Find Patterns in Genomes

    NASA Astrophysics Data System (ADS)

    Liu, Sophia; Hockenberry, Adam; Lancichinetti, Andrea; Jewett, Michael; Amaral, Luis

    The existence of over- and under-represented sequence motifs in genomes provides evidence of selective evolutionary pressures on biological mechanisms such as transcription, translation, ligand-substrate binding, and host immunity. To accurately identify motifs and other genome-scale patterns of interest, it is essential to be able to generate accurate null models that are appropriate for the sequences under study. There are currently no tools available that allow users to create random coding sequences with specified amino acid composition and GC content. Using the principle of maximum entropy, we developed a method that generates unbiased random sequences with pre-specified amino acid and GC content. Our method is the simplest way to obtain maximally unbiased random sequences that are subject to GC usage and primary amino acid sequence constraints. This approach can also be easily be expanded to create unbiased random sequences that incorporate more complicated constraints such as individual nucleotide usage or even di-nucleotide frequencies. The ability to generate correctly specified null models will allow researchers to accurately identify sequence motifs which will lead to a better understanding of biological processes. National Institute of General Medical Science, Northwestern University Presidential Fellowship, National Science Foundation, David and Lucile Packard Foundation, Camille Dreyfus Teacher Scholar Award.

  16. Chaotic oscillation and random-number generation based on nanoscale optical-energy transfer.

    PubMed

    Naruse, Makoto; Kim, Song-Ju; Aono, Masashi; Hori, Hirokazu; Ohtsu, Motoichi

    2014-08-12

    By using nanoscale energy-transfer dynamics and density matrix formalism, we demonstrate theoretically and numerically that chaotic oscillation and random-number generation occur in a nanoscale system. The physical system consists of a pair of quantum dots (QDs), with one QD smaller than the other, between which energy transfers via optical near-field interactions. When the system is pumped by continuous-wave radiation and incorporates a timing delay between two energy transfers within the system, it emits optical pulses. We refer to such QD pairs as nano-optical pulsers (NOPs). Irradiating an NOP with external periodic optical pulses causes the oscillating frequency of the NOP to synchronize with the external stimulus. We find that chaotic oscillation occurs in the NOP population when they are connected by an external time delay. Moreover, by evaluating the time-domain signals by statistical-test suites, we confirm that the signals are sufficiently random to qualify the system as a random-number generator (RNG). This study reveals that even relatively simple nanodevices that interact locally with each other through optical energy transfer at scales far below the wavelength of irradiating light can exhibit complex oscillatory dynamics. These findings are significant for applications such as ultrasmall RNGs.

  17. Random field assessment of nanoscopic inhomogeneity of bone.

    PubMed

    Dong, X Neil; Luo, Qing; Sparkman, Daniel M; Millwater, Harry R; Wang, Xiaodu

    2010-12-01

    Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to represent the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. Copyright © 2010 Elsevier Inc. All rights reserved.

  18. Generating Phenotypical Erroneous Human Behavior to Evaluate Human-automation Interaction Using Model Checking

    PubMed Central

    Bolton, Matthew L.; Bass, Ellen J.; Siminiceanu, Radu I.

    2012-01-01

    Breakdowns in complex systems often occur as a result of system elements interacting in unanticipated ways. In systems with human operators, human-automation interaction associated with both normative and erroneous human behavior can contribute to such failures. Model-driven design and analysis techniques provide engineers with formal methods tools and techniques capable of evaluating how human behavior can contribute to system failures. This paper presents a novel method for automatically generating task analytic models encompassing both normative and erroneous human behavior from normative task models. The generated erroneous behavior is capable of replicating Hollnagel’s zero-order phenotypes of erroneous action for omissions, jumps, repetitions, and intrusions. Multiple phenotypical acts can occur in sequence, thus allowing for the generation of higher order phenotypes. The task behavior model pattern capable of generating erroneous behavior can be integrated into a formal system model so that system safety properties can be formally verified with a model checker. This allows analysts to prove that a human-automation interactive system (as represented by the model) will or will not satisfy safety properties with both normative and generated erroneous human behavior. We present benchmarks related to the size of the statespace and verification time of models to show how the erroneous human behavior generation process scales. We demonstrate the method with a case study: the operation of a radiation therapy machine. A potential problem resulting from a generated erroneous human action is discovered. A design intervention is presented which prevents this problem from occurring. We discuss how our method could be used to evaluate larger applications and recommend future paths of development. PMID:23105914

  19. Generation and performance of special quasirandom structures for studying the elastic properties of random alloys: Application to Al-Ti

    NASA Astrophysics Data System (ADS)

    von Pezold, Johann; Dick, Alexey; Friák, Martin; Neugebauer, Jörg

    2010-03-01

    The performance of special-quasirandom structures (SQSs) for the description of elastic properties of random alloys was evaluated. A set of system-independent 32-atom-fcc SQS spanning the entire concentration range was generated and used to determine C11 , C12 , and C44 of binary random substitutional AlTi alloys. The elastic properties of these alloys could be described using the set of SQS with an accuracy comparable to the accuracy achievable by statistical sampling of the configurational space of 3×3×3 (108 atom, C44 ) and 4×4×4 (256 atom, C11 and C12 ) fcc supercells, irrespective of the impurity concentration. The smaller system size makes the proposed SQS ideal candidates for the ab initio determination of the elastic constants of random substitutional alloys. The set of optimized SQS is provided.

  20. Antibiotic Resistance in Salmonella Enteritidis Isolates Recovered from Chicken, Chicken Breast, and Humans Through National Antimicrobial Resistance Monitoring System Between 1996 and 2014.

    PubMed

    Paudyal, Narayan; Pan, Hang; Li, Xiaoliang; Fang, Weihuan; Yue, Min

    2018-06-21

    Salmonella enterica subspecies enterica serotype Enteritidis (S. Enteritidis) is one of the leading causes for human salmonellosis all over the world. We analyzed the surveillance data of 18 years on antimicrobial resistance profiling of S. Enteritidis collected and isolated by the National Antimicrobial Resistance Monitoring System (NARMS) from humans, chicken, and chicken breasts. Statistical tool based on the unique individual antibiotic minimum inhibitory concentration (MIC) profiling was used to compare antimicrobial resistance in the isolates. A machine-learning algorithm, Random Forest matrix, segregated a collection of 6819 S. Enteritidis into multiple populations. The MIC value of 13 common antibiotics to individual isolate when taken as the best classifier, resulted in two distinct groups represented herein as Population-I and Population-II. Population-I, which spread within a small tight cluster, comprised all the chicken and chicken breasts' isolates as well as about 13.4% of the human isolates, whereas the Population-II consisted of the human isolates only, with a larger spread over wider area away from the Population-I (p < 0.001). Few overlapping, yet diverse clusters between humans and chicken as well as higher level of resistance of chicken breast isolate toward third-generation cephalosporins and tetracyclines compared to those from human isolates, highlight differences in their population structure. These findings indicate a complex driver for enriching antibiotic-resistant Salmonella in the food-chain other than those of chicken origin. This warrants for other strategies in addition to the judicious/restricted use of antibiotics to mitigate the threat of antimicrobial resistance.

Top