Implementation of a quantum random number generator based on the optimal clustering of photocounts
NASA Astrophysics Data System (ADS)
Balygin, K. A.; Zaitsev, V. I.; Klimov, A. N.; Kulik, S. P.; Molotkov, S. N.
2017-10-01
To implement quantum random number generators, it is fundamentally important to have a mathematically provable and experimentally testable process of measurements of a system from which an initial random sequence is generated. This makes sure that randomness indeed has a quantum nature. A quantum random number generator has been implemented with the use of the detection of quasi-single-photon radiation by a silicon photomultiplier (SiPM) matrix, which makes it possible to reliably reach the Poisson statistics of photocounts. The choice and use of the optimal clustering of photocounts for the initial sequence of photodetection events and a method of extraction of a random sequence of 0's and 1's, which is polynomial in the length of the sequence, have made it possible to reach a yield rate of 64 Mbit/s of the output certainly random sequence.
Security of practical private randomness generation
NASA Astrophysics Data System (ADS)
Pironio, Stefano; Massar, Serge
2013-01-01
Measurements on entangled quantum systems necessarily yield outcomes that are intrinsically unpredictable if they violate a Bell inequality. This property can be used to generate certified randomness in a device-independent way, i.e., without making detailed assumptions about the internal working of the quantum devices used to generate the random numbers. Furthermore these numbers are also private; i.e., they appear random not only to the user but also to any adversary that might possess a perfect description of the devices. Since this process requires a small initial random seed to sample the behavior of the quantum devices and to extract uniform randomness from the raw outputs of the devices, one usually speaks of device-independent randomness expansion. The purpose of this paper is twofold. First, we point out that in most real, practical situations, where the concept of device independence is used as a protection against unintentional flaws or failures of the quantum apparatuses, it is sufficient to show that the generated string is random with respect to an adversary that holds only classical side information; i.e., proving randomness against quantum side information is not necessary. Furthermore, the initial random seed does not need to be private with respect to the adversary, provided that it is generated in a way that is independent from the measured systems. The devices, however, will generate cryptographically secure randomness that cannot be predicted by the adversary, and thus one can, given access to free public randomness, talk about private randomness generation. The theoretical tools to quantify the generated randomness according to these criteria were already introduced in S. Pironio [Nature (London)NATUAS0028-083610.1038/nature09008 464, 1021 (2010)], but the final results were improperly formulated. The second aim of this paper is to correct this inaccurate formulation and therefore lay out a precise theoretical framework for practical device-independent randomness generation.
Generating random numbers by means of nonlinear dynamic systems
NASA Astrophysics Data System (ADS)
Zang, Jiaqi; Hu, Haojie; Zhong, Juhua; Luo, Duanbin; Fang, Yi
2018-07-01
To introduce the randomness of a physical process to students, a chaotic pendulum experiment was opened in East China University of Science and Technology (ECUST) on the undergraduate level in the physics department. It was shown chaotic motion could be initiated through adjusting the operation of a chaotic pendulum. By using the data of the angular displacements of chaotic motion, random binary numerical arrays can be generated. To check the randomness of generated numerical arrays, the NIST Special Publication 800-20 method was adopted. As a result, it was found that all the random arrays which were generated by the chaotic motion could pass the validity criteria and some of them were even better than the quality of pseudo-random numbers generated by a computer. Through the experiments, it is demonstrated that chaotic pendulum can be used as an efficient mechanical facility in generating random numbers, and can be applied in teaching random motion to the students.
Image encryption using random sequence generated from generalized information domain
NASA Astrophysics Data System (ADS)
Xia-Yan, Zhang; Guo-Ji, Zhang; Xuan, Li; Ya-Zhou, Ren; Jie-Hua, Wu
2016-05-01
A novel image encryption method based on the random sequence generated from the generalized information domain and permutation-diffusion architecture is proposed. The random sequence is generated by reconstruction from the generalized information file and discrete trajectory extraction from the data stream. The trajectory address sequence is used to generate a P-box to shuffle the plain image while random sequences are treated as keystreams. A new factor called drift factor is employed to accelerate and enhance the performance of the random sequence generator. An initial value is introduced to make the encryption method an approximately one-time pad. Experimental results show that the random sequences pass the NIST statistical test with a high ratio and extensive analysis demonstrates that the new encryption scheme has superior security.
Fast and secure encryption-decryption method based on chaotic dynamics
Protopopescu, Vladimir A.; Santoro, Robert T.; Tolliver, Johnny S.
1995-01-01
A method and system for the secure encryption of information. The method comprises the steps of dividing a message of length L into its character components; generating m chaotic iterates from m independent chaotic maps; producing an "initial" value based upon the m chaotic iterates; transforming the "initial" value to create a pseudo-random integer; repeating the steps of generating, producing and transforming until a pseudo-random integer sequence of length L is created; and encrypting the message as ciphertext based upon the pseudo random integer sequence. A system for accomplishing the invention is also provided.
Knowledge Representation for Decision Making Agents
2013-07-15
knowledge map. This knowledge map is a dictionary data structure called tmap in the code. It represents a network of locations with a number [0,1...fillRandom(): Informed initial tmap distribution (randomly generated per node) with belief one. • initialBelief = 3 uses fillCenter(): normal...triggered on AllMyFMsHaveBeenInitialized. 2. Executes main.py • Initializes knowledge map labeled tmap . • Calls initialize search() – resets distanceTot and
Kouritzin, Michael A; Newton, Fraser; Wu, Biao
2013-04-01
Herein, we propose generating CAPTCHAs through random field simulation and give a novel, effective and efficient algorithm to do so. Indeed, we demonstrate that sufficient information about word tests for easy human recognition is contained in the site marginal probabilities and the site-to-nearby-site covariances and that these quantities can be embedded directly into certain conditional probabilities, designed for effective simulation. The CAPTCHAs are then partial random realizations of the random CAPTCHA word. We start with an initial random field (e.g., randomly scattered letter pieces) and use Gibbs resampling to re-simulate portions of the field repeatedly using these conditional probabilities until the word becomes human-readable. The residual randomness from the initial random field together with the random implementation of the CAPTCHA word provide significant resistance to attack. This results in a CAPTCHA, which is unrecognizable to modern optical character recognition but is recognized about 95% of the time in a human readability study.
Private randomness expansion with untrusted devices
NASA Astrophysics Data System (ADS)
Colbeck, Roger; Kent, Adrian
2011-03-01
Randomness is an important resource for many applications, from gambling to secure communication. However, guaranteeing that the output from a candidate random source could not have been predicted by an outside party is a challenging task, and many supposedly random sources used today provide no such guarantee. Quantum solutions to this problem exist, for example a device which internally sends a photon through a beamsplitter and observes on which side it emerges, but, presently, such solutions require the user to trust the internal workings of the device. Here, we seek to go beyond this limitation by asking whether randomness can be generated using untrusted devices—even ones created by an adversarial agent—while providing a guarantee that no outside party (including the agent) can predict it. Since this is easily seen to be impossible unless the user has an initially private random string, the task we investigate here is private randomness expansion. We introduce a protocol for private randomness expansion with untrusted devices which is designed to take as input an initially private random string and produce as output a longer private random string. We point out that private randomness expansion protocols are generally vulnerable to attacks that can render the initial string partially insecure, even though that string is used only inside a secure laboratory; our protocol is designed to remove this previously unconsidered vulnerability by privacy amplification. We also discuss extensions of our protocol designed to generate an arbitrarily long random string from a finite initially private random string. The security of these protocols against the most general attacks is left as an open question.
Unbiased All-Optical Random-Number Generator
NASA Astrophysics Data System (ADS)
Steinle, Tobias; Greiner, Johannes N.; Wrachtrup, Jörg; Giessen, Harald; Gerhardt, Ilja
2017-10-01
The generation of random bits is of enormous importance in modern information science. Cryptographic security is based on random numbers which require a physical process for their generation. This is commonly performed by hardware random-number generators. These often exhibit a number of problems, namely experimental bias, memory in the system, and other technical subtleties, which reduce the reliability in the entropy estimation. Further, the generated outcome has to be postprocessed to "iron out" such spurious effects. Here, we present a purely optical randomness generator, based on the bistable output of an optical parametric oscillator. Detector noise plays no role and postprocessing is reduced to a minimum. Upon entering the bistable regime, initially the resulting output phase depends on vacuum fluctuations. Later, the phase is rigidly locked and can be well determined versus a pulse train, which is derived from the pump laser. This delivers an ambiguity-free output, which is reliably detected and associated with a binary outcome. The resulting random bit stream resembles a perfect coin toss and passes all relevant randomness measures. The random nature of the generated binary outcome is furthermore confirmed by an analysis of resulting conditional entropies.
Pseudorandom number generation using chaotic true orbits of the Bernoulli map
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saito, Asaki, E-mail: saito@fun.ac.jp; Yamaguchi, Akihiro
We devise a pseudorandom number generator that exactly computes chaotic true orbits of the Bernoulli map on quadratic algebraic integers. Moreover, we describe a way to select the initial points (seeds) for generating multiple pseudorandom binary sequences. This selection method distributes the initial points almost uniformly (equidistantly) in the unit interval, and latter parts of the generated sequences are guaranteed not to coincide. We also demonstrate through statistical testing that the generated sequences possess good randomness properties.
Toward DNA-based Security Circuitry: First Step - Random Number Generation.
Bogard, Christy M; Arazi, Benjamin; Rouchka, Eric C
2008-08-10
DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. Our team investigates the implications of DNA-based circuit design in serving security applications. As an initial step we develop a random number generation circuitry. A novel prototype schema employs solid-phase synthesis of oligonucleotides for random construction of DNA sequences. Temporary storage and retrieval is achieved through plasmid vectors.
On grey levels in random CAPTCHA generation
NASA Astrophysics Data System (ADS)
Newton, Fraser; Kouritzin, Michael A.
2011-06-01
A CAPTCHA is an automatically generated test designed to distinguish between humans and computer programs; specifically, they are designed to be easy for humans but difficult for computer programs to pass in order to prevent the abuse of resources by automated bots. They are commonly seen guarding webmail registration forms, online auction sites, and preventing brute force attacks on passwords. In the following, we address the question: How does adding a grey level to random CAPTCHA generation affect the utility of the CAPTCHA? We treat the problem of generating the random CAPTCHA as one of random field simulation: An initial state of background noise is evolved over time using Gibbs sampling and an efficient algorithm for generating correlated random variables. This approach has already been found to yield highly-readable yet difficult-to-crack CAPTCHAs. We detail how the requisite parameters for introducing grey levels are estimated and how we generate the random CAPTCHA. The resulting CAPTCHA will be evaluated in terms of human readability as well as its resistance to automated attacks in the forms of character segmentation and optical character recognition.
Quantum random bit generation using energy fluctuations in stimulated Raman scattering.
Bustard, Philip J; England, Duncan G; Nunn, Josh; Moffatt, Doug; Spanner, Michael; Lausten, Rune; Sussman, Benjamin J
2013-12-02
Random number sequences are a critical resource in modern information processing systems, with applications in cryptography, numerical simulation, and data sampling. We introduce a quantum random number generator based on the measurement of pulse energy quantum fluctuations in Stokes light generated by spontaneously-initiated stimulated Raman scattering. Bright Stokes pulse energy fluctuations up to five times the mean energy are measured with fast photodiodes and converted to unbiased random binary strings. Since the pulse energy is a continuous variable, multiple bits can be extracted from a single measurement. Our approach can be generalized to a wide range of Raman active materials; here we demonstrate a prototype using the optical phonon line in bulk diamond.
A hybrid-type quantum random number generator
NASA Astrophysics Data System (ADS)
Hai-Qiang, Ma; Wu, Zhu; Ke-Jin, Wei; Rui-Xue, Li; Hong-Wei, Liu
2016-05-01
This paper proposes a well-performing hybrid-type truly quantum random number generator based on the time interval between two independent single-photon detection signals, which is practical and intuitive, and generates the initial random number sources from a combination of multiple existing random number sources. A time-to-amplitude converter and multichannel analyzer are used for qualitative analysis to demonstrate that each and every step is random. Furthermore, a carefully designed data acquisition system is used to obtain a high-quality random sequence. Our scheme is simple and proves that the random number bit rate can be dramatically increased to satisfy practical requirements. Project supported by the National Natural Science Foundation of China (Grant Nos. 61178010 and 11374042), the Fund of State Key Laboratory of Information Photonics and Optical Communications (Beijing University of Posts and Telecommunications), China, and the Fundamental Research Funds for the Central Universities of China (Grant No. bupt2014TS01).
Yao, Shengnan; Zeng, Weiming; Wang, Nizhuan; Chen, Lei
2013-07-01
Independent component analysis (ICA) has been proven to be effective for functional magnetic resonance imaging (fMRI) data analysis. However, ICA decomposition requires to optimize the unmixing matrix iteratively whose initial values are generated randomly. Thus the randomness of the initialization leads to different ICA decomposition results. Therefore, just one-time decomposition for fMRI data analysis is not usually reliable. Under this circumstance, several methods about repeated decompositions with ICA (RDICA) were proposed to reveal the stability of ICA decomposition. Although utilizing RDICA has achieved satisfying results in validating the performance of ICA decomposition, RDICA cost much computing time. To mitigate the problem, in this paper, we propose a method, named ATGP-ICA, to do the fMRI data analysis. This method generates fixed initial values with automatic target generation process (ATGP) instead of being produced randomly. We performed experimental tests on both hybrid data and fMRI data to indicate the effectiveness of the new method and made a performance comparison of the traditional one-time decomposition with ICA (ODICA), RDICA and ATGP-ICA. The proposed method demonstrated that it not only could eliminate the randomness of ICA decomposition, but also could save much computing time compared to RDICA. Furthermore, the ROC (Receiver Operating Characteristic) power analysis also denoted the better signal reconstruction performance of ATGP-ICA than that of RDICA. Copyright © 2013 Elsevier Inc. All rights reserved.
Dynamic Loads Generation for Multi-Point Vibration Excitation Problems
NASA Technical Reports Server (NTRS)
Shen, Lawrence
2011-01-01
A random-force method has been developed to predict dynamic loads produced by rocket-engine random vibrations for new rocket-engine designs. The method develops random forces at multiple excitation points based on random vibration environments scaled from accelerometer data obtained during hot-fire tests of existing rocket engines. This random-force method applies random forces to the model and creates expected dynamic response in a manner that simulates the way the operating engine applies self-generated random vibration forces (random pressure acting on an area) with the resulting responses that we measure with accelerometers. This innovation includes the methodology (implementation sequence), the computer code, two methods to generate the random-force vibration spectra, and two methods to reduce some of the inherent conservatism in the dynamic loads. This methodology would be implemented to generate the random-force spectra at excitation nodes without requiring the use of artificial boundary conditions in a finite element model. More accurate random dynamic loads than those predicted by current industry methods can then be generated using the random force spectra. The scaling method used to develop the initial power spectral density (PSD) environments for deriving the random forces for the rocket engine case is based on the Barrett Criteria developed at Marshall Space Flight Center in 1963. This invention approach can be applied in the aerospace, automotive, and other industries to obtain reliable dynamic loads and responses from a finite element model for any structure subject to multipoint random vibration excitations.
Quantumness-generating capability of quantum dynamics
NASA Astrophysics Data System (ADS)
Li, Nan; Luo, Shunlong; Mao, Yuanyuan
2018-04-01
We study quantumness-generating capability of quantum dynamics, where quantumness refers to the noncommutativity between the initial state and the evolving state. In terms of the commutator of the square roots of the initial state and the evolving state, we define a measure to quantify the quantumness-generating capability of quantum dynamics with respect to initial states. Quantumness-generating capability is absent in classical dynamics and hence is a fundamental characteristic of quantum dynamics. For qubit systems, we present an analytical form for this measure, by virtue of which we analyze several prototypical dynamics such as unitary dynamics, phase damping dynamics, amplitude damping dynamics, and random unitary dynamics (Pauli channels). Necessary and sufficient conditions for the monotonicity of quantumness-generating capability are also identified. Finally, we compare these conditions for the monotonicity of quantumness-generating capability with those for various Markovianities and illustrate that quantumness-generating capability and quantum Markovianity are closely related, although they capture different aspects of quantum dynamics.
Graphene based widely-tunable and singly-polarized pulse generation with random fiber lasers
Yao, B. C.; Rao, Y. J.; Wang, Z. N.; Wu, Y.; Zhou, J. H.; Wu, H.; Fan, M. Q.; Cao, X. L.; Zhang, W. L.; Chen, Y. F.; Li, Y. R.; Churkin, D.; Turitsyn, S.; Wong, C. W.
2015-01-01
Pulse generation often requires a stabilized cavity and its corresponding mode structure for initial phase-locking. Contrastingly, modeless cavity-free random lasers provide new possibilities for high quantum efficiency lasing that could potentially be widely tunable spectrally and temporally. Pulse generation in random lasers, however, has remained elusive since the discovery of modeless gain lasing. Here we report coherent pulse generation with modeless random lasers based on the unique polarization selectivity and broadband saturable absorption of monolayer graphene. Simultaneous temporal compression of cavity-free pulses are observed with such a polarization modulation, along with a broadly-tunable pulsewidth across two orders of magnitude down to 900 ps, a broadly-tunable repetition rate across three orders of magnitude up to 3 MHz, and a singly-polarized pulse train at 41 dB extinction ratio, about an order of magnitude larger than conventional pulsed fiber lasers. Moreover, our graphene-based pulse formation also demonstrates robust pulse-to-pulse stability and wide-wavelength operation due to the cavity-less feature. Such a graphene-based architecture not only provides a tunable pulsed random laser for fiber-optic sensing, speckle-free imaging, and laser-material processing, but also a new way for the non-random CW fiber lasers to generate widely tunable and singly-polarized pulses. PMID:26687730
Graphene based widely-tunable and singly-polarized pulse generation with random fiber lasers.
Yao, B C; Rao, Y J; Wang, Z N; Wu, Y; Zhou, J H; Wu, H; Fan, M Q; Cao, X L; Zhang, W L; Chen, Y F; Li, Y R; Churkin, D; Turitsyn, S; Wong, C W
2015-12-21
Pulse generation often requires a stabilized cavity and its corresponding mode structure for initial phase-locking. Contrastingly, modeless cavity-free random lasers provide new possibilities for high quantum efficiency lasing that could potentially be widely tunable spectrally and temporally. Pulse generation in random lasers, however, has remained elusive since the discovery of modeless gain lasing. Here we report coherent pulse generation with modeless random lasers based on the unique polarization selectivity and broadband saturable absorption of monolayer graphene. Simultaneous temporal compression of cavity-free pulses are observed with such a polarization modulation, along with a broadly-tunable pulsewidth across two orders of magnitude down to 900 ps, a broadly-tunable repetition rate across three orders of magnitude up to 3 MHz, and a singly-polarized pulse train at 41 dB extinction ratio, about an order of magnitude larger than conventional pulsed fiber lasers. Moreover, our graphene-based pulse formation also demonstrates robust pulse-to-pulse stability and wide-wavelength operation due to the cavity-less feature. Such a graphene-based architecture not only provides a tunable pulsed random laser for fiber-optic sensing, speckle-free imaging, and laser-material processing, but also a new way for the non-random CW fiber lasers to generate widely tunable and singly-polarized pulses.
Signs of universality in the structure of culture
NASA Astrophysics Data System (ADS)
Băbeanu, Alexandru-Ionuţ; Talman, Leandros; Garlaschelli, Diego
2017-11-01
Understanding the dynamics of opinions, preferences and of culture as whole requires more use of empirical data than has been done so far. It is clear that an important role in driving this dynamics is played by social influence, which is the essential ingredient of many quantitative models. Such models require that all traits are fixed when specifying the "initial cultural state". Typically, this initial state is randomly generated, from a uniform distribution over the set of possible combinations of traits. However, recent work has shown that the outcome of social influence dynamics strongly depends on the nature of the initial state. If the latter is sampled from empirical data instead of being generated in a uniformly random way, a higher level of cultural diversity is found after long-term dynamics, for the same level of propensity towards collective behavior in the short-term. Moreover, if the initial state is randomized by shuffling the empirical traits among people, the level of long-term cultural diversity is in-between those obtained for the empirical and uniformly random counterparts. The current study repeats the analysis for multiple empirical data sets, showing that the results are remarkably similar, although the matrix of correlations between cultural variables clearly differs across data sets. This points towards robust structural properties inherent in empirical cultural states, possibly due to universal laws governing the dynamics of culture in the real world. The results also suggest that this dynamics might be characterized by criticality and involve mechanisms beyond social influence.
Analysis of entropy extraction efficiencies in random number generation systems
NASA Astrophysics Data System (ADS)
Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu
2016-05-01
Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.
True Randomness from Big Data.
Papakonstantinou, Periklis A; Woodruff, David P; Yang, Guang
2016-09-26
Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.
NASA Astrophysics Data System (ADS)
Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang
2016-09-01
Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.
Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang
2016-01-01
Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514
Generating Random Numbers by Means of Nonlinear Dynamic Systems
ERIC Educational Resources Information Center
Zang, Jiaqi; Hu, Haojie; Zhong, Juhua; Luo, Duanbin; Fang, Yi
2018-01-01
To introduce the randomness of a physical process to students, a chaotic pendulum experiment was opened in East China University of Science and Technology (ECUST) on the undergraduate level in the physics department. It was shown chaotic motion could be initiated through adjusting the operation of a chaotic pendulum. By using the data of the…
All about Eve: Secret Sharing using Quantum Effects
NASA Technical Reports Server (NTRS)
Jackson, Deborah J.
2005-01-01
This document discusses the nature of light (including classical light and photons), encryption, quantum key distribution (QKD), light polarization and beamsplitters and their application to information communication. A quantum of light represents the smallest possible subdivision of radiant energy (light) and is called a photon. The QKD key generation sequence is outlined including the receiver broadcasting the initial signal indicating reception availability, timing pulses from the sender to provide reference for gated detection of photons, the sender generating photons through random polarization while the receiver detects photons with random polarization and communicating via data link to mutually establish random keys. The QKD network vision includes inter-SATCOM, point-to-point Gnd Fiber and SATCOM-fiber nodes. QKD offers an unconditionally secure method of exchanging encryption keys. Ongoing research will focus on how to increase the key generation rate.
Homogeneous buoyancy-generated turbulence
NASA Technical Reports Server (NTRS)
Batchelor, G. K.; Canuto, V. M.; Chasnov, J. R.
1992-01-01
Using a theoretical analysis of fundamental equations and a numerical simulation of the flow field, the statistically homogeneous motion that is generated by buoyancy forces after the creation of homogeneous random fluctuations in the density of infinite fluid at an initial instant is examined. It is shown that analytical results together with numerical results provide a comprehensive description of the 'birth, life, and death' of buoyancy-generated turbulence. Results of numerical simulations yielded the mean-square density mean-square velocity fluctuations and the associated spectra as functions of time for various initial conditions, and the time required for the mean-square density fluctuation to fall to a specified small value was estimated.
Dynamic analysis of a pumped-storage hydropower plant with random power load
NASA Astrophysics Data System (ADS)
Zhang, Hao; Chen, Diyi; Xu, Beibei; Patelli, Edoardo; Tolo, Silvia
2018-02-01
This paper analyzes the dynamic response of a pumped-storage hydropower plant in generating mode. Considering the elastic water column effects in the penstock, a linearized reduced order dynamic model of the pumped-storage hydropower plant is used in this paper. As the power load is always random, a set of random generator electric power output is introduced to research the dynamic behaviors of the pumped-storage hydropower plant. Then, the influences of the PI gains on the dynamic characteristics of the pumped-storage hydropower plant with the random power load are analyzed. In addition, the effects of initial power load and PI parameters on the stability of the pumped-storage hydropower plant are studied in depth. All of the above results will provide theoretical guidance for the study and analysis of the pumped-storage hydropower plant.
NASA Astrophysics Data System (ADS)
Weng, Jingmeng; Wen, Weidong; Cui, Haitao; Chen, Bo
2018-06-01
A new method to generate the random distribution of fibers in the transverse cross-section of fiber reinforced composites with high fiber volume fraction is presented in this paper. Based on the microscopy observation of the transverse cross-sections of unidirectional composite laminates, hexagon arrangement is set as the initial arrangement status, and the initial velocity of each fiber is arbitrary at an arbitrary direction, the micro-scale representative volume element (RVE) is established by simulating perfectly elastic collision. Combined with the proposed periodic boundary conditions which are suitable for multi-axial loading, the effective elastic properties of composite materials can be predicted. The predicted properties show reasonable agreement with experimental results. By comparing the stress field of RVE with fibers distributed randomly and RVE with fibers distributed periodically, the predicted elastic modulus of RVE with fibers distributed randomly is greater than RVE with fibers distributed periodically.
Cambron, Jerrilyn A; Dexheimer, Jennifer M; Chang, Mabel; Cramer, Gregory D
2010-01-01
The purpose of this article is to describe the methods for recruitment in a clinical trial on chiropractic care for lumbar spinal stenosis. This randomized, placebo-controlled pilot study investigated the efficacy of different amounts of total treatment dosage over 6 weeks in 60 volunteer subjects with lumbar spinal stenosis. Subjects were recruited for this study through several media venues, focusing on successful and cost-effective strategies. Included in our efforts were radio advertising, newspaper advertising, direct mail, and various other low-cost initiatives. Of the 1211 telephone screens, 60 responders (5.0%) were randomized into the study. The most successful recruitment method was radio advertising, generating more than 64% of the calls (776 subjects). Newspaper and magazine advertising generated approximately 9% of all calls (108 subjects), and direct mail generated less than 7% (79 subjects). The total direct cost for recruitment was $40 740 or $679 per randomized patient. The costs per randomization were highest for direct mail ($995 per randomization) and lowest for newspaper/magazine advertising ($558 per randomization). Success of recruitment methods may vary based on target population and location. Planning of recruitment efforts is essential to the success of any clinical trial. Copyright 2010 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.
Pseudo-random dynamic address configuration (PRDAC) algorithm for mobile ad hoc networks
NASA Astrophysics Data System (ADS)
Wu, Shaochuan; Tan, Xuezhi
2007-11-01
By analyzing all kinds of address configuration algorithms, this paper provides a new pseudo-random dynamic address configuration (PRDAC) algorithm for mobile ad hoc networks. Based on PRDAC, the first node that initials this network randomly chooses a nonlinear shift register that can generates an m-sequence. When another node joins this network, the initial node will act as an IP address configuration sever to compute an IP address according to this nonlinear shift register, and then allocates this address and tell the generator polynomial of this shift register to this new node. By this means, when other node joins this network, any node that has obtained an IP address can act as a server to allocate address to this new node. PRDAC can also efficiently avoid IP conflicts and deal with network partition and merge as same as prophet address (PA) allocation and dynamic configuration and distribution protocol (DCDP). Furthermore, PRDAC has less algorithm complexity, less computational complexity and more sufficient assumption than PA. In addition, PRDAC radically avoids address conflicts and maximizes the utilization rate of IP addresses. Analysis and simulation results show that PRDAC has rapid convergence, low overhead and immune from topological structures.
Generating equilateral random polygons in confinement
NASA Astrophysics Data System (ADS)
Diao, Y.; Ernst, C.; Montemayor, A.; Ziegler, U.
2011-10-01
One challenging problem in biology is to understand the mechanism of DNA packing in a confined volume such as a cell. It is known that confined circular DNA is often knotted and hence the topology of the extracted (and relaxed) circular DNA can be used as a probe of the DNA packing mechanism. However, in order to properly estimate the topological properties of the confined circular DNA structures using mathematical models, it is necessary to generate large ensembles of simulated closed chains (i.e. polygons) of equal edge lengths that are confined in a volume such as a sphere of certain fixed radius. Finding efficient algorithms that properly sample the space of such confined equilateral random polygons is a difficult problem. In this paper, we propose a method that generates confined equilateral random polygons based on their probability distribution. This method requires the creation of a large database initially. However, once the database has been created, a confined equilateral random polygon of length n can be generated in linear time in terms of n. The errors introduced by the method can be controlled and reduced by the refinement of the database. Furthermore, our numerical simulations indicate that these errors are unbiased and tend to cancel each other in a long polygon.
A method to generate the surface cell layer of the 3D virtual shoot apex from apical initials.
Kucypera, Krzysztof; Lipowczan, Marcin; Piekarska-Stachowiak, Anna; Nakielski, Jerzy
2017-01-01
The development of cell pattern in the surface cell layer of the shoot apex can be investigated in vivo by use of a time-lapse confocal images, showing naked meristem in 3D in successive times. However, how this layer is originated from apical initials and develops as a result of growth and divisions of their descendants, remains unknown. This is an open area for computer modelling. A method to generate the surface cell layer is presented on the example of the 3D paraboloidal shoot apical dome. In the used model the layer originates from three apical initials that meet at the dome summit and develops through growth and cell divisions under the isotropic surface growth, defined by the growth tensor. The cells, which are described by polyhedrons, divide anticlinally with the smallest division plane that passes depending on the used mode through the cell center, or the point found randomly near this center. The formation of the surface cell pattern is described with the attention being paid to activity of the apical initials and fates of their descendants. The computer generated surface layer that included about 350 cells required about 1200 divisions of the apical initials and their derivatives. The derivatives were arranged into three more or less equal clonal sectors composed of cellular clones at different age. Each apical initial renewed itself 7-8 times to produce the sector. In the shape and location and the cellular clones the following divisions of the initial were manifested. The application of the random factor resulted in more realistic cell pattern in comparison to the pure mode. The cell divisions were analyzed statistically on the top view. When all of the division walls were considered, their angular distribution was uniform, whereas in the distribution that was limited to apical initials only, some preferences related to their arrangement at the dome summit were observed. The realistic surface cell pattern was obtained. The present method is a useful tool to generate surface cell layer, study activity of initial cells and their derivatives, and how cell expansion and division are coordinated during growth. We expect its further application to clarify the question of a number and permanence or impermanence of initial cells, and possible relationship between their shape and oriented divisions, both on the ground of the growth tensor approach.
Obeso, Ignacio; Wilkinson, Leonora; Casabona, Enrique; Bringas, Maria Luisa; Álvarez, Mario; Álvarez, Lázaro; Pavón, Nancy; Rodríguez-Oroz, Maria-Cruz; Macías, Raúl; Obeso, Jose A; Jahanshahi, Marjan
2011-07-01
Recent imaging studies in healthy controls with a conditional stop signal reaction time (RT) task have implicated the subthalamic nucleus (STN) in response inhibition and the pre-supplementary motor area (pre-SMA) in conflict resolution. Parkinson's disease (PD) is characterized by striatal dopamine deficiency and overactivity of the STN and underactivation of the pre-SMA during movement. We used the conditional stop signal RT task to investigate whether PD produced similar or dissociable effects on response initiation, response inhibition and response initiation under conflict. In addition, we also examined inhibition of prepotent responses on three cognitive tasks: the Stroop, random number generation and Hayling sentence completion. PD patients were impaired on the conditional stop signal reaction time task, with response initiation both in situations with or without conflict and response inhibition all being significantly delayed, and had significantly greater difficulty in suppressing prepotent or habitual responses on the Stroop, Hayling and random number generation tasks relative to controls. These results demonstrate the existence of a generalized inhibitory deficit in PD, which suggest that PD is a disorder of inhibition as well as activation and that in situations of conflict, executive control over responses is compromised.
Numerical Modeling of S-Wave Generation by Fracture Damage in Underground Nuclear Explosions
2009-09-30
Element Package, ABAQUS. A user -defined subroutine , VUMAT, was written that incorporates the micro-mechanics based damage constitutive law described...dynamic damage evolution on the elastic and anelastic response. 2) whereas the Ashby/Sammis model was only applicable to the case where the initial cracks ...are all parallel and the same size, we can now include a specified distribution of initial crack sizes with random azimuthal orientation about the
Numerical Generation of Dense Plume Fingers in Unsaturated Homogeneous Porous Media
NASA Astrophysics Data System (ADS)
Cremer, C.; Graf, T.
2012-04-01
In nature, the migration of dense plumes typically results in the formation of vertical plume fingers. Flow direction in fingers is downwards, which is counterbalanced by upwards flow of less dense fluid between fingers. In heterogeneous media, heterogeneity itself is known to trigger the formation of fingers. In homogeneous media, however, fingers are also created even if all grains had the same diameter. The reason is that pore-scale heterogeneity leading to different flow velocities also exists in homogeneous media due to two effects: (i) Grains of identical size may randomly arrange differently, e.g. forming tetrahedrons, hexahedrons or octahedrons. Each arrangement creates pores of varying diameter, thus resulting in different average flow velocities. (ii) Random variations of solute concentration lead to varying buoyancy effects, thus also resulting in different velocities. As a continuation of previously made efforts to incorporate pore-scale heterogeneity into fully saturated soil such that dense fingers are realistically generated (Cremer and Graf, EGU Assembly, 2011), the current paper extends the research scope from saturated to unsaturated soil. Perturbation methods are evaluated by numerically re-simulating a laboratory-scale experiment of plume transport in homogeneous unsaturated sand (Simmons et al., Transp. Porous Media, 2002). The following 5 methods are being discussed: (i) homogeneous sand, (ii) initial perturbation of solute concentration, (iii) spatially random, time-constant perturbation of solute source, (iv) spatially and temporally random noise of simulated solute concentration, and (v) random K-field that introduces physically insignificant but numerically significant heterogeneity. Results demonstrate that, as opposed to saturated flow, perturbing the solute source will not result in plume fingering. This is because the location of the perturbed source (domain top) and the location of finger generation (groundwater surface) do not coincide. Alternatively, similar to saturated flow, applying either a random concentration noise (iv) or a random K-field (v) generates realistic plume fingering. Future work will focus on the generation mechanisms of plume finger splitting.
Use of Genetic Algorithms to solve Inverse Problems in Relativistic Hydrodynamics
NASA Astrophysics Data System (ADS)
Guzmán, F. S.; González, J. A.
2018-04-01
We present the use of Genetic Algorithms (GAs) as a strategy to solve inverse problems associated with models of relativistic hydrodynamics. The signal we consider to emulate an observation is the density of a relativistic gas, measured at a point where a shock is traveling. This shock is generated numerically out of a Riemann problem with mildly relativistic conditions. The inverse problem we propose is the prediction of the initial conditions of density, velocity and pressure of the Riemann problem that gave origin to that signal. For this we use the density, velocity and pressure of the gas at both sides of the discontinuity, as the six genes of an organism, initially with random values within a tolerance. We then prepare an initial population of N of these organisms and evolve them using methods based on GAs. In the end, the organism with the best fitness of each generation is compared to the signal and the process ends when the set of initial conditions of the organisms of a later generation fit the Signal within a tolerance.
NASA Astrophysics Data System (ADS)
Li, Xiayue; Curtis, Farren S.; Rose, Timothy; Schober, Christoph; Vazquez-Mayagoitia, Alvaro; Reuter, Karsten; Oberhofer, Harald; Marom, Noa
2018-06-01
We present Genarris, a Python package that performs configuration space screening for molecular crystals of rigid molecules by random sampling with physical constraints. For fast energy evaluations, Genarris employs a Harris approximation, whereby the total density of a molecular crystal is constructed via superposition of single molecule densities. Dispersion-inclusive density functional theory is then used for the Harris density without performing a self-consistency cycle. Genarris uses machine learning for clustering, based on a relative coordinate descriptor developed specifically for molecular crystals, which is shown to be robust in identifying packing motif similarity. In addition to random structure generation, Genarris offers three workflows based on different sequences of successive clustering and selection steps: the "Rigorous" workflow is an exhaustive exploration of the potential energy landscape, the "Energy" workflow produces a set of low energy structures, and the "Diverse" workflow produces a maximally diverse set of structures. The latter is recommended for generating initial populations for genetic algorithms. Here, the implementation of Genarris is reported and its application is demonstrated for three test cases.
NASA Astrophysics Data System (ADS)
Ding, Jian; Li, Li
2018-05-01
We initiate the study on chemical distances of percolation clusters for level sets of two-dimensional discrete Gaussian free fields as well as loop clusters generated by two-dimensional random walk loop soups. One of our results states that the chemical distance between two macroscopic annuli away from the boundary for the random walk loop soup at the critical intensity is of dimension 1 with positive probability. Our proof method is based on an interesting combination of a theorem of Makarov, isomorphism theory, and an entropic repulsion estimate for Gaussian free fields in the presence of a hard wall.
NASA Astrophysics Data System (ADS)
Ding, Jian; Li, Li
2018-06-01
We initiate the study on chemical distances of percolation clusters for level sets of two-dimensional discrete Gaussian free fields as well as loop clusters generated by two-dimensional random walk loop soups. One of our results states that the chemical distance between two macroscopic annuli away from the boundary for the random walk loop soup at the critical intensity is of dimension 1 with positive probability. Our proof method is based on an interesting combination of a theorem of Makarov, isomorphism theory, and an entropic repulsion estimate for Gaussian free fields in the presence of a hard wall.
Legleye, Stéphane; Khlat, Myriam; Mayet, Aurélie; Beck, François; Falissard, Bruno; Chau, Nearkasen; Peretti-Watel, Patrick
2016-10-01
The diffusion of cannabis initiation has been accompanied by a reversal in the educational gradient: contrary to older generations, the less educated in recent generations are more likely to initiate than the more educated. We tested whether the educational gradient for the transition from initiation to daily use evolved in the same way. A French telephone random survey conducted in 2010 (21 818 respondents aged 15-64 years), asking interviewees about their ages at initiation to daily use, if any. A total of 6824 cannabis initiators aged 18-64 years at data collection. Three birth cohort groups (generations) were compared: 1946-60 (n = 767), 1961-75 (n = 2632) and 1976-92 (n = 3425) with, respectively, 47, 42 and 45% of women. Risks of transition to daily use from ages 11-34 were compared through time-discrete logistic regressions and educational gradients were quantified through a relative index of inequality (RII). Control variables include age and time-varying variables (ages at tobacco daily use, at first drunkenness and at first other use of an illicit drug in a list of 13 products). Twenty-four per cent of the initiators reported daily use before age 35, the proportions tripling from the oldest to the youngest generation (from 11.7 to 38.6% in men, from 7.7 to 22.2% in women). Whatever the generation, the less educated initiators more often shifted to daily use than the most educated: from the oldest to the youngest generation, RII = 2.13, 95% confidence interval (CI) = [0.65, 7.02]; 2.19 95% CI = [1.33, 3.63]; and 2.24, 95% CI = [1.60, 3.15] in men; RII = 3.31, 95% CI = [0.75, 14.68]; 3.17, 95% CI = [1.49, 6.76]; and 3.56, 95% CI = [2.07, 6.14] in women, respectively. In France, the risk of transition from cannabis initiation to daily use has remained consistently higher among less educated cannabis initiators over three generations (1946-60, 1961-75, 1976-92), in contrast to what is observed for initiation. © 2016 Society for the Study of Addiction.
Randrianjatovo-Gbalou, Irina; Rosario, Sandrine; Sismeiro, Odile; Varet, Hugo; Legendre, Rachel; Coppée, Jean-Yves; Huteau, Valérie; Pochet, Sylvie; Delarue, Marc
2018-05-21
Nucleic acid aptamers, especially RNA, exhibit valuable advantages compared to protein therapeutics in terms of size, affinity and specificity. However, the synthesis of libraries of large random RNAs is still difficult and expensive. The engineering of polymerases able to directly generate these libraries has the potential to replace the chemical synthesis approach. Here, we start with a DNA polymerase that already displays a significant template-free nucleotidyltransferase activity, human DNA polymerase theta, and we mutate it based on the knowledge of its three-dimensional structure as well as previous mutational studies on members of the same polA family. One mutant exhibited a high tolerance towards ribonucleotides (NTPs) and displayed an efficient ribonucleotidyltransferase activity that resulted in the assembly of long RNA polymers. HPLC analysis and RNA sequencing of the products were used to quantify the incorporation of the four NTPs as a function of initial NTP concentrations and established the randomness of each generated nucleic acid sequence. The same mutant revealed a propensity to accept other modified nucleotides and to extend them in long fragments. Hence, this mutant can deliver random natural and modified RNA polymers libraries ready to use for SELEX, with custom lengths and balanced or unbalanced ratios.
Cruse-Sanders, J. M.; Hamrick, J.L.; Ahumada, J.A.
2005-01-01
American ginseng, Panax quinquefolius L., is one of the most heavily traded medicinal plants in North America. The effect of harvest on genetic diversity in ginseng was measured with a single generation culling simulation program. Culling scenarios included random harvest at varying levels, legal limit random harvest and legal limit mature plant harvest. The legal limit was determined by the proportion of legally harvestable plants per population (% mature plants per population). Random harvest at varying levels resulted in significant loss of genetic diversity, especially allelic richness. Relative to initial levels, average within-population genetic diversity (H e) was significantly lower when plants were culled randomly at the legal limit (Mann-Whitney U = 430, p < 0.001) or when only mature plants were culled (Mann-Whitney U = 394, p < 0.01). Within-population genetic diversity was significantly higher with legal limit mature plant harvest (H e = 0.068) than when plants were culled randomly at the legal limit (H e = 0.064; U = 202, p < 0.01). Based on these simulations of harvest over one generation, we recommend that harvesting fewer than the proportion of mature plants could reduce the negative genetic effects of harvest on ginseng populations. ?? Springer 2005.
NASA Astrophysics Data System (ADS)
Tseng, Po-Hao; Hsu, Kai-Chieh; Lin, Yu-Yu; Lee, Feng-Min; Lee, Ming-Hsiu; Lung, Hsiang-Lan; Hsieh, Kuang-Yeu; Chung Wang, Keh; Lu, Chih-Yuan
2018-04-01
A high performance physically unclonable function (PUF) implemented with WO3 resistive random access memory (ReRAM) is presented in this paper. This robust ReRAM-PUF can eliminated bit flipping problem at very high temperature (up to 250 °C) due to plentiful read margin by using initial resistance state and set resistance state. It is also promised 10 years retention at the temperature range of 210 °C. These two stable resistance states enable stable operation at automotive environments from -40 to 125 °C without need of temperature compensation circuit. The high uniqueness of PUF can be achieved by implementing a proposed identification (ID)-generation method. Optimized forming condition can move 50% of the cells to low resistance state and the remaining 50% remain at initial high resistance state. The inter- and intra-PUF evaluations with unlimited separation of hamming distance (HD) are successfully demonstrated even under the corner condition. The number of reproduction was measured to exceed 107 times with 0% bit error rate (BER) at read voltage from 0.4 to 0.7 V.
Properties of promoters cloned randomly from the Saccharomyces cerevisiae genome.
Santangelo, G M; Tornow, J; McLaughlin, C S; Moldave, K
1988-01-01
Promoters were isolated at random from the genome of Saccharomyces cerevisiae by using a plasmid that contains a divergently arrayed pair of promoterless reporter genes. A comprehensive library was constructed by inserting random (DNase I-generated) fragments into the intergenic region upstream from the reporter genes. Simple in vivo assays for either reporter gene product (alcohol dehydrogenase or beta-galactosidase) allowed the rapid identification of promoters from among these random fragments. Poly(dA-dT) homopolymer tracts were present in three of five randomly cloned promoters. With two exceptions, each RNA start site detected was 40 to 100 base pairs downstream from a TATA element. All of the randomly cloned promoters were capable of activating reporter gene transcription bidirectionally. Interestingly, one of the promoter fragments originated in a region of the S. cerevisiae rDNA spacer; regulated divergent transcription (presumably by RNA polymerase II) initiated in the same region. Images PMID:2847031
Morse, Melvin L; Beem, Lance W
2011-12-01
Reiki therapy is documented for relief of pain and stress. Energetic healing has been documented to alter biologic markers of illness such as hematocrit. True random number generators are reported to be affected by energy healers and spiritually oriented conscious awareness. The patient was a then 54-year-old severely ill man who had hepatitis C types 1 and 2 and who did not improve with conventional therapy. He also suffered from obesity, the metabolic syndrome, asthma, and hypertension. He was treated with experimental high-dose interferon/riboviron therapy with resultant profound anemia and neutropenia. Energetic healing and Reiki therapy was administered initially to enhance the patient's sense of well-being and to relieve anxiety. Possible effects on the patient's absolute neutrophil count and hematocrit were incidentally noted. Reiki therapy was then initiated at times of profound neutropenia to assess its possible effect on the patient's absolute neutrophil count (ANC). Reiki and other energetic healing sessions were monitored with a true random number generator (RNG). Statistically significant relationships were documented between Reiki therapy, a quieting of the electronically created white noise of the RNG during healing sessions, and improvement in the patient's ANC. The immediate clinical result was that the patient could tolerate the high-dose interferon regimen without missing doses because of absolute neutropenia. The patient was initially a late responder to interferon and had been given a 5% chance of clearing the virus. He remains clear of the virus 1 year after treatment. The association between changes in the RNG, Reiki therapy, and a patient's ANC is the first to the authors' knowledge in the medical literature. Future studies assessing the effects of energetic healing on specific biologic markers of disease are anticipated. Concurrent use of a true RNG may prove to correlate with the effectiveness of energetic therapy.
Beem, Lance W.
2011-01-01
Abstract Background Reiki therapy is documented for relief of pain and stress. Energetic healing has been documented to alter biologic markers of illness such as hematocrit. True random number generators are reported to be affected by energy healers and spiritually oriented conscious awareness. Methods The patient was a then 54-year-old severely ill man who had hepatitis C types 1 and 2 and who did not improve with conventional therapy. He also suffered from obesity, the metabolic syndrome, asthma, and hypertension. He was treated with experimental high-dose interferon/riboviron therapy with resultant profound anemia and neutropenia. Energetic healing and Reiki therapy was administered initially to enhance the patient's sense of well-being and to relieve anxiety. Possible effects on the patient's absolute neutrophil count and hematocrit were incidentally noted. Reiki therapy was then initiated at times of profound neutropenia to assess its possible effect on the patient's absolute neutrophil count (ANC). Reiki and other energetic healing sessions were monitored with a true random number generator (RNG). Results Statistically significant relationships were documented between Reiki therapy, a quieting of the electronically created white noise of the RNG during healing sessions, and improvement in the patient's ANC. The immediate clinical result was that the patient could tolerate the high-dose interferon regimen without missing doses because of absolute neutropenia. The patient was initially a late responder to interferon and had been given a 5% chance of clearing the virus. He remains clear of the virus 1 year after treatment. Conclusions The association between changes in the RNG, Reiki therapy, and a patient's ANC is the first to the authors' knowledge in the medical literature. Future studies assessing the effects of energetic healing on specific biologic markers of disease are anticipated. Concurrent use of a true RNG may prove to correlate with the effectiveness of energetic therapy. PMID:22132706
Security and composability of randomness expansion from Bell inequalities
NASA Astrophysics Data System (ADS)
Fehr, Serge; Gelles, Ran; Schaffner, Christian
2013-01-01
The nonlocal behavior of quantum mechanics can be used to generate guaranteed fresh randomness from an untrusted device that consists of two nonsignalling components; since the generation process requires some initial fresh randomness to act as a catalyst, one also speaks of randomness expansion. R. Colbeck and A. Kent [J. Phys. A1751-811310.1088/1751-8113/44/9/095305 44, 095305 (2011)] proposed the first method for generating randomness from untrusted devices, but without providing a rigorous analysis. This was addressed subsequently by S. Pironio [Nature (London)NATUAS0028-083610.1038/nature09008 464, 1021 (2010)], who aimed at deriving a lower bound on the min-entropy of the data extracted from an untrusted device based only on the observed nonlocal behavior of the device. Although that article succeeded in developing important tools for reaching the stated goal, the proof itself contained a bug, and the given formal claim on the guaranteed amount of min-entropy needs to be revisited. In this paper we build on the tools provided by Pironio and obtain a meaningful lower bound on the min-entropy of the data produced by an untrusted device based on the observed nonlocal behavior of the device. Our main result confirms the essence of the (improperly formulated) claims of Pironio and puts them on solid ground. We also address the question of composability and show that different untrusted devices can be composed in an alternating manner under the assumption that they are not entangled. This enables superpolynomial randomness expansion based on two untrusted yet unentangled devices.
NASA Astrophysics Data System (ADS)
Zhuang, Yufei; Huang, Haibin
2014-02-01
A hybrid algorithm combining particle swarm optimization (PSO) algorithm with the Legendre pseudospectral method (LPM) is proposed for solving time-optimal trajectory planning problem of underactuated spacecrafts. At the beginning phase of the searching process, an initialization generator is constructed by the PSO algorithm due to its strong global searching ability and robustness to random initial values, however, PSO algorithm has a disadvantage that its convergence rate around the global optimum is slow. Then, when the change in fitness function is smaller than a predefined value, the searching algorithm is switched to the LPM to accelerate the searching process. Thus, with the obtained solutions by the PSO algorithm as a set of proper initial guesses, the hybrid algorithm can find a global optimum more quickly and accurately. 200 Monte Carlo simulations results demonstrate that the proposed hybrid PSO-LPM algorithm has greater advantages in terms of global searching capability and convergence rate than both single PSO algorithm and LPM algorithm. Moreover, the PSO-LPM algorithm is also robust to random initial values.
Lent, Michelle R.; Veur, Stephanie S. Vander; McCoy, Tara A.; Wojtanowski, Alexis C.; Sandoval, Brianna; Sherman, Sandy; Komaroff, Eugene; Foster, Gary D.
2014-01-01
Objective Although many initiatives exist to improve the availability of healthy foods in corner stores, few randomized trials have assessed their effects. This study evaluated, in a randomized, controlled trial, the effects of a first-generation healthy corner store intervention on students’ food and beverage purchases over a two-year period. Design and Methods Participants (n=767) were 4th-6th grade students. Ten schools and their nearby corner stores (n=24) were randomly assigned to the healthy corner store intervention or an assessment-only control. Intercept surveys directly assessed the nutritional characteristics of students’ corner store purchases at baseline, 1 and 2 years. Students’ weight and heights were measured at baseline, 1 and 2 years. Results There were no differences in energy content per intercept purchased from control or intervention schools at year 1 (p=0.12) or 2 (p=0.58). There were no differences between control and intervention students in BMI-z score (year 1, p=0.83; year 2, p=0. 98) or obesity prevalence (year 1, p=0.96; year 2, p=0.58). Conclusions A healthy corner store initiative did not result in significant changes in the energy content of corner store purchases or in continuous or categorical measures of obesity. These data will help to inform future interventions. PMID:25311881
Design automation techniques for custom LSI arrays
NASA Technical Reports Server (NTRS)
Feller, A.
1975-01-01
The standard cell design automation technique is described as an approach for generating random logic PMOS, CMOS or CMOS/SOS custom large scale integration arrays with low initial nonrecurring costs and quick turnaround time or design cycle. The system is composed of predesigned circuit functions or cells and computer programs capable of automatic placement and interconnection of the cells in accordance with an input data net list. The program generates a set of instructions to drive an automatic precision artwork generator. A series of support design automation and simulation programs are described, including programs for verifying correctness of the logic on the arrays, performing dc and dynamic analysis of MOS devices, and generating test sequences.
Analysis of force profile during a maximum voluntary isometric contraction task.
Househam, Elizabeth; McAuley, John; Charles, Thompson; Lightfoot, Timothy; Swash, Michael
2004-03-01
This study analyses maximum voluntary isometric contraction (MVIC) and its measurement by recording the force profile during maximal-effort, 7-s hand-grip contractions. Six healthy subjects each performed three trials repeated at short intervals to study variation from fatigue. These three trials were performed during three separate sessions at daily intervals to look at random variation. A pattern of force development during a trial was identified. An initiation phase, with or without an initiation peak, was followed by a maintenance phase, sometimes with secondary pulses and an underlying decline in force. Of these three MVIC parameters, maximum force during the maintenance phase showed less random variability compared to intertrial fatigue variability than did maximum force during the initiation phase or absolute maximum force. Analysis of MVIC as a task, rather than a single, maximal value reveals deeper levels of motor control in its generation. Thus, force parameters other than the absolute maximum force may be better suited to quantification of muscle performance in health and disease.
Long period pseudo random number sequence generator
NASA Technical Reports Server (NTRS)
Wang, Charles C. (Inventor)
1989-01-01
A circuit for generating a sequence of pseudo random numbers, (A sub K). There is an exponentiator in GF(2 sup m) for the normal basis representation of elements in a finite field GF(2 sup m) each represented by m binary digits and having two inputs and an output from which the sequence (A sub K). Of pseudo random numbers is taken. One of the two inputs is connected to receive the outputs (E sub K) of maximal length shift register of n stages. There is a switch having a pair of inputs and an output. The switch outputs is connected to the other of the two inputs of the exponentiator. One of the switch inputs is connected for initially receiving a primitive element (A sub O) in GF(2 sup m). Finally, there is a delay circuit having an input and an output. The delay circuit output is connected to the other of the switch inputs and the delay circuit input is connected to the output of the exponentiator. Whereby after the exponentiator initially receives the primitive element (A sub O) in GF(2 sup m) through the switch, the switch can be switched to cause the exponentiator to receive as its input a delayed output A(K-1) from the exponentiator thereby generating (A sub K) continuously at the output of the exponentiator. The exponentiator in GF(2 sup m) is novel and comprises a cyclic-shift circuit; a Massey-Omura multiplier; and, a control logic circuit all operably connected together to perform the function U(sub i) = 92(sup i) (for n(sub i) = 1 or 1 (for n(subi) = 0).
NASA Astrophysics Data System (ADS)
Shemer, L.; Sergeeva, A.
2009-12-01
The statistics of random water wave field determines the probability of appearance of extremely high (freak) waves. This probability is strongly related to the spectral wave field characteristics. Laboratory investigation of the spatial variation of the random wave-field statistics for various initial conditions is thus of substantial practical importance. Unidirectional nonlinear random wave groups are investigated experimentally in the 300 m long Large Wave Channel (GWK) in Hannover, Germany, which is the biggest facility of its kind in Europe. Numerous realizations of a wave field with the prescribed frequency power spectrum, yet randomly-distributed initial phases of each harmonic, were generated by a computer-controlled piston-type wavemaker. Several initial spectral shapes with identical dominant wave length but different width were considered. For each spectral shape, the total duration of sampling in all realizations was long enough to yield sufficient sample size for reliable statistics. Through all experiments, an effort had been made to retain the characteristic wave height value and thus the degree of nonlinearity of the wave field. Spatial evolution of numerous statistical wave field parameters (skewness, kurtosis and probability distributions) is studied using about 25 wave gauges distributed along the tank. It is found that, depending on the initial spectral shape, the frequency spectrum of the wave field may undergo significant modification in the course of its evolution along the tank; the values of all statistical wave parameters are strongly related to the local spectral width. A sample of the measured wave height probability functions (scaled by the variance of surface elevation) is plotted in Fig. 1 for the initially narrow rectangular spectrum. The results in Fig. 1 resemble findings obtained in [1] for the initial Gaussian spectral shape. The probability of large waves notably surpasses that predicted by the Rayleigh distribution and is the highest at the distance of about 100 m. Acknowledgement This study is carried out in the framework of the EC supported project "Transnational access to large-scale tests in the Large Wave Channel (GWK) of Forschungszentrum Küste (Contract HYDRALAB III - No. 022441). [1] L. Shemer and A. Sergeeva, J. Geophys. Res. Oceans 114, C01015 (2009). Figure 1. Variation along the tank of the measured wave height distribution for rectangular initial spectral shape, the carrier wave period T0=1.5 s.
Simulation of Stochastic Processes by Coupled ODE-PDE
NASA Technical Reports Server (NTRS)
Zak, Michail
2008-01-01
A document discusses the emergence of randomness in solutions of coupled, fully deterministic ODE-PDE (ordinary differential equations-partial differential equations) due to failure of the Lipschitz condition as a new phenomenon. It is possible to exploit the special properties of ordinary differential equations (represented by an arbitrarily chosen, dynamical system) coupled with the corresponding Liouville equations (used to describe the evolution of initial uncertainties in terms of joint probability distribution) in order to simulate stochastic processes with the proscribed probability distributions. The important advantage of the proposed approach is that the simulation does not require a random-number generator.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trull, J.; Wang, B.; Parra, A.
2015-06-01
Pulse compression in dispersive strontium barium niobate crystal with a random size and distribution of the anti-parallel orientated nonlinear domains is observed via transverse second harmonic generation. The dependence of the transverse width of the second harmonic trace along the propagation direction allows for the determination of the initial chirp and duration of pulses in the femtosecond regime. This technique permits a real-time analysis of the pulse evolution and facilitates fast in-situ correction of pulse chirp acquired in the propagation through an optical system.
Manns, Braden; Barrett, Brendan; Evans, Michael; Garg, Amit; Hemmelgarn, Brenda; Kappel, Joanne; Klarenbach, Scott; Madore, Francois; Parfrey, Patrick; Samuel, Susan; Soroka, Steven; Suri, Rita; Tonelli, Marcello; Wald, Ron; Walsh, Michael; Zappitelli, Michael
2014-01-01
Patients with chronic kidney disease (CKD) do not always receive care consistent with guidelines, in part due to complexities in CKD management, lack of randomized trial data to inform care, and a failure to disseminate best practice. At a 2007 conference of key Canadian stakeholders in kidney disease, attendees noted that the impact of Canadian Society of Nephrology (CSN) guidelines was attenuated given limited formal linkages between the CSN Clinical Practice Guidelines Group, kidney researchers, decision makers and knowledge users, and that further knowledge was required to guide care in patients with kidney disease. The idea for the Canadian Kidney Knowledge Translation and Generation Network (CANN-NET) developed from this meeting. CANN-NET is a pan-Canadian network established in partnership with CSN, the Kidney Foundation of Canada and other professional societies to improve the care and outcomes of patients with and at risk for kidney disease. The initial priority areas for knowledge translation include improving optimal timing of dialysis initiation, and increasing the appropriate use of home dialysis. Given the urgent need for new knowledge, CANN-NET has also brought together a national group of experienced Canadian researchers to address knowledge gaps by encouraging and supporting multicentre randomized trials in priority areas, including management of cardiovascular disease in patients with kidney failure.
ERIC Educational Resources Information Center
Hendra, Richard; Greenberg, David H.; Hamilton, Gayle; Oppenheim, Ari; Pennington, Alexandra; Schaberg, Kelsey; Tessler, Betsy L.
2016-01-01
This report summarizes the two-year findings of a rigorous random assignment evaluation of the WorkAdvance model, a sectoral training, and advancement initiative. Launched in 2011, WorkAdvance goes beyond the previous generation of employment programs by introducing demand-driven skills training and a focus on jobs that have career pathways. The…
Self-synchronization in an ensemble of nonlinear oscillators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ostrovsky, L. A., E-mail: lev.ostrovsky@gmail.com; Galperin, Y. V.; Skirta, E. A.
2016-06-15
The paper describes the results of study of a system of coupled nonlinear, Duffing-type oscillators, from the viewpoint of their self-synchronization, i.e., generation of a coherent field (order parameter) via instability of an incoherent (random-phase) initial state. We consider both the cases of dissipative coupling (e.g., via the joint radiation) and reactive coupling in a Hamiltonian system.
ERIC Educational Resources Information Center
Hendra, Richard; Greenberg, David H.; Hamilton, Gayle; Oppenheim, Ari; Pennington, Alexandra; Schaberg, Kelsey; Tessler, Betsy L.
2016-01-01
This report summarizes the two-year findings of a rigorous random assignment evaluation of the WorkAdvance model, a sectoral training and advancement initiative. Launched in 2011, WorkAdvance goes beyond the previous generation of employment programs by introducing demand-driven skills training and a focus on jobs that have career pathways. The…
ERIC Educational Resources Information Center
Bowen, Michelle; Laurion, Suzanne
A study documented, using a telephone survey, the incidence rates of sexual harassment of mass communication interns, and compared those rates to student and professional rates. A probability sample of 44 male and 52 female mass communications professionals was generated using several random sampling techniques from among professionals who work in…
Random Sequence for Optimal Low-Power Laser Generated Ultrasound
NASA Astrophysics Data System (ADS)
Vangi, D.; Virga, A.; Gulino, M. S.
2017-08-01
Low-power laser generated ultrasounds are lately gaining importance in the research world, thanks to the possibility of investigating a mechanical component structural integrity through a non-contact and Non-Destructive Testing (NDT) procedure. The ultrasounds are, however, very low in amplitude, making it necessary to use pre-processing and post-processing operations on the signals to detect them. The cross-correlation technique is used in this work, meaning that a random signal must be used as laser input. For this purpose, a highly random and simple-to-create code called T sequence, capable of enhancing the ultrasound detectability, is introduced (not previously available at the state of the art). Several important parameters which characterize the T sequence can influence the process: the number of pulses Npulses , the pulse duration δ and the distance between pulses dpulses . A Finite Element FE model of a 3 mm steel disk has been initially developed to analytically study the longitudinal ultrasound generation mechanism and the obtainable outputs. Later, experimental tests have shown that the T sequence is highly flexible for ultrasound detection purposes, making it optimal to use high Npulses and δ but low dpulses . In the end, apart from describing all phenomena that arise in the low-power laser generation process, the results of this study are also important for setting up an effective NDT procedure using this technology.
Super-stable Poissonian structures
NASA Astrophysics Data System (ADS)
Eliazar, Iddo
2012-10-01
In this paper we characterize classes of Poisson processes whose statistical structures are super-stable. We consider a flow generated by a one-dimensional ordinary differential equation, and an ensemble of particles ‘surfing’ the flow. The particles start from random initial positions, and are propagated along the flow by stochastic ‘wave processes’ with general statistics and general cross correlations. Setting the initial positions to be Poisson processes, we characterize the classes of Poisson processes that render the particles’ positions—at all times, and invariantly with respect to the wave processes—statistically identical to their initial positions. These Poisson processes are termed ‘super-stable’ and facilitate the generalization of the notion of stationary distributions far beyond the realm of Markov dynamics.
Towards component-based validation of GATE: aspects of the coincidence processor
Moraes, Eder R.; Poon, Jonathan K.; Balakrishnan, Karthikayan; Wang, Wenli; Badawi, Ramsey D.
2014-01-01
GATE is public domain software widely used for Monte Carlo simulation in emission tomography. Validations of GATE have primarily been performed on a whole-system basis, leaving the possibility that errors in one sub-system may be offset by errors in others. We assess the accuracy of the GATE PET coincidence generation sub-system in isolation, focusing on the options most closely modeling the majority of commercially available scanners. Independent coincidence generators were coded by teams at Toshiba Medical Research Unit (TMRU) and UC Davis. A model similar to the Siemens mCT scanner was created in GATE. Annihilation photons interacting with the detectors were recorded. Coincidences were generated using GATE, TMRU and UC Davis code and results compared to “ground truth” obtained from the history of the photon interactions. GATE was tested twice, once with every qualified single event opening a time window and initiating a coincidence check (the “multiple window method”), and once where a time window is opened and a coincidence check initiated only by the first single event to occur after the end of the prior time window (the “single window method”). True, scattered and random coincidences were compared. Noise equivalent count rates were also computed and compared. The TMRU and UC Davis coincidence generators agree well with ground truth. With GATE, reasonable accuracy can be obtained if the single window method option is chosen and random coincidences are estimated without use of the delayed coincidence option. However in this GATE version, other parameter combinations can result in significant errors. PMID:25240897
NASA Astrophysics Data System (ADS)
Senavirathne, Gayan; Bertram, Jeffrey G.; Jaszczur, Malgorzata; Chaurasiya, Kathy R.; Pham, Phuong; Mak, Chi H.; Goodman, Myron F.; Rueda, David
2015-12-01
Activation-induced deoxycytidine deaminase (AID) generates antibody diversity in B cells by initiating somatic hypermutation (SHM) and class-switch recombination (CSR) during transcription of immunoglobulin variable (IgV) and switch region (IgS) DNA. Using single-molecule FRET, we show that AID binds to transcribed dsDNA and translocates unidirectionally in concert with RNA polymerase (RNAP) on moving transcription bubbles, while increasing the fraction of stalled bubbles. AID scans randomly when constrained in an 8 nt model bubble. When unconstrained on single-stranded (ss) DNA, AID moves in random bidirectional short slides/hops over the entire molecule while remaining bound for ~5 min. Our analysis distinguishes dynamic scanning from static ssDNA creasing. That AID alone can track along with RNAP during transcription and scan within stalled transcription bubbles suggests a mechanism by which AID can initiate SHM and CSR when properly regulated, yet when unregulated can access non-Ig genes and cause cancer.
Jet meandering by a foil pitching in quiescent fluid
NASA Astrophysics Data System (ADS)
Shinde, Sachin Y.; Arakeri, Jaywant H.
2013-04-01
The flow produced by a rigid symmetric NACA0015 airfoil purely pitching at a fixed location in quiescent fluid (the limiting case of infinite Strouhal number) is studied using visualizations and particle image velocimetry. A weak jet is generated whose inclination changes continually with time. This meandering is observed to be random and independent of the initial conditions, over a wide range of pitching parameters.
NASA Astrophysics Data System (ADS)
Shi, Xizhi; He, Chaoyu; Pickard, Chris J.; Tang, Chao; Zhong, Jianxin
2018-01-01
A method is introduced to stochastically generate crystal structures with defined structural characteristics. Reasonable quotient graphs for symmetric crystals are constructed using a random strategy combined with space group and graph theory. Our algorithm enables the search for large-size and complex crystal structures with a specified connectivity, such as threefold sp2 carbons, fourfold sp3 carbons, as well as mixed sp2-sp3 carbons. To demonstrate the method, we randomly construct initial structures adhering to space groups from 75 to 230 and a range of lattice constants, and we identify 281 new sp3 carbon crystals. First-principles optimization of these structures show that most of them are dynamically and mechanically stable and are energetically comparable to those previously proposed. Some of the new structures can be considered as candidates to explain the experimental cold compression of graphite.
Thermodynamic method for generating random stress distributions on an earthquake fault
Barall, Michael; Harris, Ruth A.
2012-01-01
This report presents a new method for generating random stress distributions on an earthquake fault, suitable for use as initial conditions in a dynamic rupture simulation. The method employs concepts from thermodynamics and statistical mechanics. A pattern of fault slip is considered to be analogous to a micro-state of a thermodynamic system. The energy of the micro-state is taken to be the elastic energy stored in the surrounding medium. Then, the Boltzmann distribution gives the probability of a given pattern of fault slip and stress. We show how to decompose the system into independent degrees of freedom, which makes it computationally feasible to select a random state. However, due to the equipartition theorem, straightforward application of the Boltzmann distribution leads to a divergence which predicts infinite stress. To avoid equipartition, we show that the finite strength of the fault acts to restrict the possible states of the system. By analyzing a set of earthquake scaling relations, we derive a new formula for the expected power spectral density of the stress distribution, which allows us to construct a computer algorithm free of infinities. We then present a new technique for controlling the extent of the rupture by generating a random stress distribution thousands of times larger than the fault surface, and selecting a portion which, by chance, has a positive stress perturbation of the desired size. Finally, we present a new two-stage nucleation method that combines a small zone of forced rupture with a larger zone of reduced fracture energy.
Saelens, Brian E; Scholz, Kelley; Walters, Kelly; Simoni, Jane M; Wright, Davene R
2017-08-01
To examine feasibility and initial efficacy of having previously treated parents serve as peer interventionists in family-based behavioral weight management treatment (FBT). Children aged 7-11 years with overweight/obesity and parents (n = 59 families) were enrolled in one of two pilot trials, the EPICH (Engaging Parents in Child Health) randomized trial comparing professional versus peer FBT delivery or the Parent Partnership trial, which provided professionally delivered FBT to families (first generation) and then randomly assigned first generation parents to either be or not be peer interventionists for subsequent families (second generation). Efficacy (child zBMI change), feasibility, and costs for delivering FBT, and impacts of being a peer interventionist were examined. In EPICH, families receiving professional versus peer intervention had similar decreases in child zBMI and parent BMI, with markedly lower costs for peer versus professional delivery. In Parent Partnership, families receiving peer intervention significantly decreased weight status, with very preliminary evidence suggesting better maintenance of child zBMI changes if parents served as peer interventionists. Previously treated parents were willing, highly confident, and able to serve as peer interventionists in FBT. Two pilot randomized clinical trials suggest parents-as-peer interventionists in FBT may be feasible, efficacious, and delivered at lower costs, with perhaps some additional benefits to serving as a peer interventionist. More robust investigation is warranted of peer treatment delivery models for pediatric weight management.
An Image Encryption Algorithm Utilizing Julia Sets and Hilbert Curves
Sun, Yuanyuan; Chen, Lina; Xu, Rudan; Kong, Ruiqing
2014-01-01
Image encryption is an important and effective technique to protect image security. In this paper, a novel image encryption algorithm combining Julia sets and Hilbert curves is proposed. The algorithm utilizes Julia sets’ parameters to generate a random sequence as the initial keys and gets the final encryption keys by scrambling the initial keys through the Hilbert curve. The final cipher image is obtained by modulo arithmetic and diffuse operation. In this method, it needs only a few parameters for the key generation, which greatly reduces the storage space. Moreover, because of the Julia sets’ properties, such as infiniteness and chaotic characteristics, the keys have high sensitivity even to a tiny perturbation. The experimental results indicate that the algorithm has large key space, good statistical property, high sensitivity for the keys, and effective resistance to the chosen-plaintext attack. PMID:24404181
Seidavi, Alireza; Goldsmith, Marian R.
2014-01-01
Abstract The experiments reported here were conducted to investigate the effect of selection on three quantitative traits, namely cocoon weight, cocoon shell weight, and cocoon shell percentage, during four generations by rearing six pure breeds of domesticated silkworm, Bombyx mori L. (Lepidoptera: Bombycidae) of Chinese and Japanese origin compared with random unselected groups as controls. All stages of rearing and data recording were performed over four rearing periods, with generations 1–3 during successive spring seasons and generation 4 during the autumn season in year 3. Each pure line contained two groups of selected and random (control) groups. Comparisons included the effect of selection methods, pure line, and generation on the phenotypic values. We found strong main effects of pure line, generation, sex, and group and support for nearly all interactions between these main effects for all three response traits. The results indicated that cocoon weight and cocoon shell weight in the selected group were higher than in the control or nonselected group. Both selected and nonselected groups had the lowest cocoon weight, cocoon shell weight, and cocoon shell percentage in the fourth generation when environmental conditions during the autumn season were less favorable than spring. The cocoon weight and cocoon shell weight averages were higher for nonselected groups in the second and third generations, and for the selected group in the first generation due to the direct effect of selection. PMID:25527593
NASA Astrophysics Data System (ADS)
Aksoy, A.; Lee, J. H.; Kitanidis, P. K.
2016-12-01
Heterogeneity in hydraulic conductivity (K) impacts the transport and fate of contaminants in subsurface as well as design and operation of managed aquifer recharge (MAR) systems. Recently, improvements in computational resources and availability of big data through electrical resistivity tomography (ERT) and remote sensing have provided opportunities to better characterize the subsurface. Yet, there is need to improve prediction and evaluation methods in order to obtain information from field measurements for better field characterization. In this study, genetic algorithm optimization, which has been widely used in optimal aquifer remediation designs, was used to determine the spatial distribution of K. A hypothetical 2 km by 2 km aquifer was considered. A genetic algorithm library, PGAPack, was linked with a fast Fourier transform based random field generator as well as a groundwater flow and contaminant transport simulation model (BIO2D-KE). The objective of the optimization model was to minimize the total squared error between measured and predicted field values. It was assumed measured K values were available through ERT. Performance of genetic algorithm in predicting the distribution of K was tested for different cases. In the first one, it was assumed that observed K values were evaluated using the random field generator only as the forward model. In the second case, as well as K-values obtained through ERT, measured head values were incorporated into evaluation in which BIO2D-KE and random field generator were used as the forward models. Lastly, tracer concentrations were used as additional information in the optimization model. Initial results indicated enhanced performance when random field generator and BIO2D-KE are used in combination in predicting the spatial distribution in K.
Zhao, Youxuan; Li, Feilong; Cao, Peng; Liu, Yaolu; Zhang, Jianyu; Fu, Shaoyun; Zhang, Jun; Hu, Ning
2017-08-01
Since the identification of micro-cracks in engineering materials is very valuable in understanding the initial and slight changes in mechanical properties of materials under complex working environments, numerical simulations on the propagation of the low frequency S 0 Lamb wave in thin plates with randomly distributed micro-cracks were performed to study the behavior of nonlinear Lamb waves. The results showed that while the influence of the randomly distributed micro-cracks on the phase velocity of the low frequency S 0 fundamental waves could be neglected, significant ultrasonic nonlinear effects caused by the randomly distributed micro-cracks was discovered, which mainly presented as a second harmonic generation. By using a Monte Carlo simulation method, we found that the acoustic nonlinear parameter increased linearly with the micro-crack density and the size of micro-crack zone, and it was also related to the excitation frequency and friction coefficient of the micro-crack surfaces. In addition, it was found that the nonlinear effect of waves reflected by the micro-cracks was more noticeable than that of the transmitted waves. This study theoretically reveals that the low frequency S 0 mode of Lamb waves can be used as the fundamental waves to quantitatively identify micro-cracks in thin plates. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Keylock, C. J.
2017-03-01
An algorithm is described that can generate random variants of a time series while preserving the probability distribution of original values and the pointwise Hölder regularity. Thus, it preserves the multifractal properties of the data. Our algorithm is similar in principle to well-known algorithms based on the preservation of the Fourier amplitude spectrum and original values of a time series. However, it is underpinned by a dual-tree complex wavelet transform rather than a Fourier transform. Our method, which we term the iterated amplitude adjusted wavelet transform can be used to generate bootstrapped versions of multifractal data, and because it preserves the pointwise Hölder regularity but not the local Hölder regularity, it can be used to test hypotheses concerning the presence of oscillating singularities in a time series, an important feature of turbulence and econophysics data. Because the locations of the data values are randomized with respect to the multifractal structure, hypotheses about their mutual coupling can be tested, which is important for the velocity-intermittency structure of turbulence and self-regulating processes.
On predicting receptivity to surface roughness in a compressible infinite swept wing boundary layer
NASA Astrophysics Data System (ADS)
Thomas, Christian; Mughal, Shahid; Ashworth, Richard
2017-03-01
The receptivity of crossflow disturbances on an infinite swept wing is investigated using solutions of the adjoint linearised Navier-Stokes equations. The adjoint based method for predicting the magnitude of stationary disturbances generated by randomly distributed surface roughness is described, with the analysis extended to include both surface curvature and compressible flow effects. Receptivity is predicted for a broad spectrum of spanwise wavenumbers, variable freestream Reynolds numbers, and subsonic Mach numbers. Curvature is found to play a significant role in the receptivity calculations, while compressible flow effects are only found to marginally affect the initial size of the crossflow instability. A Monte Carlo type analysis is undertaken to establish the mean amplitude and variance of crossflow disturbances generated by the randomly distributed surface roughness. Mean amplitudes are determined for a range of flow parameters that are maximised for roughness distributions containing a broad spectrum of roughness wavelengths, including those that are most effective in generating stationary crossflow disturbances. A control mechanism is then developed where the short scale roughness wavelengths are damped, leading to significant reductions in the receptivity amplitude.
Self-correcting random number generator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humble, Travis S.; Pooser, Raphael C.
2016-09-06
A system and method for generating random numbers. The system may include a random number generator (RNG), such as a quantum random number generator (QRNG) configured to self-correct or adapt in order to substantially achieve randomness from the output of the RNG. By adapting, the RNG may generate a random number that may be considered random regardless of whether the random number itself is tested as such. As an example, the RNG may include components to monitor one or more characteristics of the RNG during operation, and may use the monitored characteristics as a basis for adapting, or self-correcting, tomore » provide a random number according to one or more performance criteria.« less
Creating targeted initial populations for genetic product searches in heterogeneous markets
NASA Astrophysics Data System (ADS)
Foster, Garrett; Turner, Callaway; Ferguson, Scott; Donndelinger, Joseph
2014-12-01
Genetic searches often use randomly generated initial populations to maximize diversity and enable a thorough sampling of the design space. While many of these initial configurations perform poorly, the trade-off between population diversity and solution quality is typically acceptable for small-scale problems. Navigating complex design spaces, however, often requires computationally intelligent approaches that improve solution quality. This article draws on research advances in market-based product design and heuristic optimization to strategically construct 'targeted' initial populations. Targeted initial designs are created using respondent-level part-worths estimated from discrete choice models. These designs are then integrated into a traditional genetic search. Two case study problems of differing complexity are presented to illustrate the benefits of this approach. In both problems, targeted populations lead to computational savings and product configurations with improved market share of preferences. Future research efforts to tailor this approach and extend it towards multiple objectives are also discussed.
Superdiffusion in a non-Markovian random walk model with a Gaussian memory profile
NASA Astrophysics Data System (ADS)
Borges, G. M.; Ferreira, A. S.; da Silva, M. A. A.; Cressoni, J. C.; Viswanathan, G. M.; Mariz, A. M.
2012-09-01
Most superdiffusive Non-Markovian random walk models assume that correlations are maintained at all time scales, e.g., fractional Brownian motion, Lévy walks, the Elephant walk and Alzheimer walk models. In the latter two models the random walker can always "remember" the initial times near t = 0. Assuming jump size distributions with finite variance, the question naturally arises: is superdiffusion possible if the walker is unable to recall the initial times? We give a conclusive answer to this general question, by studying a non-Markovian model in which the walker's memory of the past is weighted by a Gaussian centered at time t/2, at which time the walker had one half the present age, and with a standard deviation σt which grows linearly as the walker ages. For large widths we find that the model behaves similarly to the Elephant model, but for small widths this Gaussian memory profile model behaves like the Alzheimer walk model. We also report that the phenomenon of amnestically induced persistence, known to occur in the Alzheimer walk model, arises in the Gaussian memory profile model. We conclude that memory of the initial times is not a necessary condition for generating (log-periodic) superdiffusion. We show that the phenomenon of amnestically induced persistence extends to the case of a Gaussian memory profile.
Real-time fast physical random number generator with a photonic integrated circuit.
Ugajin, Kazusa; Terashima, Yuta; Iwakawa, Kento; Uchida, Atsushi; Harayama, Takahisa; Yoshimura, Kazuyuki; Inubushi, Masanobu
2017-03-20
Random number generators are essential for applications in information security and numerical simulations. Most optical-chaos-based random number generators produce random bit sequences by offline post-processing with large optical components. We demonstrate a real-time hardware implementation of a fast physical random number generator with a photonic integrated circuit and a field programmable gate array (FPGA) electronic board. We generate 1-Tbit random bit sequences and evaluate their statistical randomness using NIST Special Publication 800-22 and TestU01. All of the BigCrush tests in TestU01 are passed using 410-Gbit random bit sequences. A maximum real-time generation rate of 21.1 Gb/s is achieved for random bit sequences in binary format stored in a computer, which can be directly used for applications involving secret keys in cryptography and random seeds in large-scale numerical simulations.
Towards component-based validation of GATE: aspects of the coincidence processor.
Moraes, Eder R; Poon, Jonathan K; Balakrishnan, Karthikayan; Wang, Wenli; Badawi, Ramsey D
2015-02-01
GATE is public domain software widely used for Monte Carlo simulation in emission tomography. Validations of GATE have primarily been performed on a whole-system basis, leaving the possibility that errors in one sub-system may be offset by errors in others. We assess the accuracy of the GATE PET coincidence generation sub-system in isolation, focusing on the options most closely modeling the majority of commercially available scanners. Independent coincidence generators were coded by teams at Toshiba Medical Research Unit (TMRU) and UC Davis. A model similar to the Siemens mCT scanner was created in GATE. Annihilation photons interacting with the detectors were recorded. Coincidences were generated using GATE, TMRU and UC Davis code and results compared to "ground truth" obtained from the history of the photon interactions. GATE was tested twice, once with every qualified single event opening a time window and initiating a coincidence check (the "multiple window method"), and once where a time window is opened and a coincidence check initiated only by the first single event to occur after the end of the prior time window (the "single window method"). True, scattered and random coincidences were compared. Noise equivalent count rates were also computed and compared. The TMRU and UC Davis coincidence generators agree well with ground truth. With GATE, reasonable accuracy can be obtained if the single window method option is chosen and random coincidences are estimated without use of the delayed coincidence option. However in this GATE version, other parameter combinations can result in significant errors. Copyright © 2014 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Quantum random number generator
Pooser, Raphael C.
2016-05-10
A quantum random number generator (QRNG) and a photon generator for a QRNG are provided. The photon generator may be operated in a spontaneous mode below a lasing threshold to emit photons. Photons emitted from the photon generator may have at least one random characteristic, which may be monitored by the QRNG to generate a random number. In one embodiment, the photon generator may include a photon emitter and an amplifier coupled to the photon emitter. The amplifier may enable the photon generator to be used in the QRNG without introducing significant bias in the random number and may enable multiplexing of multiple random numbers. The amplifier may also desensitize the photon generator to fluctuations in power supplied thereto while operating in the spontaneous mode. In one embodiment, the photon emitter and amplifier may be a tapered diode amplifier.
Food-web complexity emerging from ecological dynamics on adaptive networks.
Garcia-Domingo, Josep L; Saldaña, Joan
2007-08-21
Food webs are complex networks describing trophic interactions in ecological communities. Since Robert May's seminal work on random structured food webs, the complexity-stability debate is a central issue in ecology: does network complexity increase or decrease food-web persistence? A multi-species predator-prey model incorporating adaptive predation shows that the action of ecological dynamics on the topology of a food web (whose initial configuration is generated either by the cascade model or by the niche model) render, when a significant fraction of adaptive predators is present, similar hyperbolic complexity-persistence relationships as those observed in empirical food webs. It is also shown that the apparent positive relation between complexity and persistence in food webs generated under the cascade model, which has been pointed out in previous papers, disappears when the final connection is used instead of the initial one to explain species persistence.
Quantum random number generation
Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; ...
2016-06-28
Quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a highmore » speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.« less
Evolution of the magnetorotational instability on initially tangled magnetic fields
NASA Astrophysics Data System (ADS)
Bhat, Pallavi; Ebrahimi, Fatima; Blackman, Eric G.; Subramanian, Kandaswamy
2017-12-01
The initial magnetic field of previous magnetorotational instability (MRI) simulations has always included a significant system-scale component, even if stochastic. However, it is of conceptual and practical interest to assess whether the MRI can grow when the initial field is turbulent. The ubiquitous presence of turbulent or random flows in astrophysical plasmas generically leads to a small-scale dynamo (SSD), which would provide initial seed turbulent velocity and magnetic fields in the plasma that becomes an accretion disc. Can the MRI grow from these more realistic initial conditions? To address this, we supply a standard shearing box with isotropically forced SSD generated magnetic and velocity fields as initial conditions and remove the forcing. We find that if the initially supplied fields are too weak or too incoherent, they decay from the initial turbulent cascade faster than they can grow via the MRI. When the initially supplied fields are sufficient to allow MRI growth and sustenance, the saturated stresses, large-scale fields and power spectra match those of the standard zero net flux MRI simulation with an initial large-scale vertical field.
Continuous time quantum random walks in free space
NASA Astrophysics Data System (ADS)
Eichelkraut, Toni; Vetter, Christian; Perez-Leija, Armando; Christodoulides, Demetrios; Szameit, Alexander
2014-05-01
We show theoretically and experimentally that two-dimensional continuous time coherent random walks are possible in free space, that is, in the absence of any external potential, by properly tailoring the associated initial wave function. These effects are experimentally demonstrated using classical paraxial light. Evidently, the usage of classical beams to explore the dynamics of point-like quantum particles is possible since both phenomena are mathematically equivalent. This in turn makes our approach suitable for the realization of random walks using different quantum particles, including electrons and photons. To study the spatial evolution of a wavefunction theoretically, we consider the one-dimensional paraxial wave equation (i∂z +1/2 ∂x2) Ψ = 0 . Starting with the initially localized wavefunction Ψ (x , 0) = exp [ -x2 / 2σ2 ] J0 (αx) , one can show that the evolution of such Gaussian-apodized Bessel envelopes within a region of validity resembles the probability pattern of a quantum walker traversing a uniform lattice. In order to generate the desired input-field in our experimental setting we shape the amplitude and phase of a collimated light beam originating from a classical HeNe-Laser (633 nm) utilizing a spatial light modulator.
Kuklinski, Margaret R; Fagan, Abigail A; Hawkins, J David; Briney, John S; Catalano, Richard F
2015-06-01
To determine whether the Communities That Care (CTC) prevention system is a cost-beneficial intervention. Data were from a longitudinal panel of 4,407 youth participating in a randomized controlled trial including 24 towns in 7 states, matched in pairs within state and randomly assigned to condition. Significant differences favoring intervention youth in sustained abstinence from delinquency, alcohol use, and tobacco use through Grade 12 were monetized and compared to economic investment in CTC. CTC was estimated to produce $4,477 in benefits per youth (discounted 2011 dollars). It cost $556 per youth to implement CTC for 5 years. The net present benefit was $3,920. The benefit-cost ratio was $8.22 per dollar invested. The internal rate of return was 21%. Risk that investment would exceed benefits was minimal. Investment was expected to be recouped within 9 years. Sensitivity analyses in which effects were halved yielded positive cost-beneficial results. CTC is a cost-beneficial, community-based approach to preventing initiation of delinquency, alcohol use, and tobacco use. CTC is estimated to generate economic benefits that exceed implementation costs when disseminated with fidelity in communities.
MATIN: a random network coding based framework for high quality peer-to-peer live video streaming.
Barekatain, Behrang; Khezrimotlagh, Dariush; Aizaini Maarof, Mohd; Ghaeini, Hamid Reza; Salleh, Shaharuddin; Quintana, Alfonso Ariza; Akbari, Behzad; Cabrera, Alicia Triviño
2013-01-01
In recent years, Random Network Coding (RNC) has emerged as a promising solution for efficient Peer-to-Peer (P2P) video multicasting over the Internet. This probably refers to this fact that RNC noticeably increases the error resiliency and throughput of the network. However, high transmission overhead arising from sending large coefficients vector as header has been the most important challenge of the RNC. Moreover, due to employing the Gauss-Jordan elimination method, considerable computational complexity can be imposed on peers in decoding the encoded blocks and checking linear dependency among the coefficients vectors. In order to address these challenges, this study introduces MATIN which is a random network coding based framework for efficient P2P video streaming. The MATIN includes a novel coefficients matrix generation method so that there is no linear dependency in the generated coefficients matrix. Using the proposed framework, each peer encapsulates one instead of n coefficients entries into the generated encoded packet which results in very low transmission overhead. It is also possible to obtain the inverted coefficients matrix using a bit number of simple arithmetic operations. In this regard, peers sustain very low computational complexities. As a result, the MATIN permits random network coding to be more efficient in P2P video streaming systems. The results obtained from simulation using OMNET++ show that it substantially outperforms the RNC which uses the Gauss-Jordan elimination method by providing better video quality on peers in terms of the four important performance metrics including video distortion, dependency distortion, End-to-End delay and Initial Startup delay.
NASA Astrophysics Data System (ADS)
Vodenicarevic, D.; Locatelli, N.; Mizrahi, A.; Friedman, J. S.; Vincent, A. F.; Romera, M.; Fukushima, A.; Yakushiji, K.; Kubota, H.; Yuasa, S.; Tiwari, S.; Grollier, J.; Querlioz, D.
2017-11-01
Low-energy random number generation is critical for many emerging computing schemes proposed to complement or replace von Neumann architectures. However, current random number generators are always associated with an energy cost that is prohibitive for these computing schemes. We introduce random number bit generation based on specific nanodevices: superparamagnetic tunnel junctions. We experimentally demonstrate high-quality random bit generation that represents an orders-of-magnitude improvement in energy efficiency over current solutions. We show that the random generation speed improves with nanodevice scaling, and we investigate the impact of temperature, magnetic field, and cross talk. Finally, we show how alternative computing schemes can be implemented using superparamagentic tunnel junctions as random number generators. These results open the way for fabricating efficient hardware computing devices leveraging stochasticity, and they highlight an alternative use for emerging nanodevices.
Konias, Sokratis; Chouvarda, Ioanna; Vlahavas, Ioannis; Maglaveras, Nicos
2005-09-01
Current approaches for mining association rules usually assume that the mining is performed in a static database, where the problem of missing attribute values does not practically exist. However, these assumptions are not preserved in some medical databases, like in a home care system. In this paper, a novel uncertainty rule algorithm is illustrated, namely URG-2 (Uncertainty Rule Generator), which addresses the problem of mining dynamic databases containing missing values. This algorithm requires only one pass from the initial dataset in order to generate the item set, while new metrics corresponding to the notion of Support and Confidence are used. URG-2 was evaluated over two medical databases, introducing randomly multiple missing values for each record's attribute (rate: 5-20% by 5% increments) in the initial dataset. Compared with the classical approach (records with missing values are ignored), the proposed algorithm was more robust in mining rules from datasets containing missing values. In all cases, the difference in preserving the initial rules ranged between 30% and 60% in favour of URG-2. Moreover, due to its incremental nature, URG-2 saved over 90% of the time required for thorough re-mining. Thus, the proposed algorithm can offer a preferable solution for mining in dynamic relational databases.
Owens, Scott R; Wiehagen, Luke T; Kelly, Susan M; Piccoli, Anthony L; Lassige, Karen; Yousem, Samuel A; Dhir, Rajiv; Parwani, Anil V
2010-09-01
We recently implemented a novel pre-sign-out quality assurance tool in our subspecialty-based surgical pathology practice at the University of Pittsburgh Medical Center. It randomly selects an adjustable percentage of cases for review by a second pathologist at the time the originating pathologist's electronic signature is entered and requires that the review be completed within 24 hours, before release of the final report. The tool replaced a retrospective audit system and it has been in successful use since January 2009. We report our initial experience for the first 14 months of its service. During this time, the disagreement numbers and levels were similar to those identified using the retrospective system, case turnaround time was not significantly affected, and the number of case amendments generated decreased. The tool is a useful quality assurance instrument and its prospective nature allows for the potential prevention of some serious errors.
Senavirathne, Gayan; Bertram, Jeffrey G.; Jaszczur, Malgorzata; Chaurasiya, Kathy R.; Pham, Phuong; Mak, Chi H.; Goodman, Myron F.; Rueda, David
2015-01-01
Activation-induced deoxycytidine deaminase (AID) generates antibody diversity in B cells by initiating somatic hypermutation (SHM) and class-switch recombination (CSR) during transcription of immunoglobulin variable (IgV) and switch region (IgS) DNA. Using single-molecule FRET, we show that AID binds to transcribed dsDNA and translocates unidirectionally in concert with RNA polymerase (RNAP) on moving transcription bubbles, while increasing the fraction of stalled bubbles. AID scans randomly when constrained in an 8 nt model bubble. When unconstrained on single-stranded (ss) DNA, AID moves in random bidirectional short slides/hops over the entire molecule while remaining bound for ∼5 min. Our analysis distinguishes dynamic scanning from static ssDNA creasing. That AID alone can track along with RNAP during transcription and scan within stalled transcription bubbles suggests a mechanism by which AID can initiate SHM and CSR when properly regulated, yet when unregulated can access non-Ig genes and cause cancer. PMID:26681117
Pseudo-Random Number Generator Based on Coupled Map Lattices
NASA Astrophysics Data System (ADS)
Lü, Huaping; Wang, Shihong; Hu, Gang
A one-way coupled chaotic map lattice is used for generating pseudo-random numbers. It is shown that with suitable cooperative applications of both chaotic and conventional approaches, the output of the spatiotemporally chaotic system can easily meet the practical requirements of random numbers, i.e., excellent random statistical properties, long periodicity of computer realizations, and fast speed of random number generations. This pseudo-random number generator system can be used as ideal synchronous and self-synchronizing stream cipher systems for secure communications.
Nonlinear consolidation in randomly heterogeneous highly compressible aquitards
NASA Astrophysics Data System (ADS)
Zapata-Norberto, Berenice; Morales-Casique, Eric; Herrera, Graciela S.
2018-05-01
Severe land subsidence due to groundwater extraction may occur in multiaquifer systems where highly compressible aquitards are present. The highly compressible nature of the aquitards leads to nonlinear consolidation where the groundwater flow parameters are stress-dependent. The case is further complicated by the heterogeneity of the hydrogeologic and geotechnical properties of the aquitards. The effect of realistic vertical heterogeneity of hydrogeologic and geotechnical parameters on the consolidation of highly compressible aquitards is investigated by means of one-dimensional Monte Carlo numerical simulations where the lower boundary represents the effect of an instant drop in hydraulic head due to groundwater pumping. Two thousand realizations are generated for each of the following parameters: hydraulic conductivity ( K), compression index ( C c), void ratio ( e) and m (an empirical parameter relating hydraulic conductivity and void ratio). The correlation structure, the mean and the variance for each parameter were obtained from a literature review about field studies in the lacustrine sediments of Mexico City. The results indicate that among the parameters considered, random K has the largest effect on the ensemble average behavior of the system when compared to a nonlinear consolidation model with deterministic initial parameters. The deterministic solution underestimates the ensemble average of total settlement when initial K is random. In addition, random K leads to the largest variance (and therefore largest uncertainty) of total settlement, groundwater flux and time to reach steady-state conditions.
Robot path planning using a genetic algorithm
NASA Technical Reports Server (NTRS)
Cleghorn, Timothy F.; Baffes, Paul T.; Wang, Liu
1988-01-01
Robot path planning can refer either to a mobile vehicle such as a Mars Rover, or to an end effector on an arm moving through a cluttered workspace. In both instances there may exist many solutions, some of which are better than others, either in terms of distance traversed, energy expended, or joint angle or reach capabilities. A path planning program has been developed based upon a genetic algorithm. This program assumes global knowledge of the terrain or workspace, and provides a family of good paths between the initial and final points. Initially, a set of valid random paths are constructed. Successive generations of valid paths are obtained using one of several possible reproduction strategies similar to those found in biological communities. A fitness function is defined to describe the goodness of the path, in this case including length, slope, and obstacle avoidance considerations. It was found that with some reproduction strategies, the average value of the fitness function improved for successive generations, and that by saving the best paths of each generation, one could quite rapidly obtain a collection of good candidate solutions.
Extracting random numbers from quantum tunnelling through a single diode.
Bernardo-Gavito, Ramón; Bagci, Ibrahim Ethem; Roberts, Jonathan; Sexton, James; Astbury, Benjamin; Shokeir, Hamzah; McGrath, Thomas; Noori, Yasir J; Woodhead, Christopher S; Missous, Mohamed; Roedig, Utz; Young, Robert J
2017-12-19
Random number generation is crucial in many aspects of everyday life, as online security and privacy depend ultimately on the quality of random numbers. Many current implementations are based on pseudo-random number generators, but information security requires true random numbers for sensitive applications like key generation in banking, defence or even social media. True random number generators are systems whose outputs cannot be determined, even if their internal structure and response history are known. Sources of quantum noise are thus ideal for this application due to their intrinsic uncertainty. In this work, we propose using resonant tunnelling diodes as practical true random number generators based on a quantum mechanical effect. The output of the proposed devices can be directly used as a random stream of bits or can be further distilled using randomness extraction algorithms, depending on the application.
Generating and using truly random quantum states in Mathematica
NASA Astrophysics Data System (ADS)
Miszczak, Jarosław Adam
2012-01-01
The problem of generating random quantum states is of a great interest from the quantum information theory point of view. In this paper we present a package for Mathematica computing system harnessing a specific piece of hardware, namely Quantis quantum random number generator (QRNG), for investigating statistical properties of quantum states. The described package implements a number of functions for generating random states, which use Quantis QRNG as a source of randomness. It also provides procedures which can be used in simulations not related directly to quantum information processing. Program summaryProgram title: TRQS Catalogue identifier: AEKA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 7924 No. of bytes in distributed program, including test data, etc.: 88 651 Distribution format: tar.gz Programming language: Mathematica, C Computer: Requires a Quantis quantum random number generator (QRNG, http://www.idquantique.com/true-random-number-generator/products-overview.html) and supporting a recent version of Mathematica Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit) RAM: Case dependent Classification: 4.15 Nature of problem: Generation of random density matrices. Solution method: Use of a physical quantum random number generator. Running time: Generating 100 random numbers takes about 1 second, generating 1000 random density matrices takes more than a minute.
Mechanism for generation of left isomerism in Ccdc40 mutant embryos
Sugrue, Kelsey F.
2017-01-01
Leftward fluid flow in the mouse node is generated by cilia and is critical for initiating asymmetry of the left-right axis. Coiled-coil domain containing-40 (Ccdc40) plays an evolutionarily conserved role in the assembly of motile cilia and establishment of the left-right axis. Approximately one-third of Ccdc40lnks mutant embryos display situs defects and here we investigate the underlying mechanism. Ccdc40lnks mutants show delayed induction of markers of the left-lateral plate mesoderm (L-LPM) including Lefty1, Lefty2 and Nodal. Consistent with defective cilia motility compromising fluid flow across the node, initiation of asymmetric perinodal Cerberus like-2 (Cerl2) expression is delayed and then randomized. This is followed by delayed and then randomized asymmetric Nodal expression around the node. We propose a model to explain how left isomerism arises in a proportion of Ccdc40lnks mutants. We postulate that with defective motile cilia, Cerl2 expression remains symmetric and Nodal is antagonized equally on both sides of the node. This effectively reduces Nodal activation bilaterally, leading to reduced and delayed activation of Nodal and its antagonists in the LPM. This model is further supported by the failure to establish Nodal expression in the left-LPM with reduced Nodal gene dosage in Ccdc40lnks/lnks;NodalLacZ/+ mutants causing a predominance of right not left isomerism. Together these results suggest a model where cilia generated fluid flow in the node functions to ensure robust Nodal activation and a timely left-sided developmental program in the LPM. PMID:28182636
NASA Astrophysics Data System (ADS)
Fei, Pengzhan; Cavicchi, Kevin
2011-03-01
A new ABA triblock copolymer of poly(styrene-block- methylacrylate-random-octadecylacrylate-block-styrene) (PS-b- PMA-r-PODA-b-PS) was synthesized by reversible addition fragmentation chain transfer polymerization. The triblock copolymer can generate a three-dimensional, physically crosslinked network by self-assembly, where the glassy PS domains physically crosslink the midblock chains. The side chain crystallization of the polyoctadecylacrylare (PODA) side chain generates a second reversible network enabling shape memory properties. Shape memory tests by uniaxial deformation and recovery of molded dog-bone shape samples demonstrate that shape fixities above 96% and shape recoveries above 98% were obtained for extensional strains up to 300%. An outstanding advantage of this shape memory material is that it can be very easily shaped and remolded by elevating the temperature to 140circ; C, and after remolding the initial shape memory properties are totally recovered by eliminating the defects introduced by the previous deformation cycling.
PCA-LBG-based algorithms for VQ codebook generation
NASA Astrophysics Data System (ADS)
Tsai, Jinn-Tsong; Yang, Po-Yuan
2015-04-01
Vector quantisation (VQ) codebooks are generated by combining principal component analysis (PCA) algorithms with Linde-Buzo-Gray (LBG) algorithms. All training vectors are grouped according to the projected values of the principal components. The PCA-LBG-based algorithms include (1) PCA-LBG-Median, which selects the median vector of each group, (2) PCA-LBG-Centroid, which adopts the centroid vector of each group, and (3) PCA-LBG-Random, which randomly selects a vector of each group. The LBG algorithm finds a codebook based on the better vectors sent to an initial codebook by the PCA. The PCA performs an orthogonal transformation to convert a set of potentially correlated variables into a set of variables that are not linearly correlated. Because the orthogonal transformation efficiently distinguishes test image vectors, the proposed PCA-LBG-based algorithm is expected to outperform conventional algorithms in designing VQ codebooks. The experimental results confirm that the proposed PCA-LBG-based algorithms indeed obtain better results compared to existing methods reported in the literature.
Quantum random number generation for loophole-free Bell tests
NASA Astrophysics Data System (ADS)
Mitchell, Morgan; Abellan, Carlos; Amaya, Waldimar
2015-05-01
We describe the generation of quantum random numbers at multi-Gbps rates, combined with real-time randomness extraction, to give very high purity random numbers based on quantum events at most tens of ns in the past. The system satisfies the stringent requirements of quantum non-locality tests that aim to close the timing loophole. We describe the generation mechanism using spontaneous-emission-driven phase diffusion in a semiconductor laser, digitization, and extraction by parity calculation using multi-GHz logic chips. We pay special attention to experimental proof of the quality of the random numbers and analysis of the randomness extraction. In contrast to widely-used models of randomness generators in the computer science literature, we argue that randomness generation by spontaneous emission can be extracted from a single source.
Some design issues of strata-matched non-randomized studies with survival outcomes.
Mazumdar, Madhu; Tu, Donsheng; Zhou, Xi Kathy
2006-12-15
Non-randomized studies for the evaluation of a medical intervention are useful for quantitative hypothesis generation before the initiation of a randomized trial and also when randomized clinical trials are difficult to conduct. A strata-matched non-randomized design is often utilized where subjects treated by a test intervention are matched to a fixed number of subjects treated by a standard intervention within covariate based strata. In this paper, we consider the issue of sample size calculation for this design. Based on the asymptotic formula for the power of a stratified log-rank test, we derive a formula to calculate the minimum number of subjects in the test intervention group that is required to detect a given relative risk between the test and standard interventions. When this minimum number of subjects in the test intervention group is available, an equation is also derived to find the multiple that determines the number of subjects in the standard intervention group within each stratum. The methodology developed is applied to two illustrative examples in gastric cancer and sarcoma.
A random forest algorithm for nowcasting of intense precipitation events
NASA Astrophysics Data System (ADS)
Das, Saurabh; Chakraborty, Rohit; Maitra, Animesh
2017-09-01
Automatic nowcasting of convective initiation and thunderstorms has potential applications in several sectors including aviation planning and disaster management. In this paper, random forest based machine learning algorithm is tested for nowcasting of convective rain with a ground based radiometer. Brightness temperatures measured at 14 frequencies (7 frequencies in 22-31 GHz band and 7 frequencies in 51-58 GHz bands) are utilized as the inputs of the model. The lower frequency band is associated to the water vapor absorption whereas the upper frequency band relates to the oxygen absorption and hence, provide information on the temperature and humidity of the atmosphere. Synthetic minority over-sampling technique is used to balance the data set and 10-fold cross validation is used to assess the performance of the model. Results indicate that random forest algorithm with fixed alarm generation time of 30 min and 60 min performs quite well (probability of detection of all types of weather condition ∼90%) with low false alarms. It is, however, also observed that reducing the alarm generation time improves the threat score significantly and also decreases false alarms. The proposed model is found to be very sensitive to the boundary layer instability as indicated by the variable importance measure. The study shows the suitability of a random forest algorithm for nowcasting application utilizing a large number of input parameters from diverse sources and can be utilized in other forecasting problems.
Takeuchi, Hiroshi
2018-05-08
Since searching for the global minimum on the potential energy surface of a cluster is very difficult, many geometry optimization methods have been proposed, in which initial geometries are randomly generated and subsequently improved with different algorithms. In this study, a size-guided multi-seed heuristic method is developed and applied to benzene clusters. It produces initial configurations of the cluster with n molecules from the lowest-energy configurations of the cluster with n - 1 molecules (seeds). The initial geometries are further optimized with the geometrical perturbations previously used for molecular clusters. These steps are repeated until the size n satisfies a predefined one. The method locates putative global minima of benzene clusters with up to 65 molecules. The performance of the method is discussed using the computational cost, rates to locate the global minima, and energies of initial geometries. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
An On-Demand Optical Quantum Random Number Generator with In-Future Action and Ultra-Fast Response
Stipčević, Mario; Ursin, Rupert
2015-01-01
Random numbers are essential for our modern information based society e.g. in cryptography. Unlike frequently used pseudo-random generators, physical random number generators do not depend on complex algorithms but rather on a physicsal process to provide true randomness. Quantum random number generators (QRNG) do rely on a process, wich can be described by a probabilistic theory only, even in principle. Here we present a conceptualy simple implementation, which offers a 100% efficiency of producing a random bit upon a request and simultaneously exhibits an ultra low latency. A careful technical and statistical analysis demonstrates its robustness against imperfections of the actual implemented technology and enables to quickly estimate randomness of very long sequences. Generated random numbers pass standard statistical tests without any post-processing. The setup described, as well as the theory presented here, demonstrate the maturity and overall understanding of the technology. PMID:26057576
NASA Astrophysics Data System (ADS)
Miszczak, Jarosław Adam
2013-01-01
The presented package for the Mathematica computing system allows the harnessing of quantum random number generators (QRNG) for investigating the statistical properties of quantum states. The described package implements a number of functions for generating random states. The new version of the package adds the ability to use the on-line quantum random number generator service and implements new functions for retrieving lists of random numbers. Thanks to the introduced improvements, the new version provides faster access to high-quality sources of random numbers and can be used in simulations requiring large amount of random data. New version program summaryProgram title: TRQS Catalogue identifier: AEKA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 18 134 No. of bytes in distributed program, including test data, etc.: 2 520 49 Distribution format: tar.gz Programming language: Mathematica, C. Computer: Any supporting Mathematica in version 7 or higher. Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit). RAM: Case-dependent Supplementary material: Fig. 1 mentioned below can be downloaded. Classification: 4.15. External routines: Quantis software library (http://www.idquantique.com/support/quantis-trng.html) Catalogue identifier of previous version: AEKA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183(2012)118 Does the new version supersede the previous version?: Yes Nature of problem: Generation of random density matrices and utilization of high-quality random numbers for the purpose of computer simulation. Solution method: Use of a physical quantum random number generator and an on-line service providing access to the source of true random numbers generated by quantum real number generator. Reasons for new version: Added support for the high-speed on-line quantum random number generator and improved methods for retrieving lists of random numbers. Summary of revisions: The presented version provides two signicant improvements. The first one is the ability to use the on-line Quantum Random Number Generation service developed by PicoQuant GmbH and the Nano-Optics groups at the Department of Physics of Humboldt University. The on-line service supported in the version 2.0 of the TRQS package provides faster access to true randomness sources constructed using the laws of quantum physics. The service is freely available at https://qrng.physik.hu-berlin.de/. The use of this service allows using the presented package with the need of a physical quantum random number generator. The second improvement introduced in this version is the ability to retrieve arrays of random data directly for the used source. This increases the speed of the random number generation, especially in the case of an on-line service, where it reduces the time necessary to establish the connection. Thanks to the speed improvement of the presented version, the package can now be used in simulations requiring larger amounts of random data. Moreover, the functions for generating random numbers provided by the current version of the package more closely follow the pattern of functions for generating pseudo- random numbers provided in Mathematica. Additional comments: Speed comparison: The implementation of the support for the QRNG on-line service provides a noticeable improvement in the speed of random number generation. For the samples of real numbers of size 101; 102,…,107 the times required to generate these samples using Quantis USB device and QRNG service are compared in Fig. 1. The presented results show that the use of the on-line service provides faster access to random numbers. One should note, however, that the speed gain can increase or decrease depending on the connection speed between the computer and the server providing random numbers. Running time: Depends on the used source of randomness and the amount of random data used in the experiment. References: [1] M. Wahl, M. Leifgen, M. Berlin, T. Röhlicke, H.-J. Rahn, O. Benson., An ultrafast quantum random number generator with provably bounded output bias based on photon arrival time measurements, Applied Physics Letters, Vol. 098, 171105 (2011). http://dx.doi.org/10.1063/1.3578456.
Computer-generated reminders and quality of pediatric HIV care in a resource-limited setting.
Were, Martin C; Nyandiko, Winstone M; Huang, Kristin T L; Slaven, James E; Shen, Changyu; Tierney, William M; Vreeman, Rachel C
2013-03-01
To evaluate the impact of clinician-targeted computer-generated reminders on compliance with HIV care guidelines in a resource-limited setting. We conducted this randomized, controlled trial in an HIV referral clinic in Kenya caring for HIV-infected and HIV-exposed children (<14 years of age). For children randomly assigned to the intervention group, printed patient summaries containing computer-generated patient-specific reminders for overdue care recommendations were provided to the clinician at the time of the child's clinic visit. For children in the control group, clinicians received the summaries, but no computer-generated reminders. We compared differences between the intervention and control groups in completion of overdue tasks, including HIV testing, laboratory monitoring, initiating antiretroviral therapy, and making referrals. During the 5-month study period, 1611 patients (49% female, 70% HIV-infected) were eligible to receive at least 1 computer-generated reminder (ie, had an overdue clinical task). We observed a fourfold increase in the completion of overdue clinical tasks when reminders were availed to providers over the course of the study (68% intervention vs 18% control, P < .001). Orders also occurred earlier for the intervention group (77 days, SD 2.4 days) compared with the control group (104 days, SD 1.2 days) (P < .001). Response rates to reminders varied significantly by type of reminder and between clinicians. Clinician-targeted, computer-generated clinical reminders are associated with a significant increase in completion of overdue clinical tasks for HIV-infected and exposed children in a resource-limited setting.
The RANDOM computer program: A linear congruential random number generator
NASA Technical Reports Server (NTRS)
Miles, R. F., Jr.
1986-01-01
The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.
Rational group decision making: A random field Ising model at T = 0
NASA Astrophysics Data System (ADS)
Galam, Serge
1997-02-01
A modified version of a finite random field Ising ferromagnetic model in an external magnetic field at zero temperature is presented to describe group decision making. Fields may have a non-zero average. A postulate of minimum inter-individual conflicts is assumed. Interactions then produce a group polarization along one very choice which is however randomly selected. A small external social pressure is shown to have a drastic effect on the polarization. Individual bias related to personal backgrounds, cultural values and past experiences are introduced via quenched local competing fields. They are shown to be instrumental in generating a larger spectrum of collective new choices beyond initial ones. In particular, compromise is found to results from the existence of individual competing bias. Conflict is shown to weaken group polarization. The model yields new psychosociological insights about consensus and compromise in groups.
Key Aspects of Nucleic Acid Library Design for in Vitro Selection
Vorobyeva, Maria A.; Davydova, Anna S.; Vorobjev, Pavel E.; Pyshnyi, Dmitrii V.; Venyaminova, Alya G.
2018-01-01
Nucleic acid aptamers capable of selectively recognizing their target molecules have nowadays been established as powerful and tunable tools for biospecific applications, be it therapeutics, drug delivery systems or biosensors. It is now generally acknowledged that in vitro selection enables one to generate aptamers to almost any target of interest. However, the success of selection and the affinity of the resulting aptamers depend to a large extent on the nature and design of an initial random nucleic acid library. In this review, we summarize and discuss the most important features of the design of nucleic acid libraries for in vitro selection such as the nature of the library (DNA, RNA or modified nucleotides), the length of a randomized region and the presence of fixed sequences. We also compare and contrast different randomization strategies and consider computer methods of library design and some other aspects. PMID:29401748
Nierenberg, Andrew A.; Sylvia, Louisa G.; Leon, Andrew C.; Reilly-Harrington, Noreen; Shesler, Leah W.; McElroy, Susan L.; Friedman, Edward S.; Thase, Michael E.; Shelton, Richard C.; Bowden, Charles; Tohen, Mauricio; Singh, Vivek; Deckersbach, Thilo; Ketter, Terence; Kocsis, James; McInnis, Melvin G.; Schoenfeld, David; Bobo, William V.; Calabrese, Joseph R.
2015-01-01
Background Classic and second generation antipsychotic mood stabilizers are recommended for treatment of bipolar disorder, yet there are no randomized comparative effectiveness studies that have examined the “real-world” advantages and disadvantages of these medications Purpose We describe the strategic decisions in the design of the Clinical and Health Outcomes Initiative in Comparative Effectiveness for Bipolar Disorder (Bipolar CHOICE). This paper outlines the key issues and solutions the investigators faced in designing a clinical trial that would maximize generalizability and inform real-world clinical treatment of bipolar disorder. Methods Bipolar CHOICE was a 6-month, multi-site, prospective, randomized clinical trial of outpatients with bipolar disorder. This study compares the effectiveness of quetiapine versus lithium, each with adjunctive personalized treatments. The co-primary outcomes selected are the overall benefits and harms of the study medications (as measured by the Clinical Global Impression-Efficacy Index) and the Necessary Clinical Adjustments (a measure of the number of medication changes). Secondary outcomes are continuous measures of mood, the Framingham General Cardiovascular Risk Score and the Longitudinal Interval Follow up Evaluation Range of Impaired Functioning Tool. Results The final study design consisted of a single-blind, randomized comparative effectiveness trial of quetiapine versus lithium, plus adjunctive personalized treatment (APT), across ten sites. Other important study considerations included limited exclusion criteria to maximize generalizability, flexible dosing of APT medications to mimic real-world treatment, and an intent-to-treat analysis plan. 482 participants were randomized to the study and 364 completed. Limitations The potential limitations of the study include the heterogeneity of APT, selection of study medications, lack of a placebo-control group, and participants’ ability to pay for study medications. Conclusion We expect that this study will inform our understanding of the benefits and harms of lithium, a classic mood stabilizer, compared to quetiapine, a second generation antipsychotic with broad-spectrum activity in bipolar disorder and will provide an example of a well-designed and well-conducted randomized comparative effectiveness clinical trial. PMID:24346608
Nierenberg, Andrew A; Sylvia, Louisa G; Leon, Andrew C; Reilly-Harrington, Noreen A; Shesler, Leah W; McElroy, Susan L; Friedman, Edward S; Thase, Michael E; Shelton, Richard C; Bowden, Charles L; Tohen, Mauricio; Singh, Vivek; Deckersbach, Thilo; Ketter, Terence A; Kocsis, James H; McInnis, Melvin G; Schoenfeld, David; Bobo, William V; Calabrese, Joseph R
2014-02-01
Classic and second-generation antipsychotic mood stabilizers are recommended for treatment of bipolar disorder, yet there are no randomized comparative effectiveness studies that have examined the 'real-world' advantages and disadvantages of these medications. We describe the strategic decisions in the design of the Clinical and Health Outcomes Initiative in Comparative Effectiveness for Bipolar Disorder (Bipolar CHOICE). This article outlines the key issues and solutions the investigators faced in designing a clinical trial that would maximize generalizability and inform real-world clinical treatment of bipolar disorder. Bipolar CHOICE was a 6-month, multi-site, prospective, randomized clinical trial of outpatients with bipolar disorder. This study compares the effectiveness of quetiapine versus lithium, each with adjunctive personalized treatments (APTs). The co-primary outcomes selected are the overall benefits and harms of the study medications (as measured by the Clinical Global Impression-Efficacy Index) and the Necessary Clinical Adjustments (a measure of the number of medication changes). Secondary outcomes are continuous measures of mood, the Framingham General Cardiovascular Risk Score, and the Longitudinal Interval Follow up Evaluation Range of Impaired Functioning Tool (LIFE-RIFT). The final study design consisted of a single-blind, randomized comparative effectiveness trial of quetiapine versus lithium, plus APT, across 10 sites. Other important study considerations included limited exclusion criteria to maximize generalizability, flexible dosing of APT medications to mimic real-world treatment, and an intent-to-treat analysis plan. In all, 482 participants were randomized to the study, and 364 completed the study. The potential limitations of the study include the heterogeneity of APT, selection of study medications, lack of a placebo-control group, and participants' ability to pay for study medications. We expect that this study will inform our understanding of the benefits and harms of lithium, a classic mood stabilizer, compared to quetiapine, a second-generation antipsychotic with broad-spectrum activity in bipolar disorder, and will provide an example of a well-designed and well-conducted randomized comparative effectiveness clinical trial.
Generation of physical random numbers by using homodyne detection
NASA Astrophysics Data System (ADS)
Hirakawa, Kodai; Oya, Shota; Oguri, Yusuke; Ichikawa, Tsubasa; Eto, Yujiro; Hirano, Takuya; Tsurumaru, Toyohiro
2016-10-01
Physical random numbers generated by quantum measurements are, in principle, impossible to predict. We have demonstrated the generation of physical random numbers by using a high-speed balanced photodetector to measure the quadrature amplitudes of vacuum states. Using this method, random numbers were generated at 500 Mbps, which is more than one order of magnitude faster than previously [Gabriel et al:, Nature Photonics 4, 711-715 (2010)]. The Crush test battery of the TestU01 suite consists of 31 tests in 144 variations, and we used them to statistically analyze these numbers. The generated random numbers passed 14 of the 31 tests. To improve the randomness, we performed a hash operation, in which each random number was multiplied by a random Toeplitz matrix; the resulting numbers passed all of the tests in the TestU01 Crush battery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu
Quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a highmore » speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.« less
Analysis of Uniform Random Numbers Generated by Randu and Urn Ten Different Seeds.
The statistical properties of the numbers generated by two uniform random number generators, RANDU and URN, each using ten different seeds are...The testing is performed on a sequence of 50,000 numbers generated by each uniform random number generator using each of the ten seeds . (Author)
MATIN: A Random Network Coding Based Framework for High Quality Peer-to-Peer Live Video Streaming
Barekatain, Behrang; Khezrimotlagh, Dariush; Aizaini Maarof, Mohd; Ghaeini, Hamid Reza; Salleh, Shaharuddin; Quintana, Alfonso Ariza; Akbari, Behzad; Cabrera, Alicia Triviño
2013-01-01
In recent years, Random Network Coding (RNC) has emerged as a promising solution for efficient Peer-to-Peer (P2P) video multicasting over the Internet. This probably refers to this fact that RNC noticeably increases the error resiliency and throughput of the network. However, high transmission overhead arising from sending large coefficients vector as header has been the most important challenge of the RNC. Moreover, due to employing the Gauss-Jordan elimination method, considerable computational complexity can be imposed on peers in decoding the encoded blocks and checking linear dependency among the coefficients vectors. In order to address these challenges, this study introduces MATIN which is a random network coding based framework for efficient P2P video streaming. The MATIN includes a novel coefficients matrix generation method so that there is no linear dependency in the generated coefficients matrix. Using the proposed framework, each peer encapsulates one instead of n coefficients entries into the generated encoded packet which results in very low transmission overhead. It is also possible to obtain the inverted coefficients matrix using a bit number of simple arithmetic operations. In this regard, peers sustain very low computational complexities. As a result, the MATIN permits random network coding to be more efficient in P2P video streaming systems. The results obtained from simulation using OMNET++ show that it substantially outperforms the RNC which uses the Gauss-Jordan elimination method by providing better video quality on peers in terms of the four important performance metrics including video distortion, dependency distortion, End-to-End delay and Initial Startup delay. PMID:23940530
MatchingLand, geospatial data testbed for the assessment of matching methods.
Xavier, Emerson M A; Ariza-López, Francisco J; Ureña-Cámara, Manuel A
2017-12-05
This article presents datasets prepared with the aim of helping the evaluation of geospatial matching methods for vector data. These datasets were built up from mapping data produced by official Spanish mapping agencies. The testbed supplied encompasses the three geometry types: point, line and area. Initial datasets were submitted to geometric transformations in order to generate synthetic datasets. These transformations represent factors that might influence the performance of geospatial matching methods, like the morphology of linear or areal features, systematic transformations, and random disturbance over initial data. We call our 11 GiB benchmark data 'MatchingLand' and we hope it can be useful for the geographic information science research community.
Search of exploration opportunity for near earth objects based on analytical gradients
NASA Astrophysics Data System (ADS)
Ren, Y.; Cui, P. Y.; Luan, E. J.
2008-01-01
The problem of searching for exploration opportunity of near Earth objects is investigated. For rendezvous missions, the analytical gradients of performance index with respect to free parameters are derived by combining the calculus of variation with the theory of state-transition matrix. Then, some initial guesses are generated random in the search space, and the performance index is optimized with the guidance of analytical gradients from these initial guesses. This method not only keeps the property of global search in traditional method, but also avoids the blindness in the traditional exploration opportunity search; hence, the computing speed could be increased greatly. Furthermore, by using this method, the search precision could be controlled effectively.
NASA Astrophysics Data System (ADS)
Ferguson, Kevin; Sewell, Everest; Krivets, Vitaliy; Greenough, Jeffrey; Jacobs, Jeffrey
2016-11-01
Initial conditions for the Richtmyer-Meshkov instability (RMI) are measured in three dimensions in the University of Arizona Vertical Shock Tube using a moving magnet galvanometer system. The resulting volumetric data is used as initial conditions for the simulation of the RMI using ARES at Lawrence-Livermore National Laboratory (LLNL). The heavy gas is sulfur hexafluoride (SF6), and the light gas is air. The perturbations are generated by harmonically oscillating the gasses vertically using two loudspeakers mounted to the shock tube which cause Faraday resonance, producing a random short wavelength perturbation on the interface. Planar Mie scattering is used to illuminate the flow field through the addition of propylene glycol particles seeded in the heavy gas. An M=1.2 shock impulsively accelerates the interface, initiating instability growth. Images of the initial condition and instability growth are captured at a rate of 6 kHz using high speed cameras. Comparisons between experimental and simulation results, mixing diagnostics, and mixing zone growth are presented.
Caustics, counting maps and semi-classical asymptotics
NASA Astrophysics Data System (ADS)
Ercolani, N. M.
2011-02-01
This paper develops a deeper understanding of the structure and combinatorial significance of the partition function for Hermitian random matrices. The coefficients of the large N expansion of the logarithm of this partition function, also known as the genus expansion (and its derivatives), are generating functions for a variety of graphical enumeration problems. The main results are to prove that these generating functions are, in fact, specific rational functions of a distinguished irrational (algebraic) function, z0(t). This distinguished function is itself the generating function for the Catalan numbers (or generalized Catalan numbers, depending on the choice of weight of the parameter t). It is also a solution of the inviscid Burgers equation for certain initial data. The shock formation, or caustic, of the Burgers characteristic solution is directly related to the poles of the rational forms of the generating functions. As an intriguing application, one gains new insights into the relation between certain derivatives of the genus expansion, in a double-scaling limit, and the asymptotic expansion of the first Painlevé transcendent. This provides a precise expression of the Painlevé asymptotic coefficients directly in terms of the coefficients of the partial fractions expansion of the rational form of the generating functions established in this paper. Moreover, these insights point towards a more general program relating the first Painlevé hierarchy to the higher order structure of the double-scaling limit through the specific rational structure of generating functions in the genus expansion. The paper closes with a discussion of the relation of this work to recent developments in understanding the asymptotics of graphical enumeration. As a by-product, these results also yield new information about the asymptotics of recurrence coefficients for orthogonal polynomials with respect to exponential weights, the calculation of correlation functions for certain tied random walks on a 1D lattice, and the large time asymptotics of random matrix partition functions.
Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin
2015-01-01
Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks. PMID:26501283
Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin
2015-10-16
Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks.
A rule-based software test data generator
NASA Technical Reports Server (NTRS)
Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II
1991-01-01
Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.
Independent tasks scheduling in cloud computing via improved estimation of distribution algorithm
NASA Astrophysics Data System (ADS)
Sun, Haisheng; Xu, Rui; Chen, Huaping
2018-04-01
To minimize makespan for scheduling independent tasks in cloud computing, an improved estimation of distribution algorithm (IEDA) is proposed to tackle the investigated problem in this paper. Considering that the problem is concerned with multi-dimensional discrete problems, an improved population-based incremental learning (PBIL) algorithm is applied, which the parameter for each component is independent with other components in PBIL. In order to improve the performance of PBIL, on the one hand, the integer encoding scheme is used and the method of probability calculation of PBIL is improved by using the task average processing time; on the other hand, an effective adaptive learning rate function that related to the number of iterations is constructed to trade off the exploration and exploitation of IEDA. In addition, both enhanced Max-Min and Min-Min algorithms are properly introduced to form two initial individuals. In the proposed IEDA, an improved genetic algorithm (IGA) is applied to generate partial initial population by evolving two initial individuals and the rest of initial individuals are generated at random. Finally, the sampling process is divided into two parts including sampling by probabilistic model and IGA respectively. The experiment results show that the proposed IEDA not only gets better solution, but also has faster convergence speed.
Kuklinski, Margaret R.; Fagan, Abigail A.; Hawkins, J. David; Briney, John S.; Catalano, Richard F.
2015-01-01
Objective To determine whether the Communities That Care (CTC) prevention system is a cost-beneficial intervention. Methods Data were from a longitudinal panel of 4,407 youth participating in a randomized controlled trial including 24 towns in 7 states, matched in pairs within state and randomly assigned to condition. Significant differences favoring intervention youth in sustained abstinence from delinquency, alcohol use, and tobacco use through Grade 12 were monetized and compared to economic investment in CTC. Results CTC was estimated to produce $4,477 in benefits per youth (discounted 2011 dollars). It cost $556 per youth to implement CTC for 5 years. The net present benefit was $3,920. The benefit-cost ratio was $8.22 per dollar invested. The internal rate of return was 21%. Risk that investment would exceed benefits was minimal. Investment was expected to be recouped within 9 years. Sensitivity analyses in which effects were halved yielded positive cost-beneficial results. Conclusions CTC is a cost-beneficial, community-based approach to preventing initiation of delinquency, alcohol use, and tobacco use. CTC is estimated to generate economic benefits that exceed implementation costs when disseminated with fidelity in communities. PMID:26213527
Alternative Approaches to Land Initialization for Seasonal Precipitation and Temperature Forecasts
NASA Technical Reports Server (NTRS)
Koster, Randal; Suarez, Max; Liu, Ping; Jambor, Urszula
2004-01-01
The seasonal prediction system of the NASA Global Modeling and Assimilation Office is used to generate ensembles of summer forecasts utilizing realistic soil moisture initialization. To derive the realistic land states, we drive offline the system's land model with realistic meteorological forcing over the period 1979-1993 (in cooperation with the Global Land Data Assimilation System project at GSFC) and then extract the state variables' values on the chosen forecast start dates. A parallel series of forecast ensembles is performed with a random (though climatologically consistent) set of land initial conditions; by comparing the two sets of ensembles, we can isolate the impact of land initialization on forecast skill from that of the imposed SSTs. The base initialization experiment is supplemented with several forecast ensembles that use alternative initialization techniques. One ensemble addresses the impact of minimizing climate drift in the system through the scaling of the initial conditions, and another is designed to isolate the importance of the precipitation signal from that of all other signals in the antecedent offline forcing. A third ensemble includes a more realistic initialization of the atmosphere along with the land initialization. The impact of each variation on forecast skill is quantified.
A fast ergodic algorithm for generating ensembles of equilateral random polygons
NASA Astrophysics Data System (ADS)
Varela, R.; Hinson, K.; Arsuaga, J.; Diao, Y.
2009-03-01
Knotted structures are commonly found in circular DNA and along the backbone of certain proteins. In order to properly estimate properties of these three-dimensional structures it is often necessary to generate large ensembles of simulated closed chains (i.e. polygons) of equal edge lengths (such polygons are called equilateral random polygons). However finding efficient algorithms that properly sample the space of equilateral random polygons is a difficult problem. Currently there are no proven algorithms that generate equilateral random polygons with its theoretical distribution. In this paper we propose a method that generates equilateral random polygons in a 'step-wise uniform' way. We prove that this method is ergodic in the sense that any given equilateral random polygon can be generated by this method and we show that the time needed to generate an equilateral random polygon of length n is linear in terms of n. These two properties make this algorithm a big improvement over the existing generating methods. Detailed numerical comparisons of our algorithm with other widely used algorithms are provided.
Using Computer-Generated Random Numbers to Calculate the Lifetime of a Comet.
ERIC Educational Resources Information Center
Danesh, Iraj
1991-01-01
An educational technique to calculate the lifetime of a comet using software-generated random numbers is introduced to undergraduate physiques and astronomy students. Discussed are the generation and eligibility of the required random numbers, background literature related to the problem, and the solution to the problem using random numbers.…
A Comparison of Three Random Number Generators for Aircraft Dynamic Modeling Applications
NASA Technical Reports Server (NTRS)
Grauer, Jared A.
2017-01-01
Three random number generators, which produce Gaussian white noise sequences, were compared to assess their suitability in aircraft dynamic modeling applications. The first generator considered was the MATLAB (registered) implementation of the Mersenne-Twister algorithm. The second generator was a website called Random.org, which processes atmospheric noise measured using radios to create the random numbers. The third generator was based on synthesis of the Fourier series, where the random number sequences are constructed from prescribed amplitude and phase spectra. A total of 200 sequences, each having 601 random numbers, for each generator were collected and analyzed in terms of the mean, variance, normality, autocorrelation, and power spectral density. These sequences were then applied to two problems in aircraft dynamic modeling, namely estimating stability and control derivatives from simulated onboard sensor data, and simulating flight in atmospheric turbulence. In general, each random number generator had good performance and is well-suited for aircraft dynamic modeling applications. Specific strengths and weaknesses of each generator are discussed. For Monte Carlo simulation, the Fourier synthesis method is recommended because it most accurately and consistently approximated Gaussian white noise and can be implemented with reasonable computational effort.
1981-01-01
Channel and study permutation codes as a special case. ,uch a code is generated by an initial vector x, a group G of orthogonal n by n matrices, and a...random-access components, is introduced and studied . Under this scheme, the network stations are divided into groups , each of which is assigned a...IEEE INFORMATION THEORY GROUP CO-SPONSORED BY: UNION RADIO SCIENTIFIQUE INTERNATIONALE IEEE Catalog Number 81 CH 1609-7 IT . 81 ~20 04Q SECURITY
Realization of a Quantum Random Generator Certified with the Kochen-Specker Theorem
NASA Astrophysics Data System (ADS)
Kulikov, Anatoly; Jerger, Markus; Potočnik, Anton; Wallraff, Andreas; Fedorov, Arkady
2017-12-01
Random numbers are required for a variety of applications from secure communications to Monte Carlo simulation. Yet randomness is an asymptotic property, and no output string generated by a physical device can be strictly proven to be random. We report an experimental realization of a quantum random number generator (QRNG) with randomness certified by quantum contextuality and the Kochen-Specker theorem. The certification is not performed in a device-independent way but through a rigorous theoretical proof of each outcome being value indefinite even in the presence of experimental imperfections. The analysis of the generated data confirms the incomputable nature of our QRNG.
Realization of a Quantum Random Generator Certified with the Kochen-Specker Theorem.
Kulikov, Anatoly; Jerger, Markus; Potočnik, Anton; Wallraff, Andreas; Fedorov, Arkady
2017-12-15
Random numbers are required for a variety of applications from secure communications to Monte Carlo simulation. Yet randomness is an asymptotic property, and no output string generated by a physical device can be strictly proven to be random. We report an experimental realization of a quantum random number generator (QRNG) with randomness certified by quantum contextuality and the Kochen-Specker theorem. The certification is not performed in a device-independent way but through a rigorous theoretical proof of each outcome being value indefinite even in the presence of experimental imperfections. The analysis of the generated data confirms the incomputable nature of our QRNG.
Bor, Jacob; Geldsetzer, Pascal; Venkataramani, Atheendar; Bärnighausen, Till
2015-01-01
Purpose of review Randomized, population-representative trials of clinical interventions are rare. Quasi-experiments have been used successfully to generate causal evidence on the cascade of HIV care in a broad range of real-world settings. Recent findings Quasi-experiments exploit exogenous, or quasi-random, variation occurring naturally in the world or because of an administrative rule or policy change to estimate causal effects. Well designed quasi-experiments have greater internal validity than typical observational research designs. At the same time, quasi-experiments may also have potential for greater external validity than experiments and can be implemented when randomized clinical trials are infeasible or unethical. Quasi-experimental studies have established the causal effects of HIV testing and initiation of antiretroviral therapy on health, economic outcomes and sexual behaviors, as well as indirect effects on other community members. Recent quasi-experiments have evaluated specific interventions to improve patient performance in the cascade of care, providing causal evidence to optimize clinical management of HIV. Summary Quasi-experiments have generated important data on the real-world impacts of HIV testing and treatment and on interventions to improve the cascade of care. With the growth in large-scale clinical and administrative data, quasi-experiments enable rigorous evaluation of policies implemented in real-world settings. PMID:26371463
Bor, Jacob; Geldsetzer, Pascal; Venkataramani, Atheendar; Bärnighausen, Till
2015-11-01
Randomized, population-representative trials of clinical interventions are rare. Quasi-experiments have been used successfully to generate causal evidence on the cascade of HIV care in a broad range of real-world settings. Quasi-experiments exploit exogenous, or quasi-random, variation occurring naturally in the world or because of an administrative rule or policy change to estimate causal effects. Well designed quasi-experiments have greater internal validity than typical observational research designs. At the same time, quasi-experiments may also have potential for greater external validity than experiments and can be implemented when randomized clinical trials are infeasible or unethical. Quasi-experimental studies have established the causal effects of HIV testing and initiation of antiretroviral therapy on health, economic outcomes and sexual behaviors, as well as indirect effects on other community members. Recent quasi-experiments have evaluated specific interventions to improve patient performance in the cascade of care, providing causal evidence to optimize clinical management of HIV. Quasi-experiments have generated important data on the real-world impacts of HIV testing and treatment and on interventions to improve the cascade of care. With the growth in large-scale clinical and administrative data, quasi-experiments enable rigorous evaluation of policies implemented in real-world settings.
Secure communications using quantum cryptography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, R.J.; Buttler, W.T.; Kwiat, P.G.
1997-08-01
The secure distribution of the secret random bit sequences known as {open_quotes}key{close_quotes} material, is an essential precursor to their use for the encryption and decryption of confidential communications. Quantum cryptography is an emerging technology for secure key distribution with single-photon transmissions, nor evade detection (eavesdropping raises the key error rate above a threshold value). We have developed experimental quantum cryptography systems based on the transmission of non-orthogonal single-photon states to generate shared key material over multi-kilometer optical fiber paths and over line-of-sight links. In both cases, key material is built up using the transmission of a single-photon per bit ofmore » an initial secret random sequence. A quantum-mechanically random subset of this sequence is identified, becoming the key material after a data reconciliation stage with the sender. In our optical fiber experiment we have performed quantum key distribution over 24-km of underground optical fiber using single-photon interference states, demonstrating that secure, real-time key generation over {open_quotes}open{close_quotes} multi-km node-to-node optical fiber communications links is possible. We have also constructed a quantum key distribution system for free-space, line-of-sight transmission using single-photon polarization states, which is currently undergoing laboratory testing. 7 figs.« less
Source-Independent Quantum Random Number Generation
NASA Astrophysics Data System (ADS)
Cao, Zhu; Zhou, Hongyi; Yuan, Xiao; Ma, Xiongfeng
2016-01-01
Quantum random number generators can provide genuine randomness by appealing to the fundamental principles of quantum mechanics. In general, a physical generator contains two parts—a randomness source and its readout. The source is essential to the quality of the resulting random numbers; hence, it needs to be carefully calibrated and modeled to achieve information-theoretical provable randomness. However, in practice, the source is a complicated physical system, such as a light source or an atomic ensemble, and any deviations in the real-life implementation from the theoretical model may affect the randomness of the output. To close this gap, we propose a source-independent scheme for quantum random number generation in which output randomness can be certified, even when the source is uncharacterized and untrusted. In our randomness analysis, we make no assumptions about the dimension of the source. For instance, multiphoton emissions are allowed in optical implementations. Our analysis takes into account the finite-key effect with the composable security definition. In the limit of large data size, the length of the input random seed is exponentially small compared to that of the output random bit. In addition, by modifying a quantum key distribution system, we experimentally demonstrate our scheme and achieve a randomness generation rate of over 5 ×103 bit /s .
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, K. S.; Nakae, L. F.; Prasad, M. K.
Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less
Quantum Random Number Generation Using a Quanta Image Sensor
Amri, Emna; Felk, Yacine; Stucki, Damien; Ma, Jiaju; Fossum, Eric R.
2016-01-01
A new quantum random number generation method is proposed. The method is based on the randomness of the photon emission process and the single photon counting capability of the Quanta Image Sensor (QIS). It has the potential to generate high-quality random numbers with remarkable data output rate. In this paper, the principle of photon statistics and theory of entropy are discussed. Sample data were collected with QIS jot device, and its randomness quality was analyzed. The randomness assessment method and results are discussed. PMID:27367698
NASA Astrophysics Data System (ADS)
Holmquist, J. R.; Crooks, S.; Windham-Myers, L.; Megonigal, P.; Weller, D.; Lu, M.; Bernal, B.; Byrd, K. B.; Morris, J. T.; Troxler, T.; McCombs, J.; Herold, N.
2017-12-01
Stable coastal wetlands can store substantial amounts of carbon (C) that can be released when they are degraded or eroded. The EPA recently incorporated coastal wetland net-storage and emissions within the Agricultural Forested and Other Land Uses category of the U.S. National Greenhouse Gas Inventory (NGGI). This was a seminal analysis, but its quantification of uncertainty needs improvement. We provide a value-added analysis by estimating that uncertainty, focusing initially on the most basic assumption, the area of coastal wetlands. We considered three sources: uncertainty in the areas of vegetation and salinity subclasses, uncertainty in the areas of changing or stable wetlands, and uncertainty in the inland extent of coastal wetlands. The areas of vegetation and salinity subtypes, as well as stable or changing, were estimated from 2006 and 2010 maps derived from Landsat imagery by the Coastal Change Analysis Program (C-CAP). We generated unbiased area estimates and confidence intervals for C-CAP, taking into account mapped area, proportional areas of commission and omission errors, as well as the number of observations. We defined the inland extent of wetlands as all land below the current elevation of twice monthly highest tides. We generated probabilistic inundation maps integrating wetland-specific bias and random error in light-detection and ranging elevation maps, with the spatially explicit random error in tidal surfaces generated from tide gauges. This initial uncertainty analysis will be extended to calculate total propagated uncertainty in the NGGI by including the uncertainties in the amount of C lost from eroded and degraded wetlands, stored annually in stable wetlands, and emitted in the form of methane by tidal freshwater wetlands.
Age patterns of smoking initiation among Kuwait university male students.
Sugathan, T N; Moody, P M; Bustan, M A; Elgerges, N S
1998-12-01
The present study is a detailed evaluation of age at smoking initiation among university male students in Kuwait based on a random sample of 664 students selected from all students during 1993. The Acturial Life Table analysis revealed that almost one tenth of the students initiated cigarette smoking between ages 16 and 17 with the rate of initiation increasing rapidly thereafter and reaching 30% by age 20 and almost 50% by the time they celebrate their 24th birthday. The most important environmental risk factor positively associated for smoking initiation was observed to be the history of smoking among siblings with a relative risk of 1.4. Compared to students of medicine and engineering, the students of other faculties revealed a higher risk in smoking initiation with an RR = 1.77 for sciences and commerce and 1.61 for other faculties (arts, law, education and Islamic studies). The analysis revealed a rising generation trend in cigarette smoking. There is a need for reduction of this trend among young adults in Kuwait and throughout other countries in the region.
A generator for unique quantum random numbers based on vacuum states
NASA Astrophysics Data System (ADS)
Gabriel, Christian; Wittmann, Christoffer; Sych, Denis; Dong, Ruifang; Mauerer, Wolfgang; Andersen, Ulrik L.; Marquardt, Christoph; Leuchs, Gerd
2010-10-01
Random numbers are a valuable component in diverse applications that range from simulations over gambling to cryptography. The quest for true randomness in these applications has engendered a large variety of different proposals for producing random numbers based on the foundational unpredictability of quantum mechanics. However, most approaches do not consider that a potential adversary could have knowledge about the generated numbers, so the numbers are not verifiably random and unique. Here we present a simple experimental setup based on homodyne measurements that uses the purity of a continuous-variable quantum vacuum state to generate unique random numbers. We use the intrinsic randomness in measuring the quadratures of a mode in the lowest energy vacuum state, which cannot be correlated to any other state. The simplicity of our source, combined with its verifiably unique randomness, are important attributes for achieving high-reliability, high-speed and low-cost quantum random number generators.
Generating synthetic wave climates for coastal modelling: a linear mixed modelling approach
NASA Astrophysics Data System (ADS)
Thomas, C.; Lark, R. M.
2013-12-01
Numerical coastline morphological evolution models require wave climate properties to drive morphological change through time. Wave climate properties (typically wave height, period and direction) may be temporally fixed, culled from real wave buoy data, or allowed to vary in some way defined by a Gaussian or other pdf. However, to examine sensitivity of coastline morphologies to wave climate change, it seems desirable to be able to modify wave climate time series from a current to some new state along a trajectory, but in a way consistent with, or initially conditioned by, the properties of existing data, or to generate fully synthetic data sets with realistic time series properties. For example, mean or significant wave height time series may have underlying periodicities, as revealed in numerous analyses of wave data. Our motivation is to develop a simple methodology to generate synthetic wave climate time series that can change in some stochastic way through time. We wish to use such time series in a coastline evolution model to test sensitivities of coastal landforms to changes in wave climate over decadal and centennial scales. We have worked initially on time series of significant wave height, based on data from a Waverider III buoy located off the coast of Yorkshire, England. The statistical framework for the simulation is the linear mixed model. The target variable, perhaps after transformation (Box-Cox), is modelled as a multivariate Gaussian, the mean modelled as a function of a fixed effect, and two random components, one of which is independently and identically distributed (iid) and the second of which is temporally correlated. The model was fitted to the data by likelihood methods. We considered the option of a periodic mean, the period either fixed (e.g. at 12 months) or estimated from the data. We considered two possible correlation structures for the second random effect. In one the correlation decays exponentially with time. In the second (spherical) model, it cuts off at a temporal range. Having fitted the model, multiple realisations were generated; the random effects were simulated by specifying a covariance matrix for the simulated values, with the estimated parameters. The Cholesky factorisation of the covariance matrix was computed and realizations of the random component of the model generated by pre-multiplying a vector of iid standard Gaussian variables by the lower triangular factor. The resulting random variate was added to the mean value computed from the fixed effects, and the result back-transformed to the original scale of the measurement. Realistic simulations result from approach described above. Background exploratory data analysis was undertaken on 20-day sets of 30-minute buoy data, selected from days 5-24 of months January, April, July, October, 2011, to elucidate daily to weekly variations, and to keep numerical analysis tractable computationally. Work remains to be undertaken to develop suitable models for synthetic directional data. We suggest that the general principles of the method will have applications in other geomorphological modelling endeavours requiring time series of stochastically variable environmental parameters.
Random ambience using high fidelity images
NASA Astrophysics Data System (ADS)
Abu, Nur Azman; Sahib, Shahrin
2011-06-01
Most of the secure communication nowadays mandates true random keys as an input. These operations are mostly designed and taken care of by the developers of the cryptosystem. Due to the nature of confidential crypto development today, pseudorandom keys are typically designed and still preferred by the developers of the cryptosystem. However, these pseudorandom keys are predictable, periodic and repeatable, hence they carry minimal entropy. True random keys are believed to be generated only via hardware random number generators. Careful statistical analysis is still required to have any confidence the process and apparatus generates numbers that are sufficiently random to suit the cryptographic use. In this underlying research, each moment in life is considered unique in itself. The random key is unique for the given moment generated by the user whenever he or she needs the random keys in practical secure communication. An ambience of high fidelity digital image shall be tested for its randomness according to the NIST Statistical Test Suite. Recommendation on generating a simple 4 megabits per second random cryptographic keys live shall be reported.
Usefulness of image morphing techniques in cancer treatment by conformal radiotherapy
NASA Astrophysics Data System (ADS)
Atoui, Hussein; Sarrut, David; Miguet, Serge
2004-05-01
Conformal radiotherapy is a cancer treatment technique, that targets high-energy X-rays to tumors with minimal exposure to surrounding healthy tissues. Irradiation ballistics is calculated based on an initial 3D Computerized Tomography (CT) scan. At every treatment session, the random positioning of the patient, compared to the reference position defined by the initial 3D CT scan, can generate treatment inaccuracies. Positioning errors potentially predispose to dangerous exposure to healthy tissues as well as insufficient irradiation to the tumor. A proposed solution would be the use of portal images generated by Electronic Portal Imaging Devices (EPID). Portal images (PI) allow a comparison with reference images retained by physicians, namely Digitally Reconstructed Radiographs (DRRs). At present, physicians must estimate patient positional errors by visual inspection. However, this may be inaccurate and consumes time. The automation of this task has been the subject of many researches. Unfortunately, the intensive use of DRRs and the high computing time required have prevented real time implementation. We are currently investigating a new method for DRR generation that calculates intermediate DRRs by 2D deformation of previously computed DRRs. We approach this investigation with the use of a morphing-based technique named mesh warping.
NASA Astrophysics Data System (ADS)
Sui, Liansheng; Xu, Minjie; Tian, Ailing
2017-04-01
A novel optical image encryption scheme is proposed based on quick response code and high dimension chaotic system, where only the intensity distribution of encoded information is recorded as ciphertext. Initially, the quick response code is engendered from the plain image and placed in the input plane of the double random phase encoding architecture. Then, the code is encrypted to the ciphertext with noise-like distribution by using two cascaded gyrator transforms. In the process of encryption, the parameters such as rotation angles and random phase masks are generated as interim variables and functions based on Chen system. A new phase retrieval algorithm is designed to reconstruct the initial quick response code in the process of decryption, in which a priori information such as three position detection patterns is used as the support constraint. The original image can be obtained without any energy loss by scanning the decrypted code with mobile devices. The ciphertext image is the real-valued function which is more convenient for storing and transmitting. Meanwhile, the security of the proposed scheme is enhanced greatly due to high sensitivity of initial values of Chen system. Extensive cryptanalysis and simulation have performed to demonstrate the feasibility and effectiveness of the proposed scheme.
Quantum transport with long-range steps on Watts-Strogatz networks
NASA Astrophysics Data System (ADS)
Wang, Yan; Xu, Xin-Jian
2016-07-01
We study transport dynamics of quantum systems with long-range steps on the Watts-Strogatz network (WSN) which is generated by rewiring links of the regular ring. First, we probe physical systems modeled by the discrete nonlinear schrödinger (DNLS) equation. Using the localized initial condition, we compute the time-averaged occupation probability of the initial site, which is related to the nonlinearity, the long-range steps and rewiring links. Self-trapping transitions occur at large (small) nonlinear parameters for coupling ɛ=-1 (1), as long-range interactions are intensified. The structure disorder induced by random rewiring, however, has dual effects for ɛ=-1 and inhibits the self-trapping behavior for ɛ=1. Second, we investigate continuous-time quantum walks (CTQW) on the regular ring ruled by the discrete linear schrödinger (DLS) equation. It is found that only the presence of the long-range steps does not affect the efficiency of the coherent exciton transport, while only the allowance of random rewiring enhances the partial localization. If both factors are considered simultaneously, localization is greatly strengthened, and the transport becomes worse.
A random walk model for evaluating clinical trials involving serial observations.
Hopper, J L; Young, G P
1988-05-01
For clinical trials where the variable of interest is ordered and categorical (for example, disease severity, symptom scale), and where measurements are taken at intervals, it might be possible to achieve a greater discrimination between the efficacy of treatments by modelling each patient's progress as a stochastic process. The random walk is a simple, easily interpreted model that can be fitted by maximum likelihood using a maximization routine with inference based on standard likelihood theory. In general the model can allow for randomly censored data, incorporates measured prognostic factors, and inference is conditional on the (possibly non-random) allocation of patients. Tests of fit and of model assumptions are proposed, and application to two therapeutic trials of gastroenterological disorders are presented. The model gave measures of the rate of, and variability in, improvement for patients under different treatments. A small simulation study suggested that the model is more powerful than considering the difference between initial and final scores, even when applied to data generated by a mechanism other than the random walk model assumed in the analysis. It thus provides a useful additional statistical method for evaluating clinical trials.
Superparamagnetic perpendicular magnetic tunnel junctions for true random number generators
NASA Astrophysics Data System (ADS)
Parks, Bradley; Bapna, Mukund; Igbokwe, Julianne; Almasi, Hamid; Wang, Weigang; Majetich, Sara A.
2018-05-01
Superparamagnetic perpendicular magnetic tunnel junctions are fabricated and analyzed for use in random number generators. Time-resolved resistance measurements are used as streams of bits in statistical tests for randomness. Voltage control of the thermal stability enables tuning the average speed of random bit generation up to 70 kHz in a 60 nm diameter device. In its most efficient operating mode, the device generates random bits at an energy cost of 600 fJ/bit. A narrow range of magnetic field tunes the probability of a given state from 0 to 1, offering a means of probabilistic computing.
Brownian motion properties of optoelectronic random bit generators based on laser chaos.
Li, Pu; Yi, Xiaogang; Liu, Xianglian; Wang, Yuncai; Wang, Yongge
2016-07-11
The nondeterministic property of the optoelectronic random bit generator (RBG) based on laser chaos are experimentally analyzed from two aspects of the central limit theorem and law of iterated logarithm. The random bits are extracted from an optical feedback chaotic laser diode using a multi-bit extraction technique in the electrical domain. Our experimental results demonstrate that the generated random bits have no statistical distance from the Brownian motion, besides that they can pass the state-of-the-art industry-benchmark statistical test suite (NIST SP800-22). All of them give a mathematically provable evidence that the ultrafast random bit generator based on laser chaos can be used as a nondeterministic random bit source.
Random deposition of particles of different sizes.
Forgerini, F L; Figueiredo, W
2009-04-01
We study the surface growth generated by the random deposition of particles of different sizes. A model is proposed where the particles are aggregated on an initially flat surface, giving rise to a rough interface and a porous bulk. By using Monte Carlo simulations, a surface has grown by adding particles of different sizes, as well as identical particles on the substrate in (1+1) dimensions. In the case of deposition of particles of different sizes, they are selected from a Poisson distribution, where the particle sizes may vary by 1 order of magnitude. For the deposition of identical particles, only particles which are larger than one lattice parameter of the substrate are considered. We calculate the usual scaling exponents: the roughness, growth, and dynamic exponents alpha, beta, and z, respectively, as well as, the porosity in the bulk, determining the porosity as a function of the particle size. The results of our simulations show that the roughness evolves in time following three different behaviors. The roughness in the initial times behaves as in the random deposition model. At intermediate times, the surface roughness grows slowly and finally, at long times, it enters into the saturation regime. The bulk formed by depositing large particles reveals a porosity that increases very fast at the initial times and also reaches a saturation value. Excepting the case where particles have the size of one lattice spacing, we always find that the surface roughness and porosity reach limiting values at long times. Surprisingly, we find that the scaling exponents are the same as those predicted by the Villain-Lai-Das Sarma equation.
Towards a high-speed quantum random number generator
NASA Astrophysics Data System (ADS)
Stucki, Damien; Burri, Samuel; Charbon, Edoardo; Chunnilall, Christopher; Meneghetti, Alessio; Regazzoni, Francesco
2013-10-01
Randomness is of fundamental importance in various fields, such as cryptography, numerical simulations, or the gaming industry. Quantum physics, which is fundamentally probabilistic, is the best option for a physical random number generator. In this article, we will present the work carried out in various projects in the context of the development of a commercial and certified high speed random number generator.
Generating variable and random schedules of reinforcement using Microsoft Excel macros.
Bancroft, Stacie L; Bourret, Jason C
2008-01-01
Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to write macros in Microsoft Excel that will generate variable-ratio, variable-interval, variable-time, random-ratio, random-interval, and random-time reinforcement schedule values.
Distributed optical fiber-based monitoring approach of spatial seepage behavior in dike engineering
NASA Astrophysics Data System (ADS)
Su, Huaizhi; Ou, Bin; Yang, Lifu; Wen, Zhiping
2018-07-01
The failure caused by seepage is the most common one in dike engineering. As to the characteristics of seepage in dike, such as longitudinal extension engineering, the randomness, strong concealment and small initial quantity order, by means of distributed fiber temperature sensor system (DTS), adopting an improved optical fiber layer layout scheme, the location of initial interpolation point of the saturation line is obtained. With the barycentric Lagrange interpolation collocation method (BLICM), the infiltrated surface of dike full-section is generated. Combined with linear optical fiber monitoring seepage method, BLICM is applied in an engineering case, which shows that a real-time seepage monitoring technique is presented in full-section of dike based on the combination method.
Damage Propagation Modeling for Aircraft Engine Prognostics
NASA Technical Reports Server (NTRS)
Saxena, Abhinav; Goebel, Kai; Simon, Don; Eklund, Neil
2008-01-01
This paper describes how damage propagation can be modeled within the modules of aircraft gas turbine engines. To that end, response surfaces of all sensors are generated via a thermo-dynamical simulation model for the engine as a function of variations of flow and efficiency of the modules of interest. An exponential rate of change for flow and efficiency loss was imposed for each data set, starting at a randomly chosen initial deterioration set point. The rate of change of the flow and efficiency denotes an otherwise unspecified fault with increasingly worsening effect. The rates of change of the faults were constrained to an upper threshold but were otherwise chosen randomly. Damage propagation was allowed to continue until a failure criterion was reached. A health index was defined as the minimum of several superimposed operational margins at any given time instant and the failure criterion is reached when health index reaches zero. Output of the model was the time series (cycles) of sensed measurements typically available from aircraft gas turbine engines. The data generated were used as challenge data for the Prognostics and Health Management (PHM) data competition at PHM 08.
Hayes, Timothy; Usami, Satoshi; Jacobucci, Ross; McArdle, John J
2015-12-01
In this article, we describe a recent development in the analysis of attrition: using classification and regression trees (CART) and random forest methods to generate inverse sampling weights. These flexible machine learning techniques have the potential to capture complex nonlinear, interactive selection models, yet to our knowledge, their performance in the missing data analysis context has never been evaluated. To assess the potential benefits of these methods, we compare their performance with commonly employed multiple imputation and complete case techniques in 2 simulations. These initial results suggest that weights computed from pruned CART analyses performed well in terms of both bias and efficiency when compared with other methods. We discuss the implications of these findings for applied researchers. (c) 2015 APA, all rights reserved).
Ocean biogeochemistry modeled with emergent trait-based genomics
NASA Astrophysics Data System (ADS)
Coles, V. J.; Stukel, M. R.; Brooks, M. T.; Burd, A.; Crump, B. C.; Moran, M. A.; Paul, J. H.; Satinsky, B. M.; Yager, P. L.; Zielinski, B. L.; Hood, R. R.
2017-12-01
Marine ecosystem models have advanced to incorporate metabolic pathways discovered with genomic sequencing, but direct comparisons between models and “omics” data are lacking. We developed a model that directly simulates metagenomes and metatranscriptomes for comparison with observations. Model microbes were randomly assigned genes for specialized functions, and communities of 68 species were simulated in the Atlantic Ocean. Unfit organisms were replaced, and the model self-organized to develop community genomes and transcriptomes. Emergent communities from simulations that were initialized with different cohorts of randomly generated microbes all produced realistic vertical and horizontal ocean nutrient, genome, and transcriptome gradients. Thus, the library of gene functions available to the community, rather than the distribution of functions among specific organisms, drove community assembly and biogeochemical gradients in the model ocean.
Hayes, Timothy; Usami, Satoshi; Jacobucci, Ross; McArdle, John J.
2016-01-01
In this article, we describe a recent development in the analysis of attrition: using classification and regression trees (CART) and random forest methods to generate inverse sampling weights. These flexible machine learning techniques have the potential to capture complex nonlinear, interactive selection models, yet to our knowledge, their performance in the missing data analysis context has never been evaluated. To assess the potential benefits of these methods, we compare their performance with commonly employed multiple imputation and complete case techniques in 2 simulations. These initial results suggest that weights computed from pruned CART analyses performed well in terms of both bias and efficiency when compared with other methods. We discuss the implications of these findings for applied researchers. PMID:26389526
Ensemble of Chaotic and Naive Approaches for Performance Enhancement in Video Encryption.
Chandrasekaran, Jeyamala; Thiruvengadam, S J
2015-01-01
Owing to the growth of high performance network technologies, multimedia applications over the Internet are increasing exponentially. Applications like video conferencing, video-on-demand, and pay-per-view depend upon encryption algorithms for providing confidentiality. Video communication is characterized by distinct features such as large volume, high redundancy between adjacent frames, video codec compliance, syntax compliance, and application specific requirements. Naive approaches for video encryption encrypt the entire video stream with conventional text based cryptographic algorithms. Although naive approaches are the most secure for video encryption, the computational cost associated with them is very high. This research work aims at enhancing the speed of naive approaches through chaos based S-box design. Chaotic equations are popularly known for randomness, extreme sensitivity to initial conditions, and ergodicity. The proposed methodology employs two-dimensional discrete Henon map for (i) generation of dynamic and key-dependent S-box that could be integrated with symmetric algorithms like Blowfish and Data Encryption Standard (DES) and (ii) generation of one-time keys for simple substitution ciphers. The proposed design is tested for randomness, nonlinearity, avalanche effect, bit independence criterion, and key sensitivity. Experimental results confirm that chaos based S-box design and key generation significantly reduce the computational cost of video encryption with no compromise in security.
Ensemble of Chaotic and Naive Approaches for Performance Enhancement in Video Encryption
Chandrasekaran, Jeyamala; Thiruvengadam, S. J.
2015-01-01
Owing to the growth of high performance network technologies, multimedia applications over the Internet are increasing exponentially. Applications like video conferencing, video-on-demand, and pay-per-view depend upon encryption algorithms for providing confidentiality. Video communication is characterized by distinct features such as large volume, high redundancy between adjacent frames, video codec compliance, syntax compliance, and application specific requirements. Naive approaches for video encryption encrypt the entire video stream with conventional text based cryptographic algorithms. Although naive approaches are the most secure for video encryption, the computational cost associated with them is very high. This research work aims at enhancing the speed of naive approaches through chaos based S-box design. Chaotic equations are popularly known for randomness, extreme sensitivity to initial conditions, and ergodicity. The proposed methodology employs two-dimensional discrete Henon map for (i) generation of dynamic and key-dependent S-box that could be integrated with symmetric algorithms like Blowfish and Data Encryption Standard (DES) and (ii) generation of one-time keys for simple substitution ciphers. The proposed design is tested for randomness, nonlinearity, avalanche effect, bit independence criterion, and key sensitivity. Experimental results confirm that chaos based S-box design and key generation significantly reduce the computational cost of video encryption with no compromise in security. PMID:26550603
Secure uniform random-number extraction via incoherent strategies
NASA Astrophysics Data System (ADS)
Hayashi, Masahito; Zhu, Huangjun
2018-01-01
To guarantee the security of uniform random numbers generated by a quantum random-number generator, we study secure extraction of uniform random numbers when the environment of a given quantum state is controlled by the third party, the eavesdropper. Here we restrict our operations to incoherent strategies that are composed of the measurement on the computational basis and incoherent operations (or incoherence-preserving operations). We show that the maximum secure extraction rate is equal to the relative entropy of coherence. By contrast, the coherence of formation gives the extraction rate when a certain constraint is imposed on the eavesdropper's operations. The condition under which the two extraction rates coincide is then determined. Furthermore, we find that the exponential decreasing rate of the leaked information is characterized by Rényi relative entropies of coherence. These results clarify the power of incoherent strategies in random-number generation, and can be applied to guarantee the quality of random numbers generated by a quantum random-number generator.
Entanglement routers via a wireless quantum network based on arbitrary two qubit systems
NASA Astrophysics Data System (ADS)
Metwally, N.
2014-12-01
A wireless quantum network is generated between multi-hops, where each hop consists of two entangled nodes. These nodes share a finite number of entangled two-qubit systems randomly. Different types of wireless quantum bridges (WQBS) are generated between the non-connected nodes. The efficiency of these WQBS to be used as quantum channels between its terminals to perform quantum teleportation is investigated. We suggest a theoretical wireless quantum communication protocol to teleport unknown quantum signals from one node to another, where the more powerful WQBS are used as quantum channels. It is shown that, by increasing the efficiency of the sources that emit the initial partial entangled states, one can increase the efficiency of the wireless quantum communication protocol.
Certified randomness in quantum physics.
Acín, Antonio; Masanes, Lluis
2016-12-07
The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.
Random deflections of a string on an elastic foundation.
NASA Technical Reports Server (NTRS)
Sanders, J. L., Jr.
1972-01-01
The paper is concerned with the problem of a taut string on a random elastic foundation subjected to random loads. The boundary value problem is transformed into an initial value problem by the method of invariant imbedding. Fokker-Planck equations for the random initial value problem are formulated and solved in some special cases. The analysis leads to a complete characterization of the random deflection function.
Problems with the random number generator RANF implemented on the CDC cyber 205
NASA Astrophysics Data System (ADS)
Kalle, Claus; Wansleben, Stephan
1984-10-01
We show that using RANF may lead to wrong results when lattice models are simulated by Monte Carlo methods. We present a shift-register sequence random number generator which generates two random numbers per cycle on a two pipe CDC Cyber 205.
Recommendations and illustrations for the evaluation of photonic random number generators
NASA Astrophysics Data System (ADS)
Hart, Joseph D.; Terashima, Yuta; Uchida, Atsushi; Baumgartner, Gerald B.; Murphy, Thomas E.; Roy, Rajarshi
2017-09-01
The never-ending quest to improve the security of digital information combined with recent improvements in hardware technology has caused the field of random number generation to undergo a fundamental shift from relying solely on pseudo-random algorithms to employing optical entropy sources. Despite these significant advances on the hardware side, commonly used statistical measures and evaluation practices remain ill-suited to understand or quantify the optical entropy that underlies physical random number generation. We review the state of the art in the evaluation of optical random number generation and recommend a new paradigm: quantifying entropy generation and understanding the physical limits of the optical sources of randomness. In order to do this, we advocate for the separation of the physical entropy source from deterministic post-processing in the evaluation of random number generators and for the explicit consideration of the impact of the measurement and digitization process on the rate of entropy production. We present the Cohen-Procaccia estimate of the entropy rate h (𝜖 ,τ ) as one way to do this. In order to provide an illustration of our recommendations, we apply the Cohen-Procaccia estimate as well as the entropy estimates from the new NIST draft standards for physical random number generators to evaluate and compare three common optical entropy sources: single photon time-of-arrival detection, chaotic lasers, and amplified spontaneous emission.
NASA Astrophysics Data System (ADS)
Kandrup, Henry E.
1988-06-01
This paper reexamines the statistical quantum field theory of a free, minimally coupled, real scalar field Φ in a statically bounded, classical Friedmann cosmology, where the time-dependent scale factor Ω(t) tends to constant values Ω1 and Ω2 for t
NASA Technical Reports Server (NTRS)
Ham, Yoo-Geun; Schubert, Siegfried; Chang, Yehui
2012-01-01
An initialization strategy, tailored to the prediction of the Madden-Julian oscillation (MJO), is evaluated using the Goddard Earth Observing System Model, version 5 (GEOS-5), coupled general circulation model (CGCM). The approach is based on the empirical singular vectors (ESVs) of a reduced-space statistically determined linear approximation of the full nonlinear CGCM. The initial ESV, extracted using 10 years (1990-99) of boreal winter hindcast data, has zonal wind anomalies over the western Indian Ocean, while the final ESV (at a forecast lead time of 10 days) reflects a propagation of the zonal wind anomalies to the east over the Maritime Continent an evolution that is characteristic of the MJO. A new set of ensemble hindcasts are produced for the boreal winter season from 1990 to 1999 in which the leading ESV provides the initial perturbations. The results are compared with those from a set of control hindcasts generated using random perturbations. It is shown that the ESV-based predictions have a systematically higher bivariate correlation skill in predicting the MJO compared to those using the random perturbations. Furthermore, the improvement in the skill depends on the phase of the MJO. The ESV is particularly effective in increasing the forecast skill during those phases of the MJO in which the control has low skill (with correlations increasing by as much as 0.2 at 20 25-day lead times), as well as during those times in which the MJO is weak.
Generating Variable and Random Schedules of Reinforcement Using Microsoft Excel Macros
Bancroft, Stacie L; Bourret, Jason C
2008-01-01
Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to write macros in Microsoft Excel that will generate variable-ratio, variable-interval, variable-time, random-ratio, random-interval, and random-time reinforcement schedule values. PMID:18595286
Bauer, Mark S; Miller, Christopher J; Li, Mingfei; Bajor, Laura A; Lee, Austin
2016-09-01
Numerous antimanic treatments have been introduced over the past two decades, particularly second-generation antipsychotics (SGAs). However, it is not clear whether such newer agents provide any advantage over older treatments. A historical cohort design investigated the nationwide population of outpatients with bipolar disorder treated in the Department of Veterans Affairs who were newly initiated on an antimanic agent between 2003 and 2010 (N=27 727). The primary outcome was likelihood of all-cause hospitalization during the year after initiation, controlling for numerous demographic, clinical, and treatment characteristics. Potential correlates of effect were explored by investigating time to initiation of a second antimanic agent or antidepressant. After control for covariates, those initiated on lithium or valproate monotherapy, compared to those beginning SGA monotherapy, were significantly less likely to be hospitalized, had a longer time to hospitalization, and had fewer hospitalizations in the subsequent year. Those on combination treatment had a significantly higher likelihood of hospitalization, although they also had a longer time to addition of an additional antimanic agent or antidepressant. The present analysis of a large and unselected nationwide population provides important complementary data to that from controlled trials. Although various mechanisms may be responsible for the results, the data support the utilization of lithium or valproate, rather than SGAs, as the initial antimanic treatment in bipolar disorder. A large-scale, prospective, randomized, pragmatic clinical trial comparing the initiation of SGA monotherapy to that of lithium or valproate monotherapy is a logical next step. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Programmable random interval generator
NASA Technical Reports Server (NTRS)
Lindsey, R. S., Jr.
1973-01-01
Random pulse generator can supply constant-amplitude randomly distributed pulses with average rate ranging from a few counts per second to more than one million counts per second. Generator requires no high-voltage power supply or any special thermal cooling apparatus. Device is uniquely versatile and provides wide dynamic range of operation.
Novel layered clustering-based approach for generating ensemble of classifiers.
Rahman, Ashfaqur; Verma, Brijesh
2011-05-01
This paper introduces a novel concept for creating an ensemble of classifiers. The concept is based on generating an ensemble of classifiers through clustering of data at multiple layers. The ensemble classifier model generates a set of alternative clustering of a dataset at different layers by randomly initializing the clustering parameters and trains a set of base classifiers on the patterns at different clusters in different layers. A test pattern is classified by first finding the appropriate cluster at each layer and then using the corresponding base classifier. The decisions obtained at different layers are fused into a final verdict using majority voting. As the base classifiers are trained on overlapping patterns at different layers, the proposed approach achieves diversity among the individual classifiers. Identification of difficult-to-classify patterns through clustering as well as achievement of diversity through layering leads to better classification results as evidenced from the experimental results.
Experimentally generated randomness certified by the impossibility of superluminal signals.
Bierhorst, Peter; Knill, Emanuel; Glancy, Scott; Zhang, Yanbao; Mink, Alan; Jordan, Stephen; Rommal, Andrea; Liu, Yi-Kai; Christensen, Bradley; Nam, Sae Woo; Stevens, Martin J; Shalm, Lynden K
2018-04-01
From dice to modern electronic circuits, there have been many attempts to build better devices to generate random numbers. Randomness is fundamental to security and cryptographic systems and to safeguarding privacy. A key challenge with random-number generators is that it is hard to ensure that their outputs are unpredictable 1-3 . For a random-number generator based on a physical process, such as a noisy classical system or an elementary quantum measurement, a detailed model that describes the underlying physics is necessary to assert unpredictability. Imperfections in the model compromise the integrity of the device. However, it is possible to exploit the phenomenon of quantum non-locality with a loophole-free Bell test to build a random-number generator that can produce output that is unpredictable to any adversary that is limited only by general physical principles, such as special relativity 1-11 . With recent technological developments, it is now possible to carry out such a loophole-free Bell test 12-14,22 . Here we present certified randomness obtained from a photonic Bell experiment and extract 1,024 random bits that are uniformly distributed to within 10 -12 . These random bits could not have been predicted according to any physical theory that prohibits faster-than-light (superluminal) signalling and that allows independent measurement choices. To certify and quantify the randomness, we describe a protocol that is optimized for devices that are characterized by a low per-trial violation of Bell inequalities. Future random-number generators based on loophole-free Bell tests may have a role in increasing the security and trust of our cryptographic systems and infrastructure.
Pseudo-random number generator for the Sigma 5 computer
NASA Technical Reports Server (NTRS)
Carroll, S. N.
1983-01-01
A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.
Katriel, G.; Yaari, R.; Huppert, A.; Roll, U.; Stone, L.
2011-01-01
This paper presents new computational and modelling tools for studying the dynamics of an epidemic in its initial stages that use both available incidence time series and data describing the population's infection network structure. The work is motivated by data collected at the beginning of the H1N1 pandemic outbreak in Israel in the summer of 2009. We formulated a new discrete-time stochastic epidemic SIR (susceptible-infected-recovered) model that explicitly takes into account the disease's specific generation-time distribution and the intrinsic demographic stochasticity inherent to the infection process. Moreover, in contrast with many other modelling approaches, the model allows direct analytical derivation of estimates for the effective reproductive number (Re) and of their credible intervals, by maximum likelihood and Bayesian methods. The basic model can be extended to include age–class structure, and a maximum likelihood methodology allows us to estimate the model's next-generation matrix by combining two types of data: (i) the incidence series of each age group, and (ii) infection network data that provide partial information of ‘who-infected-who’. Unlike other approaches for estimating the next-generation matrix, the method developed here does not require making a priori assumptions about the structure of the next-generation matrix. We show, using a simulation study, that even a relatively small amount of information about the infection network greatly improves the accuracy of estimation of the next-generation matrix. The method is applied in practice to estimate the next-generation matrix from the Israeli H1N1 pandemic data. The tools developed here should be of practical importance for future investigations of epidemics during their initial stages. However, they require the availability of data which represent a random sample of the real epidemic process. We discuss the conditions under which reporting rates may or may not influence our estimated quantities and the effects of bias. PMID:21247949
Time Evolving Fission Chain Theory and Fast Neutron and Gamma-Ray Counting Distributions
Kim, K. S.; Nakae, L. F.; Prasad, M. K.; ...
2015-11-01
Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less
NASA Astrophysics Data System (ADS)
Das, Suman; Sadique Uz Zaman, J. K. M.; Ghosh, Ranjan
2016-06-01
In Advanced Encryption Standard (AES), the standard S-Box is conventionally generated by using a particular irreducible polynomial {11B} in GF(28) as the modulus and a particular additive constant polynomial {63} in GF(2), though it can be generated by many other polynomials. In this paper, it has been shown that it is possible to generate secured AES S-Boxes by using some other selected modulus and additive polynomials and also can be generated randomly, using a PRNG like BBS. A comparative study has been made on the randomness of corresponding AES ciphertexts generated, using these S-Boxes, by the NIST Test Suite coded for this paper. It has been found that besides using the standard one, other moduli and additive constants are also able to generate equally or better random ciphertexts; the same is true for random S-Boxes also. As these new types of S-Boxes are user-defined, hence unknown, they are able to prevent linear and differential cryptanalysis. Moreover, they act as additional key-inputs to AES, thus increasing the key-space.
A package of Linux scripts for the parallelization of Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Badal, Andreu; Sempau, Josep
2006-09-01
Despite the fact that fast computers are nowadays available at low cost, there are many situations where obtaining a reasonably low statistical uncertainty in a Monte Carlo (MC) simulation involves a prohibitively large amount of time. This limitation can be overcome by having recourse to parallel computing. Most tools designed to facilitate this approach require modification of the source code and the installation of additional software, which may be inconvenient for some users. We present a set of tools, named clonEasy, that implement a parallelization scheme of a MC simulation that is free from these drawbacks. In clonEasy, which is designed to run under Linux, a set of "clone" CPUs is governed by a "master" computer by taking advantage of the capabilities of the Secure Shell (ssh) protocol. Any Linux computer on the Internet that can be ssh-accessed by the user can be used as a clone. A key ingredient for the parallel calculation to be reliable is the availability of an independent string of random numbers for each CPU. Many generators—such as RANLUX, RANECU or the Mersenne Twister—can readily produce these strings by initializing them appropriately and, hence, they are suitable to be used with clonEasy. This work was primarily motivated by the need to find a straightforward way to parallelize PENELOPE, a code for MC simulation of radiation transport that (in its current 2005 version) employs the generator RANECU, which uses a combination of two multiplicative linear congruential generators (MLCGs). Thus, this paper is focused on this class of generators and, in particular, we briefly present an extension of RANECU that increases its period up to ˜5×10 and we introduce seedsMLCG, a tool that provides the information necessary to initialize disjoint sequences of an MLCG to feed different CPUs. This program, in combination with clonEasy, allows to run PENELOPE in parallel easily, without requiring specific libraries or significant alterations of the sequential code. Program summary 1Title of program:clonEasy Catalogue identifier:ADYD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADYD_v1_0 Program obtainable from:CPC Program Library, Queen's University of Belfast, Northern Ireland Computer for which the program is designed and others in which it is operable:Any computer with a Unix style shell (bash), support for the Secure Shell protocol and a FORTRAN compiler Operating systems under which the program has been tested:Linux (RedHat 8.0, SuSe 8.1, Debian Woody 3.1) Compilers:GNU FORTRAN g77 (Linux); g95 (Linux); Intel Fortran Compiler 7.1 (Linux) Programming language used:Linux shell (bash) script, FORTRAN 77 No. of bits in a word:32 No. of lines in distributed program, including test data, etc.:1916 No. of bytes in distributed program, including test data, etc.:18 202 Distribution format:tar.gz Nature of the physical problem:There are many situations where a Monte Carlo simulation involves a huge amount of CPU time. The parallelization of such calculations is a simple way of obtaining a relatively low statistical uncertainty using a reasonable amount of time. Method of solution:The presented collection of Linux scripts and auxiliary FORTRAN programs implement Secure Shell-based communication between a "master" computer and a set of "clones". The aim of this communication is to execute a code that performs a Monte Carlo simulation on all the clones simultaneously. The code is unique, but each clone is fed with a different set of random seeds. Hence, clonEasy effectively permits the parallelization of the calculation. Restrictions on the complexity of the program:clonEasy can only be used with programs that produce statistically independent results using the same code, but with a different sequence of random numbers. Users must choose the initialization values for the random number generator on each computer and combine the output from the different executions. A FORTRAN program to combine the final results is also provided. Typical running time:The execution time of each script largely depends on the number of computers that are used, the actions that are to be performed and, to a lesser extent, on the network connexion bandwidth. Unusual features of the program:Any computer on the Internet with a Secure Shell client/server program installed can be used as a node of a virtual computer cluster for parallel calculations with the sequential source code. The simplicity of the parallelization scheme makes the use of this package a straightforward task, which does not require installing any additional libraries. Program summary 2Title of program:seedsMLCG Catalogue identifier:ADYE_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADYE_v1_0 Program obtainable from:CPC Program Library, Queen's University of Belfast, Northern Ireland Computer for which the program is designed and others in which it is operable:Any computer with a FORTRAN compiler Operating systems under which the program has been tested:Linux (RedHat 8.0, SuSe 8.1, Debian Woody 3.1), MS Windows (2000, XP) Compilers:GNU FORTRAN g77 (Linux and Windows); g95 (Linux); Intel Fortran Compiler 7.1 (Linux); Compaq Visual Fortran 6.1 (Windows) Programming language used:FORTRAN 77 No. of bits in a word:32 Memory required to execute with typical data:500 kilobytes No. of lines in distributed program, including test data, etc.:492 No. of bytes in distributed program, including test data, etc.:5582 Distribution format:tar.gz Nature of the physical problem:Statistically independent results from different runs of a Monte Carlo code can be obtained using uncorrelated sequences of random numbers on each execution. Multiplicative linear congruential generators (MLCG), or other generators that are based on them such as RANECU, can be adapted to produce these sequences. Method of solution:For a given MLCG, the presented program calculates initialization values that produce disjoint, consecutive sequences of pseudo-random numbers. The calculated values initiate the generator in distant positions of the random number cycle and can be used, for instance, on a parallel simulation. The values are found using the formula S=(aS)MODm, which gives the random value that will be generated after J iterations of the MLCG. Restrictions on the complexity of the program:The 32-bit length restriction for the integer variables in standard FORTRAN 77 limits the produced seeds to be separated a distance smaller than 2 31, when the distance J is expressed as an integer value. The program allows the user to input the distance as a power of 10 for the purpose of efficiently splitting the sequence of generators with a very long period. Typical running time:The execution time depends on the parameters of the used MLCG and the distance between the generated seeds. The generation of 10 6 seeds separated 10 12 units in the sequential cycle, for one of the MLCGs found in the RANECU generator, takes 3 s on a 2.4 GHz Intel Pentium 4 using the g77 compiler.
Pawlowski, Marcin Piotr; Jara, Antonio; Ogorzalek, Maciej
2015-01-01
Entropy in computer security is associated with the unpredictability of a source of randomness. The random source with high entropy tends to achieve a uniform distribution of random values. Random number generators are one of the most important building blocks of cryptosystems. In constrained devices of the Internet of Things ecosystem, high entropy random number generators are hard to achieve due to hardware limitations. For the purpose of the random number generation in constrained devices, this work proposes a solution based on the least-significant bits concatenation entropy harvesting method. As a potential source of entropy, on-board integrated sensors (i.e., temperature, humidity and two different light sensors) have been analyzed. Additionally, the costs (i.e., time and memory consumption) of the presented approach have been measured. The results obtained from the proposed method with statistical fine tuning achieved a Shannon entropy of around 7.9 bits per byte of data for temperature and humidity sensors. The results showed that sensor-based random number generators are a valuable source of entropy with very small RAM and Flash memory requirements for constrained devices of the Internet of Things. PMID:26506357
Pawlowski, Marcin Piotr; Jara, Antonio; Ogorzalek, Maciej
2015-10-22
Entropy in computer security is associated with the unpredictability of a source of randomness. The random source with high entropy tends to achieve a uniform distribution of random values. Random number generators are one of the most important building blocks of cryptosystems. In constrained devices of the Internet of Things ecosystem, high entropy random number generators are hard to achieve due to hardware limitations. For the purpose of the random number generation in constrained devices, this work proposes a solution based on the least-significant bits concatenation entropy harvesting method. As a potential source of entropy, on-board integrated sensors (i.e., temperature, humidity and two different light sensors) have been analyzed. Additionally, the costs (i.e., time and memory consumption) of the presented approach have been measured. The results obtained from the proposed method with statistical fine tuning achieved a Shannon entropy of around 7.9 bits per byte of data for temperature and humidity sensors. The results showed that sensor-based random number generators are a valuable source of entropy with very small RAM and Flash memory requirements for constrained devices of the Internet of Things.
Truly random number generation: an example
NASA Astrophysics Data System (ADS)
Frauchiger, Daniela; Renner, Renato
2013-10-01
Randomness is crucial for a variety of applications, ranging from gambling to computer simulations, and from cryptography to statistics. However, many of the currently used methods for generating randomness do not meet the criteria that are necessary for these applications to work properly and safely. A common problem is that a sequence of numbers may look random but nevertheless not be truly random. In fact, the sequence may pass all standard statistical tests and yet be perfectly predictable. This renders it useless for many applications. For example, in cryptography, the predictability of a "andomly" chosen password is obviously undesirable. Here, we review a recently developed approach to generating true | and hence unpredictable | randomness.
Almirall, Daniel; DiStefano, Charlotte; Chang, Ya-Chih; Shire, Stephanie; Kaiser, Ann; Lu, Xi; Nahum-Shani, Inbal; Landa, Rebecca; Mathy, Pamela; Kasari, Connie
2016-01-01
Objective There are limited data on the effects of adaptive social communication interventions with a speech-generating device in autism. This study is the first to compare growth in communications outcomes among three adaptive interventions in school-aged children with autism spectrum disorder (ASD) who are minimally verbal. Methods Sixty-one children, aged 5–8 years participated in a sequential, multiple-assignment randomized trial (SMART). All children received a developmental communication intervention: joint attention, symbolic play, engagement and regulation (JASP) with enhanced milieu teaching (EMT). The SMART included three two-stage, 24-week adaptive interventions with different provisions of a speech-generating device (SGD) in the context of JASP+EMT. The first adaptive intervention, with no SGD, initially assigned JASP+EMT alone; then intensified JASP+EMT for slow responders. In the second adaptive intervention, slow responders to JASP+EMT were assigned JASP+EMT+SGD. The third adaptive intervention initially assigned JASP+EMT+SGD; then intensified JASP+EMT+SGD for slow responders. Analyses examined between-group differences in change in outcomes from baseline to week 36. Verbal outcomes included spontaneous communicative utterances and novel words. Non-linguistic communication outcomes included initiating joint attention and behavior regulation, and play. Results The adaptive intervention beginning with JASP+EMT+SGD was estimated as superior. There were significant (P<0.05) between-group differences in change in spontaneous communicative utterances and initiating joint attention. Conclusions School-aged children with ASD who are minimally verbal make significant gains in communication outcomes with an adaptive intervention beginning with JASP+EMT+SGD. Future research should explore mediators and moderators of the adaptive intervention effects and second-stage intervention options that further capitalize on early gains in treatment. PMID:26954267
DNA-based random number generation in security circuitry.
Gearheart, Christy M; Arazi, Benjamin; Rouchka, Eric C
2010-06-01
DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. This research focuses on further developing DNA-based methodologies to mimic digital data manipulation. While exhibiting fundamental principles, this work was done in conjunction with the vision that DNA-based circuitry, when the technology matures, will form the basis for a tamper-proof security module, revolutionizing the meaning and concept of tamper-proofing and possibly preventing it altogether based on accurate scientific observations. A paramount part of such a solution would be self-generation of random numbers. A novel prototype schema employs solid phase synthesis of oligonucleotides for random construction of DNA sequences; temporary storage and retrieval is achieved through plasmid vectors. A discussion of how to evaluate sequence randomness is included, as well as how these techniques are applied to a simulation of the random number generation circuitry. Simulation results show generated sequences successfully pass three selected NIST random number generation tests specified for security applications.
NASA Astrophysics Data System (ADS)
Matsumoto, Kouhei; Kasuya, Yuki; Yumoto, Mitsuki; Arai, Hideaki; Sato, Takashi; Sakamoto, Shuichi; Ohkawa, Masashi; Ohdaira, Yasuo
2018-02-01
Not so long ago, pseudo random numbers generated by numerical formulae were considered to be adequate for encrypting important data-files, because of the time needed to decode them. With today's ultra high-speed processors, however, this is no longer true. So, in order to thwart ever-more advanced attempts to breach our system's protections, cryptologists have devised a method that is considered to be virtually impossible to decode, and uses what is a limitless number of physical random numbers. This research describes a method, whereby laser diode's frequency noise generate a large quantities of physical random numbers. Using two types of photo detectors (APD and PIN-PD), we tested the abilities of two types of lasers (FP-LD and VCSEL) to generate random numbers. In all instances, an etalon served as frequency discriminator, the examination pass rates were determined using NIST FIPS140-2 test at each bit, and the Random Number Generation (RNG) speed was noted.
Generation of pseudo-random numbers
NASA Technical Reports Server (NTRS)
Howell, L. W.; Rheinfurth, M. H.
1982-01-01
Practical methods for generating acceptable random numbers from a variety of probability distributions which are frequently encountered in engineering applications are described. The speed, accuracy, and guarantee of statistical randomness of the various methods are discussed.
Ultra-fast quantum randomness generation by accelerated phase diffusion in a pulsed laser diode.
Abellán, C; Amaya, W; Jofre, M; Curty, M; Acín, A; Capmany, J; Pruneri, V; Mitchell, M W
2014-01-27
We demonstrate a high bit-rate quantum random number generator by interferometric detection of phase diffusion in a gain-switched DFB laser diode. Gain switching at few-GHz frequencies produces a train of bright pulses with nearly equal amplitudes and random phases. An unbalanced Mach-Zehnder interferometer is used to interfere subsequent pulses and thereby generate strong random-amplitude pulses, which are detected and digitized to produce a high-rate random bit string. Using established models of semiconductor laser field dynamics, we predict a regime of high visibility interference and nearly complete vacuum-fluctuation-induced phase diffusion between pulses. These are confirmed by measurement of pulse power statistics at the output of the interferometer. Using a 5.825 GHz excitation rate and 14-bit digitization, we observe 43 Gbps quantum randomness generation.
Self-balanced real-time photonic scheme for ultrafast random number generation
NASA Astrophysics Data System (ADS)
Li, Pu; Guo, Ya; Guo, Yanqiang; Fan, Yuanlong; Guo, Xiaomin; Liu, Xianglian; Shore, K. Alan; Dubrova, Elena; Xu, Bingjie; Wang, Yuncai; Wang, Anbang
2018-06-01
We propose a real-time self-balanced photonic method for extracting ultrafast random numbers from broadband randomness sources. In place of electronic analog-to-digital converters (ADCs), the balanced photo-detection technology is used to directly quantize optically sampled chaotic pulses into a continuous random number stream. Benefitting from ultrafast photo-detection, our method can efficiently eliminate the generation rate bottleneck from electronic ADCs which are required in nearly all the available fast physical random number generators. A proof-of-principle experiment demonstrates that using our approach 10 Gb/s real-time and statistically unbiased random numbers are successfully extracted from a bandwidth-enhanced chaotic source. The generation rate achieved experimentally here is being limited by the bandwidth of the chaotic source. The method described has the potential to attain a real-time rate of 100 Gb/s.
Armijo-Olivo, Susan; Cummings, Greta G.; Amin, Maryam; Flores-Mir, Carlos
2017-01-01
Objectives To examine the risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions and the development of these aspects over time. Methods We included 540 randomized clinical trials from 64 selected systematic reviews. We extracted, in duplicate, details from each of the selected randomized clinical trials with respect to publication and trial characteristics, reporting and methodologic characteristics, and Cochrane risk of bias domains. We analyzed data using logistic regression and Chi-square statistics. Results Sequence generation was assessed to be inadequate (at unclear or high risk of bias) in 68% (n = 367) of the trials, while allocation concealment was inadequate in the majority of trials (n = 464; 85.9%). Blinding of participants and blinding of the outcome assessment were judged to be inadequate in 28.5% (n = 154) and 40.5% (n = 219) of the trials, respectively. A sample size calculation before the initiation of the study was not performed/reported in 79.1% (n = 427) of the trials, while the sample size was assessed as adequate in only 17.6% (n = 95) of the trials. Two thirds of the trials were not described as double blinded (n = 358; 66.3%), while the method of blinding was appropriate in 53% (n = 286) of the trials. We identified a significant decrease over time (1955–2013) in the proportion of trials assessed as having inadequately addressed methodological quality items (P < 0.05) in 30 out of the 40 quality criteria, or as being inadequate (at high or unclear risk of bias) in five domains of the Cochrane risk of bias tool: sequence generation, allocation concealment, incomplete outcome data, other sources of bias, and overall risk of bias. Conclusions The risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions have improved over time; however, further efforts that contribute to the development of more stringent methodology and detailed reporting of trials are still needed. PMID:29272315
Saltaji, Humam; Armijo-Olivo, Susan; Cummings, Greta G; Amin, Maryam; Flores-Mir, Carlos
2017-01-01
To examine the risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions and the development of these aspects over time. We included 540 randomized clinical trials from 64 selected systematic reviews. We extracted, in duplicate, details from each of the selected randomized clinical trials with respect to publication and trial characteristics, reporting and methodologic characteristics, and Cochrane risk of bias domains. We analyzed data using logistic regression and Chi-square statistics. Sequence generation was assessed to be inadequate (at unclear or high risk of bias) in 68% (n = 367) of the trials, while allocation concealment was inadequate in the majority of trials (n = 464; 85.9%). Blinding of participants and blinding of the outcome assessment were judged to be inadequate in 28.5% (n = 154) and 40.5% (n = 219) of the trials, respectively. A sample size calculation before the initiation of the study was not performed/reported in 79.1% (n = 427) of the trials, while the sample size was assessed as adequate in only 17.6% (n = 95) of the trials. Two thirds of the trials were not described as double blinded (n = 358; 66.3%), while the method of blinding was appropriate in 53% (n = 286) of the trials. We identified a significant decrease over time (1955-2013) in the proportion of trials assessed as having inadequately addressed methodological quality items (P < 0.05) in 30 out of the 40 quality criteria, or as being inadequate (at high or unclear risk of bias) in five domains of the Cochrane risk of bias tool: sequence generation, allocation concealment, incomplete outcome data, other sources of bias, and overall risk of bias. The risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions have improved over time; however, further efforts that contribute to the development of more stringent methodology and detailed reporting of trials are still needed.
Autocorrelation peaks in congruential pseudorandom number generators
NASA Technical Reports Server (NTRS)
Neuman, F.; Merrick, R. B.
1976-01-01
The complete correlation structure of several congruential pseudorandom number generators (PRNG) of the same type and small cycle length was studied to deal with the problem of congruential PRNG almost repeating themselves at intervals smaller than their cycle lengths, during simulation of bandpass filtered normal random noise. Maximum period multiplicative and mixed congruential generators were studied, with inferences drawn from examination of several tractable members of a class of random number generators, and moduli from 2 to the 5th power to 2 to the 9th power. High correlation is shown to exist in mixed and multiplicative congruential random number generators and prime moduli Lehmer generators for shifts a fraction of their cycle length. The random noise sequences in question are required when simulating electrical noise, air turbulence, or time variation of wind parameters.
Transport of secondary electrons and reactive species in ion tracks
NASA Astrophysics Data System (ADS)
Surdutovich, Eugene; Solov'yov, Andrey V.
2015-08-01
The transport of reactive species brought about by ions traversing tissue-like medium is analysed analytically. Secondary electrons ejected by ions are capable of ionizing other molecules; the transport of these generations of electrons is studied using the random walk approximation until these electrons remain ballistic. Then, the distribution of solvated electrons produced as a result of interaction of low-energy electrons with water molecules is obtained. The radial distribution of energy loss by ions and secondary electrons to the medium yields the initial radial dose distribution, which can be used as initial conditions for the predicted shock waves. The formation, diffusion, and chemical evolution of hydroxyl radicals in liquid water are studied as well. COST Action Nano-IBCT: Nano-scale Processes Behind Ion-Beam Cancer Therapy.
Rosenheck, Robert A; Leslie, Douglas L; Sindelar, Jody L; Miller, Edward A; Tariot, Peter N; Dagerman, Karen S; Davis, Sonia M; Lebowitz, Barry D; Rabins, Peter; Hsiao, John K; Lieberman, Jeffery A; Schneider, Lon S
2007-11-01
Second-generation antipsychotics (SGAs) are prescribed for psychosis, aggression, and agitation in Alzheimer disease (AD). To conduct a cost-benefit analysis of SGAs and placebo (taken to represent a "watchful waiting" treatment strategy) for psychosis and aggression in outpatients with AD. Randomized placebo-controlled trial of alternative SGA initiation strategies. Forty-two outpatient clinics. Outpatients with AD and psychosis, aggression, or agitation (N = 421). Intervention Participants were randomly assigned to treatment with olanzapine, quetiapine fumarate, risperidone, or placebo with the option of double-blind rerandomization to another antipsychotic or citalopram hydrobromide or open treatment over 9 months. Monthly interviews documented health service use and costs. The economic perspective addressed total health care and medication costs. Costs of study drugs were estimated from wholesale prices with adjustment for discounts and rebates. Quality-adjusted life-years (QALYs) were assessed with the Health Utilities Index Mark 3 and were supplemented with measures of functioning, activities of daily living, and quality of life. Primary analyses were conducted using all available data. Secondary analyses excluded observations after the first medication change (ie, phase 1 only). Cost-benefit analysis was conducted using the net health benefits approach in a sensitivity analysis in which QALYs were valued at $50,000 per year and $100,000 per year. Average total health costs, including medications, were significantly lower for placebo than for SGAs, by $50 to $100 per month. There were no differences between treatments in QALYs or other measures of function. Phase 1-only analyses were broadly similar. Net-benefit analysis showed greater net health benefits for placebo as compared with other treatments, with probabilities ranging from 50% to 90%. There were no differences in measures of effectiveness between initiation of active treatments or placebo (which represented watchful waiting) but the placebo group had significantly lower health care costs. clinicaltrials.gov Identifier: NCT00015548.
Efficient convex-elastic net algorithm to solve the Euclidean traveling salesman problem.
Al-Mulhem, M; Al-Maghrabi, T
1998-01-01
This paper describes a hybrid algorithm that combines an adaptive-type neural network algorithm and a nondeterministic iterative algorithm to solve the Euclidean traveling salesman problem (E-TSP). It begins with a brief introduction to the TSP and the E-TSP. Then, it presents the proposed algorithm with its two major components: the convex-elastic net (CEN) algorithm and the nondeterministic iterative improvement (NII) algorithm. These two algorithms are combined into the efficient convex-elastic net (ECEN) algorithm. The CEN algorithm integrates the convex-hull property and elastic net algorithm to generate an initial tour for the E-TSP. The NII algorithm uses two rearrangement operators to improve the initial tour given by the CEN algorithm. The paper presents simulation results for two instances of E-TSP: randomly generated tours and tours for well-known problems in the literature. Experimental results are given to show that the proposed algorithm ran find the nearly optimal solution for the E-TSP that outperform many similar algorithms reported in the literature. The paper concludes with the advantages of the new algorithm and possible extensions.
NASA Astrophysics Data System (ADS)
Bakhtiar, Nurizatul Syarfinas Ahmad; Abdullah, Farah Aini; Hasan, Yahya Abu
2017-08-01
In this paper, we consider the dynamical behaviour of the random field on the pulsating and snaking solitons in a dissipative systems described by the one-dimensional cubic-quintic complex Ginzburg-Landau equation (cqCGLE). The dynamical behaviour of the random filed was simulated by adding a random field to the initial pulse. Then, we solve it numerically by fixing the initial amplitude profile for the pulsating and snaking solitons without losing any generality. In order to create the random field, we choose 0 ≤ ɛ ≤ 1.0. As a result, multiple soliton trains are formed when the random field is applied to a pulse like initial profile for the parameters of the pulsating and snaking solitons. The results also show the effects of varying the random field of the transient energy peaks in pulsating and snaking solitons.
Note: Fully integrated 3.2 Gbps quantum random number generator with real-time extraction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiao-Guang; Nie, You-Qi; Liang, Hao
2016-07-15
We present a real-time and fully integrated quantum random number generator (QRNG) by measuring laser phase fluctuations. The QRNG scheme based on laser phase fluctuations is featured for its capability of generating ultra-high-speed random numbers. However, the speed bottleneck of a practical QRNG lies on the limited speed of randomness extraction. To close the gap between the fast randomness generation and the slow post-processing, we propose a pipeline extraction algorithm based on Toeplitz matrix hashing and implement it in a high-speed field-programmable gate array. Further, all the QRNG components are integrated into a module, including a compact and actively stabilizedmore » interferometer, high-speed data acquisition, and real-time data post-processing and transmission. The final generation rate of the QRNG module with real-time extraction can reach 3.2 Gbps.« less
NASA Technical Reports Server (NTRS)
Eck, Marshall; Mukunda, Meera
1988-01-01
A calculational method is described which provides a powerful tool for predicting solid rocket motor (SRM) casing and liquid rocket tankage fragmentation response. The approach properly partitions the available impulse to each major system-mass component. It uses the Pisces code developed by Physics International to couple the forces generated by an Eulerian-modeled gas flow field to a Lagrangian-modeled fuel and casing system. The details of the predictive analytical modeling process and the development of normalized relations for momentum partition as a function of SRM burn time and initial geometry are discussed. Methods for applying similar modeling techniques to liquid-tankage-overpressure failures are also discussed. Good agreement between predictions and observations are obtained for five specific events.
Experimental nonlocality-based randomness generation with nonprojective measurements
NASA Astrophysics Data System (ADS)
Gómez, S.; Mattar, A.; Gómez, E. S.; Cavalcanti, D.; Farías, O. Jiménez; Acín, A.; Lima, G.
2018-04-01
We report on an optical setup generating more than one bit of randomness from one entangled bit (i.e., a maximally entangled state of two qubits). The amount of randomness is certified through the observation of Bell nonlocal correlations. To attain this result we implemented a high-purity entanglement source and a nonprojective three-outcome measurement. Our implementation achieves a gain of 27% of randomness as compared with the standard methods using projective measurements. Additionally, we estimate the amount of randomness certified in a one-sided device-independent scenario, through the observation of Einstein-Podolsky-Rosen steering. Our results prove that nonprojective quantum measurements allow extending the limits for nonlocality-based certified randomness generation using current technology.
Lee, Jeffrey S; Cleaver, Gerald B
2017-10-01
In this note, the Cosmic Microwave Background (CMB) Radiation is shown to be capable of functioning as a Random Bit Generator, and constitutes an effectively infinite supply of truly random one-time pad values of arbitrary length. It is further argued that the CMB power spectrum potentially conforms to the FIPS 140-2 standard. Additionally, its applicability to the generation of a (n × n) random key matrix for a Vernam cipher is established.
A new simple technique for improving the random properties of chaos-based cryptosystems
NASA Astrophysics Data System (ADS)
Garcia-Bosque, M.; Pérez-Resa, A.; Sánchez-Azqueta, C.; Celma, S.
2018-03-01
A new technique for improving the security of chaos-based stream ciphers has been proposed and tested experimentally. This technique manages to improve the randomness properties of the generated keystream by preventing the system to fall into short period cycles due to digitation. In order to test this technique, a stream cipher based on a Skew Tent Map algorithm has been implemented on a Virtex 7 FPGA. The randomness of the keystream generated by this system has been compared to the randomness of the keystream generated by the same system with the proposed randomness-enhancement technique. By subjecting both keystreams to the National Institute of Standards and Technology (NIST) tests, we have proved that our method can considerably improve the randomness of the generated keystreams. In order to incorporate our randomness-enhancement technique, only 41 extra slices have been needed, proving that, apart from effective, this method is also efficient in terms of area and hardware resources.
Marra, Carlo A; Grubisic, Maja; Cibere, Jolanda; Grindrod, Kelly A; Woolcott, John C; Gastonguay, Louise; Esdaile, John M
2014-06-01
To determine if a pharmacist-initiated multidisciplinary strategy provides value for money compared to usual care in participants with previously undiagnosed knee osteoarthritis. Pharmacies were randomly allocated to provide either 1) usual care and a pamphlet or 2) intervention care, which consisted of education, pain medication management by a pharmacist, physiotherapy-guided exercise, and communication with the primary care physician. Costs and quality-adjusted life-years (QALYs) were determined for patients assigned to each treatment and incremental cost-effectiveness ratios (ICERs) were determined. From the Ministry of Health perspective, the average patient in the intervention group generated slightly higher costs compared with usual care. Similar findings were obtained when using the societal perspective. The intervention resulted in ICERs of $232 (95% confidence interval [95% CI] -1,530, 2,154) per QALY gained from the Ministry of Health perspective and $14,395 (95% CI 7,826, 23,132) per QALY gained from the societal perspective, compared with usual care. A pharmacist-initiated, multidisciplinary program was good value for money from both the societal and Ministry of Health perspectives. Copyright © 2014 by the American College of Rheumatology.
Doing better by getting worse: posthypnotic amnesia improves random number generation.
Terhune, Devin Blair; Brugger, Peter
2011-01-01
Although forgetting is often regarded as a deficit that we need to control to optimize cognitive functioning, it can have beneficial effects in a number of contexts. We examined whether disrupting memory for previous numerical responses would attenuate repetition avoidance (the tendency to avoid repeating the same number) during random number generation and thereby improve the randomness of responses. Low suggestible and low dissociative and high dissociative highly suggestible individuals completed a random number generation task in a control condition, following a posthypnotic amnesia suggestion to forget previous numerical responses, and in a second control condition following the cancellation of the suggestion. High dissociative highly suggestible participants displayed a selective increase in repetitions during posthypnotic amnesia, with equivalent repetition frequency to a random system, whereas the other two groups exhibited repetition avoidance across conditions. Our results demonstrate that temporarily disrupting memory for previous numerical responses improves random number generation.
Doing Better by Getting Worse: Posthypnotic Amnesia Improves Random Number Generation
Terhune, Devin Blair; Brugger, Peter
2011-01-01
Although forgetting is often regarded as a deficit that we need to control to optimize cognitive functioning, it can have beneficial effects in a number of contexts. We examined whether disrupting memory for previous numerical responses would attenuate repetition avoidance (the tendency to avoid repeating the same number) during random number generation and thereby improve the randomness of responses. Low suggestible and low dissociative and high dissociative highly suggestible individuals completed a random number generation task in a control condition, following a posthypnotic amnesia suggestion to forget previous numerical responses, and in a second control condition following the cancellation of the suggestion. High dissociative highly suggestible participants displayed a selective increase in repetitions during posthypnotic amnesia, with equivalent repetition frequency to a random system, whereas the other two groups exhibited repetition avoidance across conditions. Our results demonstrate that temporarily disrupting memory for previous numerical responses improves random number generation. PMID:22195022
Method and apparatus for in-situ characterization of energy storage and energy conversion devices
Christophersen, Jon P [Idaho Falls, ID; Motloch, Chester G [Idaho Falls, ID; Morrison, John L [Butte, MT; Albrecht, Weston [Layton, UT
2010-03-09
Disclosed are methods and apparatuses for determining an impedance of an energy-output device using a random noise stimulus applied to the energy-output device. A random noise signal is generated and converted to a random noise stimulus as a current source correlated to the random noise signal. A bias-reduced response of the energy-output device to the random noise stimulus is generated by comparing a voltage at the energy-output device terminal to an average voltage signal. The random noise stimulus and bias-reduced response may be periodically sampled to generate a time-varying current stimulus and a time-varying voltage response, which may be correlated to generate an autocorrelated stimulus, an autocorrelated response, and a cross-correlated response. Finally, the autocorrelated stimulus, the autocorrelated response, and the cross-correlated response may be combined to determine at least one of impedance amplitude, impedance phase, and complex impedance.
Unmet Need: Improving mHealth Evaluation Rigor to Build the Evidence Base.
Mookherji, Sangeeta; Mehl, Garrett; Kaonga, Nadi; Mechael, Patricia
2015-01-01
mHealth-the use of mobile technologies for health-is a growing element of health system activity globally, but evaluation of those activities remains quite scant, and remains an important knowledge gap for advancing mHealth activities. In 2010, the World Health Organization and Columbia University implemented a small-scale survey to generate preliminary data on evaluation activities used by mHealth initiatives. The authors describe self-reported data from 69 projects in 29 countries. The majority (74%) reported some sort of evaluation activity, primarily nonexperimental in design (62%). The authors developed a 6-point scale of evaluation rigor comprising information on use of comparison groups, sample size calculation, data collection timing, and randomization. The mean score was low (2.4); half (47%) were conducting evaluations with a minimum threshold (4+) of rigor, indicating use of a comparison group, while less than 20% had randomized the mHealth intervention. The authors were unable to assess whether the rigor score was appropriate for the type of mHealth activity being evaluated. What was clear was that although most data came from mHealth projects pilots aimed for scale-up, few had designed evaluations that would support crucial decisions on whether to scale up and how. Whether the mHealth activity is a strategy to improve health or a tool for achieving intermediate outcomes that should lead to better health, mHealth evaluations must be improved to generate robust evidence for cost-effectiveness assessment and to allow for accurate identification of the contribution of mHealth initiatives to health systems strengthening and the impact on actual health outcomes.
NASA Astrophysics Data System (ADS)
Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad; Janssen, Hans
2015-02-01
The majority of literature regarding optimized Latin hypercube sampling (OLHS) is devoted to increasing the efficiency of these sampling strategies through the development of new algorithms based on the combination of innovative space-filling criteria and specialized optimization schemes. However, little attention has been given to the impact of the initial design that is fed into the optimization algorithm, on the efficiency of OLHS strategies. Previous studies, as well as codes developed for OLHS, have relied on one of the following two approaches for the selection of the initial design in OLHS: (1) the use of random points in the hypercube intervals (random LHS), and (2) the use of midpoints in the hypercube intervals (midpoint LHS). Both approaches have been extensively used, but no attempt has been previously made to compare the efficiency and robustness of their resulting sample designs. In this study we compare the two approaches and show that the space-filling characteristics of OLHS designs are sensitive to the initial design that is fed into the optimization algorithm. It is also illustrated that the space-filling characteristics of OLHS designs based on midpoint LHS are significantly better those based on random LHS. The two approaches are compared by incorporating their resulting sample designs in Monte Carlo simulation (MCS) for uncertainty propagation analysis, and then, by employing the sample designs in the selection of the training set for constructing non-intrusive polynomial chaos expansion (NIPCE) meta-models which subsequently replace the original full model in MCSs. The analysis is based on two case studies involving numerical simulation of density dependent flow and solute transport in porous media within the context of seawater intrusion in coastal aquifers. We show that the use of midpoint LHS as the initial design increases the efficiency and robustness of the resulting MCSs and NIPCE meta-models. The study also illustrates that this relative improvement decreases with increasing number of sample points and input parameter dimensions. Since the computational time and efforts for generating the sample designs in the two approaches are identical, the use of midpoint LHS as the initial design in OLHS is thus recommended.
ERIC Educational Resources Information Center
Boonsathorn, Wasita; Charoen, Danuvasin; Dryver, Arthur L.
2014-01-01
E-Learning brings access to a powerful but often overlooked teaching tool: random number generation. Using random number generation, a practically infinite number of quantitative problem-solution sets can be created. In addition, within the e-learning context, in the spirit of the mastery of learning, it is possible to assign online quantitative…
Random numbers certified by Bell's theorem.
Pironio, S; Acín, A; Massar, S; de la Giroday, A Boyer; Matsukevich, D N; Maunz, P; Olmschenk, S; Hayes, D; Luo, L; Manning, T A; Monroe, C
2010-04-15
Randomness is a fundamental feature of nature and a valuable resource for applications ranging from cryptography and gambling to numerical simulation of physical and biological systems. Random numbers, however, are difficult to characterize mathematically, and their generation must rely on an unpredictable physical process. Inaccuracies in the theoretical modelling of such processes or failures of the devices, possibly due to adversarial attacks, limit the reliability of random number generators in ways that are difficult to control and detect. Here, inspired by earlier work on non-locality-based and device-independent quantum information processing, we show that the non-local correlations of entangled quantum particles can be used to certify the presence of genuine randomness. It is thereby possible to design a cryptographically secure random number generator that does not require any assumption about the internal working of the device. Such a strong form of randomness generation is impossible classically and possible in quantum systems only if certified by a Bell inequality violation. We carry out a proof-of-concept demonstration of this proposal in a system of two entangled atoms separated by approximately one metre. The observed Bell inequality violation, featuring near perfect detection efficiency, guarantees that 42 new random numbers are generated with 99 per cent confidence. Our results lay the groundwork for future device-independent quantum information experiments and for addressing fundamental issues raised by the intrinsic randomness of quantum theory.
NASA Astrophysics Data System (ADS)
Kwon, Sungchul; Kim, Jin Min
2015-01-01
For a fixed-energy (FE) Manna sandpile model in one dimension, we investigate the effects of random initial conditions on the dynamical scaling behavior of an order parameter. In the FE Manna model, the density ρ of total particles is conserved, and an absorbing phase transition occurs at ρc as ρ varies. In this work, we show that, for a given ρ , random initial distributions of particles lead to the domain structure in which domains with particle densities higher and lower than ρc alternate with each other. In the domain structure, the dominant length scale is the average domain length, which increases via the coalescence of adjacent domains. At ρc, the domain structure slows down the decay of an order parameter and also causes anomalous finite-size effects, i.e., power-law decay followed by an exponential one before the quasisteady state. As a result, the interplay of particle conservation and random initial conditions causes the domain structure, which is the origin of the anomalous dynamical scaling behaviors for random initial conditions.
Social Noise: Generating Random Numbers from Twitter Streams
NASA Astrophysics Data System (ADS)
Fernández, Norberto; Quintas, Fernando; Sánchez, Luis; Arias, Jesús
2015-12-01
Due to the multiple applications of random numbers in computer systems (cryptography, online gambling, computer simulation, etc.) it is important to have mechanisms to generate these numbers. True Random Number Generators (TRNGs) are commonly used for this purpose. TRNGs rely on non-deterministic sources to generate randomness. Physical processes (like noise in semiconductors, quantum phenomenon, etc.) play this role in state of the art TRNGs. In this paper, we depart from previous work and explore the possibility of defining social TRNGs using the stream of public messages of the microblogging service Twitter as randomness source. Thus, we define two TRNGs based on Twitter stream information and evaluate them using the National Institute of Standards and Technology (NIST) statistical test suite. The results of the evaluation confirm the feasibility of the proposed approach.
Fast generation of sparse random kernel graphs
Hagberg, Aric; Lemons, Nathan; Du, Wen -Bo
2015-09-10
The development of kernel-based inhomogeneous random graphs has provided models that are flexible enough to capture many observed characteristics of real networks, and that are also mathematically tractable. We specify a class of inhomogeneous random graph models, called random kernel graphs, that produces sparse graphs with tunable graph properties, and we develop an efficient generation algorithm to sample random instances from this model. As real-world networks are usually large, it is essential that the run-time of generation algorithms scales better than quadratically in the number of vertices n. We show that for many practical kernels our algorithm runs in timemore » at most ο(n(logn)²). As an example, we show how to generate samples of power-law degree distribution graphs with tunable assortativity.« less
NASA Astrophysics Data System (ADS)
Bisadi, Zahra; Acerbi, Fabio; Fontana, Giorgio; Zorzi, Nicola; Piemonte, Claudio; Pucker, Georg; Pavesi, Lorenzo
2018-02-01
A small-sized photonic quantum random number generator, easy to be implemented in small electronic devices for secure data encryption and other applications, is highly demanding nowadays. Here, we propose a compact configuration with Silicon nanocrystals large area light emitting device (LED) coupled to a Silicon photomultiplier to generate random numbers. The random number generation methodology is based on the photon arrival time and is robust against the non-idealities of the detector and the source of quantum entropy. The raw data show high quality of randomness and pass all the statistical tests in national institute of standards and technology tests (NIST) suite without a post-processing algorithm. The highest bit rate is 0.5 Mbps with the efficiency of 4 bits per detected photon.
NASA Astrophysics Data System (ADS)
Sui, Liansheng; Liu, Benqing; Wang, Qiang; Li, Ye; Liang, Junli
2015-12-01
A color image encryption scheme is proposed based on Yang-Gu mixture amplitude-phase retrieval algorithm and two-coupled logistic map in gyrator transform domain. First, the color plaintext image is decomposed into red, green and blue components, which are scrambled individually by three random sequences generated by using the two-dimensional Sine logistic modulation map. Second, each scrambled component is encrypted into a real-valued function with stationary white noise distribution in the iterative amplitude-phase retrieval process in the gyrator transform domain, and then three obtained functions are considered as red, green and blue channels to form the color ciphertext image. Obviously, the ciphertext image is real-valued function and more convenient for storing and transmitting. In the encryption and decryption processes, the chaotic random phase mask generated based on logistic map is employed as the phase key, which means that only the initial values are used as private key and the cryptosystem has high convenience on key management. Meanwhile, the security of the cryptosystem is enhanced greatly because of high sensitivity of the private keys. Simulation results are presented to prove the security and robustness of the proposed scheme.
Caustics and Rogue Waves in an Optical Sea.
Mathis, Amaury; Froehly, Luc; Toenger, Shanti; Dias, Frédéric; Genty, Goëry; Dudley, John M
2015-08-06
There are many examples in physics of systems showing rogue wave behaviour, the generation of high amplitude events at low probability. Although initially studied in oceanography, rogue waves have now been seen in many other domains, with particular recent interest in optics. Although most studies in optics have focussed on how nonlinearity can drive rogue wave emergence, purely linear effects have also been shown to induce extreme wave amplitudes. In this paper, we report a detailed experimental study of linear rogue waves in an optical system, using a spatial light modulator to impose random phase structure on a coherent optical field. After free space propagation, different random intensity patterns are generated, including partially-developed speckle, a broadband caustic network, and an intermediate pattern with characteristics of both speckle and caustic structures. Intensity peaks satisfying statistical criteria for rogue waves are seen especially in the case of the caustic network, and are associated with broader spatial spectra. In addition, the electric field statistics of the intermediate pattern shows properties of an "optical sea" with near-Gaussian statistics in elevation amplitude, and trough-to-crest statistics that are near-Rayleigh distributed but with an extended tail where a number of rogue wave events are observed.
Caustics and Rogue Waves in an Optical Sea
Mathis, Amaury; Froehly, Luc; Toenger, Shanti; Dias, Frédéric; Genty, Goëry; Dudley, John M.
2015-01-01
There are many examples in physics of systems showing rogue wave behaviour, the generation of high amplitude events at low probability. Although initially studied in oceanography, rogue waves have now been seen in many other domains, with particular recent interest in optics. Although most studies in optics have focussed on how nonlinearity can drive rogue wave emergence, purely linear effects have also been shown to induce extreme wave amplitudes. In this paper, we report a detailed experimental study of linear rogue waves in an optical system, using a spatial light modulator to impose random phase structure on a coherent optical field. After free space propagation, different random intensity patterns are generated, including partially-developed speckle, a broadband caustic network, and an intermediate pattern with characteristics of both speckle and caustic structures. Intensity peaks satisfying statistical criteria for rogue waves are seen especially in the case of the caustic network, and are associated with broader spatial spectra. In addition, the electric field statistics of the intermediate pattern shows properties of an “optical sea” with near-Gaussian statistics in elevation amplitude, and trough-to-crest statistics that are near-Rayleigh distributed but with an extended tail where a number of rogue wave events are observed. PMID:26245864
Araújo, Ricardo de A
2010-12-01
This paper presents a hybrid intelligent methodology to design increasing translation invariant morphological operators applied to Brazilian stock market prediction (overcoming the random walk dilemma). The proposed Translation Invariant Morphological Robust Automatic phase-Adjustment (TIMRAA) method consists of a hybrid intelligent model composed of a Modular Morphological Neural Network (MMNN) with a Quantum-Inspired Evolutionary Algorithm (QIEA), which searches for the best time lags to reconstruct the phase space of the time series generator phenomenon and determines the initial (sub-optimal) parameters of the MMNN. Each individual of the QIEA population is further trained by the Back Propagation (BP) algorithm to improve the MMNN parameters supplied by the QIEA. Also, for each prediction model generated, it uses a behavioral statistical test and a phase fix procedure to adjust time phase distortions observed in stock market time series. Furthermore, an experimental analysis is conducted with the proposed method through four Brazilian stock market time series, and the achieved results are discussed and compared to results found with random walk models and the previously introduced Time-delay Added Evolutionary Forecasting (TAEF) and Morphological-Rank-Linear Time-lag Added Evolutionary Forecasting (MRLTAEF) methods. Copyright © 2010 Elsevier Ltd. All rights reserved.
Selecting the best design for nonstandard toxicology experiments.
Webb, Jennifer M; Smucker, Byran J; Bailer, A John
2014-10-01
Although many experiments in environmental toxicology use standard statistical experimental designs, there are situations that arise where no such standard design is natural or applicable because of logistical constraints. For example, the layout of a laboratory may suggest that each shelf serve as a block, with the number of experimental units per shelf either greater than or less than the number of treatments in a way that precludes the use of a typical block design. In such cases, an effective and powerful alternative is to employ optimal experimental design principles, a strategy that produces designs with precise statistical estimates. Here, a D-optimal design was generated for an experiment in environmental toxicology that has 2 factors, 16 treatments, and constraints similar to those described above. After initial consideration of a randomized complete block design and an intuitive cyclic design, it was decided to compare a D-optimal design and a slightly more complicated version of the cyclic design. Simulations were conducted generating random responses under a variety of scenarios that reflect conditions motivated by a similar toxicology study, and the designs were evaluated via D-efficiency as well as by a power analysis. The cyclic design performed well compared to the D-optimal design. © 2014 SETAC.
Hinton, Devon E; Hofmann, Stefan G; Pollack, Mark H; Otto, Michael W
2009-01-01
Based on the results of a randomized controlled trial, we examined a model of the mechanisms of efficacy of culturally adapted cognitive-behavior therapy (CBT) for Cambodian refugees with pharmacology-resistant posttraumatic stress disorder (PTSD) and comordid orthostatic panic attacks (PAs). Twelve patients were in the initial treatment condition, 12 in the delayed treatment condition. The patients randomized to CBT had much greater improvement than patients in the waitlist condition on all psychometric measures and on one physiological measure-the systolic blood pressure response to orthostasis (d = 1.31)-as evaluated by repeated-measures MANOVA and planned contrasts. After receiving CBT, the Delayed Treatment Group improved on all measures, including the systolic blood pressure response to orthostasis. The CBT treatment's reduction of PTSD severity was significantly mediated by improvement in orthostatic panic and emotion regulation ability. The current study supports our model of the generation of PTSD in the Cambodian population, and suggests a key role of decreased vagal tone in the generation of orthostatic panic and PTSD in this population. It also suggests that vagal tone is involved in emotion regulation, and that both vagal tone and emotion regulation improve across treatment.
Mobile access to virtual randomization for investigator-initiated trials.
Deserno, Thomas M; Keszei, András P
2017-08-01
Background/aims Randomization is indispensable in clinical trials in order to provide unbiased treatment allocation and a valid statistical inference. Improper handling of allocation lists can be avoided using central systems, for example, human-based services. However, central systems are unaffordable for investigator-initiated trials and might be inaccessible from some places, where study subjects need allocations. We propose mobile access to virtual randomization, where the randomization lists are non-existent and the appropriate allocation is computed on demand. Methods The core of the system architecture is an electronic data capture system or a clinical trial management system, which is extended by an R interface connecting the R server using the Java R Interface. Mobile devices communicate via the representational state transfer web services. Furthermore, a simple web-based setup allows configuring the appropriate statistics by non-statisticians. Our comprehensive R script supports simple randomization, restricted randomization using a random allocation rule, block randomization, and stratified randomization for un-blinded, single-blinded, and double-blinded trials. For each trial, the electronic data capture system or the clinical trial management system stores the randomization parameters and the subject assignments. Results Apps are provided for iOS and Android and subjects are randomized using smartphones. After logging onto the system, the user selects the trial and the subject, and the allocation number and treatment arm are displayed instantaneously and stored in the core system. So far, 156 subjects have been allocated from mobile devices serving five investigator-initiated trials. Conclusion Transforming pre-printed allocation lists into virtual ones ensures the correct conduct of trials and guarantees a strictly sequential processing in all trial sites. Covering 88% of all randomization models that are used in recent trials, virtual randomization becomes available for investigator-initiated trials and potentially for large multi-center trials.
Response Rates in Random-Digit-Dialed Telephone Surveys: Estimation vs. Measurement.
ERIC Educational Resources Information Center
Franz, Jennifer D.
The efficacy of the random digit dialing method in telephone surveys was examined. Random digit dialing (RDD) generates a pure random sample and provides the advantage of including unlisted phone numbers, as well as numbers which are too new to be listed. Its disadvantage is that it generates a major proportion of nonworking and business…
Revisiting sample size: are big trials the answer?
Lurati Buse, Giovanna A L; Botto, Fernando; Devereaux, P J
2012-07-18
The superiority of the evidence generated in randomized controlled trials over observational data is not only conditional to randomization. Randomized controlled trials require proper design and implementation to provide a reliable effect estimate. Adequate random sequence generation, allocation implementation, analyses based on the intention-to-treat principle, and sufficient power are crucial to the quality of a randomized controlled trial. Power, or the probability of the trial to detect a difference when a real difference between treatments exists, strongly depends on sample size. The quality of orthopaedic randomized controlled trials is frequently threatened by a limited sample size. This paper reviews basic concepts and pitfalls in sample-size estimation and focuses on the importance of large trials in the generation of valid evidence.
Vertical decomposition with Genetic Algorithm for Multiple Sequence Alignment
2011-01-01
Background Many Bioinformatics studies begin with a multiple sequence alignment as the foundation for their research. This is because multiple sequence alignment can be a useful technique for studying molecular evolution and analyzing sequence structure relationships. Results In this paper, we have proposed a Vertical Decomposition with Genetic Algorithm (VDGA) for Multiple Sequence Alignment (MSA). In VDGA, we divide the sequences vertically into two or more subsequences, and then solve them individually using a guide tree approach. Finally, we combine all the subsequences to generate a new multiple sequence alignment. This technique is applied on the solutions of the initial generation and of each child generation within VDGA. We have used two mechanisms to generate an initial population in this research: the first mechanism is to generate guide trees with randomly selected sequences and the second is shuffling the sequences inside such trees. Two different genetic operators have been implemented with VDGA. To test the performance of our algorithm, we have compared it with existing well-known methods, namely PRRP, CLUSTALX, DIALIGN, HMMT, SB_PIMA, ML_PIMA, MULTALIGN, and PILEUP8, and also other methods, based on Genetic Algorithms (GA), such as SAGA, MSA-GA and RBT-GA, by solving a number of benchmark datasets from BAliBase 2.0. Conclusions The experimental results showed that the VDGA with three vertical divisions was the most successful variant for most of the test cases in comparison to other divisions considered with VDGA. The experimental results also confirmed that VDGA outperformed the other methods considered in this research. PMID:21867510
Shteingart, Hanan; Loewenstein, Yonatan
2016-01-01
There is a long history of experiments in which participants are instructed to generate a long sequence of binary random numbers. The scope of this line of research has shifted over the years from identifying the basic psychological principles and/or the heuristics that lead to deviations from randomness, to one of predicting future choices. In this paper, we used generalized linear regression and the framework of Reinforcement Learning in order to address both points. In particular, we used logistic regression analysis in order to characterize the temporal sequence of participants' choices. Surprisingly, a population analysis indicated that the contribution of the most recent trial has only a weak effect on behavior, compared to more preceding trials, a result that seems irreconcilable with standard sequential effects that decay monotonously with the delay. However, when considering each participant separately, we found that the magnitudes of the sequential effect are a monotonous decreasing function of the delay, yet these individual sequential effects are largely averaged out in a population analysis because of heterogeneity. The substantial behavioral heterogeneity in this task is further demonstrated quantitatively by considering the predictive power of the model. We show that a heterogeneous model of sequential dependencies captures the structure available in random sequence generation. Finally, we show that the results of the logistic regression analysis can be interpreted in the framework of reinforcement learning, allowing us to compare the sequential effects in the random sequence generation task to those in an operant learning task. We show that in contrast to the random sequence generation task, sequential effects in operant learning are far more homogenous across the population. These results suggest that in the random sequence generation task, different participants adopt different cognitive strategies to suppress sequential dependencies when generating the "random" sequences.
Solution-Processed Carbon Nanotube True Random Number Generator.
Gaviria Rojas, William A; McMorrow, Julian J; Geier, Michael L; Tang, Qianying; Kim, Chris H; Marks, Tobin J; Hersam, Mark C
2017-08-09
With the growing adoption of interconnected electronic devices in consumer and industrial applications, there is an increasing demand for robust security protocols when transmitting and receiving sensitive data. Toward this end, hardware true random number generators (TRNGs), commonly used to create encryption keys, offer significant advantages over software pseudorandom number generators. However, the vast network of devices and sensors envisioned for the "Internet of Things" will require small, low-cost, and mechanically flexible TRNGs with low computational complexity. These rigorous constraints position solution-processed semiconducting single-walled carbon nanotubes (SWCNTs) as leading candidates for next-generation security devices. Here, we demonstrate the first TRNG using static random access memory (SRAM) cells based on solution-processed SWCNTs that digitize thermal noise to generate random bits. This bit generation strategy can be readily implemented in hardware with minimal transistor and computational overhead, resulting in an output stream that passes standardized statistical tests for randomness. By using solution-processed semiconducting SWCNTs in a low-power, complementary architecture to achieve TRNG, we demonstrate a promising approach for improving the security of printable and flexible electronics.
Wampler, Peter J; Rediske, Richard R; Molla, Azizur R
2013-01-18
A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This method provides an important technique that can be applied to other developing countries where a randomized study design is needed but infrastructure is lacking to implement more traditional participant selection methods.
640-Gbit/s fast physical random number generation using a broadband chaotic semiconductor laser
NASA Astrophysics Data System (ADS)
Zhang, Limeng; Pan, Biwei; Chen, Guangcan; Guo, Lu; Lu, Dan; Zhao, Lingjuan; Wang, Wei
2017-04-01
An ultra-fast physical random number generator is demonstrated utilizing a photonic integrated device based broadband chaotic source with a simple post data processing method. The compact chaotic source is implemented by using a monolithic integrated dual-mode amplified feedback laser (AFL) with self-injection, where a robust chaotic signal with RF frequency coverage of above 50 GHz and flatness of ±3.6 dB is generated. By using 4-least significant bits (LSBs) retaining from the 8-bit digitization of the chaotic waveform, random sequences with a bit-rate up to 640 Gbit/s (160 GS/s × 4 bits) are realized. The generated random bits have passed each of the fifteen NIST statistics tests (NIST SP800-22), indicating its randomness for practical applications.
Co-state initialization for the minimum-time low-thrust trajectory optimization
NASA Astrophysics Data System (ADS)
Taheri, Ehsan; Li, Nan I.; Kolmanovsky, Ilya
2017-05-01
This paper presents an approach for co-state initialization which is a critical step in solving minimum-time low-thrust trajectory optimization problems using indirect optimal control numerical methods. Indirect methods used in determining the optimal space trajectories typically result in two-point boundary-value problems and are solved by single- or multiple-shooting numerical methods. Accurate initialization of the co-state variables facilitates the numerical convergence of iterative boundary value problem solvers. In this paper, we propose a method which exploits the trajectory generated by the so-called pseudo-equinoctial and three-dimensional finite Fourier series shape-based methods to estimate the initial values of the co-states. The performance of the approach for two interplanetary rendezvous missions from Earth to Mars and from Earth to asteroid Dionysus is compared against three other approaches which, respectively, exploit random initialization of co-states, adjoint-control transformation and a standard genetic algorithm. The results indicate that by using our proposed approach the percent of the converged cases is higher for trajectories with higher number of revolutions while the computation time is lower. These features are advantageous for broad trajectory search in the preliminary phase of mission designs.
NASA Astrophysics Data System (ADS)
Apdilah, D.; Harahap, M. K.; Khairina, N.; Husein, A. M.; Harahap, M.
2018-04-01
One Time Pad algorithm always requires a pairing of the key for plaintext. If the length of keys less than a length of the plaintext, the key will be repeated until the length of the plaintext same with the length of the key. In this research, we use Linear Congruential Generator and Quadratic Congruential Generator for generating a random number. One Time Pad use a random number as a key for encryption and decryption process. Key will generate the first letter from the plaintext, we compare these two algorithms in terms of time speed encryption, and the result is a combination of OTP with LCG faster than the combination of OTP with QCG.
Comparison of three controllers applied to helicopter vibration
NASA Technical Reports Server (NTRS)
Leyland, Jane A.
1992-01-01
A comparison was made of the applicability and suitability of the deterministic controller, the cautious controller, and the dual controller for the reduction of helicopter vibration by using higher harmonic blade pitch control. A randomly generated linear plant model was assumed and the performance index was defined to be a quadratic output metric of this linear plant. A computer code, designed to check out and evaluate these controllers, was implemented and used to accomplish this comparison. The effects of random measurement noise, the initial estimate of the plant matrix, and the plant matrix propagation rate were determined for each of the controllers. With few exceptions, the deterministic controller yielded the greatest vibration reduction (as characterized by the quadratic output metric) and operated with the greatest reliability. Theoretical limitations of these controllers were defined and appropriate candidate alternative methods, including one method particularly suitable to the cockpit, were identified.
Commercialization of NESSUS: Status
NASA Technical Reports Server (NTRS)
Thacker, Ben H.; Millwater, Harry R.
1991-01-01
A plan was initiated in 1988 to commercialize the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) probabilistic structural analysis software. The goal of the on-going commercialization effort is to begin the transfer of Probabilistic Structural Analysis Method (PSAM) developed technology into industry and to develop additional funding resources in the general area of structural reliability. The commercialization effort is summarized. The SwRI NESSUS Software System is a general purpose probabilistic finite element computer program using state of the art methods for predicting stochastic structural response due to random loads, material properties, part geometry, and boundary conditions. NESSUS can be used to assess structural reliability, to compute probability of failure, to rank the input random variables by importance, and to provide a more cost effective design than traditional methods. The goal is to develop a general probabilistic structural analysis methodology to assist in the certification of critical components in the next generation Space Shuttle Main Engine.
Ocean biogeochemistry modeled with emergent trait-based genomics.
Coles, V J; Stukel, M R; Brooks, M T; Burd, A; Crump, B C; Moran, M A; Paul, J H; Satinsky, B M; Yager, P L; Zielinski, B L; Hood, R R
2017-12-01
Marine ecosystem models have advanced to incorporate metabolic pathways discovered with genomic sequencing, but direct comparisons between models and "omics" data are lacking. We developed a model that directly simulates metagenomes and metatranscriptomes for comparison with observations. Model microbes were randomly assigned genes for specialized functions, and communities of 68 species were simulated in the Atlantic Ocean. Unfit organisms were replaced, and the model self-organized to develop community genomes and transcriptomes. Emergent communities from simulations that were initialized with different cohorts of randomly generated microbes all produced realistic vertical and horizontal ocean nutrient, genome, and transcriptome gradients. Thus, the library of gene functions available to the community, rather than the distribution of functions among specific organisms, drove community assembly and biogeochemical gradients in the model ocean. Copyright © 2017 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.
On a phase diagram for random neural networks with embedded spike timing dependent plasticity.
Turova, Tatyana S; Villa, Alessandro E P
2007-01-01
This paper presents an original mathematical framework based on graph theory which is a first attempt to investigate the dynamics of a model of neural networks with embedded spike timing dependent plasticity. The neurons correspond to integrate-and-fire units located at the vertices of a finite subset of 2D lattice. There are two types of vertices, corresponding to the inhibitory and the excitatory neurons. The edges are directed and labelled by the discrete values of the synaptic strength. We assume that there is an initial firing pattern corresponding to a subset of units that generate a spike. The number of activated externally vertices is a small fraction of the entire network. The model presented here describes how such pattern propagates throughout the network as a random walk on graph. Several results are compared with computational simulations and new data are presented for identifying critical parameters of the model.
Optimal allocation of testing resources for statistical simulations
NASA Astrophysics Data System (ADS)
Quintana, Carolina; Millwater, Harry R.; Singh, Gulshan; Golden, Patrick
2015-07-01
Statistical estimates from simulation involve uncertainty caused by the variability in the input random variables due to limited data. Allocating resources to obtain more experimental data of the input variables to better characterize their probability distributions can reduce the variance of statistical estimates. The methodology proposed determines the optimal number of additional experiments required to minimize the variance of the output moments given single or multiple constraints. The method uses multivariate t-distribution and Wishart distribution to generate realizations of the population mean and covariance of the input variables, respectively, given an amount of available data. This method handles independent and correlated random variables. A particle swarm method is used for the optimization. The optimal number of additional experiments per variable depends on the number and variance of the initial data, the influence of the variable in the output function and the cost of each additional experiment. The methodology is demonstrated using a fretting fatigue example.
Averaging of random walks and shift-invariant measures on a Hilbert space
NASA Astrophysics Data System (ADS)
Sakbaev, V. Zh.
2017-06-01
We study random walks in a Hilbert space H and representations using them of solutions of the Cauchy problem for differential equations whose initial conditions are numerical functions on H. We construct a finitely additive analogue of the Lebesgue measure: a nonnegative finitely additive measure λ that is defined on a minimal subset ring of an infinite-dimensional Hilbert space H containing all infinite-dimensional rectangles with absolutely converging products of the side lengths and is invariant under shifts and rotations in H. We define the Hilbert space H of equivalence classes of complex-valued functions on H that are square integrable with respect to a shift-invariant measure λ. Using averaging of the shift operator in H over random vectors in H with a distribution given by a one-parameter semigroup (with respect to convolution) of Gaussian measures on H, we define a one-parameter semigroup of contracting self-adjoint transformations on H, whose generator is called the diffusion operator. We obtain a representation of solutions of the Cauchy problem for the Schrödinger equation whose Hamiltonian is the diffusion operator.
Compiling probabilistic, bio-inspired circuits on a field programmable analog array
Marr, Bo; Hasler, Jennifer
2014-01-01
A field programmable analog array (FPAA) is presented as an energy and computational efficiency engine: a mixed mode processor for which functions can be compiled at significantly less energy costs using probabilistic computing circuits. More specifically, it will be shown that the core computation of any dynamical system can be computed on the FPAA at significantly less energy per operation than a digital implementation. A stochastic system that is dynamically controllable via voltage controlled amplifier and comparator thresholds is implemented, which computes Bernoulli random variables. From Bernoulli variables it is shown exponentially distributed random variables, and random variables of an arbitrary distribution can be computed. The Gillespie algorithm is simulated to show the utility of this system by calculating the trajectory of a biological system computed stochastically with this probabilistic hardware where over a 127X performance improvement over current software approaches is shown. The relevance of this approach is extended to any dynamical system. The initial circuits and ideas for this work were generated at the 2008 Telluride Neuromorphic Workshop. PMID:24847199
Completely device-independent quantum key distribution
NASA Astrophysics Data System (ADS)
Aguilar, Edgar A.; Ramanathan, Ravishankar; Kofler, Johannes; Pawłowski, Marcin
2016-08-01
Quantum key distribution (QKD) is a provably secure way for two distant parties to establish a common secret key, which then can be used in a classical cryptographic scheme. Using quantum entanglement, one can reduce the necessary assumptions that the parties have to make about their devices, giving rise to device-independent QKD (DIQKD). However, in all existing protocols to date the parties need to have an initial (at least partially) random seed as a resource. In this work, we show that this requirement can be dropped. Using recent advances in the fields of randomness amplification and randomness expansion, we demonstrate that it is sufficient for the message the parties want to communicate to be (partially) unknown to the adversaries—an assumption without which any type of cryptography would be pointless to begin with. One party can use her secret message to locally generate a secret sequence of bits, which can then be openly used by herself and the other party in a DIQKD protocol. Hence our work reduces the requirements needed to perform secure DIQKD and establish safe communication.
Neural networks for structural design - An integrated system implementation
NASA Technical Reports Server (NTRS)
Berke, Laszlo; Hafez, Wassim; Pao, Yoh-Han
1992-01-01
The development of powerful automated procedures to aid the creative designer is becoming increasingly critical for complex design tasks. In the work described here Artificial Neural Nets are applied to acquire structural analysis and optimization domain expertise. Based on initial instructions from the user an automated procedure generates random instances of structural analysis and/or optimization 'experiences' that cover a desired domain. It extracts training patterns from the created instances, constructs and trains an appropriate network architecture and checks the accuracy of net predictions. The final product is a trained neural net that can estimate analysis and/or optimization results instantaneously.
Early stage hot spot analysis through standard cell base random pattern generation
NASA Astrophysics Data System (ADS)
Jeon, Joong-Won; Song, Jaewan; Kim, Jeong-Lim; Park, Seongyul; Yang, Seung-Hune; Lee, Sooryong; Kang, Hokyu; Madkour, Kareem; ElManhawy, Wael; Lee, SeungJo; Kwan, Joe
2017-04-01
Due to limited availability of DRC clean patterns during the process and RET recipe development, OPC recipes are not tested with high pattern coverage. Various kinds of pattern can help OPC engineer to detect sensitive patterns to lithographic effects. Random pattern generation is needed to secure robust OPC recipe. However, simple random patterns without considering real product layout style can't cover patterning hotspot in production levels. It is not effective to use them for OPC optimization thus it is important to generate random patterns similar to real product patterns. This paper presents a strategy for generating random patterns based on design architecture information and preventing hotspot in early process development stage through a tool called Layout Schema Generator (LSG). Using LSG, we generate standard cell based on random patterns reflecting real design cell structure - fin pitch, gate pitch and cell height. The output standard cells from LSG are applied to an analysis methodology to assess their hotspot severity by assigning a score according to their optical image parameters - NILS, MEEF, %PV band and thus potential hotspots can be defined by determining their ranking. This flow is demonstrated on Samsung 7nm technology optimizing OPC recipe and early enough in the process avoiding using problematic patterns.
Interbank lending, network structure and default risk contagion
NASA Astrophysics Data System (ADS)
Zhang, Minghui; He, Jianmin; Li, Shouwei
2018-03-01
This paper studies the default risk contagion in banking systems based on a dynamic network model with two different kinds of lenders' selecting mechanisms, namely, endogenous selecting (ES) and random selecting (RS). From sensitivity analysis, we find that higher risk premium, lower initial proportion of net assets, higher liquid assets threshold, larger size of liquidity shocks, higher proportion of the initial investments and higher Central Bank interest rates all lead to severer default risk contagion. Moreover, the autocorrelation of deposits and lenders' selecting probability have non-monotonic effects on the default risk contagion, and the effects differ under two mechanisms. Generally, the default risk contagion is much severer under RS mechanism than that of ES, because the multi-money-center structure generated by ES mechanism enables borrowers to borrow from more liquid banks with lower interest rates.
Probing the solar core with low-degree p modes
NASA Astrophysics Data System (ADS)
Roxburgh, I. W.; Vorontsov, S. V.
2002-01-01
We address the question of what could be learned about the solar core structure if the seismic data were limited to low-degree modes only. The results of three different experiments are described. The first is the linearized structural inversion of the p-mode frequencies of a solar model modified slightly in the energy-generating core, using the original (unmodified) model as an initial guess. In the second experiment, we invert the solar p-mode frequencies measured in the 32-month subset of BiSON data (Chaplin et al. 1998), degraded with additional 0.1 μHz random errors, using a model of 2.6 Gyr age from the solar evolutionary sequence as an initial approximation. This second inversion is non-linear. In the third experiment, we compare the same set of BiSON frequencies with current reference solar model.
Device-independent randomness generation from several Bell estimators
NASA Astrophysics Data System (ADS)
Nieto-Silleras, Olmo; Bamps, Cédric; Silman, Jonathan; Pironio, Stefano
2018-02-01
Device-independent randomness generation and quantum key distribution protocols rely on a fundamental relation between the non-locality of quantum theory and its random character. This relation is usually expressed in terms of a trade-off between the probability of guessing correctly the outcomes of measurements performed on quantum systems and the amount of violation of a given Bell inequality. However, a more accurate assessment of the randomness produced in Bell experiments can be obtained if the value of several Bell expressions is simultaneously taken into account, or if the full set of probabilities characterizing the behavior of the device is considered. We introduce protocols for device-independent randomness generation secure against classical side information, that rely on the estimation of an arbitrary number of Bell expressions or even directly on the experimental frequencies of measurement outcomes. Asymptotically, this results in an optimal generation of randomness from experimental data (as measured by the min-entropy), without having to assume beforehand that the devices violate a specific Bell inequality.
Sciandra, Matthew; Sanbonmatsu, Lisa; Duncan, Greg J.; Gennetian, Lisa A.; Katz, Lawrence F.; Kessler, Ronald C.; Kling, Jeffrey R.; Ludwig, Jens
2013-01-01
Objectives Using data from a randomized experiment, to examine whether moving youth out of areas of concentrated poverty, where a disproportionate amount of crime occurs, prevents involvement in crime. Methods We draw on new administrative data from the U.S. Department of Housing and Urban Development’s Moving to Opportunity (MTO) experiment. MTO families were randomized into an experimental group offered a housing voucher that could only be used to move to a low-poverty neighborhood, a Section 8 housing group offered a standard housing voucher, and a control group. This paper focuses on MTO youth ages 15–25 in 2001 (n = 4,643) and analyzes intention to treat effects on neighborhood characteristics and criminal behavior (number of violent- and property-crime arrests) through 10 years after randomization. Results We find the offer of a housing voucher generates large improvements in neighborhood conditions that attenuate over time and initially generates substantial reductions in violent-crime arrests and sizable increases in property-crime arrests for experimental group males. The crime effects attenuate over time along with differences in neighborhood conditions. Conclusions Our findings suggest that criminal behavior is more strongly related to current neighborhood conditions (situational neighborhood effects) than to past neighborhood conditions (developmental neighborhood effects). The MTO design makes it difficult to determine which specific neighborhood characteristics are most important for criminal behavior. Our administrative data analyses could be affected by differences across areas in the likelihood that a crime results in an arrest. PMID:24348277
Young, Robert C.; Schulberg, Herbert C.; Gildengers, Ariel G.; Sajatovic, Martha; Mulsant, Benoit H.; Gyulai, Laszlo; Beyer, John; Marangell, Lauren; Kunik, Mark; Have, Thomas Ten; Bruce, Martha L.; Gur, Ruben; Marino, Patricia; Evans, Jovier D.; Reynolds, Charles F.; Alexopoulos, George S.
2010-01-01
Aim This report considers the conceptual and methodological concerns confronting clinical investigators seeking to generate knowledge regarding the tolerability and benefits of pharmacotherapy in geriatric bipolar (BP) patients. Method There is continuing need for evidence-based guidelines derived from randomized controlled trials that will enhance drug treatment of geriatric BP patients. We, therefore, present the complex conceptual and methodological choices encountered in designing a multi-site clinical trial and the decisions reached by the investigators with the intention that study findings are pertinent to, and can facilitate, routine treatment decisions. Results Guided by a literature review and input from peers, the tolerability and anti-manic effect of lithium and valproate were judged to be the key mood stabilizers to investigate with regard to treating BP I manic, mixed and hypomanic states. The patient selection criteria are intended to generate a sample that experiences common treatment needs but which also represents the variety of older patients seen in university-based clinical settings. The clinical protocol guides titratation of lithium and valproate to target serum concentrations, with lower levels allowed when necessitated by limited tolerability. The protocol emphasizes initial monotherapy. However, augmentation with risperidone is permitted after three weeks when indicated by operational criteria. Conclusions A randomized controlled trial that both investigates commonly prescribed mood stabilizers and maximizes patient participation can meaningfully address high priority clinical concerns directly relevant to the routine pharmacologic treatment of geriatric BP patients. PMID:20148867
Sciandra, Matthew; Sanbonmatsu, Lisa; Duncan, Greg J; Gennetian, Lisa A; Katz, Lawrence F; Kessler, Ronald C; Kling, Jeffrey R; Ludwig, Jens
2013-12-01
Using data from a randomized experiment, to examine whether moving youth out of areas of concentrated poverty, where a disproportionate amount of crime occurs, prevents involvement in crime. We draw on new administrative data from the U.S. Department of Housing and Urban Development's Moving to Opportunity (MTO) experiment. MTO families were randomized into an experimental group offered a housing voucher that could only be used to move to a low-poverty neighborhood, a Section 8 housing group offered a standard housing voucher, and a control group . This paper focuses on MTO youth ages 15-25 in 2001 ( n = 4,643) and analyzes intention to treat effects on neighborhood characteristics and criminal behavior (number of violent- and property-crime arrests) through 10 years after randomization. We find the offer of a housing voucher generates large improvements in neighborhood conditions that attenuate over time and initially generates substantial reductions in violent-crime arrests and sizable increases in property-crime arrests for experimental group males. The crime effects attenuate over time along with differences in neighborhood conditions. Our findings suggest that criminal behavior is more strongly related to current neighborhood conditions (situational neighborhood effects) than to past neighborhood conditions (developmental neighborhood effects). The MTO design makes it difficult to determine which specific neighborhood characteristics are most important for criminal behavior. Our administrative data analyses could be affected by differences across areas in the likelihood that a crime results in an arrest.
NASA Astrophysics Data System (ADS)
Li, Jiafu; Xiang, Shuiying; Wang, Haoning; Gong, Junkai; Wen, Aijun
2018-03-01
In this paper, a novel image encryption algorithm based on synchronization of physical random bit generated in a cascade-coupled semiconductor ring lasers (CCSRL) system is proposed, and the security analysis is performed. In both transmitter and receiver parts, the CCSRL system is a master-slave configuration consisting of a master semiconductor ring laser (M-SRL) with cross-feedback and a solitary SRL (S-SRL). The proposed image encryption algorithm includes image preprocessing based on conventional chaotic maps, pixel confusion based on control matrix extracted from physical random bit, and pixel diffusion based on random bit stream extracted from physical random bit. Firstly, the preprocessing method is used to eliminate the correlation between adjacent pixels. Secondly, physical random bit with verified randomness is generated based on chaos in the CCSRL system, and is used to simultaneously generate the control matrix and random bit stream. Finally, the control matrix and random bit stream are used for the encryption algorithm in order to change the position and the values of pixels, respectively. Simulation results and security analysis demonstrate that the proposed algorithm is effective and able to resist various typical attacks, and thus is an excellent candidate for secure image communication application.
Probabilistic generation of random networks taking into account information on motifs occurrence.
Bois, Frederic Y; Gayraud, Ghislaine
2015-01-01
Because of the huge number of graphs possible even with a small number of nodes, inference on network structure is known to be a challenging problem. Generating large random directed graphs with prescribed probabilities of occurrences of some meaningful patterns (motifs) is also difficult. We show how to generate such random graphs according to a formal probabilistic representation, using fast Markov chain Monte Carlo methods to sample them. As an illustration, we generate realistic graphs with several hundred nodes mimicking a gene transcription interaction network in Escherichia coli.
Probabilistic Generation of Random Networks Taking into Account Information on Motifs Occurrence
Bois, Frederic Y.
2015-01-01
Abstract Because of the huge number of graphs possible even with a small number of nodes, inference on network structure is known to be a challenging problem. Generating large random directed graphs with prescribed probabilities of occurrences of some meaningful patterns (motifs) is also difficult. We show how to generate such random graphs according to a formal probabilistic representation, using fast Markov chain Monte Carlo methods to sample them. As an illustration, we generate realistic graphs with several hundred nodes mimicking a gene transcription interaction network in Escherichia coli. PMID:25493547
Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore
2011-12-01
Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.
Generating Adaptive Behaviour within a Memory-Prediction Framework
Rawlinson, David; Kowadlo, Gideon
2012-01-01
The Memory-Prediction Framework (MPF) and its Hierarchical-Temporal Memory implementation (HTM) have been widely applied to unsupervised learning problems, for both classification and prediction. To date, there has been no attempt to incorporate MPF/HTM in reinforcement learning or other adaptive systems; that is, to use knowledge embodied within the hierarchy to control a system, or to generate behaviour for an agent. This problem is interesting because the human neocortex is believed to play a vital role in the generation of behaviour, and the MPF is a model of the human neocortex. We propose some simple and biologically-plausible enhancements to the Memory-Prediction Framework. These cause it to explore and interact with an external world, while trying to maximize a continuous, time-varying reward function. All behaviour is generated and controlled within the MPF hierarchy. The hierarchy develops from a random initial configuration by interaction with the world and reinforcement learning only. Among other demonstrations, we show that a 2-node hierarchy can learn to successfully play “rocks, paper, scissors” against a predictable opponent. PMID:22272231
High-speed true random number generation based on paired memristors for security electronics
NASA Astrophysics Data System (ADS)
Zhang, Teng; Yin, Minghui; Xu, Changmin; Lu, Xiayan; Sun, Xinhao; Yang, Yuchao; Huang, Ru
2017-11-01
True random number generator (TRNG) is a critical component in hardware security that is increasingly important in the era of mobile computing and internet of things. Here we demonstrate a TRNG using intrinsic variation of memristors as a natural source of entropy that is otherwise undesirable in most applications. The random bits were produced by cyclically switching a pair of tantalum oxide based memristors and comparing their resistance values in the off state, taking advantage of the more pronounced resistance variation compared with that in the on state. Using an alternating read scheme in the designed TRNG circuit, the unbiasedness of the random numbers was significantly improved, and the bitstream passed standard randomness tests. The Pt/TaO x /Ta memristors fabricated in this work have fast programming/erasing speeds of ˜30 ns, suggesting a high random number throughput. The approach proposed here thus holds great promise for physically-implemented random number generation.
High-speed true random number generation based on paired memristors for security electronics.
Zhang, Teng; Yin, Minghui; Xu, Changmin; Lu, Xiayan; Sun, Xinhao; Yang, Yuchao; Huang, Ru
2017-11-10
True random number generator (TRNG) is a critical component in hardware security that is increasingly important in the era of mobile computing and internet of things. Here we demonstrate a TRNG using intrinsic variation of memristors as a natural source of entropy that is otherwise undesirable in most applications. The random bits were produced by cyclically switching a pair of tantalum oxide based memristors and comparing their resistance values in the off state, taking advantage of the more pronounced resistance variation compared with that in the on state. Using an alternating read scheme in the designed TRNG circuit, the unbiasedness of the random numbers was significantly improved, and the bitstream passed standard randomness tests. The Pt/TaO x /Ta memristors fabricated in this work have fast programming/erasing speeds of ∼30 ns, suggesting a high random number throughput. The approach proposed here thus holds great promise for physically-implemented random number generation.
Genetic hitchhiking can promote the initial spread of strong altruism
2008-01-01
Background The evolutionary origin of strong altruism (where the altruist pays an absolute cost in terms of fitness) towards non-kin has never been satisfactorily explained since no mechanism (except genetic drift) seems to be able to overcome the fitness disadvantage of the individual who practiced altruism in the first place. Results Here we consider a multilocus, single-generation random group model and demonstrate that with low, but realistic levels of recombination and social heterosis (selecting for allelic diversity within groups) altruism can evolve without invoking kin selection, because sampling effects in the formation of temporary groups and selection for complementary haplotypes generate nonrandom associations between alleles at polymorphic loci. Conclusion By letting altruism get off the ground, selection on other genes favourably interferes with the eventual fate of the altruistic trait due to genetic hitchhiking. PMID:18847475
Mairesse, Olivier; Hofmans, Joeri; Theuns, Peter
2008-05-01
We propose a free, easy-to-use computer program that does not requires prior knowledge of computer programming to generate and run experiments using textual or pictorial stimuli. Although the FM Experiment Builder suite was initially programmed for building and conducting FM experiments, it can also be applied for non-FM experiments that necessitate randomized, single, or multifactorial designs. The program is highly configurable, allowing multilingual use and a wide range of different response formats. The outputs of the experiments are Microsoft Excel compatible .xls files that allow easy copy-paste of the results into Weiss's FM CalSTAT program (2006) or any other statistical package. Its Java-based structure is compatible with both Windows and Macintosh operating systems, and its compactness (< 1 MB) makes it easily distributable over the Internet.
The Role of Reticulate Evolution in Creating Innovation and Complexity
Swithers, Kristen S.; Soucy, Shannon M.; Gogarten, J. Peter
2012-01-01
Reticulate evolution encompasses processes that conflict with traditional Tree of Life efforts. These processes, horizontal gene transfer (HGT), gene and whole-genome duplications through allopolyploidization, are some of the main driving forces for generating innovation and complexity. HGT has a profound impact on prokaryotic and eukaryotic evolution. HGTs can lead to the invention of new metabolic pathways and the expansion and enhancement of previously existing pathways. It allows for organismal adaptation into new ecological niches and new host ranges. Although many HGTs appear to be selected for because they provide some benefit to their recipient lineage, other HGTs may be maintained by chance through random genetic drift. Moreover, some HGTs that may initially seem parasitic in nature can cause complexity to arise through pathways of neutral evolution. Another mechanism for generating innovation and complexity, occurring more frequently in eukaryotes than in prokaryotes, is gene and genome duplications, which often occur through allopolyploidizations. We discuss how these different evolutionary processes contribute to generating innovation and complexity. PMID:22844638
Stepwise Loop Insertion Strategy for Active Site Remodeling to Generate Novel Enzyme Functions.
Hoque, Md Anarul; Zhang, Yong; Chen, Liuqing; Yang, Guangyu; Khatun, Mst Afroza; Chen, Haifeng; Hao, Liu; Feng, Yan
2017-05-19
The remodeling of active sites to generate novel biocatalysts is an attractive and challenging task. We developed a stepwise loop insertion strategy (StLois), in which randomized residue pairs are inserted into active site loops. The phosphotriesterase-like lactonase from Geobacillus kaustophilus (GkaP-PLL) was used to investigate StLois's potential for changing enzyme function. By inserting six residues into active site loop 7, the best variant ML7-B6 demonstrated a 16-fold further increase in catalytic efficiency toward ethyl-paraoxon compared with its initial template, that is a 609-fold higher, >10 7 fold substrate specificity shift relative to that of wild-type lactonase. The remodeled variants displayed 760-fold greater organophosphate hydrolysis activity toward the organophosphates parathion, diazinon, and chlorpyrifos. Structure and docking computations support the source of notably inverted enzyme specificity. Considering the fundamental importance of active site loops, the strategy has potential for the rapid generation of novel enzyme functions by loop remodeling.
On the design of henon and logistic map-based random number generator
NASA Astrophysics Data System (ADS)
Magfirawaty; Suryadi, M. T.; Ramli, Kalamullah
2017-10-01
The key sequence is one of the main elements in the cryptosystem. True Random Number Generators (TRNG) method is one of the approaches to generating the key sequence. The randomness source of the TRNG divided into three main groups, i.e. electrical noise based, jitter based and chaos based. The chaos based utilizes a non-linear dynamic system (continuous time or discrete time) as an entropy source. In this study, a new design of TRNG based on discrete time chaotic system is proposed, which is then simulated in LabVIEW. The principle of the design consists of combining 2D and 1D chaotic systems. A mathematical model is implemented for numerical simulations. We used comparator process as a harvester method to obtain the series of random bits. Without any post processing, the proposed design generated random bit sequence with high entropy value and passed all NIST 800.22 statistical tests.
Pseudo-random bit generator based on lag time series
NASA Astrophysics Data System (ADS)
García-Martínez, M.; Campos-Cantón, E.
2014-12-01
In this paper, we present a pseudo-random bit generator (PRBG) based on two lag time series of the logistic map using positive and negative values in the bifurcation parameter. In order to hidden the map used to build the pseudo-random series we have used a delay in the generation of time series. These new series when they are mapped xn against xn+1 present a cloud of points unrelated to the logistic map. Finally, the pseudo-random sequences have been tested with the suite of NIST giving satisfactory results for use in stream ciphers.
Meidahl Petersen, Kasper; Eplov, Kasper; Kjær Nielsen, Torben; Jimenez-Solem, Espen; Petersen, Morten; Broedbaek, Kasper; Daugaard Popik, Sara; Kallehave Hansen, Lina; Enghusen Poulsen, Henrik; Trærup Andersen, Jon
2016-01-01
Trimethoprim antagonize the actions of folate by inhibition of dihydrofolate reductase. This could diminish serum folate levels in humans and causes folate deficiency in some patients. We conducted a randomized, double-blind, placebo-controlled trial, to investigate the effect of trimethoprim on serum folate levels in healthy participants after a 7-day trial period. Thirty young, healthy males were randomly allocated to receive trimethoprim, 200 mg twice daily, and 30 were randomly allocated to placebo. Before trial initiation, participant numbers were given randomly generated treatment allocations within sealed opaque envelopes. Participants and all staff were kept blinded to treatment allocations during the trial. Serum folate was measured at baseline and at end of trial. In the 58 participants analyzed (30 in the trimethoprim group and 28 in the placebo group), 8 had folate deficiency at baseline. Within the trimethoprim group, serum folate was significantly decreased (P = 0.018) after the trial. We found a mean decrease in serum folate among trimethoprim exposed of 1.95 nmol/L, compared with a 0.21 nmol/L mean increase in the placebo group (P = 0.040). The proportion of folate-deficient participants increased significantly within the trimethoprim group (P = 0.034). No serious adverse events were observed. In conclusion, we found that a daily dose of 400 mg trimethoprim for 7 days significantly lowered serum folate levels in healthy study participants.
Work distributions for random sudden quantum quenches
NASA Astrophysics Data System (ADS)
Łobejko, Marcin; Łuczka, Jerzy; Talkner, Peter
2017-05-01
The statistics of work performed on a system by a sudden random quench is investigated. Considering systems with finite dimensional Hilbert spaces we model a sudden random quench by randomly choosing elements from a Gaussian unitary ensemble (GUE) consisting of Hermitian matrices with identically, Gaussian distributed matrix elements. A probability density function (pdf) of work in terms of initial and final energy distributions is derived and evaluated for a two-level system. Explicit results are obtained for quenches with a sharply given initial Hamiltonian, while the work pdfs for quenches between Hamiltonians from two independent GUEs can only be determined in explicit form in the limits of zero and infinite temperature. The same work distribution as for a sudden random quench is obtained for an adiabatic, i.e., infinitely slow, protocol connecting the same initial and final Hamiltonians.
Kumar, Keshav
2017-11-01
Multivariate curve resolution alternating least square (MCR-ALS) analysis is the most commonly used curve resolution technique. The MCR-ALS model is fitted using the alternate least square (ALS) algorithm that needs initialisation of either contribution profiles or spectral profiles of each of the factor. The contribution profiles can be initialised using the evolve factor analysis; however, in principle, this approach requires that data must belong to the sequential process. The initialisation of the spectral profiles are usually carried out using the pure variable approach such as SIMPLISMA algorithm, this approach demands that each factor must have the pure variables in the data sets. Despite these limitations, the existing approaches have been quite a successful for initiating the MCR-ALS analysis. However, the present work proposes an alternate approach for the initialisation of the spectral variables by generating the random variables in the limits spanned by the maxima and minima of each spectral variable of the data set. The proposed approach does not require that there must be pure variables for each component of the multicomponent system or the concentration direction must follow the sequential process. The proposed approach is successfully validated using the excitation-emission matrix fluorescence data sets acquired for certain fluorophores with significant spectral overlap. The calculated contribution and spectral profiles of these fluorophores are found to correlate well with the experimental results. In summary, the present work proposes an alternate way to initiate the MCR-ALS analysis.
Zheng, Guanglou; Fang, Gengfa; Shankaran, Rajan; Orgun, Mehmet A; Zhou, Jie; Qiao, Li; Saleem, Kashif
2017-05-01
Generating random binary sequences (BSes) is a fundamental requirement in cryptography. A BS is a sequence of N bits, and each bit has a value of 0 or 1. For securing sensors within wireless body area networks (WBANs), electrocardiogram (ECG)-based BS generation methods have been widely investigated in which interpulse intervals (IPIs) from each heartbeat cycle are processed to produce BSes. Using these IPI-based methods to generate a 128-bit BS in real time normally takes around half a minute. In order to improve the time efficiency of such methods, this paper presents an ECG multiple fiducial-points based binary sequence generation (MFBSG) algorithm. The technique of discrete wavelet transforms is employed to detect arrival time of these fiducial points, such as P, Q, R, S, and T peaks. Time intervals between them, including RR, RQ, RS, RP, and RT intervals, are then calculated based on this arrival time, and are used as ECG features to generate random BSes with low latency. According to our analysis on real ECG data, these ECG feature values exhibit the property of randomness and, thus, can be utilized to generate random BSes. Compared with the schemes that solely rely on IPIs to generate BSes, this MFBSG algorithm uses five feature values from one heart beat cycle, and can be up to five times faster than the solely IPI-based methods. So, it achieves a design goal of low latency. According to our analysis, the complexity of the algorithm is comparable to that of fast Fourier transforms. These randomly generated ECG BSes can be used as security keys for encryption or authentication in a WBAN system.
Random Item Generation Is Affected by Age
ERIC Educational Resources Information Center
Multani, Namita; Rudzicz, Frank; Wong, Wing Yiu Stephanie; Namasivayam, Aravind Kumar; van Lieshout, Pascal
2016-01-01
Purpose: Random item generation (RIG) involves central executive functioning. Measuring aspects of random sequences can therefore provide a simple method to complement other tools for cognitive assessment. We examine the extent to which RIG relates to specific measures of cognitive function, and whether those measures can be estimated using RIG…
NASA Astrophysics Data System (ADS)
Molotkov, S. N.
2017-03-01
Various methods for the clustering of photocounts constituting a sequence of random numbers are considered. It is shown that the clustering of photocounts resulting in the Fermi-Dirac distribution makes it possible to achieve the theoretical limit of the random number generation rate.
NASA Astrophysics Data System (ADS)
Avery, Patrick; Zurek, Eva
2017-04-01
A new algorithm, RANDSPG, that can be used to generate trial crystal structures with specific space groups and compositions is described. The program has been designed for systems where the atoms are independent of one another, and it is therefore primarily suited towards inorganic systems. The structures that are generated adhere to user-defined constraints such as: the lattice shape and size, stoichiometry, set of space groups to be generated, and factors that influence the minimum interatomic separations. In addition, the user can optionally specify if the most general Wyckoff position is to be occupied or constrain select atoms to specific Wyckoff positions. Extensive testing indicates that the algorithm is efficient and reliable. The library is lightweight, portable, dependency-free and is published under a license recognized by the Open Source Initiative. A web interface for the algorithm is publicly accessible at http://xtalopt.openmolecules.net/randSpg/randSpg.html. RANDSPG has also been interfaced with the XTALOPT evolutionary algorithm for crystal structure prediction, and it is illustrated that the use of symmetric lattices in the first generation of randomly created individuals decreases the number of structures that need to be optimized to find the global energy minimum.
The correlation structure of several popular pseudorandom number generators
NASA Technical Reports Server (NTRS)
Neuman, F.; Merrick, R.; Martin, C. F.
1973-01-01
One of the desirable properties of a pseudorandom number generator is that the sequence of numbers it generates should have very low autocorrelation for all shifts except for zero shift and those that are multiples of its cycle length. Due to the simple methods of constructing random numbers, the ideal is often not quite fulfilled. A simple method of examining any random generator for previously unsuspected regularities is discussed. Once they are discovered it is often easy to derive the mathematical relationships, which describe the mathematical relationships, which describe the regular behavior. As examples, it is shown that high correlation exists in mixed and multiplicative congruential random number generators and prime moduli Lehmer generators for shifts a fraction of their cycle lengths.
Mohamed, Somaia; Rosenheck, Robert A; Lin, Haiqun; Swartz, Marvin; McEvoy, Joseph; Stroup, Scott
2015-07-01
No large-scale randomized trial has compared the effect of different second-generation antipsychotic drugs and any first-generation drug on alcohol, drug and nicotine use in patients with schizophrenia. The Clinical Antipsychotic Trial of Intervention Effectiveness study randomly assigned 1432 patients formally diagnosed with schizophrenia to four second-generation antipsychotic drugs (olanzapine, risperidone quetiapine, and ziprasidone) and one first-generation antipsychotic (perphenazine) and followed them for up to 18 months. Secondary outcome data documented cigarettes smoked in the past week and alcohol and drug use severity ratings. At baseline, 61% of patients smoked, 35% used alcohol, and 23% used illicit drugs. Although there were significant effects of time showing reduction in substance use over the 18 months (all p < 0.0001), this study found no evidence that any antipsychotic was robustly superior to any other in a secondary analysis of data on substance use outcomes from a large 18-month randomized schizophrenia trial.
High-Speed Device-Independent Quantum Random Number Generation without a Detection Loophole
NASA Astrophysics Data System (ADS)
Liu, Yang; Yuan, Xiao; Li, Ming-Han; Zhang, Weijun; Zhao, Qi; Zhong, Jiaqiang; Cao, Yuan; Li, Yu-Huai; Chen, Luo-Kan; Li, Hao; Peng, Tianyi; Chen, Yu-Ao; Peng, Cheng-Zhi; Shi, Sheng-Cai; Wang, Zhen; You, Lixing; Ma, Xiongfeng; Fan, Jingyun; Zhang, Qiang; Pan, Jian-Wei
2018-01-01
Quantum mechanics provides the means of generating genuine randomness that is impossible with deterministic classical processes. Remarkably, the unpredictability of randomness can be certified in a manner that is independent of implementation devices. Here, we present an experimental study of device-independent quantum random number generation based on a detection-loophole-free Bell test with entangled photons. In the randomness analysis, without the independent identical distribution assumption, we consider the worst case scenario that the adversary launches the most powerful attacks against the quantum adversary. After considering statistical fluctuations and applying an 80 Gb ×45.6 Mb Toeplitz matrix hashing, we achieve a final random bit rate of 114 bits /s , with a failure probability less than 10-5. This marks a critical step towards realistic applications in cryptography and fundamental physics tests.
Utz, Bettina; Assarag, Bouchra; Essolbi, Amina; Barkat, Amina; El Ansari, Nawal; Fakhir, Bouchra; Delamou, Alexandre; De Brouwere, Vincent
2017-06-19
Morocco is facing a growing prevalence of diabetes and according to latest figures of the World Health Organization, already 12.4% of the population are affected. A similar prevalence has been reported for gestational diabetes (GDM) and although it is not yet high on the national agenda, immediate and long-term complications threaten the health of mothers and future generations. A situational analysis on GDM conducted in 2015 revealed difficulties in access to screening and delays in receiving appropriate care. This implementation study has as objective to evaluate a decentralized GDM detection and management approach through the primary level of care and assess its potential for scaling up. We will conduct a hybrid effectiveness-implementation research using a cluster randomized controlled trial design in two districts of Morocco. Using the health center as unit of randomization we randomly selected 20 health centers with 10 serving as intervention and 10 as control facilities. In the intervention arm, providers will screen pregnant women attending antenatal care for GDM by capillary glucose testing during antenatal care. Women tested positive will receive nutritional counselling and will be followed up through the health center. In the control facilities, screening and initial management of GDM will follow standard practice. Primary outcome will be birthweight with weight gain during pregnancy, average glucose levels and pregnancy outcomes including mode of delivery, presence or absence of obstetric or newborn complications and the prevalence of GDM at health center level as secondary outcomes. Furthermore we will assess the quality of life /care experienced by the women in both arms. Qualitative methods will be applied to evaluate the feasibility of the intervention at primary level and its adoption by the health care providers. In Morocco, gestational diabetes screening and its initial management is fragmented and coupled with difficulties in access and treatment delays. Implementation of a strategy that enables detection, management and follow-up of affected women at primary health care level is expected to positively impact on access to care and medical outcomes. The trial has been registered on clininicaltrials.gov ; identifier NCT02979756 ; retrospectively registered 22 November 2016.
NASA Astrophysics Data System (ADS)
Leetmaa, Mikael; Skorodumova, Natalia V.
2015-11-01
We here present a revised version, v1.1, of the KMCLib general framework for kinetic Monte-Carlo (KMC) simulations. The generation of random numbers in KMCLib now relies on the C++11 standard library implementation, and support has been added for the user to choose from a set of C++11 implemented random number generators. The Mersenne-twister, the 24 and 48 bit RANLUX and a 'minimal-standard' PRNG are supported. We have also included the possibility to use true random numbers via the C++11 std::random_device generator. This release also includes technical updates to support the use of an extended range of operating systems and compilers.
Minimalist design of a robust real-time quantum random number generator
NASA Astrophysics Data System (ADS)
Kravtsov, K. S.; Radchenko, I. V.; Kulik, S. P.; Molotkov, S. N.
2015-08-01
We present a simple and robust construction of a real-time quantum random number generator (QRNG). Our minimalist approach ensures stable operation of the device as well as its simple and straightforward hardware implementation as a stand-alone module. As a source of randomness the device uses measurements of time intervals between clicks of a single-photon detector. The obtained raw sequence is then filtered and processed by a deterministic randomness extractor, which is realized as a look-up table. This enables high speed on-the-fly processing without the need of extensive computations. The overall performance of the device is around 1 random bit per detector click, resulting in 1.2 Mbit/s generation rate in our implementation.
Promoting Physical Activity through Hand-Held Computer Technology
King, Abby C.; Ahn, David K.; Oliveira, Brian M.; Atienza, Audie A.; Castro, Cynthia M.; Gardner, Christopher D.
2009-01-01
Background Efforts to achieve population-wide increases in walking and similar moderate-intensity physical activities potentially can be enhanced through relevant applications of state-of-the-art interactive communication technologies. Yet few systematic efforts to evaluate the efficacy of hand-held computers and similar devices for enhancing physical activity levels have occurred. The purpose of this first-generation study was to evaluate the efficacy of a hand-held computer (i.e., personal digital assistant [PDA]) for increasing moderate intensity or more vigorous (MOD+) physical activity levels over 8 weeks in mid-life and older adults relative to a standard information control arm. Design Randomized, controlled 8-week experiment. Data were collected in 2005 and analyzed in 2006-2007. Setting/Participants Community-based study of 37 healthy, initially underactive adults aged 50 years and older who were randomized and completed the 8-week study (intervention=19, control=18). Intervention Participants received an instructional session and a PDA programmed to monitor their physical activity levels twice per day and provide daily and weekly individualized feedback, goal setting, and support. Controls received standard, age-appropriate written physical activity educational materials. Main Outcome Measure Physical activity was assessed via the Community Healthy Activities Model Program for Seniors (CHAMPS) questionnaire at baseline and 8 weeks. Results Relative to controls, intervention participants reported significantly greater 8-week mean estimated caloric expenditure levels and minutes per week in MOD+ activity (p<0.04). Satisfaction with the PDA was reasonably high in this largely PDA-naive sample. Conclusions Results from this first-generation study indicate that hand-held computers may be effective tools for increasing initial physical activity levels among underactive adults. PMID:18201644
Ohayon, Elan L; Kalitzin, Stiliyan; Suffczynski, Piotr; Jin, Frank Y; Tsang, Paul W; Borrett, Donald S; Burnham, W McIntyre; Kwan, Hon C
2004-01-01
The problem of demarcating neural network space is formidable. A simple fully connected recurrent network of five units (binary activations, synaptic weight resolution of 10) has 3.2 *10(26) possible initial states. The problem increases drastically with scaling. Here we consider three complementary approaches to help direct the exploration to distinguish epileptic from healthy networks. [1] First, we perform a gross mapping of the space of five-unit continuous recurrent networks using randomized weights and initial activations. The majority of weight patterns (>70%) were found to result in neural assemblies exhibiting periodic limit-cycle oscillatory behavior. [2] Next we examine the activation space of non-periodic networks demonstrating that the emergence of paroxysmal activity does not require changes in connectivity. [3] The next challenge is to focus the search of network space to identify networks with more complex dynamics. Here we rely on a major available indicator critical to clinical assessment but largely ignored by epilepsy modelers, namely: behavioral states. To this end, we connected the above network layout to an external robot in which interactive states were evolved. The first random generation showed a distribution in line with approach [1]. That is, the predominate phenotypes were fixed-point or oscillatory with seizure-like motor output. As evolution progressed the profile changed markedly. Within 20 generations the entire population was able to navigate a simple environment with all individuals exhibiting multiply-stable behaviors with no cases of default locked limit-cycle oscillatory motor behavior. The resultant population may thus afford us a view of the architectural principles demarcating healthy biological networks from the pathological. The approach has an advantage over other epilepsy modeling techniques in providing a way to clarify whether observed dynamics or suggested therapies are pointing to computational viability or dead space.
Phage display peptide libraries: deviations from randomness and correctives
Ryvkin, Arie; Ashkenazy, Haim; Weiss-Ottolenghi, Yael; Piller, Chen; Pupko, Tal; Gershoni, Jonathan M
2018-01-01
Abstract Peptide-expressing phage display libraries are widely used for the interrogation of antibodies. Affinity selected peptides are then analyzed to discover epitope mimetics, or are subjected to computational algorithms for epitope prediction. A critical assumption for these applications is the random representation of amino acids in the initial naïve peptide library. In a previous study, we implemented next generation sequencing to evaluate a naïve library and discovered severe deviations from randomness in UAG codon over-representation as well as in high G phosphoramidite abundance causing amino acid distribution biases. In this study, we demonstrate that the UAG over-representation can be attributed to the burden imposed on the phage upon the assembly of the recombinant Protein 8 subunits. This was corrected by constructing the libraries using supE44-containing bacteria which suppress the UAG driven abortive termination. We also demonstrate that the overabundance of G stems from variant synthesis-efficiency and can be corrected using compensating oligonucleotide-mixtures calibrated by mass spectroscopy. Construction of libraries implementing these correctives results in markedly improved libraries that display random distribution of amino acids, thus ensuring that enriched peptides obtained in biopanning represent a genuine selection event, a fundamental assumption for phage display applications. PMID:29420788
Guschinskaya, Natalia; Brunel, Romain; Tourte, Maxime; Lipscomb, Gina L; Adams, Michael W W; Oger, Philippe; Charpentier, Xavier
2016-11-08
Transposition mutagenesis is a powerful tool to identify the function of genes, reveal essential genes and generally to unravel the genetic basis of living organisms. However, transposon-mediated mutagenesis has only been successfully applied to a limited number of archaeal species and has never been reported in Thermococcales. Here, we report random insertion mutagenesis in the hyperthermophilic archaeon Pyrococcus furiosus. The strategy takes advantage of the natural transformability of derivatives of the P. furiosus COM1 strain and of in vitro Mariner-based transposition. A transposon bearing a genetic marker is randomly transposed in vitro in genomic DNA that is then used for natural transformation of P. furiosus. A small-scale transposition reaction routinely generates several hundred and up to two thousands transformants. Southern analysis and sequencing showed that the obtained mutants contain a single and random genomic insertion. Polyploidy has been reported in Thermococcales and P. furiosus is suspected of being polyploid. Yet, about half of the mutants obtained on the first selection are homozygous for the transposon insertion. Two rounds of isolation on selective medium were sufficient to obtain gene conversion in initially heterozygous mutants. This transposition mutagenesis strategy will greatly facilitate functional exploration of the Thermococcales genomes.
Spline methods for approximating quantile functions and generating random samples
NASA Technical Reports Server (NTRS)
Schiess, J. R.; Matthews, C. G.
1985-01-01
Two cubic spline formulations are presented for representing the quantile function (inverse cumulative distribution function) of a random sample of data. Both B-spline and rational spline approximations are compared with analytic representations of the quantile function. It is also shown how these representations can be used to generate random samples for use in simulation studies. Comparisons are made on samples generated from known distributions and a sample of experimental data. The spline representations are more accurate for multimodal and skewed samples and to require much less time to generate samples than the analytic representation.
Münte, Thomas F; Joppich, Gregor; Däuper, Jan; Schrader, Christoph; Dengler, Reinhard; Heldmann, Marcus
2015-01-01
The generation of random sequences is considered to tax executive functions and has been reported to be impaired in Parkinson's disease (PD) previously. To assess the neurophysiological markers of random number generation in PD. Event-related potentials (ERP) were recorded in 12 PD patients and 12 age-matched normal controls (NC) while either engaging in random number generation (RNG) by pressing the number keys on a computer keyboard in a random sequence or in ordered number generation (ONG) necessitating key presses in the canonical order. Key presses were paced by an external auditory stimulus at a rate of 1 tone every 1800 ms. As a secondary task subjects had to monitor the tone-sequence for a particular target tone to which the number "0" key had to be pressed. This target tone occurred randomly and infrequently, thus creating a secondary oddball task. Behaviorally, PD patients showed an increased tendency to count in steps of one as well as a tendency towards repetition avoidance. Electrophysiologically, the amplitude of the P3 component of the ERP to the target tone of the secondary task was reduced during RNG in PD but not in NC. The behavioral findings indicate less random behavior in PD while the ERP findings suggest that this impairment comes about, because attentional resources are depleted in PD.
Anderson localization for radial tree-like random quantum graphs
NASA Astrophysics Data System (ADS)
Hislop, Peter D.; Post, Olaf
We prove that certain random models associated with radial, tree-like, rooted quantum graphs exhibit Anderson localization at all energies. The two main examples are the random length model (RLM) and the random Kirchhoff model (RKM). In the RLM, the lengths of each generation of edges form a family of independent, identically distributed random variables (iid). For the RKM, the iid random variables are associated with each generation of vertices and moderate the current flow through the vertex. We consider extensions to various families of decorated graphs and prove stability of localization with respect to decoration. In particular, we prove Anderson localization for the random necklace model.
The Reliability of Randomly Generated Math Curriculum-Based Measurements
ERIC Educational Resources Information Center
Strait, Gerald G.; Smith, Bradley H.; Pender, Carolyn; Malone, Patrick S.; Roberts, Jarod; Hall, John D.
2015-01-01
"Curriculum-Based Measurement" (CBM) is a direct method of academic assessment used to screen and evaluate students' skills and monitor their responses to academic instruction and intervention. Interventioncentral.org offers a math worksheet generator at no cost that creates randomly generated "math curriculum-based measures"…
Assortativity and leadership emerge from anti-preferential attachment in heterogeneous networks.
Sendiña-Nadal, I; Danziger, M M; Wang, Z; Havlin, S; Boccaletti, S
2016-02-18
Real-world networks have distinct topologies, with marked deviations from purely random networks. Many of them exhibit degree-assortativity, with nodes of similar degree more likely to link to one another. Though microscopic mechanisms have been suggested for the emergence of other topological features, assortativity has proven elusive. Assortativity can be artificially implanted in a network via degree-preserving link permutations, however this destroys the graph's hierarchical clustering and does not correspond to any microscopic mechanism. Here, we propose the first generative model which creates heterogeneous networks with scale-free-like properties in degree and clustering distributions and tunable realistic assortativity. Two distinct populations of nodes are incrementally added to an initial network by selecting a subgraph to connect to at random. One population (the followers) follows preferential attachment, while the other population (the potential leaders) connects via anti-preferential attachment: they link to lower degree nodes when added to the network. By selecting the lower degree nodes, the potential leader nodes maintain high visibility during the growth process, eventually growing into hubs. The evolution of links in Facebook empirically validates the connection between the initial anti-preferential attachment and long term high degree. In this way, our work sheds new light on the structure and evolution of social networks.
Assortativity and leadership emerge from anti-preferential attachment in heterogeneous networks
NASA Astrophysics Data System (ADS)
Sendiña-Nadal, I.; Danziger, M. M.; Wang, Z.; Havlin, S.; Boccaletti, S.
2016-02-01
Real-world networks have distinct topologies, with marked deviations from purely random networks. Many of them exhibit degree-assortativity, with nodes of similar degree more likely to link to one another. Though microscopic mechanisms have been suggested for the emergence of other topological features, assortativity has proven elusive. Assortativity can be artificially implanted in a network via degree-preserving link permutations, however this destroys the graph’s hierarchical clustering and does not correspond to any microscopic mechanism. Here, we propose the first generative model which creates heterogeneous networks with scale-free-like properties in degree and clustering distributions and tunable realistic assortativity. Two distinct populations of nodes are incrementally added to an initial network by selecting a subgraph to connect to at random. One population (the followers) follows preferential attachment, while the other population (the potential leaders) connects via anti-preferential attachment: they link to lower degree nodes when added to the network. By selecting the lower degree nodes, the potential leader nodes maintain high visibility during the growth process, eventually growing into hubs. The evolution of links in Facebook empirically validates the connection between the initial anti-preferential attachment and long term high degree. In this way, our work sheds new light on the structure and evolution of social networks.
Yuldashev, Petr V; Ollivier, Sébastien; Karzova, Maria M; Khokhlova, Vera A; Blanc-Benon, Philippe
2017-12-01
Linear and nonlinear propagation of high amplitude acoustic pulses through a turbulent layer in air is investigated using a two-dimensional KZK-type (Khokhlov-Zabolotskaya-Kuznetsov) equation. Initial waves are symmetrical N-waves with shock fronts of finite width. A modified von Kármán spectrum model is used to generate random wind velocity fluctuations associated with the turbulence. Physical parameters in simulations correspond to previous laboratory scale experiments where N-waves with 1.4 cm wavelength propagated through a turbulence layer with the outer scale of about 16 cm. Mean value and standard deviation of peak overpressure and shock steepness, as well as cumulative probabilities to observe amplified peak overpressure and shock steepness, are analyzed. Nonlinear propagation effects are shown to enhance pressure level in random foci for moderate initial amplitudes of N-waves thus increasing the probability to observe highly peaked waveforms. Saturation of the pressure level is observed for stronger nonlinear effects. It is shown that in the linear propagation regime, the turbulence mainly leads to the smearing of shock fronts, thus decreasing the probability to observe high values of steepness, whereas nonlinear effects dramatically increase the probability to observe steep shocks.
Quantum random number generator based on quantum nature of vacuum fluctuations
NASA Astrophysics Data System (ADS)
Ivanova, A. E.; Chivilikhin, S. A.; Gleim, A. V.
2017-11-01
Quantum random number generator (QRNG) allows obtaining true random bit sequences. In QRNG based on quantum nature of vacuum, optical beam splitter with two inputs and two outputs is normally used. We compare mathematical descriptions of spatial beam splitter and fiber Y-splitter in the quantum model for QRNG, based on homodyne detection. These descriptions were identical, that allows to use fiber Y-splitters in practical QRNG schemes, simplifying the setup. Also we receive relations between the input radiation and the resulting differential current in homodyne detector. We experimentally demonstrate possibility of true random bits generation by using QRNG based on homodyne detection with Y-splitter.
Grubber, J. M.; McVay, M. A.; Olsen, M. K.; Bolton, J.; Gierisch, J. M.; Taylor, S. S.; Maciejewski, M. L.; Yancy, W. S.
2016-01-01
Abstract Objective A weight loss maintenance trial involving weight loss prior to randomization is challenging to implement due to the potential for dropout and insufficient weight loss. We examined rates and correlates of non‐initiation, dropout, and insufficient weight loss during a weight loss maintenance trial. Methods The MAINTAIN trial involved a 16‐week weight loss program followed by randomization among participants losing at least 4 kg. Psychosocial measures were administered during a screening visit. Weight was obtained at the first group session and 16 weeks later to determine eligibility for randomization. Results Of 573 patients who screened as eligible, 69 failed to initiate the weight loss program. In adjusted analyses, failure to initiate was associated with lower age, lack of a support person, and less encouragement for making dietary changes. Among participants who initiated, 200 dropped out, 82 lost insufficient weight, and 222 lost sufficient weight for randomization. Compared to losing sufficient weight, dropping out was associated with younger age and tobacco use, whereas losing insufficient weight was associated with non‐White race and controlled motivation for physical activity. Conclusions Studies should be conducted to evaluate strategies to maximize recruitment and retention of subgroups that are less likely to initiate and be retained in weight loss maintenance trials. PMID:28090340
A revision of the subtract-with-borrow random number generators
NASA Astrophysics Data System (ADS)
Sibidanov, Alexei
2017-12-01
The most popular and widely used subtract-with-borrow generator, also known as RANLUX, is reimplemented as a linear congruential generator using large integer arithmetic with the modulus size of 576 bits. Modern computers, as well as the specific structure of the modulus inferred from RANLUX, allow for the development of a fast modular multiplication - the core of the procedure. This was previously believed to be slow and have too high cost in terms of computing resources. Our tests show a significant gain in generation speed which is comparable with other fast, high quality random number generators. An additional feature is the fast skipping of generator states leading to a seeding scheme which guarantees the uniqueness of random number sequences. Licensing provisions: GPLv3 Programming language: C++, C, Assembler
High-Speed Device-Independent Quantum Random Number Generation without a Detection Loophole.
Liu, Yang; Yuan, Xiao; Li, Ming-Han; Zhang, Weijun; Zhao, Qi; Zhong, Jiaqiang; Cao, Yuan; Li, Yu-Huai; Chen, Luo-Kan; Li, Hao; Peng, Tianyi; Chen, Yu-Ao; Peng, Cheng-Zhi; Shi, Sheng-Cai; Wang, Zhen; You, Lixing; Ma, Xiongfeng; Fan, Jingyun; Zhang, Qiang; Pan, Jian-Wei
2018-01-05
Quantum mechanics provides the means of generating genuine randomness that is impossible with deterministic classical processes. Remarkably, the unpredictability of randomness can be certified in a manner that is independent of implementation devices. Here, we present an experimental study of device-independent quantum random number generation based on a detection-loophole-free Bell test with entangled photons. In the randomness analysis, without the independent identical distribution assumption, we consider the worst case scenario that the adversary launches the most powerful attacks against the quantum adversary. After considering statistical fluctuations and applying an 80 Gb×45.6 Mb Toeplitz matrix hashing, we achieve a final random bit rate of 114 bits/s, with a failure probability less than 10^{-5}. This marks a critical step towards realistic applications in cryptography and fundamental physics tests.
A three-dimensional simulation of transition and early turbulence in a time-developing mixing layer
NASA Technical Reports Server (NTRS)
Cain, A. B.; Reynolds, W. C.; Ferziger, J. H.
1981-01-01
The physics of the transition and early turbulence regimes in the time developing mixing layer was investigated. The sensitivity of the mixing layer to the disturbance field of the initial condition is considered. The growth of the momentum thickness, the mean velocity profile, the turbulence kinetic energy, the Reynolds stresses, the anisotropy tensor, and particle track pictures of computations are all examined in an effort to better understand the physics of these regimes. The amplitude, spectrum shape, and random phases of the initial disturbance field were varied. A scheme of generating discrete orthogonal function expansions on some nonuniform grids was developed. All cases address the early or near field of the mixing layer. The most significant result shows that the secondary instability of the mixing layer is produced by spanwise variations in the straining field of the primary vortex structures.
2014-01-01
Background Antipsychotic medications, particularly second-generation antipsychotics, are increasingly being used to alleviate the symptoms of schizophrenia and other severe mental disorders in the pediatric population. While evidence-based approaches examining efficacy and safety outcomes have been reported, no review has evaluated prolactin-based adverse events for antipsychotic treatments in schizophrenia and schizophrenia spectrum disorders. Methods/design Searches involving MEDLINE, EMBASE, CENTRAL, PsycINFO, and clinical trial registries (ClinicalTrials.gov, Drug Industry Document Archive [DIDA], International Clinical Trials Registry Platform [ICTRP]) will be used to identify relevant studies. Two reviewers will independently screen abstracts and relevant full-text articles of the papers identified by the initial search according to the prospectively defined eligibility criteria. Data extraction will be conducted in duplicate independently. Pairwise random effects meta-analyses and network meta-analyses will be conducted on individual drug and class effects where appropriate. Discussion This systematic review will evaluate prolactin-based adverse events of first- and second-generation antipsychotics in the pediatric population with schizophrenia and schizophrenia spectrum disorders. It will also seek to strengthen the evidence base of the safety of antipsychotics by incorporating both randomized controlled trials and observational studies. Systematic review registration PROSPERO CRD42014009506 PMID:25312992
Neutron monitor generated data distributions in quantum variational Monte Carlo
NASA Astrophysics Data System (ADS)
Kussainov, A. S.; Pya, N.
2016-08-01
We have assessed the potential applications of the neutron monitor hardware as random number generator for normal and uniform distributions. The data tables from the acquisition channels with no extreme changes in the signal level were chosen as the retrospective model. The stochastic component was extracted by fitting the raw data with splines and then subtracting the fit. Scaling the extracted data to zero mean and variance of one is sufficient to obtain a stable standard normal random variate. Distributions under consideration pass all available normality tests. Inverse transform sampling is suggested to use as a source of the uniform random numbers. Variational Monte Carlo method for quantum harmonic oscillator was used to test the quality of our random numbers. If the data delivery rate is of importance and the conventional one minute resolution neutron count is insufficient, we could always settle for an efficient seed generator to feed into the faster algorithmic random number generator or create a buffer.
Random bits, true and unbiased, from atmospheric turbulence
Marangon, Davide G.; Vallone, Giuseppe; Villoresi, Paolo
2014-01-01
Random numbers represent a fundamental ingredient for secure communications and numerical simulation as well as to games and in general to Information Science. Physical processes with intrinsic unpredictability may be exploited to generate genuine random numbers. The optical propagation in strong atmospheric turbulence is here taken to this purpose, by observing a laser beam after a 143 km free-space path. In addition, we developed an algorithm to extract the randomness of the beam images at the receiver without post-processing. The numbers passed very selective randomness tests for qualification as genuine random numbers. The extracting algorithm can be easily generalized to random images generated by different physical processes. PMID:24976499
Sandhu, Satpal Singh; Sandhu, Jasleen
2013-01-01
Objective:To investigate and compare the effects of superelastic nickel–titanium and multistranded stainless steel archwires on pain during the initial phase of orthodontic treatment. Design:A double-blind two-arm parallel design stratified randomized clinical trial. Setting:A single centre in India between December 2010 and June 2012. A total of 96 participants (48 male and 48 females; 14.1±2.1 years old) were randomized (stratified on age, sex and initial crowding) to superelastic nickel–titanium or multistranded stainless steel archwire groups using a computer-generated allocation sequence. Methods:We compared 0.016-inch superelastic nickel–titanium and 0.0175-inch multistranded stainless steel wires in 0.022-inch slot (Roth prescription) preadjusted edgewise appliances. The follow-up period was 14 days. Outcome was assessed with a visual analogue scale at baseline and 32 pre-specified follow-up points. Data was analyzed using mixed-effects model analysis. Results:One participant was lost to follow up and 10 were excluded from the analysis due to bond failure or incomplete questionnaire answers. Ultimately, 85 participants (42 males and 43 females; 14.1±2.0 years old) were analysed for the final results. No statistically significant difference was found for overall pain [F value = 2.65, degrees of freedom (df) = 92.6; P = 0.1071]. However, compared to multistranded stainless steel wires, pain in subjects with superelastic nickel–titanium archwires was significantly greater at 12 h (t = 2.34; P = 0.0193), as well as at day 1 in the morning (t = 2.21, P = 0.0273), afternoon (t = 2.11, P = 0.0346) and at bedtime (t = 2.03, P = 0.042). Conclusion:For overall pain, there was no statistically significant difference between the two wires. However, subjects with superelastic nickel–titanium archwires had a significantly higher pain at peak level. PMID:24297959
On Convergent Probability of a Random Walk
ERIC Educational Resources Information Center
Lee, Y.-F.; Ching, W.-K.
2006-01-01
This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.
Efficient image projection by Fourier electroholography.
Makowski, Michał; Ducin, Izabela; Kakarenko, Karol; Kolodziejczyk, Andrzej; Siemion, Agnieszka; Siemion, Andrzej; Suszek, Jaroslaw; Sypek, Maciej; Wojnowski, Dariusz
2011-08-15
An improved efficient projection of color images is presented. It uses a phase spatial light modulator with three iteratively optimized Fourier holograms displayed simultaneously--each for one primary color. This spatial division instead of time division provides stable images. A pixelated structure of the modulator and fluctuations of liquid crystal molecules cause a zeroth-order peak, eliminated by additional wavelength-dependent phase factors shifting it before the image plane, where it is blocked with a matched filter. Speckles are suppressed by time integration of variable speckle patterns generated by additional randomizations of an initial phase and minor changes of the signal. © 2011 Optical Society of America
A Novel Color Image Encryption Algorithm Based on Quantum Chaos Sequence
NASA Astrophysics Data System (ADS)
Liu, Hui; Jin, Cong
2017-03-01
In this paper, a novel algorithm of image encryption based on quantum chaotic is proposed. The keystreams are generated by the two-dimensional logistic map as initial conditions and parameters. And then general Arnold scrambling algorithm with keys is exploited to permute the pixels of color components. In diffusion process, a novel encryption algorithm, folding algorithm, is proposed to modify the value of diffused pixels. In order to get the high randomness and complexity, the two-dimensional logistic map and quantum chaotic map are coupled with nearest-neighboring coupled-map lattices. Theoretical analyses and computer simulations confirm that the proposed algorithm has high level of security.
A parallel time integrator for noisy nonlinear oscillatory systems
NASA Astrophysics Data System (ADS)
Subber, Waad; Sarkar, Abhijit
2018-06-01
In this paper, we adapt a parallel time integration scheme to track the trajectories of noisy non-linear dynamical systems. Specifically, we formulate a parallel algorithm to generate the sample path of nonlinear oscillator defined by stochastic differential equations (SDEs) using the so-called parareal method for ordinary differential equations (ODEs). The presence of Wiener process in SDEs causes difficulties in the direct application of any numerical integration techniques of ODEs including the parareal algorithm. The parallel implementation of the algorithm involves two SDEs solvers, namely a fine-level scheme to integrate the system in parallel and a coarse-level scheme to generate and correct the required initial conditions to start the fine-level integrators. For the numerical illustration, a randomly excited Duffing oscillator is investigated in order to study the performance of the stochastic parallel algorithm with respect to a range of system parameters. The distributed implementation of the algorithm exploits Massage Passing Interface (MPI).
Mirror Instability in the Turbulent Solar Wind
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hellinger, Petr; Landi, Simone; Verdini, Andrea
2017-04-01
The relationship between a decaying strong turbulence and the mirror instability in a slowly expanding plasma is investigated using two-dimensional hybrid expanding box simulations. We impose an initial ambient magnetic field perpendicular to the simulation box, and we start with a spectrum of large-scale, linearly polarized, random-phase Alfvénic fluctuations that have energy equipartition between kinetic and magnetic fluctuations and a vanishing correlation between the two fields. A turbulent cascade rapidly develops, magnetic field fluctuations exhibit a Kolmogorov-like power-law spectrum at large scales and a steeper spectrum at sub-ion scales. The imposed expansion (taking a strictly transverse ambient magnetic field) leadsmore » to the generation of an important perpendicular proton temperature anisotropy that eventually drives the mirror instability. This instability generates large-amplitude, nonpropagating, compressible, pressure-balanced magnetic structures in a form of magnetic enhancements/humps that reduce the perpendicular temperature anisotropy.« less
Measurement uncertainty evaluation of conicity error inspected on CMM
NASA Astrophysics Data System (ADS)
Wang, Dongxia; Song, Aiguo; Wen, Xiulan; Xu, Youxiong; Qiao, Guifang
2016-01-01
The cone is widely used in mechanical design for rotation, centering and fixing. Whether the conicity error can be measured and evaluated accurately will directly influence its assembly accuracy and working performance. According to the new generation geometrical product specification(GPS), the error and its measurement uncertainty should be evaluated together. The mathematical model of the minimum zone conicity error is established and an improved immune evolutionary algorithm(IIEA) is proposed to search for the conicity error. In the IIEA, initial antibodies are firstly generated by using quasi-random sequences and two kinds of affinities are calculated. Then, each antibody clone is generated and they are self-adaptively mutated so as to maintain diversity. Similar antibody is suppressed and new random antibody is generated. Because the mathematical model of conicity error is strongly nonlinear and the input quantities are not independent, it is difficult to use Guide to the expression of uncertainty in the measurement(GUM) method to evaluate measurement uncertainty. Adaptive Monte Carlo method(AMCM) is proposed to estimate measurement uncertainty in which the number of Monte Carlo trials is selected adaptively and the quality of the numerical results is directly controlled. The cone parts was machined on lathe CK6140 and measured on Miracle NC 454 Coordinate Measuring Machine(CMM). The experiment results confirm that the proposed method not only can search for the approximate solution of the minimum zone conicity error(MZCE) rapidly and precisely, but also can evaluate measurement uncertainty and give control variables with an expected numerical tolerance. The conicity errors computed by the proposed method are 20%-40% less than those computed by NC454 CMM software and the evaluation accuracy improves significantly.
Scope of Various Random Number Generators in Ant System Approach for TSP
NASA Technical Reports Server (NTRS)
Sen, S. K.; Shaykhian, Gholam Ali
2007-01-01
Experimented on heuristic, based on an ant system approach for traveling Salesman problem, are several quasi and pseudo-random number generators. This experiment is to explore if any particular generator is most desirable. Such an experiment on large samples has the potential to rank the performance of the generators for the foregoing heuristic. This is just to seek an answer to the controversial performance ranking of the generators in probabilistic/statically sense.
True randomness from an incoherent source
NASA Astrophysics Data System (ADS)
Qi, Bing
2017-11-01
Quantum random number generators (QRNGs) harness the intrinsic randomness in measurement processes: the measurement outputs are truly random, given the input state is a superposition of the eigenstates of the measurement operators. In the case of trusted devices, true randomness could be generated from a mixed state ρ so long as the system entangled with ρ is well protected. We propose a random number generation scheme based on measuring the quadrature fluctuations of a single mode thermal state using an optical homodyne detector. By mixing the output of a broadband amplified spontaneous emission (ASE) source with a single mode local oscillator (LO) at a beam splitter and performing differential photo-detection, we can selectively detect the quadrature fluctuation of a single mode output of the ASE source, thanks to the filtering function of the LO. Experimentally, a quadrature variance about three orders of magnitude larger than the vacuum noise has been observed, suggesting this scheme can tolerate much higher detector noise in comparison with QRNGs based on measuring the vacuum noise. The high quality of this entropy source is evidenced by the small correlation coefficients of the acquired data. A Toeplitz-hashing extractor is applied to generate unbiased random bits from the Gaussian distributed raw data, achieving an efficiency of 5.12 bits per sample. The output of the Toeplitz extractor successfully passes all the NIST statistical tests for random numbers.
Golden Ratio Versus Pi as Random Sequence Sources for Monte Carlo Integration
NASA Technical Reports Server (NTRS)
Sen, S. K.; Agarwal, Ravi P.; Shaykhian, Gholam Ali
2007-01-01
We discuss here the relative merits of these numbers as possible random sequence sources. The quality of these sequences is not judged directly based on the outcome of all known tests for the randomness of a sequence. Instead, it is determined implicitly by the accuracy of the Monte Carlo integration in a statistical sense. Since our main motive of using a random sequence is to solve real world problems, it is more desirable if we compare the quality of the sequences based on their performances for these problems in terms of quality/accuracy of the output. We also compare these sources against those generated by a popular pseudo-random generator, viz., the Matlab rand and the quasi-random generator ha/ton both in terms of error and time complexity. Our study demonstrates that consecutive blocks of digits of each of these numbers produce a good random sequence source. It is observed that randomly chosen blocks of digits do not have any remarkable advantage over consecutive blocks for the accuracy of the Monte Carlo integration. Also, it reveals that pi is a better source of a random sequence than theta when the accuracy of the integration is concerned.
Faggion, Clovis Mariano; Wu, Yun-Chun; Scheidgen, Moritz; Tu, Yu-Kang
2015-01-01
Risk of bias (ROB) may threaten the internal validity of a clinical trial by distorting the magnitude of treatment effect estimates, although some conflicting information on this assumption exists. The objective of this study was evaluate the effect of ROB on the magnitude of treatment effect estimates in randomized controlled trials (RCTs) in periodontology and implant dentistry. A search for Cochrane systematic reviews (SRs), including meta-analyses of RCTs published in periodontology and implant dentistry fields, was performed in the Cochrane Library in September 2014. Random-effect meta-analyses were performed by grouping RCTs with different levels of ROBs in three domains (sequence generation, allocation concealment, and blinding of outcome assessment). To increase power and precision, only SRs with meta-analyses including at least 10 RCTs were included. Meta-regression was performed to investigate the association between ROB characteristics and the magnitudes of intervention effects in the meta-analyses. Of the 24 initially screened SRs, 21 SRs were excluded because they did not include at least 10 RCTs in the meta-analyses. Three SRs (two from periodontology field) generated information for conducting 27 meta-analyses. Meta-regression did not reveal significant differences in the relationship of the ROB level with the size of treatment effect estimates, although a trend for inflated estimates was observed in domains with unclear ROBs. In this sample of RCTs, high and (mainly) unclear risks of selection and detection biases did not seem to influence the size of treatment effect estimates, although several confounders might have influenced the strength of the association.
NASA Astrophysics Data System (ADS)
Sirait, Kamson; Tulus; Budhiarti Nababan, Erna
2017-12-01
Clustering methods that have high accuracy and time efficiency are necessary for the filtering process. One method that has been known and applied in clustering is K-Means Clustering. In its application, the determination of the begining value of the cluster center greatly affects the results of the K-Means algorithm. This research discusses the results of K-Means Clustering with starting centroid determination with a random and KD-Tree method. The initial determination of random centroid on the data set of 1000 student academic data to classify the potentially dropout has a sse value of 952972 for the quality variable and 232.48 for the GPA, whereas the initial centroid determination by KD-Tree has a sse value of 504302 for the quality variable and 214,37 for the GPA variable. The smaller sse values indicate that the result of K-Means Clustering with initial KD-Tree centroid selection have better accuracy than K-Means Clustering method with random initial centorid selection.
Experimental study of a quantum random-number generator based on two independent lasers
NASA Astrophysics Data System (ADS)
Sun, Shi-Hai; Xu, Feihu
2017-12-01
A quantum random-number generator (QRNG) can produce true randomness by utilizing the inherent probabilistic nature of quantum mechanics. Recently, the spontaneous-emission quantum phase noise of the laser has been widely deployed for quantum random-number generation, due to its high rate, its low cost, and the feasibility of chip-scale integration. Here, we perform a comprehensive experimental study of a phase-noise-based QRNG with two independent lasers, each of which operates in either continuous-wave (CW) or pulsed mode. We implement the QRNG by operating the two lasers in three configurations, namely, CW + CW, CW + pulsed, and pulsed + pulsed, and demonstrate their trade-offs, strengths, and weaknesses.
Minimal-post-processing 320-Gbps true random bit generation using physical white chaos.
Wang, Anbang; Wang, Longsheng; Li, Pu; Wang, Yuncai
2017-02-20
Chaotic external-cavity semiconductor laser (ECL) is a promising entropy source for generation of high-speed physical random bits or digital keys. The rate and randomness is unfortunately limited by laser relaxation oscillation and external-cavity resonance, and is usually improved by complicated post processing. Here, we propose using a physical broadband white chaos generated by optical heterodyning of two ECLs as entropy source to construct high-speed random bit generation (RBG) with minimal post processing. The optical heterodyne chaos not only has a white spectrum without signature of relaxation oscillation and external-cavity resonance but also has a symmetric amplitude distribution. Thus, after quantization with a multi-bit analog-digital-convertor (ADC), random bits can be obtained by extracting several least significant bits (LSBs) without any other processing. In experiments, a white chaos with a 3-dB bandwidth of 16.7 GHz is generated. Its entropy rate is estimated as 16 Gbps by single-bit quantization which means a spectrum efficiency of 96%. With quantization using an 8-bit ADC, 320-Gbps physical RBG is achieved by directly extracting 4 LSBs at 80-GHz sampling rate.
Edney, Sarah; Looyestyn, Jemma; Ryan, Jillian; Kernot, Jocelyn; Maher, Carol
2018-04-05
Social networking websites have attracted considerable attention as a delivery platform for physical activity interventions. Current evidence highlights a need to enhance user engagement with these interventions to actualize their potential. The purpose of this study was to determine which post type generates the most engagement from participants and whether engagement was related to change in physical activity in an intervention delivered via Facebook. Subgroup analysis of the intervention condition of a randomized controlled trial was conducted. The group moderator posted a new message to the private Facebook group each day of the program. The Facebook posts (n = 118) were categorized into the following types: moderator-initiated running program, multimedia, motivational, opinion polls, or discussion question and participant-initiated experience shares, or questions. Four metrics were used to measure volume of engagement with each post type, "likes," "comments," "poll votes," and "photo uploads." One-way ANOVA was used to determine whether engagement differed by post type and an independent samples t-test to determine differences in engagement between moderator and participant-initiated posts. Pearson correlation was used to examine associations between total engagement and change in physical activity. Engagement varied by post type. Polls elicited the greatest engagement (p ≤ .01). The most common form of engagement was "likes," and engagement was higher for moderator-initiated rather than participant-initiated posts (mean = 8.0 [SD 6.8] vs. 5.3 [SD 3.2]; p ≤ .01). Total engagement with the Facebook group was not directly associated with change in physical activity (r = -.13, p = .47). However, engagement was associated with compliance with the running program (r = .37, p = .04) and there was a nonsignificant positive association between compliance and change in physical activity (r = .32, p = .08). Posts requiring a simple response generated the most engagement. Intervention moderators should facilitate familiarity between participants at the intervention outset, to encourage engagement between participants. Engagement was related to change in physical activity, and these recommendations should be incorporated to enhance engagement and efficacy of interventions.
Empirical Analysis and Refinement of Expert System Knowledge Bases
1988-08-31
refinement. Both a simulated case generation program, and a random rule basher were developed to enhance rule refinement experimentation. *Substantial...the second fiscal year 88 objective was fully met. Rule Refinement System Simulated Rule Basher Case Generator Stored Cases Expert System Knowledge...generated until the rule is satisfied. Cases may be randomly generated for a given rule or hypothesis. Rule Basher Given that one has a correct
1989-08-01
Random variables for the conditional exponential distribution are generated using the inverse transform method. C1) Generate U - UCO,i) (2) Set s - A ln...e - [(x+s - 7)/ n] 0 + [Cx-T)/n]0 c. Random variables from the conditional weibull distribution are generated using the inverse transform method. C1...using a standard normal transformation and the inverse transform method. B - 3 APPENDIX 3 DISTRIBUTIONS SUPPORTED BY THE MODEL (1) Generate Y - PCX S
Practical quantum random number generator based on measuring the shot noise of vacuum states
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen Yong; Zou Hongxin; Tian Liang
2010-06-15
The shot noise of vacuum states is a kind of quantum noise and is totally random. In this paper a nondeterministic random number generation scheme based on measuring the shot noise of vacuum states is presented and experimentally demonstrated. We use a homodyne detector to measure the shot noise of vacuum states. Considering that the frequency bandwidth of our detector is limited, we derive the optimal sampling rate so that sampling points have the least correlation with each other. We also choose a method to extract random numbers from sampling values, and prove that the influence of classical noise canmore » be avoided with this method so that the detector does not have to be shot-noise limited. The random numbers generated with this scheme have passed ent and diehard tests.« less
Compact quantum random number generator based on superluminescent light-emitting diodes
NASA Astrophysics Data System (ADS)
Wei, Shihai; Yang, Jie; Fan, Fan; Huang, Wei; Li, Dashuang; Xu, Bingjie
2017-12-01
By measuring the amplified spontaneous emission (ASE) noise of the superluminescent light emitting diodes, we propose and realize a quantum random number generator (QRNG) featured with practicability. In the QRNG, after the detection and amplification of the ASE noise, the data acquisition and randomness extraction which is integrated in a field programmable gate array (FPGA) are both implemented in real-time, and the final random bit sequences are delivered to a host computer with a real-time generation rate of 1.2 Gbps. Further, to achieve compactness, all the components of the QRNG are integrated on three independent printed circuit boards with a compact design, and the QRNG is packed in a small enclosure sized 140 mm × 120 mm × 25 mm. The final random bit sequences can pass all the NIST-STS and DIEHARD tests.
Pseudo-random properties of a linear congruential generator investigated by b-adic diaphony
NASA Astrophysics Data System (ADS)
Stoev, Peter; Stoilova, Stanislava
2017-12-01
In the proposed paper we continue the study of the diaphony, defined in b-adic number system, and we extend it in different directions. We investigate this diaphony as a tool for estimation of the pseudorandom properties of some of the most used random number generators. This is done by evaluating the distribution of specially constructed two-dimensional nets on the base of the obtained random numbers. The aim is to see how the generated numbers are suitable for calculations in some numerical methods (Monte Carlo etc.).
Dynamic Simulation of Random Packing of Polydispersive Fine Particles
NASA Astrophysics Data System (ADS)
Ferraz, Carlos Handrey Araujo; Marques, Samuel Apolinário
2018-02-01
In this paper, we perform molecular dynamic (MD) simulations to study the two-dimensional packing process of both monosized and random size particles with radii ranging from 1.0 to 7.0 μm. The initial positions as well as the radii of five thousand fine particles were defined inside a rectangular box by using a random number generator. Both the translational and rotational movements of each particle were considered in the simulations. In order to deal with interacting fine particles, we take into account both the contact forces and the long-range dispersive forces. We account for normal and static/sliding tangential friction forces between particles and between particle and wall by means of a linear model approach, while the long-range dispersive forces are computed by using a Lennard-Jones-like potential. The packing processes were studied assuming different long-range interaction strengths. We carry out statistical calculations of the different quantities studied such as packing density, mean coordination number, kinetic energy, and radial distribution function as the system evolves over time. We find that the long-range dispersive forces can strongly influence the packing process dynamics as they might form large particle clusters, depending on the intensity of the long-range interaction strength.
Generating constrained randomized sequences: item frequency matters.
French, Robert M; Perruchet, Pierre
2009-11-01
All experimental psychologists understand the importance of randomizing lists of items. However, randomization is generally constrained, and these constraints-in particular, not allowing immediately repeated items-which are designed to eliminate particular biases, frequently engender others. We describe a simple Monte Carlo randomization technique that solves a number of these problems. However, in many experimental settings, we are concerned not only with the number and distribution of items but also with the number and distribution of transitions between items. The algorithm mentioned above provides no control over this. We therefore introduce a simple technique that uses transition tables for generating correctly randomized sequences. We present an analytic method of producing item-pair frequency tables and item-pair transitional probability tables when immediate repetitions are not allowed. We illustrate these difficulties and how to overcome them, with reference to a classic article on word segmentation in infants. Finally, we provide free access to an Excel file that allows users to generate transition tables with up to 10 different item types, as well as to generate appropriately distributed randomized sequences of any length without immediately repeated elements. This file is freely available from http://leadserv.u-bourgogne.fr/IMG/xls/TransitionMatrix.xls.
A random spatial network model based on elementary postulates
Karlinger, Michael R.; Troutman, Brent M.
1989-01-01
A model for generating random spatial networks that is based on elementary postulates comparable to those of the random topology model is proposed. In contrast to the random topology model, this model ascribes a unique spatial specification to generated drainage networks, a distinguishing property of some network growth models. The simplicity of the postulates creates an opportunity for potential analytic investigations of the probabilistic structure of the drainage networks, while the spatial specification enables analyses of spatially dependent network properties. In the random topology model all drainage networks, conditioned on magnitude (number of first-order streams), are equally likely, whereas in this model all spanning trees of a grid, conditioned on area and drainage density, are equally likely. As a result, link lengths in the generated networks are not independent, as usually assumed in the random topology model. For a preliminary model evaluation, scale-dependent network characteristics, such as geometric diameter and link length properties, and topologic characteristics, such as bifurcation ratio, are computed for sets of drainage networks generated on square and rectangular grids. Statistics of the bifurcation and length ratios fall within the range of values reported for natural drainage networks, but geometric diameters tend to be relatively longer than those for natural networks.
Method and apparatus for determining position using global positioning satellites
NASA Technical Reports Server (NTRS)
Ward, John (Inventor); Ward, William S. (Inventor)
1998-01-01
A global positioning satellite receiver having an antenna for receiving a L1 signal from a satellite. The L1 signal is processed by a preamplifier stage including a band pass filter and a low noise amplifier and output as a radio frequency (RF) signal. A mixer receives and de-spreads the RF signal in response to a pseudo-random noise code, i.e., Gold code, generated by an internal pseudo-random noise code generator. A microprocessor enters a code tracking loop, such that during the code tracking loop, it addresses the pseudo-random code generator to cause the pseudo-random code generator to sequentially output pseudo-random codes corresponding to satellite codes used to spread the L1 signal, until correlation occurs. When an output of the mixer is indicative of the occurrence of correlation between the RF signal and the generated pseudo-random codes, the microprocessor enters an operational state which slows the receiver code sequence to stay locked with the satellite code sequence. The output of the mixer is provided to a detector which, in turn, controls certain routines of the microprocessor. The microprocessor will output pseudo range information according to an interrupt routine in response detection of correlation. The pseudo range information is to be telemetered to a ground station which determines the position of the global positioning satellite receiver.
Narrow-band generation in random distributed feedback fiber laser.
Sugavanam, Srikanth; Tarasov, Nikita; Shu, Xuewen; Churkin, Dmitry V
2013-07-15
Narrow-band emission of spectral width down to ~0.05 nm line-width is achieved in the random distributed feedback fiber laser employing narrow-band fiber Bragg grating or fiber Fabry-Perot interferometer filters. The observed line-width is ~10 times less than line-width of other demonstrated up to date random distributed feedback fiber lasers. The random DFB laser with Fabry-Perot interferometer filter provides simultaneously multi-wavelength and narrow-band (within each line) generation with possibility of further wavelength tuning.
Tassie, Jean-Michel; Malateste, Karen; Pujades-Rodríguez, Mar; Poulet, Elisabeth; Bennett, Diane; Harries, Anthony; Mahy, Mary; Schechter, Mauro; Souteyrand, Yves; Dabis, François
2010-11-10
Retention of patients on antiretroviral therapy (ART) over time is a proxy for quality of care and an outcome indicator to monitor ART programs. Using existing databases (Antiretroviral in Lower Income Countries of the International Databases to Evaluate AIDS and Médecins Sans Frontières), we evaluated three sampling approaches to simplify the generation of outcome indicators. We used individual patient data from 27 ART sites and included 27,201 ART-naive adults (≥15 years) who initiated ART in 2005. For each site, we generated two outcome indicators at 12 months, retention on ART and proportion of patients lost to follow-up (LFU), first using all patient data and then within a smaller group of patients selected using three sampling methods (random, systematic and consecutive sampling). For each method and each site, 500 samples were generated, and the average result was compared with the unsampled value. The 95% sampling distribution (SD) was expressed as the 2.5(th) and 97.5(th) percentile values from the 500 samples. Overall, retention on ART was 76.5% (range 58.9-88.6) and the proportion of patients LFU, 13.5% (range 0.8-31.9). Estimates of retention from sampling (n = 5696) were 76.5% (SD 75.4-77.7) for random, 76.5% (75.3-77.5) for systematic and 76.0% (74.1-78.2) for the consecutive method. Estimates for the proportion of patients LFU were 13.5% (12.6-14.5), 13.5% (12.6-14.3) and 14.0% (12.5-15.5), respectively. With consecutive sampling, 50% of sites had SD within ±5% of the unsampled site value. Our results suggest that random, systematic or consecutive sampling methods are feasible for monitoring ART indicators at national level. However, sampling may not produce precise estimates in some sites.
Inventory of Amphibians and Reptiles in Southern Colorado Plateau National Parks
Persons, Trevor B.; Nowak, Erika M.
2006-01-01
In fiscal year 2000, the National Park Service (NPS) initiated a nationwide program to inventory vertebrates andvascular plants within the National Parks, and an inventory plan was developed for the 19 park units in the Southern Colorado Plateau Inventory & Monitoring Network. We surveyed 12 parks in this network for reptiles and amphibians between 2001 and 2003. The overall goals of our herpetofaunal inventories were to document 90% of the species present, identify park-specific species of special concern, and, based on the inventory results, make recommendations for the development of an effective monitoring program. We used the following standardized herpetological methods to complete the inventories: time-area constrained searches, visual encounter ('general') surveys, and nighttime road cruising. We also recorded incidental species sightings and surveyed existing literature and museum specimen databases. We found 50 amphibian and reptile species during fieldwork. These included 1 salamander, 11 anurans, 21 lizards, and 17 snakes. Literature reviews, museum specimen data records, and personal communications with NPS staff added an additional eight species, including one salamander, one turtle, one lizard, and five snakes. It was necessary to use a variety of methods to detect all species in each park. Randomly-generated 1-ha time-area constrained searches and night drives produced the fewest species and individuals of all the methods, while general surveys and randomly-generated 10-ha time-areas constrained searches produced the most. Inventory completeness was likely compromised by a severe drought across the region during our surveys. In most parks we did not come close to the goal of detecting 90% of the expected species present; however, we did document several species range extensions. Effective monitoring programs for herpetofauna on the Colorado Plateau should use a variety of methods to detect species, and focus on taxa-specific methods. Randomly-generated plots must take into account microhabitat and aquatic features to be effective at sampling for herpetofauna.
Mendelow, A. David; Rowan, Elise N.; Francis, Richard; McColl, Elaine; McNamee, Paul; Chambers, Iain R.; Unterberg, Andreas; Boyers, Dwayne; Mitchell, Patrick M.
2015-01-01
Abstract Intraparenchymal hemorrhages occur in a proportion of severe traumatic brain injury TBI patients, but the role of surgery in their treatment is unclear. This international multi-center, patient-randomized, parallel-group trial compared early surgery (hematoma evacuation within 12 h of randomization) with initial conservative treatment (subsequent evacuation allowed if deemed necessary). Patients were randomized using an independent randomization service within 48 h of TBI. Patients were eligible if they had no more than two intraparenchymal hemorrhages of 10 mL or more and did not have an extradural or subdural hematoma that required surgery. The primary outcome measure was the traditional dichotomous split of the Glasgow Outcome Scale obtained by postal questionnaires sent directly to patients at 6 months. The trial was halted early by the UK funding agency (NIHR HTA) for failure to recruit sufficient patients from the UK (trial registration: ISRCTN19321911). A total of 170 patients were randomized from 31 of 59 registered centers worldwide. Of 82 patients randomized to early surgery with complete follow-up, 30 (37%) had an unfavorable outcome. Of 85 patients randomized to initial conservative treatment with complete follow-up, 40 (47%) had an unfavorable outcome (odds ratio, 0.65; 95% confidence interval, CI 0.35, 1.21; p=0.17), with an absolute benefit of 10.5% (CI, −4.4–25.3%). There were significantly more deaths in the first 6 months in the initial conservative treatment group (33% vs. 15%; p=0.006). The 10.5% absolute benefit with early surgery was consistent with the initial power calculation. However, with the low sample size resulting from the premature termination, we cannot exclude the possibility that this could be a chance finding. A further trial is required urgently to assess whether this encouraging signal can be confirmed. PMID:25738794
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giaddui, T; Li, N; Moore, K
Purpose: To establish a workflow for NRG-GY006 IMRT pre-treatment reviews, incorporating advanced radiotherapy technologies being evaluated as part of the clinical trial. Methods: Pre-Treatment reviews are required for every IMRT case as part of NRG-GY006 (a randomized phase II trial of radiation therapy and cisplatin alone or in combination with intravenous triapine in women with newly diagnosed bulky stage I B2, stage II, IIIB, or IVA cancer of the uterine cervix or stage II-IVA vaginal cancer. The pretreatment review process includes structures review and generating an active bone marrow(ABM)- to be used as an avoidance structure during IMRT optimization- andmore » evaluating initial IMRT plan quality using knowledgeengineering based planning (KBP). Institutions will initially submit their simulation CT scan, structures file and PET/CT to IROC QA center for generating ABM. The ABM will be returned to the institution for use in planning. Institutions will then submit an initial IMRT plan for review and will receive information back following implementation of a KBP algorithm, for use in re-optimization, before submitting the final IMRT used for treatment. Results: ABM structure is generated using MIM vista software (Version 6.5, MIM corporation, Inc.). Here, the planning CT and the diagnostic PET/CT are fused and a sub threshold structure is auto segmented above the mean value of the SUV of the bone marrow. The generated ABM were compared with those generated with other software system (e.g. Velocity, Varian) and Dice coefficient (reflects the overlap of structures) ranged between 80 – 90% was achieved. A KBP model was built in Varian Eclipse TPS using the RapidPlan KBP software to perform plan quality assurance. Conclusion: The workflow for IMRT pretreatment reviews has been established. It represents a major improvement of NRG Oncology clinical trial quality assurance and incorporates the latest radiotherapy technologies as part of NCI clinical trials. This project was supported by grants U24CA180803 (IROC), UG1CA189867 (NCORP), U10CA180868 (NRG Oncology Operations), U10CA180822 (NRG Oncology SDMC) from the National Cancer Institute (NCI) and PA CURE grant.« less
Programmable quantum random number generator without postprocessing.
Nguyen, Lac; Rehain, Patrick; Sua, Yong Meng; Huang, Yu-Ping
2018-02-15
We demonstrate a viable source of unbiased quantum random numbers whose statistical properties can be arbitrarily programmed without the need for any postprocessing such as randomness distillation or distribution transformation. It is based on measuring the arrival time of single photons in shaped temporal modes that are tailored with an electro-optical modulator. We show that quantum random numbers can be created directly in customized probability distributions and pass all randomness tests of the NIST and Dieharder test suites without any randomness extraction. The min-entropies of such generated random numbers are measured close to the theoretical limits, indicating their near-ideal statistics and ultrahigh purity. Easy to implement and arbitrarily programmable, this technique can find versatile uses in a multitude of data analysis areas.
The role of ferroelectric domain structure in second harmonic generation in random quadratic media.
Roppo, Vito; Wang, W; Kalinowski, K; Kong, Y; Cojocaru, C; Trull, J; Vilaseca, R; Scalora, M; Krolikowski, W; Kivshar, Yu
2010-03-01
We study theoretically and numerically the second harmonic generation in a nonlinear crystal with random distribution of ferroelectric domains. We show that the specific features of disordered domain structure greatly affect the emission pattern of the generated harmonics. This phenomena can be used to characterize the degree of disorder in nonlinear photonic structures.
NASA Astrophysics Data System (ADS)
Zou, Guang'an; Wang, Qiang; Mu, Mu
2016-09-01
Sensitive areas for prediction of the Kuroshio large meander using a 1.5-layer, shallow-water ocean model were investigated using the conditional nonlinear optimal perturbation (CNOP) and first singular vector (FSV) methods. A series of sensitivity experiments were designed to test the sensitivity of sensitive areas within the numerical model. The following results were obtained: (1) the eff ect of initial CNOP and FSV patterns in their sensitive areas is greater than that of the same patterns in randomly selected areas, with the eff ect of the initial CNOP patterns in CNOP sensitive areas being the greatest; (2) both CNOP- and FSV-type initial errors grow more quickly than random errors; (3) the eff ect of random errors superimposed on the sensitive areas is greater than that of random errors introduced into randomly selected areas, and initial errors in the CNOP sensitive areas have greater eff ects on final forecasts. These results reveal that the sensitive areas determined using the CNOP are more sensitive than those of FSV and other randomly selected areas. In addition, ideal hindcasting experiments were conducted to examine the validity of the sensitive areas. The results indicate that reduction (or elimination) of CNOP-type errors in CNOP sensitive areas at the initial time has a greater forecast benefit than the reduction (or elimination) of FSV-type errors in FSV sensitive areas. These results suggest that the CNOP method is suitable for determining sensitive areas in the prediction of the Kuroshio large-meander path.
Blencowe, Natalie S; Cook, Jonathan A; Pinkney, Thomas; Rogers, Chris; Reeves, Barnaby C; Blazeby, Jane M
2017-04-01
Randomized controlled trials in surgery are notoriously difficult to design and conduct due to numerous methodological and cultural challenges. Over the last 5 years, several UK-based surgical trial-related initiatives have been funded to address these issues. These include the development of Surgical Trials Centers and Surgical Specialty Leads (individual surgeons responsible for championing randomized controlled trials in their specialist fields), both funded by the Royal College of Surgeons of England; networks of research-active surgeons in training; and investment in methodological research relating to surgical randomized controlled trials (to address issues such as recruitment, blinding, and the selection and standardization of interventions). This article discusses these initiatives more in detail and provides exemplar cases to illustrate how the methodological challenges have been tackled. The initiatives have surpassed expectations, resulting in a renaissance in surgical research throughout the United Kingdom, such that the number of patients entering surgical randomized controlled trials has doubled.
Experimental and theoretical study of combustion jet ignition
NASA Technical Reports Server (NTRS)
Chen, D. Y.; Ghoniem, A. F.; Oppenheim, A. K.
1983-01-01
A combustion jet ignition system was developed to generate turbulent jets of combustion products containing free radicals and to discharge them as ignition sources into a combustible medium. In order to understand the ignition and the inflammation processes caused by combustion jets, the studies of the fluid mechanical properties of turbulent jets with and without combustion were conducted theoretically and experimentally. Experiments using a specially designed igniter, with a prechamber to build up and control the stagnation pressure upstream of the orifice, were conducted to investigate the formation processes of turbulent jets of combustion products. The penetration speed of combustion jets has been found to be constant initially and then decreases monotonically as turbulent jets of combustion products travel closer to the wall. This initial penetration speed to combustion jets is proportional to the initial stagnation pressure upstream of the orifice for the same stoichiometric mixture. Computer simulations by Chorin's Random Vortex Method implemented with the flame propagation algorithm for the theoretical model of turbulent jets with and without combustion were performed to study the turbulent jet flow field. In the formation processes of the turbulent jets, the large-scale eddy structure of turbulence, the so-called coherent structure, dominates the entrainment and mixing processes. The large-scale eddy structure of turbulent jets in this study is constructed by a series of vortex pairs, which are organized in the form of a staggered array of vortex clouds generating local recirculation flow patterns.
Object-based change detection method using refined Markov random field
NASA Astrophysics Data System (ADS)
Peng, Daifeng; Zhang, Yongjun
2017-01-01
In order to fully consider the local spatial constraints between neighboring objects in object-based change detection (OBCD), an OBCD approach is presented by introducing a refined Markov random field (MRF). First, two periods of images are stacked and segmented to produce image objects. Second, object spectral and textual histogram features are extracted and G-statistic is implemented to measure the distance among different histogram distributions. Meanwhile, object heterogeneity is calculated by combining spectral and textual histogram distance using adaptive weight. Third, an expectation-maximization algorithm is applied for determining the change category of each object and the initial change map is then generated. Finally, a refined change map is produced by employing the proposed refined object-based MRF method. Three experiments were conducted and compared with some state-of-the-art unsupervised OBCD methods to evaluate the effectiveness of the proposed method. Experimental results demonstrate that the proposed method obtains the highest accuracy among the methods used in this paper, which confirms its validness and effectiveness in OBCD.
Perrone, T M; Gonzatti, M I; Villamizar, G; Escalante, A; Aso, P M
2009-05-12
Nine Trypanosoma sp. Venezuelan isolates, initially presumed to be T. evansi, were collected from three different hosts, capybara (Apure state), horse (Apure state) and donkey (Guarico state) and compared by the random amplification polymorphic DNA technique (RAPD). Thirty-one to 46 reproducible fragments were obtained with 12 of the 40 primers that were used. Most of the primers detected molecular profiles with few polymorphisms between the seven horse, capybara and donkey isolates. Quantitative analyses of the RAPD profiles of these isolates revealed a high degree of genetic conservation with similarity coefficients between 85.7% and 98.5%. Ten of the primers generated polymorphic RAPD profiles with two of the three Trypanosoma sp. horse isolates, namely TeAp-N/D1 and TeGu-N/D1. The similarity coefficient between these two isolates and the rest, ranged from 57.9% to 68.4% and the corresponding dendrogram clustered TeAp-N/D1 and Te Gu-N/D1 in a genetically distinct group.
A Numerical Study of New Logistic Map
NASA Astrophysics Data System (ADS)
Khmou, Youssef
In this paper, we propose a new logistic map based on the relation of the information entropy, we study the bifurcation diagram comparatively to the standard logistic map. In the first part, we compare the obtained diagram, by numerical simulations, with that of the standard logistic map. It is found that the structures of both diagrams are similar where the range of the growth parameter is restricted to the interval [0,e]. In the second part, we present an application of the proposed map in traffic flow using macroscopic model. It is found that the bifurcation diagram is an exact model of the Greenberg’s model of traffic flow where the growth parameter corresponds to the optimal velocity and the random sequence corresponds to the density. In the last part, we present a second possible application of the proposed map which consists of random number generation. The results of the analysis show that the excluded initial values of the sequences are (0,1).
Meta-RaPS Algorithm for the Aerial Refueling Scheduling Problem
NASA Technical Reports Server (NTRS)
Kaplan, Sezgin; Arin, Arif; Rabadi, Ghaith
2011-01-01
The Aerial Refueling Scheduling Problem (ARSP) can be defined as determining the refueling completion times for each fighter aircraft (job) on multiple tankers (machines). ARSP assumes that jobs have different release times and due dates, The total weighted tardiness is used to evaluate schedule's quality. Therefore, ARSP can be modeled as a parallel machine scheduling with release limes and due dates to minimize the total weighted tardiness. Since ARSP is NP-hard, it will be more appropriate to develop a pproimate or heuristic algorithm to obtain solutions in reasonable computation limes. In this paper, Meta-Raps-ATC algorithm is implemented to create high quality solutions. Meta-RaPS (Meta-heuristic for Randomized Priority Search) is a recent and promising meta heuristic that is applied by introducing randomness to a construction heuristic. The Apparent Tardiness Rule (ATC), which is a good rule for scheduling problems with tardiness objective, is used to construct initial solutions which are improved by an exchanging operation. Results are presented for generated instances.
A Micro-Computer Model for Army Air Defense Training.
1985-03-01
generator. The period is 32763 numbers generated before a repetitive sequence is encountered on the development system. Chi-Squared tests for frequency...C’ Tests CPeriodicity. The period is 32763 numbers generated C’before a repetitive sequence is encountered on the development system. This was...positions in the test array. This was done with several different random number seeds. In each case 32763 p random numbers were generated before a
Experimentally Generated Random Numbers Certified by the Impossibility of Superluminal Signaling
NASA Astrophysics Data System (ADS)
Bierhorst, Peter; Shalm, Lynden K.; Mink, Alan; Jordan, Stephen; Liu, Yi-Kai; Rommal, Andrea; Glancy, Scott; Christensen, Bradley; Nam, Sae Woo; Knill, Emanuel
Random numbers are an important resource for applications such as numerical simulation and secure communication. However, it is difficult to certify whether a physical random number generator is truly unpredictable. Here, we exploit the phenomenon of quantum nonlocality in a loophole-free photonic Bell test experiment to obtain data containing randomness that cannot be predicted by any theory that does not also allow the sending of signals faster than the speed of light. To certify and quantify the randomness, we develop a new protocol that performs well in an experimental regime characterized by low violation of Bell inequalities. Applying an extractor function to our data, we obtain 256 new random bits, uniform to within 10- 3 .
NASA Technical Reports Server (NTRS)
Lindsey, R. S., Jr. (Inventor)
1975-01-01
An exemplary embodiment of the present invention provides a source of random width and random spaced rectangular voltage pulses whose mean or average frequency of operation is controllable within prescribed limits of about 10 hertz to 1 megahertz. A pair of thin-film metal resistors are used to provide a differential white noise voltage pulse source. Pulse shaping and amplification circuitry provide relatively short duration pulses of constant amplitude which are applied to anti-bounce logic circuitry to prevent ringing effects. The pulse outputs from the anti-bounce circuits are then used to control two one-shot multivibrators whose output comprises the random length and random spaced rectangular pulses. Means are provided for monitoring, calibrating and evaluating the relative randomness of the generator.
Source-Device-Independent Ultrafast Quantum Random Number Generation.
Marangon, Davide G; Vallone, Giuseppe; Villoresi, Paolo
2017-02-10
Secure random numbers are a fundamental element of many applications in science, statistics, cryptography and more in general in security protocols. We present a method that enables the generation of high-speed unpredictable random numbers from the quadratures of an electromagnetic field without any assumption on the input state. The method allows us to eliminate the numbers that can be predicted due to the presence of classical and quantum side information. In particular, we introduce a procedure to estimate a bound on the conditional min-entropy based on the entropic uncertainty principle for position and momentum observables of infinite dimensional quantum systems. By the above method, we experimentally demonstrated the generation of secure true random bits at a rate greater than 1.7 Gbit/s.
Annoni, J.; Pegna, A.
1997-01-01
OBJECTIVE—To test the hypothesis that, during random motor generation, the spatial contingencies inherent to the task would induce additional preferences in normal subjects, shifting their performances farther from randomness. By contrast, perceptual or executive dysfunction could alter these task related biases in patients with brain damage. METHODS—Two groups of patients, with right and left focal brain lesions, as well as 25 right handed subjects matched for age and handedness were asked to execute a random choice motor task—namely, to generate a random series of 180 button presses from a set of 10 keys placed vertically in front of them. RESULTS—In the control group, as in the left brain lesion group, motor generation was subject to deviations from theoretical expected randomness, similar to those when numbers are generated mentally, as immediate repetitions (successive presses on the same key) are avoided. However, the distribution of button presses was also contingent on the topographic disposition of the keys: the central keys were chosen more often than those placed at extreme positions. Small distances were favoured, particularly with the left hand. These patterns were influenced by implicit strategies and task related contingencies. By contrast, right brain lesion patients with frontal involvement tended to show a more square distribution of key presses—that is, the number of key presses tended to be more equally distributed. The strategies were also altered by brain lesions: the number of immediate repetitions was more frequent when the lesion involved the right frontal areas yielding a random generation nearer to expected theoretical randomness. The frequency of adjacent key presses was increased by right anterior and left posterior cortical as well as by right subcortical lesions, but decreased by left subcortical lesions. CONCLUSIONS—Depending on the side of the lesion and the degree of cortical-subcortical involvement, the deficits take on a different aspect and direct repetions and adjacent key presses have different patterns of alterations. Motor random generation is therefore a complex task which seems to necessitate the participation of numerous cerebral structures, among which those situated in the right frontal, left posterior, and subcortical regions have a predominant role. PMID:9408109
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Dlugach, Janna M.; Zakharova, Nadezhda T.
2016-01-01
The numerically exact superposition T-matrix method is used to model far-field electromagnetic scattering by two types of particulate object. Object 1 is a fixed configuration which consists of N identical spherical particles (with N 200 or 400) quasi-randomly populating a spherical volume V having a median size parameter of 50. Object 2 is a true discrete random medium (DRM) comprising the same number N of particles randomly moving throughout V. The median particle size parameter is fixed at 4. We show that if Object 1 is illuminated by a quasi-monochromatic parallel beam then it generates a typical speckle pattern having no resemblance to the scattering pattern generated by Object 2. However, if Object 1 is illuminated by a parallel polychromatic beam with a 10 bandwidth then it generates a scattering pattern that is largely devoid of speckles and closely reproduces the quasi-monochromatic pattern generated by Object 2. This result serves to illustrate the capacity of the concept of electromagnetic scattering by a DRM to encompass fixed quasi-random particulate samples provided that they are illuminated by polychromatic light.
N-state random switching based on quantum tunnelling
NASA Astrophysics Data System (ADS)
Bernardo Gavito, Ramón; Jiménez Urbanos, Fernando; Roberts, Jonathan; Sexton, James; Astbury, Benjamin; Shokeir, Hamzah; McGrath, Thomas; Noori, Yasir J.; Woodhead, Christopher S.; Missous, Mohamed; Roedig, Utz; Young, Robert J.
2017-08-01
In this work, we show how the hysteretic behaviour of resonant tunnelling diodes (RTDs) can be exploited for new functionalities. In particular, the RTDs exhibit a stochastic 2-state switching mechanism that could be useful for random number generation and cryptographic applications. This behaviour can be scaled to N-bit switching, by connecting various RTDs in series. The InGaAs/AlAs RTDs used in our experiments display very sharp negative differential resistance (NDR) peaks at room temperature which show hysteresis cycles that, rather than having a fixed switching threshold, show a probability distribution about a central value. We propose to use this intrinsic uncertainty emerging from the quantum nature of the RTDs as a source of randomness. We show that a combination of two RTDs in series results in devices with three-state outputs and discuss the possibility of scaling to N-state devices by subsequent series connections of RTDs, which we demonstrate for the up to the 4-state case. In this work, we suggest using that the intrinsic uncertainty in the conduction paths of resonant tunnelling diodes can behave as a source of randomness that can be integrated into current electronics to produce on-chip true random number generators. The N-shaped I-V characteristic of RTDs results in a two-level random voltage output when driven with current pulse trains. Electrical characterisation and randomness testing of the devices was conducted in order to determine the validity of the true randomness assumption. Based on the results obtained for the single RTD case, we suggest the possibility of using multi-well devices to generate N-state random switching devices for their use in random number generation or multi-valued logic devices.
Hepatitis B in Moroccan-Dutch: a quantitative study into determinants of screening participation.
Hamdiui, Nora; Stein, Mart L; Timen, Aura; Timmermans, Danielle; Wong, Albert; van den Muijsenbergh, Maria E T C; van Steenbergen, Jim E
2018-03-29
In November 2016, the Dutch Health Council recommended hepatitis B (HBV) screening for first-generation immigrants from HBV endemic countries. However, these communities show relatively low attendance rates for screening programmes, and our knowledge on their participation behaviour is limited. We identified determinants associated with the intention to request an HBV screening test in first-generation Moroccan-Dutch immigrants. We also investigated the influence of non-refundable costs for HBV screening on their intention. Offline and online questionnaires were distributed among first- and second/third-generation Moroccan-Dutch immigrants using respondent-driven sampling. Random forest analyses were conducted to determine which determinants had the greatest impact on (1) the intention to request an HBV screening test on one's own initiative, and (2) the intention to participate in non-refundable HBV screening at €70,-. Of the 379 Moroccan-Dutch respondents, 49.3% intended to request a test on their own initiative, and 44.1% were willing to attend non-refundable screening for €70,-. Clarity regarding infection status, not having symptoms, fatalism, perceived self-efficacy, and perceived risk of having HBV were the strongest predictors to request a test. Shame and stigma, fatalism, perceived burden of screening participation, and social influence of Islamic religious leaders had the greatest predictive value for not intending to participate in screening at €70,- non-refundable costs. Perceived severity and possible health benefit were facilitators for this intention measure. These predictions were satisfyingly accurate, as the random forest method retrieved area under the curve scores of 0.72 for intention to request a test and 0.67 for intention to participate in screening at €70,- non-refundable costs. By the use of respondent-driven sampling, we succeeded in studying screening behaviour among a hard-to-reach minority population. Despite the limitations associated with correlated data and the sampling method, we recommend to (1) incorporate clarity regarding HBV status, (2) stress the risk of an asymptomatic infection, (3) emphasise mother-to-child transmission as the main transmission route, and (4) team up with Islamic religious leaders to help decrease elements of fatalism, shame, and stigma to enhance screening uptake of Moroccan immigrants in the Netherlands.
Philip, Femi; Stewart, Susan; Southard, Jeffrey A
2016-07-01
The relative safety of drug-eluting stents (DES) and bare-metal stents (BMS) in primary percutaneous coronary intervention (PPCI) in ST elevation myocardial infarction (STEMI) continues to be debated. The long-term clinical outcomes between second generation DES and BMS for primary percutaneous coronary intervention (PCI) using network meta-analysis were compared. Randomized controlled trials comparing stent types (first generation DES, second generation DES, or BMS) were considered for inclusion. A search strategy used Medline, Embase, Cochrane databases, and proceedings of international meetings. Information about study design, inclusion criteria, and sample characteristics were extracted. Network meta-analysis was used to pool direct (comparison of second generation DES to BMS) and indirect evidence (first generation DES with BMS and second generation DES) from the randomized trials. Twelve trials comparing all stents types including 9,673 patients randomly assigned to treatment groups were analyzed. Second generation DES was associated with significantly lower incidence of definite or probable ST (OR 0.59, 95% CI 0.39-0.89), MI (OR 0.59, 95% CI 0.39-0.89), and TVR at 3 years (OR 0.50: 95% CI 0.31-0.81) compared with BMS. In addition, there was a significantly lower incidence of MACE with second generation DES versus BMS (OR 0.54, 95% CI 0.34-0.74) at 3 years. These were driven by a higher rate of TVR, MI and stent thrombosis in the BMS group at 3 years. There was a non-significant reduction in the overall and cardiac mortality [OR 0.83, 95% CI (0.60-1.14), OR 0.88, 95% CI (0.6-1.28)] with the use of second generation DES versus BMS at 3 years. Network meta-analysis of randomized trials of primary PCI demonstrated lower incidence of MACE, MI, TVR, and stent thrombosis with second generation DES compared with BMS. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Advances in plant gene-targeted and functional markers: a review
2013-01-01
Public genomic databases have provided new directions for molecular marker development and initiated a shift in the types of PCR-based techniques commonly used in plant science. Alongside commonly used arbitrarily amplified DNA markers, other methods have been developed. Targeted fingerprinting marker techniques are based on the well-established practices of arbitrarily amplified DNA methods, but employ novel methodological innovations such as the incorporation of gene or promoter elements in the primers. These markers provide good reproducibility and increased resolution by the concurrent incidence of dominant and co-dominant bands. Despite their promising features, these semi-random markers suffer from possible problems of collision and non-homology analogous to those found with randomly generated fingerprints. Transposable elements, present in abundance in plant genomes, may also be used to generate fingerprints. These markers provide increased genomic coverage by utilizing specific targeted sites and produce bands that mostly seem to be homologous. The biggest drawback with most of these techniques is that prior genomic information about retrotransposons is needed for primer design, prohibiting universal applications. Another class of recently developed methods exploits length polymorphism present in arrays of multi-copy gene families such as cytochrome P450 and β-tubulin genes to provide cross-species amplification and transferability. A specific class of marker makes use of common features of plant resistance genes to generate bands linked to a given phenotype, or to reveal genetic diversity. Conserved DNA-based strategies have limited genome coverage and may fail to reveal genetic diversity, while resistance genes may be under specific evolutionary selection. Markers may also be generated from functional and/or transcribed regions of the genome using different gene-targeting approaches coupled with the use of RNA information. Such techniques have the potential to generate phenotypically linked functional markers, especially when fingerprints are generated from the transcribed or expressed region of the genome. It is to be expected that these recently developed techniques will generate larger datasets, but their shortcomings should also be acknowledged and carefully investigated. PMID:23406322
A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing
NASA Technical Reports Server (NTRS)
Takaki, Mitsuo; Cavalcanti, Diego; Gheyi, Rohit; Iyoda, Juliano; dAmorim, Marcelo; Prudencio, Ricardo
2009-01-01
The complexity of constraints is a major obstacle for constraint-based software verification. Automatic constraint solvers are fundamentally incomplete: input constraints often build on some undecidable theory or some theory the solver does not support. This paper proposes and evaluates several randomized solvers to address this issue. We compare the effectiveness of a symbolic solver (CVC3), a random solver, three hybrid solvers (i.e., mix of random and symbolic), and two heuristic search solvers. We evaluate the solvers on two benchmarks: one consisting of manually generated constraints and another generated with a concolic execution of 8 subjects. In addition to fully decidable constraints, the benchmarks include constraints with non-linear integer arithmetic, integer modulo and division, bitwise arithmetic, and floating-point arithmetic. As expected symbolic solving (in particular, CVC3) subsumes the other solvers for the concolic execution of subjects that only generate decidable constraints. For the remaining subjects the solvers are complementary.
NASA Astrophysics Data System (ADS)
Brask, Jonatan Bohr; Martin, Anthony; Esposito, William; Houlmann, Raphael; Bowles, Joseph; Zbinden, Hugo; Brunner, Nicolas
2017-05-01
An approach to quantum random number generation based on unambiguous quantum state discrimination is developed. We consider a prepare-and-measure protocol, where two nonorthogonal quantum states can be prepared, and a measurement device aims at unambiguously discriminating between them. Because the states are nonorthogonal, this necessarily leads to a minimal rate of inconclusive events whose occurrence must be genuinely random and which provide the randomness source that we exploit. Our protocol is semi-device-independent in the sense that the output entropy can be lower bounded based on experimental data and a few general assumptions about the setup alone. It is also practically relevant, which we demonstrate by realizing a simple optical implementation, achieving rates of 16.5 Mbits /s . Combining ease of implementation, a high rate, and a real-time entropy estimation, our protocol represents a promising approach intermediate between fully device-independent protocols and commercial quantum random number generators.
Secure self-calibrating quantum random-bit generator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fiorentino, M.; Santori, C.; Spillane, S. M.
2007-03-15
Random-bit generators (RBGs) are key components of a variety of information processing applications ranging from simulations to cryptography. In particular, cryptographic systems require 'strong' RBGs that produce high-entropy bit sequences, but traditional software pseudo-RBGs have very low entropy content and therefore are relatively weak for cryptography. Hardware RBGs yield entropy from chaotic or quantum physical systems and therefore are expected to exhibit high entropy, but in current implementations their exact entropy content is unknown. Here we report a quantum random-bit generator (QRBG) that harvests entropy by measuring single-photon and entangled two-photon polarization states. We introduce and implement a quantum tomographicmore » method to measure a lower bound on the 'min-entropy' of the system, and we employ this value to distill a truly random-bit sequence. This approach is secure: even if an attacker takes control of the source of optical states, a secure random sequence can be distilled.« less
Automated segmentation of dental CBCT image with prior-guided sequential random forests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Li; Gao, Yaozong; Shi, Feng
Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate 3D models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the image artifacts caused by beam hardening, imaging noise, inhomogeneity, truncation, and maximal intercuspation, it is difficult to segment the CBCT. Methods: In this paper, the authors present a new automatic segmentation method to address these problems. Specifically, the authors first employ a majority voting method to estimatemore » the initial segmentation probability maps of both mandible and maxilla based on multiple aligned expert-segmented CBCT images. These probability maps provide an important prior guidance for CBCT segmentation. The authors then extract both the appearance features from CBCTs and the context features from the initial probability maps to train the first-layer of random forest classifier that can select discriminative features for segmentation. Based on the first-layer of trained classifier, the probability maps are updated, which will be employed to further train the next layer of random forest classifier. By iteratively training the subsequent random forest classifier using both the original CBCT features and the updated segmentation probability maps, a sequence of classifiers can be derived for accurate segmentation of CBCT images. Results: Segmentation results on CBCTs of 30 subjects were both quantitatively and qualitatively validated based on manually labeled ground truth. The average Dice ratios of mandible and maxilla by the authors’ method were 0.94 and 0.91, respectively, which are significantly better than the state-of-the-art method based on sparse representation (p-value < 0.001). Conclusions: The authors have developed and validated a novel fully automated method for CBCT segmentation.« less
Impact of Postapproval Evidence Generation on the Biopharmaceutical Industry.
Milne, Christopher-Paul; Cohen, Joshua P; Felix, Abigail; Chakravarthy, Ranjana
2015-08-01
Meeting marketplace demands for proving the value of new products requires more data than the industry has routinely produced. These data include evidence from comparative effectiveness research (CER), including randomized, controlled trials; pragmatic clinical trials; observational studies; and meta-analyses. We designed and conducted a survey to examine the industry's perceptions on new data requirements regarding CER evidence, the acceptability of postapproval study types, payer-specific issues related to CER, communication of data being generated postapproval, and methods used for facilitating postapproval evidence generation. CER is being used by payers for most types of postapproval decisions. Randomized, controlled trials were indicated as the most acceptable form of evidence. At the same time, there was support for the utility of other types of studies, such as pragmatic clinical trials and observational studies. Respondents indicated the use of multiple formats for communicating postapproval data with many different stakeholders including regulators, payers, providers, and patients. Risk-sharing agreements with payers were unanimously supported by respondents with regard to certain products with unclear clinical and economic outcomes at launch. In these instances, conditional reimbursement through coverage with evidence development was considered a constructive option. The Food and Drug Administration's initiative called Regulatory Science was considered by the respondents as having the most impact on streamlining the generation of postapproval research-related evidence. The biopharmaceutical industry is faced with a broad and complex set of challenges related to evidence generation for postapproval decisions by a variety of health care system stakeholders. Uncertainty remains as to how the industry and payers use postapproval studies to guide decision making with regard to pricing and reimbursement status. Correspondingly, there is uncertainty regarding whether the industry's investment in CER will have a positive return on investment in terms of reimbursement and market access. Copyright © 2015 Elsevier HS Journals, Inc. All rights reserved.
Gabora, Liane; Kauffman, Stuart
2016-04-01
Dietrich and Haider (Psychonomic Bulletin & Review, 21 (5), 897-915, 2014) justify their integrative framework for creativity founded on evolutionary theory and prediction research on the grounds that "theories and approaches guiding empirical research on creativity have not been supported by the neuroimaging evidence." Although this justification is controversial, the general direction holds promise. This commentary clarifies points of disagreement and unresolved issues, and addresses mis-applications of evolutionary theory that lead the authors to adopt a Darwinian (versus Lamarckian) approach. To say that creativity is Darwinian is not to say that it consists of variation plus selection - in the everyday sense of the term - as the authors imply; it is to say that evolution is occurring because selection is affecting the distribution of randomly generated heritable variation across generations. In creative thought the distribution of variants is not key, i.e., one is not inclined toward idea A because 60 % of one's candidate ideas are variants of A while only 40 % are variants of B; one is inclined toward whichever seems best. The authors concede that creative variation is partly directed; however, the greater the extent to which variants are generated non-randomly, the greater the extent to which the distribution of variants can reflect not selection but the initial generation bias. Since each thought in a creative process can alter the selective criteria against which the next is evaluated, there is no demarcation into generations as assumed in a Darwinian model. We address the authors' claim that reduced variability and individuality are more characteristic of Lamarckism than Darwinian evolution, and note that a Lamarckian approach to creativity has addressed the challenge of modeling the emergent features associated with insight.
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
Code of Federal Regulations, 2013 CFR
2013-07-01
... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
Code of Federal Regulations, 2011 CFR
2011-07-01
... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
Code of Federal Regulations, 2010 CFR
2010-07-01
... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
Code of Federal Regulations, 2014 CFR
2014-07-01
... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
Code of Federal Regulations, 2012 CFR
2012-07-01
... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...
Accelerating Pseudo-Random Number Generator for MCNP on GPU
NASA Astrophysics Data System (ADS)
Gong, Chunye; Liu, Jie; Chi, Lihua; Hu, Qingfeng; Deng, Li; Gong, Zhenghu
2010-09-01
Pseudo-random number generators (PRNG) are intensively used in many stochastic algorithms in particle simulations, artificial neural networks and other scientific computation. The PRNG in Monte Carlo N-Particle Transport Code (MCNP) requires long period, high quality, flexible jump and fast enough. In this paper, we implement such a PRNG for MCNP on NVIDIA's GTX200 Graphics Processor Units (GPU) using CUDA programming model. Results shows that 3.80 to 8.10 times speedup are achieved compared with 4 to 6 cores CPUs and more than 679.18 million double precision random numbers can be generated per second on GPU.
Digital-Analog Hybrid Scheme and Its Application to Chaotic Random Number Generators
NASA Astrophysics Data System (ADS)
Yuan, Zeshi; Li, Hongtao; Miao, Yunchi; Hu, Wen; Zhu, Xiaohua
2017-12-01
Practical random number generation (RNG) circuits are typically achieved with analog devices or digital approaches. Digital-based techniques, which use field programmable gate array (FPGA) and graphics processing units (GPU) etc. usually have better performances than analog methods as they are programmable, efficient and robust. However, digital realizations suffer from the effect of finite precision. Accordingly, the generated random numbers (RNs) are actually periodic instead of being real random. To tackle this limitation, in this paper we propose a novel digital-analog hybrid scheme that employs the digital unit as the main body, and minimum analog devices to generate physical RNs. Moreover, the possibility of realizing the proposed scheme with only one memory element is discussed. Without loss of generality, we use the capacitor and the memristor along with FPGA to construct the proposed hybrid system, and a chaotic true random number generator (TRNG) circuit is realized, producing physical RNs at a throughput of Gbit/s scale. These RNs successfully pass all the tests in the NIST SP800-22 package, confirming the significance of the scheme in practical applications. In addition, the use of this new scheme is not restricted to RNGs, and it also provides a strategy to solve the effect of finite precision in other digital systems.
Anonymous authenticated communications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beaver, Cheryl L; Schroeppel, Richard C; Snyder, Lillian A
2007-06-19
A method of performing electronic communications between members of a group wherein the communications are authenticated as being from a member of the group and have not been altered, comprising: generating a plurality of random numbers; distributing in a digital medium the plurality of random numbers to the members of the group; publishing a hash value of contents of the digital medium; distributing to the members of the group public-key-encrypted messages each containing a same token comprising a random number; and encrypting a message with a key generated from the token and the plurality of random numbers.
NullSeq: A Tool for Generating Random Coding Sequences with Desired Amino Acid and GC Contents.
Liu, Sophia S; Hockenberry, Adam J; Lancichinetti, Andrea; Jewett, Michael C; Amaral, Luís A N
2016-11-01
The existence of over- and under-represented sequence motifs in genomes provides evidence of selective evolutionary pressures on biological mechanisms such as transcription, translation, ligand-substrate binding, and host immunity. In order to accurately identify motifs and other genome-scale patterns of interest, it is essential to be able to generate accurate null models that are appropriate for the sequences under study. While many tools have been developed to create random nucleotide sequences, protein coding sequences are subject to a unique set of constraints that complicates the process of generating appropriate null models. There are currently no tools available that allow users to create random coding sequences with specified amino acid composition and GC content for the purpose of hypothesis testing. Using the principle of maximum entropy, we developed a method that generates unbiased random sequences with pre-specified amino acid and GC content, which we have developed into a python package. Our method is the simplest way to obtain maximally unbiased random sequences that are subject to GC usage and primary amino acid sequence constraints. Furthermore, this approach can easily be expanded to create unbiased random sequences that incorporate more complicated constraints such as individual nucleotide usage or even di-nucleotide frequencies. The ability to generate correctly specified null models will allow researchers to accurately identify sequence motifs which will lead to a better understanding of biological processes as well as more effective engineering of biological systems.
NASA Astrophysics Data System (ADS)
Tsao, Shih-Ming; Lai, Ji-Ching; Horng, Horng-Er; Liu, Tu-Chen; Hong, Chin-Yih
2017-04-01
Aptamers are oligonucleotides that can bind to specific target molecules. Most aptamers are generated using random libraries in the standard systematic evolution of ligands by exponential enrichment (SELEX). Each random library contains oligonucleotides with a randomized central region and two fixed primer regions at both ends. The fixed primer regions are necessary for amplifying target-bound sequences by PCR. However, these extra-sequences may cause non-specific bindings, which potentially interfere with good binding for random sequences. The Magnetic-Assisted Rapid Aptamer Selection (MARAS) is a newly developed protocol for generating single-strand DNA aptamers. No repeat selection cycle is required in the protocol. This study proposes and demonstrates a method to isolate aptamers for C-reactive proteins (CRP) from a randomized ssDNA library containing no fixed sequences at 5‧ and 3‧ termini using the MARAS platform. Furthermore, the isolated primer-free aptamer was sequenced and binding affinity for CRP was analyzed. The specificity of the obtained aptamer was validated using blind serum samples. The result was consistent with monoclonal antibody-based nephelometry analysis, which indicated that a primer-free aptamer has high specificity toward targets. MARAS is a feasible platform for efficiently generating primer-free aptamers for clinical diagnoses.
Ditcharles, Sébastien; Yiou, Eric; Delafontaine, Arnaud; Hamaoui, Alain
2017-01-01
Speed performance during gait initiation is known to be dependent on the capacity of the central nervous system to generate efficient anticipatory postural adjustments (APA). According to the posturo-kinetic capacity (PKC) concept, any factor enhancing postural chain mobility and especially spine mobility, may facilitate the development of APA and thus speed performance. “Spinal Manipulative Therapy High-Velocity, Low-Amplitude” (SMT-HVLA) is a healing technique applied to the spine which is routinely used by healthcare practitioners to improve spine mobility. As such, it may have a positive effect on the PKC and therefore facilitate gait initiation. The present study aimed to investigate the short-term effect of thoracic SMT-HVLA on spine mobility, APA and speed performance during gait initiation. Healthy young adults (n = 22) performed a series of gait initiation trials on a force plate before (“pre-manipulation” condition) and after (“post-manipulation” condition) a sham manipulation or an HVLA manipulation applied to the ninth thoracic vertebrae (T9). Participants were randomly assigned to the sham (n = 11) or the HVLA group (n = 11).The spine range of motion (ROM) was assessed in each participant immediately after the sham or HVLA manipulations using inclinometers. The results showed that the maximal thoracic flexion increased in the HVLA group after the manipulation, which was not the case in the sham group. In the HVLA group, results further showed that each of the following gait initiation variables reached a significantly lower mean value in the post-manipulation condition as compared to the pre-manipulation condition: APA duration, peak of anticipatory backward center of pressure displacement, center of gravity velocity at foot-off, mechanical efficiency of APA, peak of center of gravity velocity and step length. In contrast, for the sham group, results showed that none of the gait initiation variables significantly differed between the pre- and post-manipulation conditions. It is concluded that HVLA manipulation applied to T9 has an immediate beneficial effect on spine mobility but a detrimental effect on APA development and speed performance during gait initiation. We suggest that a neural effect induced by SMT-HVLA, possibly mediated by a transient alteration in the early sensory-motor integration, might have masked the potential mechanical benefits associated with increased spine mobility. PMID:28713254
He, Yi; Xiao, Yi; Liwo, Adam; Scheraga, Harold A
2009-10-01
We explored the energy-parameter space of our coarse-grained UNRES force field for large-scale ab initio simulations of protein folding, to obtain good initial approximations for hierarchical optimization of the force field with new virtual-bond-angle bending and side-chain-rotamer potentials which we recently introduced to replace the statistical potentials. 100 sets of energy-term weights were generated randomly, and good sets were selected by carrying out replica-exchange molecular dynamics simulations of two peptides with a minimal alpha-helical and a minimal beta-hairpin fold, respectively: the tryptophan cage (PDB code: 1L2Y) and tryptophan zipper (PDB code: 1LE1). Eight sets of parameters produced native-like structures of these two peptides. These eight sets were tested on two larger proteins: the engrailed homeodomain (PDB code: 1ENH) and FBP WW domain (PDB code: 1E0L); two sets were found to produce native-like conformations of these proteins. These two sets were tested further on a larger set of nine proteins with alpha or alpha + beta structure and found to locate native-like structures of most of them. These results demonstrate that, in addition to finding reasonable initial starting points for optimization, an extensive search of parameter space is a powerful method to produce a transferable force field. Copyright 2009 Wiley Periodicals, Inc.
Chayapathi, Varsha; Kalra, Manas; Bakshi, Anita S; Mahajan, Amita
2018-05-04
Both ketamine-midazolam and propofol are frequently used in pediatric oncology units for procedural sedation. However, there are no prospective, randomized comparative trials (RCT) comparing the two groups when the procedure is performed by nonanesthesiologists. To compare ketamine + midazolam (group A) and propofol (group B) as sedative agents for intrathecal chemotherapy with regard to efficacy, side effects, time to induction, time to recovery, and smoothness of recovery. A partially-blinded RCT was conducted between August 2015 and March 2017 after gaining institutional ethics committee approval. Children aged 1-12 years requiring intravenous sedation for intrathecal chemotherapy were included. Patients were allocated to two treatment arms using computer-generated randomization tables, after obtaining written consent. The initial doses used were: ketamine 2 mg/kg, midazolam 0.2 mg/kg, and propofol 2.5 mg/kg, as per standard recommendations. The patient, parents, and person analyzing the data were blinded. Time to sedation, dose required, depth of sedation, vital parameters, time and smoothness of recovery, and emergence phenomena were documented. We enrolled 152 patients (76 each in group A and B). Nine patients had a failure of sedation (all in group B). Mean time to sedation and recovery was shorter in group B (P < 0.001). Transient drop in saturation was more frequent in group B, without statistical significance (P = 0.174). Mean depth of sedation was greater in group A (P < 0.001). Emergence symptoms were more frequently experienced in group A (P < 0.001). Ketamine-midazolam combination is safer and more effective. Propofol is faster in onset and recovery, and has smoother emergence with poor efficacy at recommended initial doses. © 2018 Wiley Periodicals, Inc.
A Comparative Study of Random Patterns for Digital Image Correlation
NASA Astrophysics Data System (ADS)
Stoilov, G.; Kavardzhikov, V.; Pashkouleva, D.
2012-06-01
Digital Image Correlation (DIC) is a computer based image analysis technique utilizing random patterns, which finds applications in experimental mechanics of solids and structures. In this paper a comparative study of three simulated random patterns is done. One of them is generated according to a new algorithm, introduced by the authors. A criterion for quantitative evaluation of random patterns after the calculation of their autocorrelation functions is introduced. The patterns' deformations are simulated numerically and realized experimentally. The displacements are measured by using the DIC method. Tensile tests are performed after printing the generated random patterns on surfaces of standard iron sheet specimens. It is found that the new designed random pattern keeps relatively good quality until reaching 20% deformation.
Epigenetic Transgenerational Actions of Vinclozolin on Promoter Regions of the Sperm Epigenome
Guerrero-Bosagna, Carlos; Settles, Matthew; Lucker, Ben; Skinner, Michael K.
2010-01-01
Previous observations have demonstrated that embryonic exposure to the endocrine disruptor vinclozolin during gonadal sex determination promotes transgenerational adult onset disease such as male infertility, kidney disease, prostate disease, immune abnormalities and tumor development. The current study investigates genome-wide promoter DNA methylation alterations in the sperm of F3 generation rats whose F0 generation mother was exposed to vinclozolin. A methylated DNA immunoprecipitation with methyl-cytosine antibody followed by a promoter tilling microarray (MeDIP-Chip) procedure was used to identify 52 different regions with statistically significant altered methylation in the sperm promoter epigenome. Mass spectrometry bisulfite analysis was used to map the CpG DNA methylation and 16 differential DNA methylation regions were confirmed, while the remainder could not be analyzed due to bisulfite technical limitations. Analysis of these validated regions identified a consensus DNA sequence (motif) that associated with 75% of the promoters. Interestingly, only 16.8% of a random set of 125 promoters contained this motif. One candidate promoter (Fam111a) was found to be due to a copy number variation (CNV) and not a methylation change, suggesting initial alterations in the germline epigenome may promote genetic abnormalities such as induced CNV in later generations. This study identifies differential DNA methylation sites in promoter regions three generations after the initial exposure and identifies common genome features present in these regions. In addition to primary epimutations, a potential indirect genetic abnormality was identified, and both are postulated to be involved in the epigenetic transgenerational inheritance observed. This study confirms that an environmental agent has the ability to induce epigenetic transgenerational changes in the sperm epigenome. PMID:20927350
Epigenetic transgenerational actions of vinclozolin on promoter regions of the sperm epigenome.
Guerrero-Bosagna, Carlos; Settles, Matthew; Lucker, Ben; Skinner, Michael K
2010-09-30
Previous observations have demonstrated that embryonic exposure to the endocrine disruptor vinclozolin during gonadal sex determination promotes transgenerational adult onset disease such as male infertility, kidney disease, prostate disease, immune abnormalities and tumor development. The current study investigates genome-wide promoter DNA methylation alterations in the sperm of F3 generation rats whose F0 generation mother was exposed to vinclozolin. A methylated DNA immunoprecipitation with methyl-cytosine antibody followed by a promoter tilling microarray (MeDIP-Chip) procedure was used to identify 52 different regions with statistically significant altered methylation in the sperm promoter epigenome. Mass spectrometry bisulfite analysis was used to map the CpG DNA methylation and 16 differential DNA methylation regions were confirmed, while the remainder could not be analyzed due to bisulfite technical limitations. Analysis of these validated regions identified a consensus DNA sequence (motif) that associated with 75% of the promoters. Interestingly, only 16.8% of a random set of 125 promoters contained this motif. One candidate promoter (Fam111a) was found to be due to a copy number variation (CNV) and not a methylation change, suggesting initial alterations in the germline epigenome may promote genetic abnormalities such as induced CNV in later generations. This study identifies differential DNA methylation sites in promoter regions three generations after the initial exposure and identifies common genome features present in these regions. In addition to primary epimutations, a potential indirect genetic abnormality was identified, and both are postulated to be involved in the epigenetic transgenerational inheritance observed. This study confirms that an environmental agent has the ability to induce epigenetic transgenerational changes in the sperm epigenome.
Shultz, Mary
2006-01-01
Introduction: Given the common use of acronyms and initialisms in the health sciences, searchers may be entering these abbreviated terms rather than full phrases when searching online systems. The purpose of this study is to evaluate how various MEDLINE Medical Subject Headings (MeSH) interfaces map acronyms and initialisms to the MeSH vocabulary. Methods: The interfaces used in this study were: the PubMed MeSH database, the PubMed Automatic Term Mapping feature, the NLM Gateway Term Finder, and Ovid MEDLINE. Acronyms and initialisms were randomly selected from 2 print sources. The test data set included 415 randomly selected acronyms and initialisms whose related meanings were found to be MeSH terms. Each acronym and initialism was entered into each MEDLINE MeSH interface to determine if it mapped to the corresponding MeSH term. Separately, 46 commonly used acronyms and initialisms were tested. Results: While performance differed widely, the success rates were low across all interfaces for the randomly selected terms. The common acronyms and initialisms tested at higher success rates across the interfaces, but the differences between the interfaces remained. Conclusion: Online interfaces do not always map medical acronyms and initialisms to their corresponding MeSH phrases. This may lead to inaccurate results and missed information if acronyms and initialisms are used in search strategies. PMID:17082832
Finite GUE Distribution with Cut-Off at a Shock
NASA Astrophysics Data System (ADS)
Ferrari, P. L.
2018-03-01
We consider the totally asymmetric simple exclusion process with initial conditions generating a shock. The fluctuations of particle positions are asymptotically governed by the randomness around the two characteristic lines joining at the shock. Unlike in previous papers, we describe the correlation in space-time without employing the mapping to the last passage percolation, which fails to exists already for the partially asymmetric model. We then consider a special case, where the asymptotic distribution is a cut-off of the distribution of the largest eigenvalue of a finite GUE matrix. Finally we discuss the strength of the probabilistic and physically motivated approach and compare it with the mathematical difficulties of a direct computation.
Exploration Opportunity Search of Near-earth Objects Based on Analytical Gradients
NASA Astrophysics Data System (ADS)
Ren, Yuan; Cui, Ping-Yuan; Luan, En-Jie
2008-07-01
The problem of search of opportunity for the exploration of near-earth minor objects is investigated. For rendezvous missions, the analytical gradients of the performance index with respect to the free parameters are derived using the variational calculus and the theory of state-transition matrix. After generating randomly some initial guesses in the search space, the performance index is optimized, guided by the analytical gradients, leading to the local minimum points representing the potential launch opportunities. This method not only keeps the global-search property of the traditional method, but also avoids the blindness in the latter, thereby increasing greatly the computing speed. Furthermore, with this method, the searching precision could be controlled effectively.
SNP selection and classification of genome-wide SNP data using stratified sampling random forests.
Wu, Qingyao; Ye, Yunming; Liu, Yang; Ng, Michael K
2012-09-01
For high dimensional genome-wide association (GWA) case-control data of complex disease, there are usually a large portion of single-nucleotide polymorphisms (SNPs) that are irrelevant with the disease. A simple random sampling method in random forest using default mtry parameter to choose feature subspace, will select too many subspaces without informative SNPs. Exhaustive searching an optimal mtry is often required in order to include useful and relevant SNPs and get rid of vast of non-informative SNPs. However, it is too time-consuming and not favorable in GWA for high-dimensional data. The main aim of this paper is to propose a stratified sampling method for feature subspace selection to generate decision trees in a random forest for GWA high-dimensional data. Our idea is to design an equal-width discretization scheme for informativeness to divide SNPs into multiple groups. In feature subspace selection, we randomly select the same number of SNPs from each group and combine them to form a subspace to generate a decision tree. The advantage of this stratified sampling procedure can make sure each subspace contains enough useful SNPs, but can avoid a very high computational cost of exhaustive search of an optimal mtry, and maintain the randomness of a random forest. We employ two genome-wide SNP data sets (Parkinson case-control data comprised of 408 803 SNPs and Alzheimer case-control data comprised of 380 157 SNPs) to demonstrate that the proposed stratified sampling method is effective, and it can generate better random forest with higher accuracy and lower error bound than those by Breiman's random forest generation method. For Parkinson data, we also show some interesting genes identified by the method, which may be associated with neurological disorders for further biological investigations.
Single-electron random-number generator (RNG) for highly secure ubiquitous computing applications
NASA Astrophysics Data System (ADS)
Uchida, Ken; Tanamoto, Tetsufumi; Fujita, Shinobu
2007-11-01
Since the security of all modern cryptographic techniques relies on unpredictable and irreproducible digital keys generated by random-number generators (RNGs), the realization of high-quality RNG is essential for secure communications. In this report, a new RNG, which utilizes single-electron phenomena, is proposed. A room-temperature operating silicon single-electron transistor (SET) having nearby an electron pocket is used as a high-quality, ultra-small RNG. In the proposed RNG, stochastic single-electron capture/emission processes to/from the electron pocket are detected with high sensitivity by the SET, and result in giant random telegraphic signals (GRTS) on the SET current. It is experimentally demonstrated that the single-electron RNG generates extremely high-quality random digital sequences at room temperature, in spite of its simple configuration. Because of its small-size and low-power properties, the single-electron RNG is promising as a key nanoelectronic device for future ubiquitous computing systems with highly secure mobile communication capabilities.
NASA Astrophysics Data System (ADS)
Srinivas, Kadivendi; Vundavilli, Pandu R.; Manzoor Hussain, M.; Saiteja, M.
2016-09-01
Welding input parameters such as current, gas flow rate and torch angle play a significant role in determination of qualitative mechanical properties of weld joint. Traditionally, it is necessary to determine the weld input parameters for every new welded product to obtain a quality weld joint which is time consuming. In the present work, the effect of plasma arc welding parameters on mild steel was studied using a neural network approach. To obtain a response equation that governs the input-output relationships, conventional regression analysis was also performed. The experimental data was constructed based on Taguchi design and the training data required for neural networks were randomly generated, by varying the input variables within their respective ranges. The responses were calculated for each combination of input variables by using the response equations obtained through the conventional regression analysis. The performances in Levenberg-Marquardt back propagation neural network and radial basis neural network (RBNN) were compared on various randomly generated test cases, which are different from the training cases. From the results, it is interesting to note that for the above said test cases RBNN analysis gave improved training results compared to that of feed forward back propagation neural network analysis. Also, RBNN analysis proved a pattern of increasing performance as the data points moved away from the initial input values.
NASA Astrophysics Data System (ADS)
Cao, Xiangyu; Fyodorov, Yan V.; Le Doussal, Pierre
2018-02-01
We address systematically an apparent nonphysical behavior of the free-energy moment generating function for several instances of the logarithmically correlated models: the fractional Brownian motion with Hurst index H =0 (fBm0) (and its bridge version), a one-dimensional model appearing in decaying Burgers turbulence with log-correlated initial conditions and, finally, the two-dimensional log-correlated random-energy model (logREM) introduced in Cao et al. [Phys. Rev. Lett. 118, 090601 (2017), 10.1103/PhysRevLett.118.090601] based on the two-dimensional Gaussian free field with background charges and directly related to the Liouville field theory. All these models share anomalously large fluctuations of the associated free energy, with a variance proportional to the log of the system size. We argue that a seemingly nonphysical vanishing of the moment generating function for some values of parameters is related to the termination point transition (i.e., prefreezing). We study the associated universal log corrections in the frozen phase, both for logREMs and for the standard REM, filling a gap in the literature. For the above mentioned integrable instances of logREMs, we predict the nontrivial free-energy cumulants describing non-Gaussian fluctuations on the top of the Gaussian with extensive variance. Some of the predictions are tested numerically.
DNA/RNA hybrid substrates modulate the catalytic activity of purified AID.
Abdouni, Hala S; King, Justin J; Ghorbani, Atefeh; Fifield, Heather; Berghuis, Lesley; Larijani, Mani
2018-01-01
Activation-induced cytidine deaminase (AID) converts cytidine to uridine at Immunoglobulin (Ig) loci, initiating somatic hypermutation and class switching of antibodies. In vitro, AID acts on single stranded DNA (ssDNA), but neither double-stranded DNA (dsDNA) oligonucleotides nor RNA, and it is believed that transcription is the in vivo generator of ssDNA targeted by AID. It is also known that the Ig loci, particularly the switch (S) regions targeted by AID are rich in transcription-generated DNA/RNA hybrids. Here, we examined the binding and catalytic behavior of purified AID on DNA/RNA hybrid substrates bearing either random sequences or GC-rich sequences simulating Ig S regions. If substrates were made up of a random sequence, AID preferred substrates composed entirely of DNA over DNA/RNA hybrids. In contrast, if substrates were composed of S region sequences, AID preferred to mutate DNA/RNA hybrids over substrates composed entirely of DNA. Accordingly, AID exhibited a significantly higher affinity for binding DNA/RNA hybrid substrates composed specifically of S region sequences, than any other substrates composed of DNA. Thus, in the absence of any other cellular processes or factors, AID itself favors binding and mutating DNA/RNA hybrids composed of S region sequences. AID:DNA/RNA complex formation and supporting mutational analyses suggest that recognition of DNA/RNA hybrids is an inherent structural property of AID. Copyright © 2017 Elsevier Ltd. All rights reserved.
Regional SAR Image Segmentation Based on Fuzzy Clustering with Gamma Mixture Model
NASA Astrophysics Data System (ADS)
Li, X. L.; Zhao, Q. H.; Li, Y.
2017-09-01
Most of stochastic based fuzzy clustering algorithms are pixel-based, which can not effectively overcome the inherent speckle noise in SAR images. In order to deal with the problem, a regional SAR image segmentation algorithm based on fuzzy clustering with Gamma mixture model is proposed in this paper. First, initialize some generating points randomly on the image, the image domain is divided into many sub-regions using Voronoi tessellation technique. Each sub-region is regarded as a homogeneous area in which the pixels share the same cluster label. Then, assume the probability of the pixel to be a Gamma mixture model with the parameters respecting to the cluster which the pixel belongs to. The negative logarithm of the probability represents the dissimilarity measure between the pixel and the cluster. The regional dissimilarity measure of one sub-region is defined as the sum of the measures of pixels in the region. Furthermore, the Markov Random Field (MRF) model is extended from pixels level to Voronoi sub-regions, and then the regional objective function is established under the framework of fuzzy clustering. The optimal segmentation results can be obtained by the solution of model parameters and generating points. Finally, the effectiveness of the proposed algorithm can be proved by the qualitative and quantitative analysis from the segmentation results of the simulated and real SAR images.
Groupp, Elyse; Haas, Mitchell; Fairweather, Alisa; Ganger, Bonnie; Attwood, Michael
2005-02-01
To identify recruitment challenges and elucidate specific strategies that enabled recruitment of seniors for a randomized trial on low back pain comparing the Chronic Disease Self-management Program of the Stanford University to a 6-month wait-list control group. Recruitment for a randomized controlled trial. Community-based program offered at 12 locations. Community-dwelling seniors 60 years and older with chronic low back pain of mechanical origin. Passive recruitment strategies included advertisement in local and senior newspapers, in senior e-mail newsletters and listservs, in local community centers and businesses. Active strategies included meeting seniors at health fairs, lectures to the public and organizational meetings, and the help of trusted professionals in the community. A total of 100 white and 20 African American seniors were recruited. The program seemed to have the most appeal to white, middle-class older adults, educated through high school level. Advertisement failed to attract any participants to the program. Successful strategies included interaction with seniors at health fairs and lectures on health care, especially when the program was endorsed by a trusted community professional. Generating interest in the self-management program required keen communication skills because the idea of "self-management" was met with a myriad of responses, ranging from disinterest to disbelief. Generating interest also required active participation within the communities. Initial contacts had to be established with trusted professionals, whose endorsement enabled the project managers to present the concept of self-management to the seniors. More complex recruitment strategies were required for this study involving the self-management approach to back pain than for studies involving treatment.
Terçariol, César Augusto Sangaletti; Martinez, Alexandre Souto
2005-08-01
Consider a medium characterized by N points whose coordinates are randomly generated by a uniform distribution along the edges of a unitary d-dimensional hypercube. A walker leaves from each point of this disordered medium and moves according to the deterministic rule to go to the nearest point which has not been visited in the preceding mu steps (deterministic tourist walk). Each trajectory generated by this dynamics has an initial nonperiodic part of t steps (transient) and a final periodic part of p steps (attractor). The neighborhood rank probabilities are parametrized by the normalized incomplete beta function Id= I1/4 [1/2, (d+1) /2] . The joint distribution S(N) (mu,d) (t,p) is relevant, and the marginal distributions previously studied are particular cases. We show that, for the memory-less deterministic tourist walk in the euclidean space, this distribution is Sinfinity(1,d) (t,p) = [Gamma (1+ I(-1)(d)) (t+ I(-1)(d) ) /Gamma(t+p+ I(-1)(d)) ] delta(p,2), where t=0, 1,2, ... infinity, Gamma(z) is the gamma function and delta(i,j) is the Kronecker delta. The mean-field models are the random link models, which correspond to d-->infinity, and the random map model which, even for mu=0 , presents nontrivial cycle distribution [ S(N)(0,rm) (p) proportional to p(-1) ] : S(N)(0,rm) (t,p) =Gamma(N)/ {Gamma[N+1- (t+p) ] N( t+p)}. The fundamental quantities are the number of explored points n(e)=t+p and Id. Although the obtained distributions are simple, they do not follow straightforwardly and they have been validated by numerical experiments.
Faggion, Clovis Mariano; Wu, Yun-Chun; Scheidgen, Moritz; Tu, Yu-Kang
2015-01-01
Background Risk of bias (ROB) may threaten the internal validity of a clinical trial by distorting the magnitude of treatment effect estimates, although some conflicting information on this assumption exists. Objective The objective of this study was evaluate the effect of ROB on the magnitude of treatment effect estimates in randomized controlled trials (RCTs) in periodontology and implant dentistry. Methods A search for Cochrane systematic reviews (SRs), including meta-analyses of RCTs published in periodontology and implant dentistry fields, was performed in the Cochrane Library in September 2014. Random-effect meta-analyses were performed by grouping RCTs with different levels of ROBs in three domains (sequence generation, allocation concealment, and blinding of outcome assessment). To increase power and precision, only SRs with meta-analyses including at least 10 RCTs were included. Meta-regression was performed to investigate the association between ROB characteristics and the magnitudes of intervention effects in the meta-analyses. Results Of the 24 initially screened SRs, 21 SRs were excluded because they did not include at least 10 RCTs in the meta-analyses. Three SRs (two from periodontology field) generated information for conducting 27 meta-analyses. Meta-regression did not reveal significant differences in the relationship of the ROB level with the size of treatment effect estimates, although a trend for inflated estimates was observed in domains with unclear ROBs. Conclusion In this sample of RCTs, high and (mainly) unclear risks of selection and detection biases did not seem to influence the size of treatment effect estimates, although several confounders might have influenced the strength of the association. PMID:26422698
ERIC Educational Resources Information Center
Al Otaiba, Stephanie; Lake, Vickie E.; Greulich, Luana; Folsom, Jessica S.; Guidry, Lisa
2012-01-01
This randomized-control trial examined the learning of preservice teachers taking an initial Early Literacy course in an early childhood education program and of the kindergarten or first grade students they tutored in their field experience. Preservice teachers were randomly assigned to one of two tutoring programs: Book Buddies and Tutor…
Strenge, Hans; Lesmana, Cokorda Bagus Jaya; Suryani, Luh Ketut
2009-08-01
Verbal random number generation is a procedurally simple task to assess executive function and appears ideally suited for the use under diverse settings in cross-cultural research. The objective of this study was to examine ethnic group differences between young adults in Bali (Indonesia) and Kiel (Germany): 50 bilingual healthy students, 30 Balinese and 20 Germans, attempted to generate a random sequence of the digits 1 to 9. In Balinese participants, randomization was done in Balinese (native language L1) and Indonesian (first foreign language L2), in German subjects in the German (L1) and English (L2) languages. 10 of 30 Balinese (33%), but no Germans, were unable to inhibit habitual counting in more than half of the responses. The Balinese produced significantly more nonrandom responses than the Germans with higher rates of counting and significantly less occurrence of the digits 2 and 3 in L1 compared with L2. Repetition and cycling behavior did not differ between the four languages. The findings highlight the importance of taking into account culture-bound psychosocial factors for Balinese individuals when administering and interpreting a random number generation test.
Random number generators tested on quantum Monte Carlo simulations.
Hongo, Kenta; Maezono, Ryo; Miura, Kenichi
2010-08-01
We have tested and compared several (pseudo) random number generators (RNGs) applied to a practical application, ground state energy calculations of molecules using variational and diffusion Monte Carlo metheds. A new multiple recursive generator with 8th-order recursion (MRG8) and the Mersenne twister generator (MT19937) are tested and compared with the RANLUX generator with five luxury levels (RANLUX-[0-4]). Both MRG8 and MT19937 are proven to give the same total energy as that evaluated with RANLUX-4 (highest luxury level) within the statistical error bars with less computational cost to generate the sequence. We also tested the notorious implementation of linear congruential generator (LCG), RANDU, for comparison. (c) 2010 Wiley Periodicals, Inc.
An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution
NASA Technical Reports Server (NTRS)
Campbell, C. W.
1983-01-01
An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.
Music therapy CD creation for initial pediatric radiation therapy: a mixed methods analysis.
Barry, Philippa; O'Callaghan, Clare; Wheeler, Greg; Grocke, Denise
2010-01-01
A mixed methods research design was used to investigate the effects of a music therapy CD (MTCD) creation intervention on pediatric oncology patients' distress and coping during their first radiation therapy treatment. The music therapy method involved children creating a music CD using interactive computer-based music software, which was "remixed" by the music therapist-researcher to extend the musical material. Eleven pediatric radiation therapy outpatients aged 6 to 13 years were randomly assigned to either an experimental group, in which they could create a music CD prior to their initial treatment to listen to during radiation therapy, or to a standard care group. Quantitative and qualitative analyses generated multiple perceptions from the pediatric patients, parents, radiation therapy staff, and music therapist-researcher. Ratings of distress during initial radiation therapy treatment were low for all children. The comparison between the two groups found that 67% of the children in the standard care group used social withdrawal as a coping strategy, compared to 0% of the children in the music therapy group; this trend approached significance (p = 0.076). MTCD creation was a fun, engaging, and developmentally appropriate intervention for pediatric patients, which offered a positive experience and aided their use of effective coping strategies to meet the demands of their initial radiation therapy treatment.
Fidelity under isospectral perturbations: a random matrix study
NASA Astrophysics Data System (ADS)
Leyvraz, F.; García, A.; Kohler, H.; Seligman, T. H.
2013-07-01
The set of Hamiltonians generated by all unitary transformations from a single Hamiltonian is the largest set of isospectral Hamiltonians we can form. Taking advantage of the fact that the unitary group can be generated from Hermitian matrices we can take the ones generated by the Gaussian unitary ensemble with a small parameter as small perturbations. Similarly, the transformations generated by Hermitian antisymmetric matrices from orthogonal matrices form isospectral transformations among symmetric matrices. Based on this concept we can obtain the fidelity decay of a system that decays under a random isospectral perturbation with well-defined properties regarding time-reversal invariance. If we choose the Hamiltonian itself also from a classical random matrix ensemble, then we obtain solutions in terms of form factors in the limit of large matrices.
Investigation of estimators of probability density functions
NASA Technical Reports Server (NTRS)
Speed, F. M.
1972-01-01
Four research projects are summarized which include: (1) the generation of random numbers on the IBM 360/44, (2) statistical tests used to check out random number generators, (3) Specht density estimators, and (4) use of estimators of probability density functions in analyzing large amounts of data.
DNA based random key generation and management for OTP encryption.
Zhang, Yunpeng; Liu, Xin; Sun, Manhui
2017-09-01
One-time pad (OTP) is a principle of key generation applied to the stream ciphering method which offers total privacy. The OTP encryption scheme has proved to be unbreakable in theory, but difficult to realize in practical applications. Because OTP encryption specially requires the absolute randomness of the key, its development has suffered from dense constraints. DNA cryptography is a new and promising technology in the field of information security. DNA chromosomes storing capabilities can be used as one-time pad structures with pseudo-random number generation and indexing in order to encrypt the plaintext messages. In this paper, we present a feasible solution to the OTP symmetric key generation and transmission problem with DNA at the molecular level. Through recombinant DNA technology, by using only sender-receiver known restriction enzymes to combine the secure key represented by DNA sequence and the T vector, we generate the DNA bio-hiding secure key and then place the recombinant plasmid in implanted bacteria for secure key transmission. The designed bio experiments and simulation results show that the security of the transmission of the key is further improved and the environmental requirements of key transmission are reduced. Analysis has demonstrated that the proposed DNA-based random key generation and management solutions are marked by high security and usability. Published by Elsevier B.V.
Spectral turning bands for efficient Gaussian random fields generation on GPUs and accelerators
NASA Astrophysics Data System (ADS)
Hunger, L.; Cosenza, B.; Kimeswenger, S.; Fahringer, T.
2015-11-01
A random field (RF) is a set of correlated random variables associated with different spatial locations. RF generation algorithms are of crucial importance for many scientific areas, such as astrophysics, geostatistics, computer graphics, and many others. Current approaches commonly make use of 3D fast Fourier transform (FFT), which does not scale well for RF bigger than the available memory; they are also limited to regular rectilinear meshes. We introduce random field generation with the turning band method (RAFT), an RF generation algorithm based on the turning band method that is optimized for massively parallel hardware such as GPUs and accelerators. Our algorithm replaces the 3D FFT with a lower-order, one-dimensional FFT followed by a projection step and is further optimized with loop unrolling and blocking. RAFT can easily generate RF on non-regular (non-uniform) meshes and efficiently produce fields with mesh sizes bigger than the available device memory by using a streaming, out-of-core approach. Our algorithm generates RF with the correct statistical behavior and is tested on a variety of modern hardware, such as NVIDIA Tesla, AMD FirePro and Intel Phi. RAFT is faster than the traditional methods on regular meshes and has been successfully applied to two real case scenarios: planetary nebulae and cosmological simulations.
Cumming, Bruce G.
2016-01-01
In order to extract retinal disparity from a visual scene, the brain must match corresponding points in the left and right retinae. This computationally demanding task is known as the stereo correspondence problem. The initial stage of the solution to the correspondence problem is generally thought to consist of a correlation-based computation. However, recent work by Doi et al suggests that human observers can see depth in a class of stimuli where the mean binocular correlation is 0 (half-matched random dot stereograms). Half-matched random dot stereograms are made up of an equal number of correlated and anticorrelated dots, and the binocular energy model—a well-known model of V1 binocular complex cells—fails to signal disparity here. This has led to the proposition that a second, match-based computation must be extracting disparity in these stimuli. Here we show that a straightforward modification to the binocular energy model—adding a point output nonlinearity—is by itself sufficient to produce cells that are disparity-tuned to half-matched random dot stereograms. We then show that a simple decision model using this single mechanism can reproduce psychometric functions generated by human observers, including reduced performance to large disparities and rapidly updating dot patterns. The model makes predictions about how performance should change with dot size in half-matched stereograms and temporal alternation in correlation, which we test in human observers. We conclude that a single correlation-based computation, based directly on already-known properties of V1 neurons, can account for the literature on mixed correlation random dot stereograms. PMID:27196696
A high-speed on-chip pseudo-random binary sequence generator for multi-tone phase calibration
NASA Astrophysics Data System (ADS)
Gommé, Liesbeth; Vandersteen, Gerd; Rolain, Yves
2011-07-01
An on-chip reference generator is conceived by adopting the technique of decimating a pseudo-random binary sequence (PRBS) signal in parallel sequences. This is of great benefit when high-speed generation of PRBS and PRBS-derived signals is the objective. The design implemented standard CMOS logic is available in commercial libraries to provide the logic functions for the generator. The design allows the user to select the periodicity of the PRBS and the PRBS-derived signals. The characterization of the on-chip generator marks its performance and reveals promising specifications.
Scope of Various Random Number Generators in ant System Approach for TSP
NASA Technical Reports Server (NTRS)
Sen, S. K.; Shaykhian, Gholam Ali
2007-01-01
Experimented on heuristic, based on an ant system approach for traveling salesman problem, are several quasi- and pseudo-random number generators. This experiment is to explore if any particular generator is most desirable. Such an experiment on large samples has the potential to rank the performance of the generators for the foregoing heuristic. This is mainly to seek an answer to the controversial issue "which generator is the best in terms of quality of the result (accuracy) as well as cost of producing the result (time/computational complexity) in a probabilistic/statistical sense."
A Statistical Method to Distinguish Functional Brain Networks
Fujita, André; Vidal, Maciel C.; Takahashi, Daniel Y.
2017-01-01
One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism (p < 0.001). PMID:28261045
A Statistical Method to Distinguish Functional Brain Networks.
Fujita, André; Vidal, Maciel C; Takahashi, Daniel Y
2017-01-01
One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism ( p < 0.001).
Hu, Kun; Lu, Houbing; Wang, Xu; Li, Feng; Liang, Futian; Jin, Ge
2015-01-01
The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.
Note: The design of thin gap chamber simulation signal source based on field programmable gate array
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Kun; Wang, Xu; Li, Feng
The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.
Housworth, E A; Martins, E P
2001-01-01
Statistical randomization tests in evolutionary biology often require a set of random, computer-generated trees. For example, earlier studies have shown how large numbers of computer-generated trees can be used to conduct phylogenetic comparative analyses even when the phylogeny is uncertain or unknown. These methods were limited, however, in that (in the absence of molecular sequence or other data) they allowed users to assume that no phylogenetic information was available or that all possible trees were known. Intermediate situations where only a taxonomy or other limited phylogenetic information (e.g., polytomies) are available are technically more difficult. The current study describes a procedure for generating random samples of phylogenies while incorporating limited phylogenetic information (e.g., four taxa belong together in a subclade). The procedure can be used to conduct comparative analyses when the phylogeny is only partially resolved or can be used in other randomization tests in which large numbers of possible phylogenies are needed.
Cabral, Patricia; Wallander, Jan L; Song, Anna V; Elliott, Marc N; Tortolero, Susan R; Reisner, Sari L; Schuster, Mark A
2017-02-01
Examine the longitudinal association of generational status (first = child and parent born outside the United States; second = child born in the United States, parent born outside the United States; third = child and parent born in the United States) and parent and peer social factors considered in 5th grade with subsequent oral, vaginal, and anal intercourse initiation by 7th and 10th grade among Latino/a youth. Using data from Latino/a participants (N = 1,790) in the Healthy Passages™ study, the authors measured generational status (first = 18.4%, second = 57.3%, third-generation = 24.3%) and parental (i.e., monitoring, involvement, nurturance) and peer (i.e., friendship quality, social interaction, peer norms) influences in 5th grade and oral, vaginal, and anal intercourse initiation by 7th and 10th (retention = 89%) grade. Among girls, parental monitoring, social interaction, friendship quality, and peer norms predicted sexual initiation. Among boys, parental involvement, social interaction, and peer norms predicted sexual initiation (ps < .05). When ≥1 friend was perceived to have initiated sexual intercourse, third-generation Latinas were more than twice as likely as first- and second-generation Latinas (ps < .05) to initiate vaginal intercourse by 10th grade and almost 5 times as likely as first-generation Latinas to initiate oral intercourse by 7th grade. Among Latina youth, generational status plays a role in social influences on vaginal and oral intercourse initiation. Moreover, Latinas and Latinos differ in which social influences predict sexual intercourse initiation. Preventive efforts for Latino/a youth may need to differ by gender and generational status. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Physical layer one-time-pad data encryption through synchronized semiconductor laser networks
NASA Astrophysics Data System (ADS)
Argyris, Apostolos; Pikasis, Evangelos; Syvridis, Dimitris
2016-02-01
Semiconductor lasers (SL) have been proven to be a key device in the generation of ultrafast true random bit streams. Their potential to emit chaotic signals under conditions with desirable statistics, establish them as a low cost solution to cover various needs, from large volume key generation to real-time encrypted communications. Usually, only undemanding post-processing is needed to convert the acquired analog timeseries to digital sequences that pass all established tests of randomness. A novel architecture that can generate and exploit these true random sequences is through a fiber network in which the nodes are semiconductor lasers that are coupled and synchronized to central hub laser. In this work we show experimentally that laser nodes in such a star network topology can synchronize with each other through complex broadband signals that are the seed to true random bit sequences (TRBS) generated at several Gb/s. The potential for each node to access real-time generated and synchronized with the rest of the nodes random bit streams, through the fiber optic network, allows to implement an one-time-pad encryption protocol that mixes the synchronized true random bit sequence with real data at Gb/s rates. Forward-error correction methods are used to reduce the errors in the TRBS and the final error rate at the data decoding level. An appropriate selection in the sampling methodology and properties, as well as in the physical properties of the chaotic seed signal through which network locks in synchronization, allows an error free performance.
Reaction times to weak test lights. [psychophysics biological model
NASA Technical Reports Server (NTRS)
Wandell, B. A.; Ahumada, P.; Welsh, D.
1984-01-01
Maloney and Wandell (1984) describe a model of the response of a single visual channel to weak test lights. The initial channel response is a linearly filtered version of the stimulus. The filter output is randomly sampled over time. Each time a sample occurs there is some probability increasing with the magnitude of the sampled response - that a discrete detection event is generated. Maloney and Wandell derive the statistics of the detection events. In this paper a test is conducted of the hypothesis that the reaction time responses to the presence of a weak test light are initiated at the first detection event. This makes it possible to extend the application of the model to lights that are slightly above threshold, but still within the linear operating range of the visual system. A parameter-free prediction of the model proposed by Maloney and Wandell for lights detected by this statistic is tested. The data are in agreement with the prediction.
Meyer, Heather M; Teles, José; Formosa-Jordan, Pau; Refahi, Yassin; San-Bento, Rita; Ingram, Gwyneth; Jönsson, Henrik; Locke, James C W; Roeder, Adrienne H K
2017-01-01
Multicellular development produces patterns of specialized cell types. Yet, it is often unclear how individual cells within a field of identical cells initiate the patterning process. Using live imaging, quantitative image analyses and modeling, we show that during Arabidopsis thaliana sepal development, fluctuations in the concentration of the transcription factor ATML1 pattern a field of identical epidermal cells to differentiate into giant cells interspersed between smaller cells. We find that ATML1 is expressed in all epidermal cells. However, its level fluctuates in each of these cells. If ATML1 levels surpass a threshold during the G2 phase of the cell cycle, the cell will likely enter a state of endoreduplication and become giant. Otherwise, the cell divides. Our results demonstrate a fluctuation-driven patterning mechanism for how cell fate decisions can be initiated through a random yet tightly regulated process. DOI: http://dx.doi.org/10.7554/eLife.19131.001 PMID:28145865
Partial bisulfite conversion for unique template sequencing
Kumar, Vijay; Rosenbaum, Julie; Wang, Zihua; Forcier, Talitha; Ronemus, Michael; Wigler, Michael
2018-01-01
Abstract We introduce a new protocol, mutational sequencing or muSeq, which uses sodium bisulfite to randomly deaminate unmethylated cytosines at a fixed and tunable rate. The muSeq protocol marks each initial template molecule with a unique mutation signature that is present in every copy of the template, and in every fragmented copy of a copy. In the sequenced read data, this signature is observed as a unique pattern of C-to-T or G-to-A nucleotide conversions. Clustering reads with the same conversion pattern enables accurate count and long-range assembly of initial template molecules from short-read sequence data. We explore count and low-error sequencing by profiling 135 000 restriction fragments in a PstI representation, demonstrating that muSeq improves copy number inference and significantly reduces sporadic sequencer error. We explore long-range assembly in the context of cDNA, generating contiguous transcript clusters greater than 3,000 bp in length. The muSeq assemblies reveal transcriptional diversity not observable from short-read data alone. PMID:29161423
Li, Hai; Zhang, James J; Mao, Luke Lunhua; Min, Sophia D
2012-09-01
The purpose of this study was to identify and examine consumer perception of corporate social responsibility (CSR) in China's sports lottery industry, and the effect of perceived CSR initiatives on sports lottery consumption behavior. Research participants (N = 4,980), selected based on a computer-generated, randomly stratified multistage sampling process, comprised Chinese residents who had purchased sports lottery tickets in the past 12 months. They completed a questionnaire that was derived from a qualitative research process. A factor analysis extracted two factors associated with perceptions of CSR in China's sports lottery administration: Regulatory and Prevention Responsibilities and Product Development Responsibility. Logistic regression analyses revealed that these two factors were influential of consumer behavior (i.e., relative and absolute expenditure, purchasing frequency, and time commitment). This study represents an initial effort to understand the dimensions of perceived CSR associated with Chinese sports lottery. The findings signify the importance of enforcing CSR in sports lottery administration.
Montgomery, John H; Byerly, Matthew; Carmody, Thomas; Li, Baitao; Miller, Daniel R; Varghese, Femina; Holland, Rhiannon
2004-12-01
The effect of funding source on the outcome of randomized controlled trials has been investigated in several medical disciplines; however, psychiatry has been largely excluded from such analyses. In this article, randomized controlled trials of second generation antipsychotics in schizophrenia are reviewed and analyzed with respect to funding source (industry vs. non-industry funding). A literature search was conducted for randomized, double-blind trials in which at least one of the tested treatments was a second generation antipsychotic. In each study, design quality and study outcome were assessed quantitatively according to rating scales. Mean quality and outcome scores were compared in the industry-funded studies and non-industry-funded studies. An analysis of the primary author's affiliation with industry was similarly performed. Results of industry-funded studies significantly favored second generation over first generation antipsychotics when compared to non-industry-funded studies. Non-industry-funded studies showed a trend toward higher quality than industry-funded studies; however, the difference between the two was not significant. Also, within the industry-funded studies, outcomes of trials involving first authors employed by industry sponsors demonstrated a trend toward second generation over first generation antipsychotics to a greater degree than did trials involving first authors employed outside the industry (p=0.05). While the retrospective design of the study limits the strength of the findings, the data suggest that industry bias may occur in randomized controlled trials in schizophrenia. There appears to be several sources by which bias may enter clinical research, including trial design, control of data analysis and multiplicity/redundancy of trials.
Reliability analysis of structures under periodic proof tests in service
NASA Technical Reports Server (NTRS)
Yang, J.-N.
1976-01-01
A reliability analysis of structures subjected to random service loads and periodic proof tests treats gust loads and maneuver loads as random processes. Crack initiation, crack propagation, and strength degradation are treated as the fatigue process. The time to fatigue crack initiation and ultimate strength are random variables. Residual strength decreases during crack propagation, so that failure rate increases with time. When a structure fails under periodic proof testing, a new structure is built and proof-tested. The probability of structural failure in service is derived from treatment of all the random variables, strength degradations, service loads, proof tests, and the renewal of failed structures. Some numerical examples are worked out.
An investigation of the uniform random number generator
NASA Technical Reports Server (NTRS)
Temple, E. C.
1982-01-01
Most random number generators that are in use today are of the congruential form X(i+1) + AX(i) + C mod M where A, C, and M are nonnegative integers. If C=O, the generator is called the multiplicative type and those for which C/O are called mixed congruential generators. It is easy to see that congruential generators will repeat a sequence of numbers after a maximum of M values have been generated. The number of numbers that a procedure generates before restarting the sequence is called the length or the period of the generator. Generally, it is desirable to make the period as long as possible. A detailed discussion of congruential generators is given. Also, several promising procedures that differ from the multiplicative and mixed procedure are discussed.
NASA Astrophysics Data System (ADS)
Dong, Zhengcheng; Fang, Yanjun; Tian, Meng; Kong, Zhengmin
The hierarchical structure, k-core, is common in various complex networks, and the actual network always has successive layers from 1-core layer (the peripheral layer) to km-core layer (the core layer). The nodes within the core layer have been proved to be the most influential spreaders, but there is few work about how the depth of k-core layers (the value of km) can affect the robustness against cascading failures, rather than the interdependent networks. First, following the preferential attachment, a novel method is proposed to generate the scale-free network with successive k-core layers (KCBA network), and the KCBA network is validated more realistic than the traditional BA network. Then, with KCBA interdependent networks, the effect of the depth of k-core layers is investigated. Considering the load-based model, the loss of capacity on nodes is adopted to quantify the robustness instead of the number of functional nodes in the end. We conduct two attacking strategies, i.e. the RO-attack (Randomly remove only one node) and the RF-attack (Randomly remove a fraction of nodes). Results show that the robustness of KCBA networks not only depends on the depth of k-core layers, but also is slightly influenced by the initial load. With RO-attack, the networks with less k-core layers are more robust when the initial load is small. With RF-attack, the robustness improves with small km, but the improvement is getting weaker with the increment of the initial load. In a word, the lower the depth is, the more robust the networks will be.
Packaging of Dinoroseobacter shibae DNA into Gene Transfer Agent Particles Is Not Random.
Tomasch, Jürgen; Wang, Hui; Hall, April T K; Patzelt, Diana; Preusse, Matthias; Petersen, Jörn; Brinkmann, Henner; Bunk, Boyke; Bhuju, Sabin; Jarek, Michael; Geffers, Robert; Lang, Andrew S; Wagner-Döbler, Irene
2018-01-01
Gene transfer agents (GTAs) are phage-like particles which contain a fragment of genomic DNA of the bacterial or archaeal producer and deliver this to a recipient cell. GTA gene clusters are present in the genomes of almost all marine Rhodobacteraceae (Roseobacters) and might be important contributors to horizontal gene transfer in the world's oceans. For all organisms studied so far, no obvious evidence of sequence specificity or other nonrandom process responsible for packaging genomic DNA into GTAs has been found. Here, we show that knock-out of an autoinducer synthase gene of Dinoroseobacter shibae resulted in overproduction and release of functional GTA particles (DsGTA). Next-generation sequencing of the 4.2-kb DNA fragments isolated from DsGTAs revealed that packaging was not random. DNA from low-GC conjugative plasmids but not from high-GC chromids was excluded from packaging. Seven chromosomal regions were strongly overrepresented in DNA isolated from DsGTA. These packaging peaks lacked identifiable conserved sequence motifs that might represent recognition sites for the GTA terminase complex. Low-GC regions of the chromosome, including the origin and terminus of replication, were underrepresented in DNA isolated from DsGTAs. DNA methylation reduced packaging frequency while the level of gene expression had no influence. Chromosomal regions found to be over- and underrepresented in DsGTA-DNA were regularly spaced. We propose that a "headful" type of packaging is initiated at the sites of coverage peaks and, after linearization of the chromosomal DNA, proceeds in both directions from the initiation site. GC-content, DNA-modifications, and chromatin structure might influence at which sides GTA packaging can be initiated. © The Author(s) 2018. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Packaging of Dinoroseobacter shibae DNA into Gene Transfer Agent Particles Is Not Random
Wang, Hui; Hall, April T K; Patzelt, Diana; Preusse, Matthias; Petersen, Jörn; Brinkmann, Henner; Bunk, Boyke; Bhuju, Sabin; Jarek, Michael; Geffers, Robert; Lang, Andrew S; Wagner-Döbler, Irene
2018-01-01
Abstract Gene transfer agents (GTAs) are phage-like particles which contain a fragment of genomic DNA of the bacterial or archaeal producer and deliver this to a recipient cell. GTA gene clusters are present in the genomes of almost all marine Rhodobacteraceae (Roseobacters) and might be important contributors to horizontal gene transfer in the world’s oceans. For all organisms studied so far, no obvious evidence of sequence specificity or other nonrandom process responsible for packaging genomic DNA into GTAs has been found. Here, we show that knock-out of an autoinducer synthase gene of Dinoroseobacter shibae resulted in overproduction and release of functional GTA particles (DsGTA). Next-generation sequencing of the 4.2-kb DNA fragments isolated from DsGTAs revealed that packaging was not random. DNA from low-GC conjugative plasmids but not from high-GC chromids was excluded from packaging. Seven chromosomal regions were strongly overrepresented in DNA isolated from DsGTA. These packaging peaks lacked identifiable conserved sequence motifs that might represent recognition sites for the GTA terminase complex. Low-GC regions of the chromosome, including the origin and terminus of replication, were underrepresented in DNA isolated from DsGTAs. DNA methylation reduced packaging frequency while the level of gene expression had no influence. Chromosomal regions found to be over- and underrepresented in DsGTA-DNA were regularly spaced. We propose that a “headful” type of packaging is initiated at the sites of coverage peaks and, after linearization of the chromosomal DNA, proceeds in both directions from the initiation site. GC-content, DNA-modifications, and chromatin structure might influence at which sides GTA packaging can be initiated. PMID:29325123
Simulations Using Random-Generated DNA and RNA Sequences
ERIC Educational Resources Information Center
Bryce, C. F. A.
1977-01-01
Using a very simple computer program written in BASIC, a very large number of random-generated DNA or RNA sequences are obtained. Students use these sequences to predict complementary sequences and translational products, evaluate base compositions, determine frequencies of particular triplet codons, and suggest possible secondary structures.…
ERIC Educational Resources Information Center
Griffiths, Martin
2011-01-01
One of the author's undergraduate students recently asked him whether it was possible to generate a random positive integer. After some thought, the author realised that there were plenty of interesting mathematical ideas inherent in her question. So much so in fact, that the author decided to organise a workshop, open both to undergraduates and…
Tang, Lixia; Wang, Xiong; Ru, Beibei; Sun, Hengfei; Huang, Jian; Gao, Hui
2014-06-01
Recent computational and bioinformatics advances have enabled the efficient creation of novel biocatalysts by reducing amino acid variability at hot spot regions. To further expand the utility of this strategy, we present here a tool called Multi-site Degenerate Codon Analyzer (MDC-Analyzer) for the automated design of intelligent mutagenesis libraries that can completely cover user-defined randomized sequences, especially when multiple contiguous and/or adjacent sites are targeted. By initially defining an objective function, the possible optimal degenerate PCR primer profiles could be automatically explored using the heuristic approach of Greedy Best-First-Search. Compared to the previously developed DC-Analyzer, MDC-Analyzer allows for the existence of a small amount of undesired sequences as a tradeoff between the number of degenerate primers and the encoded library size while still providing all the benefits of DC-Analyzer with the ability to randomize multiple contiguous sites. MDC-Analyzer was validated using a series of randomly generated mutation schemes and experimental case studies on the evolution of halohydrin dehalogenase, which proved that the MDC methodology is more efficient than other methods and is particularly well-suited to exploring the sequence space of proteins using data-driven protein engineering strategies.
Quantum cryptography for secure free-space communications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, R.J.; Buttler, W.T.; Kwiat, P.G.
1999-03-01
The secure distribution of the secret random bit sequences known as key material, is an essential precursor to their use for the encryption and decryption of confidential communications. Quantum cryptography is a new technique for secure key distribution with single-photon transmissions: Heisenberg`s uncertainty principle ensures that an adversary can neither successfully tap the key transmissions, nor evade detection (eavesdropping raises the key error rate above a threshold value). The authors have developed experimental quantum cryptography systems based on the transmission of non-orthogonal photon polarization states to generate shared key material over line-of-sight optical links. Key material is built up usingmore » the transmission of a single-photon per bit of an initial secret random sequence. A quantum-mechanically random subset of this sequence is identified, becoming the key material after a data reconciliation stage with the sender. The authors have developed and tested a free-space quantum key distribution (QKD) system over an outdoor optical path of {approximately}1 km at Los Alamos National Laboratory under nighttime conditions. Results show that free-space QKD can provide secure real-time key distribution between parties who have a need to communicate secretly. Finally, they examine the feasibility of surface to satellite QKD.« less
ERIC Educational Resources Information Center
Koo, Helen P.; Rose, Allison; El-Khorazaty, M. Nabil; Yao, Qing; Jenkins, Renee R.; Anderson, Karen M.; Davis, Maurice; Walker, Leslie R.
2011-01-01
US adolescents initiate sex at increasingly younger ages, yet few pregnancy prevention interventions for children as young as 10-12 years old have been evaluated. Sixteen Washington, DC schools were randomly assigned to intervention versus control conditions. Beginning in 2001/02 with fifth-grade students and continuing during the sixth grade,…
GASPRNG: GPU accelerated scalable parallel random number generator library
NASA Astrophysics Data System (ADS)
Gao, Shuang; Peterson, Gregory D.
2013-04-01
Graphics processors represent a promising technology for accelerating computational science applications. Many computational science applications require fast and scalable random number generation with good statistical properties, so they use the Scalable Parallel Random Number Generators library (SPRNG). We present the GPU Accelerated SPRNG library (GASPRNG) to accelerate SPRNG in GPU-based high performance computing systems. GASPRNG includes code for a host CPU and CUDA code for execution on NVIDIA graphics processing units (GPUs) along with a programming interface to support various usage models for pseudorandom numbers and computational science applications executing on the CPU, GPU, or both. This paper describes the implementation approach used to produce high performance and also describes how to use the programming interface. The programming interface allows a user to be able to use GASPRNG the same way as SPRNG on traditional serial or parallel computers as well as to develop tightly coupled programs executing primarily on the GPU. We also describe how to install GASPRNG and use it. To help illustrate linking with GASPRNG, various demonstration codes are included for the different usage models. GASPRNG on a single GPU shows up to 280x speedup over SPRNG on a single CPU core and is able to scale for larger systems in the same manner as SPRNG. Because GASPRNG generates identical streams of pseudorandom numbers as SPRNG, users can be confident about the quality of GASPRNG for scalable computational science applications. Catalogue identifier: AEOI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOI_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: UTK license. No. of lines in distributed program, including test data, etc.: 167900 No. of bytes in distributed program, including test data, etc.: 1422058 Distribution format: tar.gz Programming language: C and CUDA. Computer: Any PC or workstation with NVIDIA GPU (Tested on Fermi GTX480, Tesla C1060, Tesla M2070). Operating system: Linux with CUDA version 4.0 or later. Should also run on MacOS, Windows, or UNIX. Has the code been vectorized or parallelized?: Yes. Parallelized using MPI directives. RAM: 512 MB˜ 732 MB (main memory on host CPU, depending on the data type of random numbers.) / 512 MB (GPU global memory) Classification: 4.13, 6.5. Nature of problem: Many computational science applications are able to consume large numbers of random numbers. For example, Monte Carlo simulations are able to consume limitless random numbers for the computation as long as resources for the computing are supported. Moreover, parallel computational science applications require independent streams of random numbers to attain statistically significant results. The SPRNG library provides this capability, but at a significant computational cost. The GASPRNG library presented here accelerates the generators of independent streams of random numbers using graphical processing units (GPUs). Solution method: Multiple copies of random number generators in GPUs allow a computational science application to consume large numbers of random numbers from independent, parallel streams. GASPRNG is a random number generators library to allow a computational science application to employ multiple copies of random number generators to boost performance. Users can interface GASPRNG with software code executing on microprocessors and/or GPUs. Running time: The tests provided take a few minutes to run.
Sugavanam, S; Yan, Z; Kamynin, V; Kurkov, A S; Zhang, L; Churkin, D V
2014-02-10
Multiwavelength lasing in the random distributed feedback fiber laser is demonstrated by employing an all fiber Lyot filter. Stable multiwavelength generation is obtained, with each line exhibiting sub-nanometer line-widths. A flat power distribution over multiple lines is obtained, which indicates that the power between lines is redistributed in nonlinear mixing processes. The multiwavelength generation is observed both in first and second Stokes waves.
Fine-scale population structure and the era of next-generation sequencing.
Henn, Brenna M; Gravel, Simon; Moreno-Estrada, Andres; Acevedo-Acevedo, Suehelay; Bustamante, Carlos D
2010-10-15
Fine-scale population structure characterizes most continents and is especially pronounced in non-cosmopolitan populations. Roughly half of the world's population remains non-cosmopolitan and even populations within cities often assort along ethnic and linguistic categories. Barriers to random mating can be ecologically extreme, such as the Sahara Desert, or cultural, such as the Indian caste system. In either case, subpopulations accumulate genetic differences if the barrier is maintained over multiple generations. Genome-wide polymorphism data, initially with only a few hundred autosomal microsatellites, have clearly established differences in allele frequency not only among continental regions, but also within continents and within countries. We review recent evidence from the analysis of genome-wide polymorphism data for genetic boundaries delineating human population structure and the main demographic and genomic processes shaping variation, and discuss the implications of population structure for the distribution and discovery of disease-causing genetic variants, in the light of the imminent availability of sequencing data for a multitude of diverse human genomes.
Sensor-Based Optimization Model for Air Quality Improvement in Home IoT
Kim, Jonghyuk
2018-01-01
We introduce current home Internet of Things (IoT) technology and present research on its various forms and applications in real life. In addition, we describe IoT marketing strategies as well as specific modeling techniques for improving air quality, a key home IoT service. To this end, we summarize the latest research on sensor-based home IoT, studies on indoor air quality, and technical studies on random data generation. In addition, we develop an air quality improvement model that can be readily applied to the market by acquiring initial analytical data and building infrastructures using spectrum/density analysis and the natural cubic spline method. Accordingly, we generate related data based on user behavioral values. We integrate the logic into the existing home IoT system to enable users to easily access the system through the Web or mobile applications. We expect that the present introduction of a practical marketing application method will contribute to enhancing the expansion of the home IoT market. PMID:29570684
Sensor-Based Optimization Model for Air Quality Improvement in Home IoT.
Kim, Jonghyuk; Hwangbo, Hyunwoo
2018-03-23
We introduce current home Internet of Things (IoT) technology and present research on its various forms and applications in real life. In addition, we describe IoT marketing strategies as well as specific modeling techniques for improving air quality, a key home IoT service. To this end, we summarize the latest research on sensor-based home IoT, studies on indoor air quality, and technical studies on random data generation. In addition, we develop an air quality improvement model that can be readily applied to the market by acquiring initial analytical data and building infrastructures using spectrum/density analysis and the natural cubic spline method. Accordingly, we generate related data based on user behavioral values. We integrate the logic into the existing home IoT system to enable users to easily access the system through the Web or mobile applications. We expect that the present introduction of a practical marketing application method will contribute to enhancing the expansion of the home IoT market.
Automated generation of radical species in crystalline carbohydrate using ab initio MD simulations.
Aalbergsjø, Siv G; Pauwels, Ewald; Van Yperen-De Deyne, Andy; Van Speybroeck, Veronique; Sagstuen, Einar
2014-08-28
As the chemical structures of radiation damaged molecules may differ greatly from their undamaged counterparts, investigation and description of radiation damaged structures is commonly biased by the researcher. Radical formation from ionizing radiation in crystalline α-l-rhamnose monohydrate has been investigated using a new method where the selection of radical structures is unbiased by the researcher. The method is based on using ab initio molecular dynamics (MD) studies to investigate how ionization damage can form, change and move. Diversity in the radical production is gained by using different points on the potential energy surface of the intact crystal as starting points for the ionizations and letting the initial velocities of the nuclei after ionization be generated randomly. 160 ab initio MD runs produced 12 unique radical structures for investigation. Out of these, 7 of the potential products have never previously been discussed, and 3 products are found to match with radicals previously observed by electron magnetic resonance experiments.
Role of rasagiline in treating Parkinson's disease: Effect on disease progression.
Malaty, Irene A; Fernandez, Hubert H
2009-08-01
Rasagiline is a second generation, selective, irreversible monoamine oxidase type B (MAO-B) inhibitor. It has demonstrated efficacy in monotherapy for early Parkinson's disease (PD) patients in one large randomized, placebo-controlled trial (TVP-1012 in Early Monotherapy for Parkinson's Disease Outpatients), and has shown ability to reduce off time in more advanced PD patients with motor fluctuations in two large placebo-controlled trials (Parkinson's Rasagiline: Efficacy and Safety in the Treatment of "Off", and Lasting Effect in Adjunct Therapy With Rasagiline Given Once Daily). Preclinical data abound to suggest potential for neuroprotection by this compound against a variety of neurotoxic insults in cell cultures and in animals. The lack of amphetamine metabolites provides an advantage over the first generation MAO-B inhibitor selegiline. One large trial has investigated the potential for disease modification in PD patients (Attenuation of Disease progression with Azilect Given Once-daily) and preliminary results maintain some possible advantage to earlier initiation of the 1 mg/day dose. The clinical significance of the difference detected remains a consideration.
Dynamical models for the formation of elephant trunks in HII regions
NASA Astrophysics Data System (ADS)
Mackey, Jonathan; Lim, Andrew J.
2010-04-01
The formation of pillars of dense gas at the boundaries of HII regions is investigated with hydrodynamical numerical simulations including ionizing radiation from a point source. We show that shadowing of ionizing radiation by an inhomogeneous density field is capable of forming so-called elephant trunks (pillars of dense gas as in e.g. M16) without the assistance of self-gravity or of ionization front and cooling instabilities. A large simulation of a density field containing randomly generated clumps of gas is shown to naturally generate elephant trunks with certain clump configurations. These configurations are simulated in isolation and analysed in detail to show the formation mechanism and determine possible observational signatures. Pillars formed by the shadowing mechanism are shown to have rather different velocity profiles depending on the initial gas configuration, but asymmetries mean that the profiles also vary significantly with perspective, limiting their ability to discriminate between formation scenarios. Neutral and molecular gas cooling are shown to have a strong effect on these results.
Recording and reading of information on optical disks
NASA Astrophysics Data System (ADS)
Bouwhuis, G.; Braat, J. J. M.
In the storage of information, related to video programs, in a spiral track on a disk, difficulties arise because the bandwidth for video is much greater than for audio signals. An attractive solution was found in optical storage. The optical noncontact method is free of wear, and allows for fast random access. Initial problems regarding a suitable light source could be overcome with the aid of appropriate laser devices. The basic concepts of optical storage on disks are treated insofar as they are relevant for the optical arrangement. A general description is provided of a video, a digital audio, and a data storage system. Scanning spot microscopy for recording and reading of optical disks is discussed, giving attention to recording of the signal, the readout of optical disks, the readout of digitally encoded signals, and cross talk. Tracking systems are also considered, taking into account the generation of error signals for radial tracking and the generation of focus error signals.
PLNoise: a package for exact numerical simulation of power-law noises
NASA Astrophysics Data System (ADS)
Milotti, Edoardo
2006-08-01
Many simulations of stochastic processes require colored noises: here I describe a small program library that generates samples with a tunable power-law spectral density: the algorithm can be modified to generate more general colored noises, and is exact for all time steps, even when they are unevenly spaced (as may often happen in the case of astronomical data, see e.g. [N.R. Lomb, Astrophys. Space Sci. 39 (1976) 447]. The method is exact in the sense that it reproduces a process that is theoretically guaranteed to produce a range-limited power-law spectrum 1/f with -1<β⩽1. The algorithm has a well-behaved computational complexity, it produces a nearly perfect Gaussian noise, and its computational efficiency depends on the required degree of noise Gaussianity. Program summaryTitle of program: PLNoise Catalogue identifier:ADXV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXV_v1_0.html Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions: none Programming language used: ANSI C Computer: Any computer with an ANSI C compiler: the package has been tested with gcc version 3.2.3 on Red Hat Linux 3.2.3-52 and gcc version 4.0.0 and 4.0.1 on Apple Mac OS X-10.4 Operating system: All operating systems capable of running an ANSI C compiler No. of lines in distributed program, including test data, etc.:6238 No. of bytes in distributed program, including test data, etc.:52 387 Distribution format:tar.gz RAM: The code of the test program is very compact (about 50 Kbytes), but the program works with list management and allocates memory dynamically; in a typical run (like the one discussed in Section 4 in the long write-up) with average list length 2ṡ10, the RAM taken by the list is 200 Kbytes. External routines: The package needs external routines to generate uniform and exponential deviates. The implementation described here uses the random number generation library ranlib freely available from Netlib [B.W. Brown, J. Lovato, K. Russell, ranlib, available from Netlib, http://www.netlib.org/random/index.html, select the C version ranlib.c], but it has also been successfully tested with the random number routines in Numerical Recipes [W.H. Press, S.A. Teulkolsky, W.T. Vetterling, B.P. Flannery, Numerical Recipes in C: The Art of Scientific Computing, second ed., Cambridge Univ. Press, Cambridge, 1992, pp. 274-290]. Notice that ranlib requires a pair of routines from the linear algebra package LINPACK, and that the distribution of ranlib includes the C source of these routines, in case LINPACK is not installed on the target machine. Nature of problem: Exact generation of different types of Gaussian colored noise. Solution method: Random superposition of relaxation processes [E. Milotti, Phys. Rev. E 72 (2005) 056701]. Unusual features: The algorithm is theoretically guaranteed to be exact, and unlike all other existing generators it can generate samples with uneven spacing. Additional comments: The program requires an initialization step; for some parameter sets this may become rather heavy. Running time: Running time varies widely with different input parameters, however in a test run like the one in Section 4 in this work, the generation routine took on average about 7 ms for each sample.
NASA Astrophysics Data System (ADS)
Ilina, Olga; Bakker, Gert-Jan; Vasaturo, Angela; Hoffman, Robert M.; Friedl, Peter
2011-02-01
Cancer invasion into an extracellular matrix (ECM) results from a biophysical reciprocal interplay between the expanding cancer lesion and tissue barriers imposed by the adjacent microenvironment. In vivo, connective tissue provides both densely packed ECM barriers adjacent to channel/track-like spaces and loosely organized zones, both of which may impact cancer invasion mode and efficiency; however little is known about how three-dimensional (3D) spaces and aligned tracks present in interstitial tissue guide cell invasion. We here describe a two-photon laser ablation procedure to generate 3D microtracks in dense 3D collagen matrices that support and guide collective cancer cell invasion. Whereas collective invasion of mammary tumor (MMT) breast cancer cells into randomly organized collagen networks required matrix metalloproteinase (MMP) activity for cell-derived collagen breakdown, re-alignment and track generation, preformed tracks supported MMP-independent collective invasion down to a track caliber of 3 µm. Besides contact guidance along the track of least resistance and initial cell deformation (squeezing), MMP-independent collective cell strands led to secondary track expansion by a pushing mechanism. Thus, two-photon laser ablation is useful to generate barrier-free microtracks in a 3D ECM which guide collective invasion independently of pericellular proteolysis.
NASA Technical Reports Server (NTRS)
Weinberg, David H.; Gott, J. Richard, III; Melott, Adrian L.
1987-01-01
Many models for the formation of galaxies and large-scale structure assume a spectrum of random phase (Gaussian), small-amplitude density fluctuations as initial conditions. In such scenarios, the topology of the galaxy distribution on large scales relates directly to the topology of the initial density fluctuations. Here a quantitative measure of topology - the genus of contours in a smoothed density distribution - is described and applied to numerical simulations of galaxy clustering, to a variety of three-dimensional toy models, and to a volume-limited sample of the CfA redshift survey. For random phase distributions the genus of density contours exhibits a universal dependence on threshold density. The clustering simulations show that a smoothing length of 2-3 times the mass correlation length is sufficient to recover the topology of the initial fluctuations from the evolved galaxy distribution. Cold dark matter and white noise models retain a random phase topology at shorter smoothing lengths, but massive neutrino models develop a cellular topology.
The potential of shifting recombination hotspots to increase genetic gain in livestock breeding.
Gonen, Serap; Battagin, Mara; Johnston, Susan E; Gorjanc, Gregor; Hickey, John M
2017-07-04
This study uses simulation to explore and quantify the potential effect of shifting recombination hotspots on genetic gain in livestock breeding programs. We simulated three scenarios that differed in the locations of quantitative trait nucleotides (QTN) and recombination hotspots in the genome. In scenario 1, QTN were randomly distributed along the chromosomes and recombination was restricted to occur within specific genomic regions (i.e. recombination hotspots). In the other two scenarios, both QTN and recombination hotspots were located in specific regions, but differed in whether the QTN occurred outside of (scenario 2) or inside (scenario 3) recombination hotspots. We split each chromosome into 250, 500 or 1000 regions per chromosome of which 10% were recombination hotspots and/or contained QTN. The breeding program was run for 21 generations of selection, after which recombination hotspot regions were kept the same or were shifted to adjacent regions for a further 80 generations of selection. We evaluated the effect of shifting recombination hotspots on genetic gain, genetic variance and genic variance. Our results show that shifting recombination hotspots reduced the decline of genetic and genic variance by releasing standing allelic variation in the form of new allele combinations. This in turn resulted in larger increases in genetic gain. However, the benefit of shifting recombination hotspots for increased genetic gain was only observed when QTN were initially outside recombination hotspots. If QTN were initially inside recombination hotspots then shifting them decreased genetic gain. Shifting recombination hotspots to regions of the genome where recombination had not occurred for 21 generations of selection (i.e. recombination deserts) released more of the standing allelic variation available in each generation and thus increased genetic gain. However, whether and how much increase in genetic gain was achieved by shifting recombination hotspots depended on the distribution of QTN in the genome, the number of recombination hotspots and whether QTN were initially inside or outside recombination hotspots. Our findings show future scope for targeted modification of recombination hotspots e.g. through changes in zinc-finger motifs of the PRDM9 protein to increase genetic gain in production species.
Subjective randomness as statistical inference.
Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B
2018-06-01
Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.
Single-shot stand-off chemical identification of powders using random Raman lasing
Hokr, Brett H.; Bixler, Joel N.; Noojin, Gary D.; Thomas, Robert J.; Rockwell, Benjamin A.; Yakovlev, Vladislav V.; Scully, Marlan O.
2014-01-01
The task of identifying explosives, hazardous chemicals, and biological materials from a safe distance is the subject we consider. Much of the prior work on stand-off spectroscopy using light has been devoted to generating a backward-propagating beam of light that can be used drive further spectroscopic processes. The discovery of random lasing and, more recently, random Raman lasing provide a mechanism for remotely generating copious amounts of chemically specific Raman scattered light. The bright nature of random Raman lasing renders directionality unnecessary, allowing for the detection and identification of chemicals from large distances in real time. In this article, the single-shot remote identification of chemicals at kilometer-scale distances is experimentally demonstrated using random Raman lasing. PMID:25114231
Optimized random phase only holograms.
Zea, Alejandro Velez; Barrera Ramirez, John Fredy; Torroba, Roberto
2018-02-15
We propose a simple and efficient technique capable of generating Fourier phase only holograms with a reconstruction quality similar to the results obtained with the Gerchberg-Saxton (G-S) algorithm. Our proposal is to use the traditional G-S algorithm to optimize a random phase pattern for the resolution, pixel size, and target size of the general optical system without any specific amplitude data. This produces an optimized random phase (ORAP), which is used for fast generation of phase only holograms of arbitrary amplitude targets. This ORAP needs to be generated only once for a given optical system, avoiding the need for costly iterative algorithms for each new target. We show numerical and experimental results confirming the validity of the proposal.
Ko, Heasin; Choi, Byung-Seok; Choe, Joong-Seon; Kim, Kap-Joong; Kim, Jong-Hoi; Youn, Chun Ju
2017-08-21
Most polarization-based BB84 quantum key distribution (QKD) systems utilize multiple lasers to generate one of four polarization quantum states randomly. However, random bit generation with multiple lasers can potentially open critical side channels that significantly endangers the security of QKD systems. In this paper, we show unnoticed side channels of temporal disparity and intensity fluctuation, which possibly exist in the operation of multiple semiconductor laser diodes. Experimental results show that the side channels can enormously degrade security performance of QKD systems. An important system issue for the improvement of quantum bit error rate (QBER) related with laser driving condition is further addressed with experimental results.
Rebello, Candida J; Chu, Yi-Fang; Johnson, William D; Martin, Corby K; Han, Hongmei; Bordenave, Nicolas; Shi, Yuhui; O'Shea, Marianne; Greenway, Frank L
2014-05-28
Foods that enhance satiety can help consumers to resist environmental cues to eat, and improve the nutritional quality of their diets. Viscosity generated by oat β-glucan, influences gastrointestinal mechanisms that mediate satiety. Differences in the source, processing treatments, and interactions with other constituents in the food matrix affect the amount, solubility, molecular weight, and structure of the β-glucan in products, which in turn influences the viscosity. This study examined the effect of two types of oatmeal and an oat-based ready-to-eat breakfast cereal (RTEC) on appetite, and assessed differences in meal viscosity and β-glucan characteristics among the cereals. Forty-eight individuals were enrolled in a randomized crossover trial. Subjects consumed isocaloric breakfast meals containing instant oatmeal (IO), old-fashioned oatmeal (SO) or RTEC in random order at least a week apart. Each breakfast meal contained 218 kcal (150 kcal cereal, and 68 kcal milk) Visual analogue scales measuring appetite were completed before breakfast, and over four hours, following the meal. Starch digestion kinetics, meal viscosities, and β-glucan characteristics for each meal were determined. Appetite responses were analyzed by area under the curve. Mixed models were used to analyze response changes over time. IO increased fullness (p = 0.04), suppressed desire to eat (p = 0.01) and reduced prospective intake (p < 0.01) more than the RTEC over four hours, and consistently at the 60 minute time-point. SO reduced prospective intake (p = 0.04) more than the RTEC. Hunger scores were not significantly different except that IO reduced hunger more than the RTEC at the 60 minute time-point. IO and SO had higher β-glucan content, molecular weight, gastric viscosity, and larger hydration spheres than the RTEC, and IO had greater viscosity after oral and initial gastric digestion (initial viscosity) than the RTEC. IO and SO improved appetite control over four hours compared to RTEC. Initial viscosity of oatmeal may be especially important for reducing appetite.
2014-01-01
Background Foods that enhance satiety can help consumers to resist environmental cues to eat, and improve the nutritional quality of their diets. Viscosity generated by oat β-glucan, influences gastrointestinal mechanisms that mediate satiety. Differences in the source, processing treatments, and interactions with other constituents in the food matrix affect the amount, solubility, molecular weight, and structure of the β-glucan in products, which in turn influences the viscosity. This study examined the effect of two types of oatmeal and an oat-based ready-to-eat breakfast cereal (RTEC) on appetite, and assessed differences in meal viscosity and β-glucan characteristics among the cereals. Methods Forty-eight individuals were enrolled in a randomized crossover trial. Subjects consumed isocaloric breakfast meals containing instant oatmeal (IO), old-fashioned oatmeal (SO) or RTEC in random order at least a week apart. Each breakfast meal contained 218 kcal (150 kcal cereal, and 68 kcal milk) Visual analogue scales measuring appetite were completed before breakfast, and over four hours, following the meal. Starch digestion kinetics, meal viscosities, and β-glucan characteristics for each meal were determined. Appetite responses were analyzed by area under the curve. Mixed models were used to analyze response changes over time. Results IO increased fullness (p = 0.04), suppressed desire to eat (p = 0.01) and reduced prospective intake (p < 0.01) more than the RTEC over four hours, and consistently at the 60 minute time-point. SO reduced prospective intake (p = 0.04) more than the RTEC. Hunger scores were not significantly different except that IO reduced hunger more than the RTEC at the 60 minute time-point. IO and SO had higher β-glucan content, molecular weight, gastric viscosity, and larger hydration spheres than the RTEC, and IO had greater viscosity after oral and initial gastric digestion (initial viscosity) than the RTEC. Conclusion IO and SO improved appetite control over four hours compared to RTEC. Initial viscosity of oatmeal may be especially important for reducing appetite. PMID:24884934
Denno, Donna M; Hoopes, Andrea J; Chandra-Mouli, Venkatraman
2015-01-01
Access to youth friendly health services is vital for ensuring sexual and reproductive health (SRH) and well-being of adolescents. This study is a descriptive review of the effectiveness of initiatives to improve adolescent access to and utilization of sexual and reproductive health services (SRHS) in low- and middle-income countries. We examined four SRHS intervention types: (1) facility based, (2) out-of-facility based, (3) interventions to reach marginalized or vulnerable populations, (4) interventions to generate demand and/or community acceptance. Outcomes assessed across the four questions included uptake of SRHS or sexual and reproductive health commodities and sexual and reproductive health biologic outcomes. There is limited evidence to support the effectiveness of initiatives that simply provide adolescent friendliness training for health workers. Data are most ample (10 initiatives demonstrating weak but positive effects and one randomized controlled trial demonstrating strong positive results on some outcome measures) for approaches that use a combination of health worker training, adolescent-friendly facility improvements, and broad information dissemination via the community, schools, and mass media. We found a paucity of evidence on out-of-facility-based strategies, except for those delivered through mixed-use youth centers that demonstrated that SRHS in these centers are neither well used nor effective at improving SRH outcomes. There was an absence of studies or evaluations examining outcomes among vulnerable or marginalized adolescents. Findings from 17 of 21 initiatives assessing demand-generation activities demonstrated at least some association with adolescent SRHS use. Of 15 studies on parental and other community gatekeepers' approval of SRHS for adolescents, which assessed SRHS/commodity uptake and/or biologic outcomes, 11 showed positive results. Packages of interventions that train health workers, improve facility adolescent friendliness, and endeavor to generate demand through multiple channels are ready for large-scale implementation. However, further evaluation of these initiatives is needed to clarify mechanisms and impact, especially of specific program components. Quality research is needed to determine effective means to deliver services outside the facilities, to reach marginalized or vulnerable adolescents, and to determine effective approaches to increase community acceptance of adolescent SRHS. Copyright © 2015 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
RandomSpot: A web-based tool for systematic random sampling of virtual slides.
Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E
2015-01-01
This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.
Direct generation of all-optical random numbers from optical pulse amplitude chaos.
Li, Pu; Wang, Yun-Cai; Wang, An-Bang; Yang, Ling-Zhen; Zhang, Ming-Jiang; Zhang, Jian-Zhong
2012-02-13
We propose and theoretically demonstrate an all-optical method for directly generating all-optical random numbers from pulse amplitude chaos produced by a mode-locked fiber ring laser. Under an appropriate pump intensity, the mode-locked laser can experience a quasi-periodic route to chaos. Such a chaos consists of a stream of pulses with a fixed repetition frequency but random intensities. In this method, we do not require sampling procedure and external triggered clocks but directly quantize the chaotic pulses stream into random number sequence via an all-optical flip-flop. Moreover, our simulation results show that the pulse amplitude chaos has no periodicity and possesses a highly symmetric distribution of amplitude. Thus, in theory, the obtained random number sequence without post-processing has a high-quality randomness verified by industry-standard statistical tests.
Monte Carlo based NMR simulations of open fractures in porous media
NASA Astrophysics Data System (ADS)
Lukács, Tamás; Balázs, László
2014-05-01
According to the basic principles of nuclear magnetic resonance (NMR), a measurement's free induction decay curve has an exponential characteristic and its parameter is the transversal relaxation time, T2, given by the Bloch equations in rotating frame. In our simulations we are observing that particular case when the bulk's volume is neglectable to the whole system, the vertical movement is basically zero, hence the diffusion part of the T2 relation can be editted out. This small-apertured situations are common in sedimentary layers, and the smallness of the observed volume enable us to calculate with just the bulk relaxation and the surface relaxation. The simulation uses the Monte-Carlo method, so it is based on a random-walk generator which provides the brownian motions of the particles by uniformly distributed, pseudorandom generated numbers. An attached differential equation assures the bulk relaxation, the initial and the iterated conditions guarantee the simulation's replicability and enable having consistent estimations. We generate an initial geometry of a plain segment with known height, with given number of particles, the spatial distribution is set to equal to each simulation, and the surface-volume ratio remains at a constant value. It follows that to the given thickness of the open fracture, from the fitted curve's parameter, the surface relaxivity is determinable. The calculated T2 distribution curves are also indicating the inconstancy in the observed fracture situations. The effect of varying the height of the lamina at a constant diffusion coefficient also produces characteristic anomaly and for comparison we have run the simulation with the same initial volume, number of particles and conditions in spherical bulks, their profiles are clear and easily to understand. The surface relaxation enables us to estimate the interaction beetwen the materials of boundary with this two geometrically well-defined bulks, therefore the distribution takes as a basis in estimation of the porosity and can be use of identifying small-grained porous media.
Dignity Therapy and Life Review for Palliative Care Patients: A Randomized Controlled Trial.
Vuksanovic, Dean; Green, Heather J; Dyck, Murray; Morrissey, Shirley A
2017-02-01
Dignity therapy (DT) is a psychotherapeutic intervention with increasing evidence of acceptability and utility in palliative care settings. The aim of this study was to evaluate the legacy creation component of DT by comparing this intervention with life review (LR) and waitlist control (WC) groups. Seventy adults with advanced terminal disease were randomly allocated to DT, LR, or WC followed by DT, of which 56 completed the study protocol. LR followed an identical protocol to DT except that no legacy document was created in LR. Primary outcome measures were the Brief Generativity and Ego-Integrity Questionnaire, Patient Dignity Inventory, Functional Assessment of Cancer Therapy-General, version 4, and treatment evaluation questionnaires. Unlike LR and WC groups, DT recipients demonstrated significantly increased generativity and ego-integrity scores at study completion. There were no significant changes for dignity-related distress or physical, social, emotional, and functional well-being among the three groups. There were also no significant changes in primary outcomes after the provision of DT after the waiting period in the WC group. High acceptability and satisfaction with interventions were noted for recipients of both DT and LR and family/carers of DT participants. This study provides initial evidence that the specific process of legacy creation is able to positively affect sense of generativity, meaning, and acceptance near end of life. High acceptability and satisfaction rates for both DT and LR and positive impacts on families/carers of DT participants provide additional support for clinical utility of these interventions. Further evaluation of specific mechanisms of change post-intervention is required given DT's uncertain efficacy on other primary outcomes. Copyright © 2016 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.
Paparella, Domenico; Parolari, Alessandro; Rotunno, Crescenzia; Vincent, Jessica; Myasoedova, Veronica; Guida, Pietro; De Palo, Micaela; Margari, Vito; Devereaux, Philip J; Lamy, Andre; Alamanni, Francesco; Yusuf, Salim; Whitlock, Richard
2017-01-01
Cardiopulmonary bypass (CPB) surgery, despite heparin administration, elicits activation of coagulation system resulting in coagulopathy. Anti-inflammatory effects of steroid treatment have been demonstrated, but its effects on coagulation system are unknown. The primary objective of this study is to assess the effects of methylprednisolone on coagulation function by evaluating thrombin generation, fibrinolysis, and platelet activation in high-risk patients undergoing cardiac surgery with CPB. The Steroids In caRdiac Surgery study is a double-blind, randomized, controlled trial performed on 7507 patients worldwide who were randomized to receive either intravenous methylprednisolone, 250 mg at anesthetic induction and 250 mg at initiation of CPB (n = 3755), or placebo (n = 3752). A substudy was conducted in 2 sites to collect blood samples perioperatively to measure prothrombin fragment 1.2 (PF1+2, thrombin generation), plasmin-antiplasmin complex (PAP, fibrinolysis), platelet factor 4 (PF4 platelet activation), and fibrinogen. Eighty-one patients were enrolled in the substudy (37 placebo vs 44 in treatment group). No difference in clinical outcome was detected, including postoperative bleeding and need for blood products transfusion. All patients showed changes of all plasma biomarkers with greater values than baseline in both groups. This reaction was attenuated significantly in the treatment group for PF1.2 (P = 0.040) and PAP (P = 0.042) values at the first intraoperative measurement. No difference between groups was detected for PF4. Methylprednisolone treatment attenuates activation of coagulation system in high-risk patients undergoing CPB surgery. Reduction of thrombin generation and fibrinolysis activation may lead to reduced blood loss after surgery. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Thomas, R. N.; Ebigbo, A.; Paluszny, A.; Zimmerman, R. W.
2016-12-01
The macroscopic permeability of 3D anisotropic geomechanically-generated fractured rock masses is investigated. The explicitly computed permeabilities are compared to the predictions of classical inclusion-based effective medium theories, and to the permeability of networks of randomly oriented and stochastically generated fractures. Stochastically generated fracture networks lack features that arise from fracture interaction, such as non-planarity, and termination of fractures upon intersection. Recent discrete fracture network studies include heuristic rules that introduce these features to some extent. In this work, fractures grow and extend under tension from a finite set of initial flaws. The finite element method is used to compute displacements, and modal stress intensity factors are computed around each fracture tip using the interaction integral accumulated over a set of virtual discs. Fracture apertures emerge as a result of simulations that honour the constraints of stress equilibrium and mass conservation. The macroscopic permeabilities are explicitly calculated by solving the local cubic law in the fractures, on an element-by-element basis, coupled to Darcy's law in the matrix. The permeabilities are then compared to the estimates given by the symmetric and asymmetric versions of the self-consistent approximation, which, for randomly fractured volumes, were previously demonstrated to be most accurate of the inclusion-based effective medium methods (Ebigbo et al., Transport in Porous Media, 2016). The permeabilities of several dozen geomechanical networks are computed as a function of density and in situ stresses. For anisotropic networks, we find that the asymmetric and symmetric self-consistent methods overestimate the effective permeability in the direction of the dominant fracture set. Effective permeabilities that are more strongly dependent on the connectivity of two or more fracture sets are more accurately captured by the effective medium models.
CrowdPhase: crowdsourcing the phase problem
Jorda, Julien; Sawaya, Michael R.; Yeates, Todd O.
2014-01-01
The human mind innately excels at some complex tasks that are difficult to solve using computers alone. For complex problems amenable to parallelization, strategies can be developed to exploit human intelligence in a collective form: such approaches are sometimes referred to as ‘crowdsourcing’. Here, a first attempt at a crowdsourced approach for low-resolution ab initio phasing in macromolecular crystallography is proposed. A collaborative online game named CrowdPhase was designed, which relies on a human-powered genetic algorithm, where players control the selection mechanism during the evolutionary process. The algorithm starts from a population of ‘individuals’, each with a random genetic makeup, in this case a map prepared from a random set of phases, and tries to cause the population to evolve towards individuals with better phases based on Darwinian survival of the fittest. Players apply their pattern-recognition capabilities to evaluate the electron-density maps generated from these sets of phases and to select the fittest individuals. A user-friendly interface, a training stage and a competitive scoring system foster a network of well trained players who can guide the genetic algorithm towards better solutions from generation to generation via gameplay. CrowdPhase was applied to two synthetic low-resolution phasing puzzles and it was shown that players could successfully obtain phase sets in the 30° phase error range and corresponding molecular envelopes showing agreement with the low-resolution models. The successful preliminary studies suggest that with further development the crowdsourcing approach could fill a gap in current crystallographic methods by making it possible to extract meaningful information in cases where limited resolution might otherwise prevent initial phasing. PMID:24914965
Rousseau, Alexandra; Robert, Annie; Gerotziafas, Grigoris; Torchin, Dahlia; Zannad, Faiez; Lacut, Karine; Libersa, Christian; Dasque, Eric; Démolis, Jean-Louis; Elalamy, Ismail; Simon, Tabassome
2010-04-01
Oral hormone therapy is associated with an increased risk of venous thrombosis. Drug agencies recommend the use of the lowest efficient dose to treat menopausal symptoms for a better risk/ratio profile, although this profile has not been totally investigated yet. The aim of the study was to compare the effect of the standard dose of 17beta-estradiol to a lower one on thrombin generation (TG). In a 2-month study, healthy menopausal women were randomized to receive daily 1mg or 2 mg of 17beta-estradiol (E1, n = 24 and E2, n = 26; respectively) with 10 mg dydrogesterone or placebo (PL, n = 22). Plasma levels factors VII, X, VIII and II were assessed before and after treatment as well as Tissue factor triggered TG, which allows the investigation of the different phases of coagulation process. The peak of thrombin was higher in hormone therapy groups (E1: 42.39 +/- 50.23 nm, E2: 31.08 +/- 85.86 nm vs. 10.52 +/- 40.63 nm in PL, P = 0.002 and P = 0.01). Time to reach the peak was also shortened (PL: 0.26 +/- 0.69 min vs. E1: -0.26 +/- 0.80 min, E2: -0.55 +/- 0.79 min, P <10(-3) for both comparisons) and mean rate index of the propagation phase of TG was significantly increased. Among the studied clotting factors, only the levels of FVII were significantly increased after treatment administration. The two doses of 17beta-estradiol induced in a similar degree an acceleration of the initiation and propagation phase of tissue factor triggered thrombin generation and a significant increase of FVII coagulant activity.
Noise-Induced Synchronization among Sub-RF CMOS Analog Oscillators for Skew-Free Clock Distribution
NASA Astrophysics Data System (ADS)
Utagawa, Akira; Asai, Tetsuya; Hirose, Tetsuya; Amemiya, Yoshihito
We present on-chip oscillator arrays synchronized by random noises, aiming at skew-free clock distribution on synchronous digital systems. Nakao et al. recently reported that independent neural oscillators can be synchronized by applying temporal random impulses to the oscillators [1], [2]. We regard neural oscillators as independent clock sources on LSIs; i. e., clock sources are distributed on LSIs, and they are forced to synchronize through the use of random noises. We designed neuron-based clock generators operating at sub-RF region (<1GHz) by modifying the original neuron model to a new model that is suitable for CMOS implementation with 0.25-μm CMOS parameters. Through circuit simulations, we demonstrate that i) the clock generators are certainly synchronized by pseudo-random noises and ii) clock generators exhibited phase-locked oscillations even if they had small device mismatches.
NASA Astrophysics Data System (ADS)
Nan, Hanqing; Liang, Long; Chen, Guo; Liu, Liyu; Liu, Ruchuan; Jiao, Yang
2018-03-01
Three-dimensional (3D) collective cell migration in a collagen-based extracellular matrix (ECM) is among one of the most significant topics in developmental biology, cancer progression, tissue regeneration, and immune response. Recent studies have suggested that collagen-fiber mediated force transmission in cellularized ECM plays an important role in stress homeostasis and regulation of collective cellular behaviors. Motivated by the recent in vitro observation that oriented collagen can significantly enhance the penetration of migrating breast cancer cells into dense Matrigel which mimics the intravasation process in vivo [Han et al. Proc. Natl. Acad. Sci. USA 113, 11208 (2016), 10.1073/pnas.1610347113], we devise a procedure for generating realizations of highly heterogeneous 3D collagen networks with prescribed microstructural statistics via stochastic optimization. Specifically, a collagen network is represented via the graph (node-bond) model and the microstructural statistics considered include the cross-link (node) density, valence distribution, fiber (bond) length distribution, as well as fiber orientation distribution. An optimization problem is formulated in which the objective function is defined as the squared difference between a set of target microstructural statistics and the corresponding statistics for the simulated network. Simulated annealing is employed to solve the optimization problem by evolving an initial network via random perturbations to generate realizations of homogeneous networks with randomly oriented fibers, homogeneous networks with aligned fibers, heterogeneous networks with a continuous variation of fiber orientation along a prescribed direction, as well as a binary system containing a collagen region with aligned fibers and a dense Matrigel region with randomly oriented fibers. The generation and propagation of active forces in the simulated networks due to polarized contraction of an embedded ellipsoidal cell and a small group of cells are analyzed by considering a nonlinear fiber model incorporating strain hardening upon large stretching and buckling upon compression. Our analysis shows that oriented fibers can significantly enhance long-range force transmission in the network. Moreover, in the oriented-collagen-Matrigel system, the forces generated by a polarized cell in collagen can penetrate deeply into the Matrigel region. The stressed Matrigel fibers could provide contact guidance for the migrating cell cells, and thus enhance their penetration into Matrigel. This suggests a possible mechanism for the observed enhanced intravasation by oriented collagen.
Cascaded Raman lasing in a PM phosphosilicate fiber with random distributed feedback
NASA Astrophysics Data System (ADS)
Lobach, Ivan A.; Kablukov, Sergey I.; Babin, Sergey A.
2018-02-01
We report on the first demonstration of a linearly polarized cascaded Raman fiber laser based on a simple half-open cavity with a broadband composite reflector and random distributed feedback in a polarization maintaining phosphosilicate fiber operating beyond zero dispersion wavelength ( 1400 nm). With increasing pump power from a Yb-doped fiber laser at 1080 nm, the random laser generates subsequently 8 W at 1262 nm and 9 W at 1515 nm with polarization extinction ratio of 27 dB. The generation linewidths amount to about 1 nm and 3 nm, respectively, being almost independent of power, in correspondence with the theory of a cascaded random lasing.
Random phase encoding for optical security
NASA Astrophysics Data System (ADS)
Wang, RuiKang K.; Watson, Ian A.; Chatwin, Christopher R.
1996-09-01
A new optical encoding method for security applications is proposed. The encoded image (encrypted into the security products) is merely a random phase image statistically and randomly generated by a random number generator using a computer, which contains no information from the reference pattern (stored for verification) or the frequency plane filter (a phase-only function for decoding). The phase function in the frequency plane is obtained using a modified phase retrieval algorithm. The proposed method uses two phase-only functions (images) at both the input and frequency planes of the optical processor leading to maximum optical efficiency. Computer simulation shows that the proposed method is robust for optical security applications.
Multi-peak structure of generation spectrum of random distributed feedback fiber Raman lasers.
Vatnik, I D; Zlobina, E A; Kablukov, S I; Babin, S A
2017-02-06
We study spectral features of the generation of random distributed feedback fiber Raman laser arising from two-peak shape of the Raman gain spectral profile realized in the germanosilicate fibers. We demonstrate that number of peaks can be calculated using power balance model considering different subcomponents within each Stokes component.
Physical Principle for Generation of Randomness
NASA Technical Reports Server (NTRS)
Zak, Michail
2009-01-01
A physical principle (more precisely, a principle that incorporates mathematical models used in physics) has been conceived as the basis of a method of generating randomness in Monte Carlo simulations. The principle eliminates the need for conventional random-number generators. The Monte Carlo simulation method is among the most powerful computational methods for solving high-dimensional problems in physics, chemistry, economics, and information processing. The Monte Carlo simulation method is especially effective for solving problems in which computational complexity increases exponentially with dimensionality. The main advantage of the Monte Carlo simulation method over other methods is that the demand on computational resources becomes independent of dimensionality. As augmented by the present principle, the Monte Carlo simulation method becomes an even more powerful computational method that is especially useful for solving problems associated with dynamics of fluids, planning, scheduling, and combinatorial optimization. The present principle is based on coupling of dynamical equations with the corresponding Liouville equation. The randomness is generated by non-Lipschitz instability of dynamics triggered and controlled by feedback from the Liouville equation. (In non-Lipschitz dynamics, the derivatives of solutions of the dynamical equations are not required to be bounded.)
Students' Misconceptions about Random Variables
ERIC Educational Resources Information Center
Kachapova, Farida; Kachapov, Ilias
2012-01-01
This article describes some misconceptions about random variables and related counter-examples, and makes suggestions about teaching initial topics on random variables in general form instead of doing it separately for discrete and continuous cases. The focus is on post-calculus probability courses. (Contains 2 figures.)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cryns, Jackson W.; Hatchell, Brian K.; Santiago-Rojas, Emiliano
Formal journal article Experimental analysis of a piezoelectric energy harvesting system for harmonic, random, and sine on random vibration Abstract: Harvesting power with a piezoelectric vibration powered generator using a full-wave rectifier conditioning circuit is experimentally compared for varying sinusoidal, random and sine on random (SOR) input vibration scenarios. Additionally, the implications of source vibration characteristics on harvester design are discussed. Studies in vibration harvesting have yielded numerous alternatives for harvesting electrical energy from vibrations but piezoceramics arose as the most compact, energy dense means of energy transduction. The rise in popularity of harvesting energy from ambient vibrations has mademore » piezoelectric generators commercially available. Much of the available literature focuses on maximizing harvested power through nonlinear processing circuits that require accurate knowledge of generator internal mechanical and electrical characteristics and idealization of the input vibration source, which cannot be assumed in general application. In this manuscript, variations in source vibration and load resistance are explored for a commercially available piezoelectric generator. We characterize the source vibration by its acceleration response for repeatability and transcription to general application. The results agree with numerical and theoretical predictions for in previous literature that load optimal resistance varies with transducer natural frequency and source type, and the findings demonstrate that significant gains are seen with lower tuned transducer natural frequencies for similar source amplitudes. Going beyond idealized steady state sinusoidal and simplified random vibration input, SOR testing allows for more accurate representation of real world ambient vibration. It is shown that characteristic interactions from more complex vibrational sources significantly alter power generation and power processing requirements by increasing harvested power, shifting optimal conditioning impedance, inducing significant voltage supply fluctuations and ultimately rendering idealized sinusoidal and random analyses insufficient.« less
NASA Technical Reports Server (NTRS)
Prasad, C. B.; Mei, Chuh
1988-01-01
The large deflection random response of symmetrically laminated cross-ply rectangular thin plates subjected to random excitation is studied. The out-of-plane boundary conditions are such that all the edges are rigidly supported against translation, but elastically restrained against rotation. The plate is also assumed to have a small initial imperfection. The assumed membrane boundary conditions are such that all the edges are free from normal and tangential forces in the plane of the plate. Mean-square deflections and mean-square strains are determined for a three-layered cross-ply laminate.
Maru, Duncan Smith-Rohrberg; Bruce, R Douglas; Walton, Mary; Mezger, Jo Anne; Springer, Sandra A; Shield, David; Altice, Frederick L
2008-03-01
Directly administered antiretroviral therapy (DAART) can improve health outcomes among HIV-infected drug users. An understanding of the utilization of DAART-initiation, adherence, and retention-is critical to successful program design. Here, we use the Behavioral Model to assess the enabling, predisposing, and need factors impacting adherence in our randomized, controlled trial of DAART versus self-administered therapy (SAT) among 141 HIV-infected drug users. Of 88 participants randomized to DAART, 74 (84%) initiated treatment, and 51 (69%) of those who initiated were retained in the program throughout the entire six-month period. Mean adherence to directly observed visits was 73%, and the mean overall composite adherence score was 77%. These results were seen despite the finding that 75% of participants indicated that they would prefer to take their own medications. Major causes of DAART discontinuation included hospitalization, incarceration, and entry into drug-treatment programs. The presence of depression and the lack of willingness to travel greater than four blocks to receive DAART predicted time-to-discontinuation.
Maru, Duncan Smith-Rohrberg; Bruce, R. Douglas; Walton, Mary; Mezger, Jo Anne; Springer, Sandra A.; Shield, David
2009-01-01
Directly administered antiretroviral therapy (DAART) can improve health outcomes among HIV-infected drug users. An understanding of the utilization of DAART—initiation, adherence, and retention—is critical to successful program design. Here, we use the Behavioral Model to assess the enabling, predisposing, and need factors impacting adherence in our randomized, controlled trial of DAART versus self-administered therapy (SAT) among 141 HIV-infected drug users. Of 88 participants randomized to DAART, 74 (84%) initiated treatment, and 51 (69%) of those who initiated were retained in the program throughout the entire six-month period. Mean adherence to directly observed visits was 73%, and the mean overall composite adherence score was 77%. These results were seen despite the finding that 75% of participants indicated that they would prefer to take their own medications. Major causes of DAART discontinuation included hospitalization, incarceration, and entry into drug-treatment programs. The presence of depression and the lack of willingness to travel greater than four blocks to receive DAART predicted time-to-discontinuation. PMID:18085432
Autonomous Byte Stream Randomizer
NASA Technical Reports Server (NTRS)
Paloulian, George K.; Woo, Simon S.; Chow, Edward T.
2013-01-01
Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.
Why the leopard got its spots: relating pattern development to ecology in felids
Allen, William L.; Cuthill, Innes C.; Scott-Samuel, Nicholas E.; Baddeley, Roland
2011-01-01
A complete explanation of the diversity of animal colour patterns requires an understanding of both the developmental mechanisms generating them and their adaptive value. However, only two previous studies, which involved computer-generated evolving prey, have attempted to make this link. This study examines variation in the camouflage patterns displayed on the flanks of many felids. After controlling for the effects of shared ancestry using a fully resolved molecular phylogeny, this study shows how phenotypes from plausible felid coat pattern generation mechanisms relate to ecology. We found that likelihood of patterning and pattern attributes, such as complexity and irregularity, were related to felids' habitats, arboreality and nocturnality. Our analysis also indicates that disruptive selection is a likely explanation for the prevalence of melanistic forms in Felidae. Furthermore, we show that there is little phylogenetic signal in the visual appearance of felid patterning, indicating that camouflage adapts to ecology over relatively short time scales. Our method could be applied to any taxon with colour patterns that can reasonably be matched to reaction–diffusion and similar models, where the kinetics of the reaction between two or more initially randomly dispersed morphogens determines the outcome of pattern development. PMID:20961899
Yoshioka, Craig; Pulokas, James; Fellmann, Denis; Potter, Clinton S.; Milligan, Ronald A.; Carragher, Bridget
2007-01-01
Visualization by electron microscopy has provided many insights into the composition, quaternary structure, and mechanism of macromolecular assemblies. By preserving samples in stain or vitreous ice it is possible to image them as discrete particles, and from these images generate three-dimensional structures. This ‘single-particle’ approach suffers from two major shortcomings; it requires an initial model to reconstitute 2D data into a 3D volume, and it often fails when faced with conformational variability. Random conical tilt (RCT) and orthogonal tilt (OTR) are methods developed to overcome these problems, but the data collection required, particularly for vitreous ice specimens, is difficult and tedious. In this paper we present an automated approach to RCT/OTR data collection that removes the burden of manual collection and offers higher quality and throughput than is otherwise possible. We show example datasets collected under stain and cryo conditions and provide statistics related to the efficiency and robustness of the process. Furthermore, we describe the new algorithms that make this method possible, which include new calibrations, improved targeting and feature-based tracking. PMID:17524663
Robustness and Vulnerability of Networks with Dynamical Dependency Groups.
Bai, Ya-Nan; Huang, Ning; Wang, Lei; Wu, Zhi-Xi
2016-11-28
The dependency property and self-recovery of failure nodes both have great effects on the robustness of networks during the cascading process. Existing investigations focused mainly on the failure mechanism of static dependency groups without considering the time-dependency of interdependent nodes and the recovery mechanism in reality. In this study, we present an evolving network model consisting of failure mechanisms and a recovery mechanism to explore network robustness, where the dependency relations among nodes vary over time. Based on generating function techniques, we provide an analytical framework for random networks with arbitrary degree distribution. In particular, we theoretically find that an abrupt percolation transition exists corresponding to the dynamical dependency groups for a wide range of topologies after initial random removal. Moreover, when the abrupt transition point is above the failure threshold of dependency groups, the evolving network with the larger dependency groups is more vulnerable; when below it, the larger dependency groups make the network more robust. Numerical simulations employing the Erdős-Rényi network and Barabási-Albert scale free network are performed to validate our theoretical results.
Damasco, O P; Graham, G C; Henry, R J; Adkins, S W; Smiths, M K; Godwin, I D
1996-11-01
A RAPD marker specific to the dwarf off-type (hereafter known as dwarf) from micropropagation of Cavendish banana (Musa spp. AAA) cultivars New Guinea Cavendish and Williams was identified following an analysis of 57 normal (true-to-type) and 59 dwarf plants generated from several different micropropagation events. Sixty-six random decamer primers were used in the initial screen, of which 19 (28.8%) revealed polymorphisms between normal and dwarf plants. Primer OPJ-04 (5'-CCGAACACGG-3') was found to amplify an approx. 1.5 kb band which was consistently present in all normal but absent in all dwarf plants of both cultivars. Reliable detection of dwarf plants was achieved using this marker, providing the only available means ofin vitro detection of dwarfs. The use of this marker could facilitate early detection and elimination of dwarfs from batches of micropropagated bananas, and may be a useful tool in determining what factors in the tissue culture process lead to this off type production.Other micropropagation-induced RAPD polymorphisms were observed but were not associated with the dwarf trait.
Fine-scale spatial genetic dynamics over the life cycle of the tropical tree Prunus africana.
Berens, D G; Braun, C; González-Martínez, S C; Griebeler, E M; Nathan, R; Böhning-Gaese, K
2014-11-01
Studying fine-scale spatial genetic patterns across life stages is a powerful approach to identify ecological processes acting within tree populations. We investigated spatial genetic dynamics across five life stages in the insect-pollinated and vertebrate-dispersed tropical tree Prunus africana in Kakamega Forest, Kenya. Using six highly polymorphic microsatellite loci, we assessed genetic diversity and spatial genetic structure (SGS) from seed rain and seedlings, and different sapling stages to adult trees. We found significant SGS in all stages, potentially caused by limited seed dispersal and high recruitment rates in areas with high light availability. SGS decreased from seed and early seedling stages to older juvenile stages. Interestingly, SGS was stronger in adults than in late juveniles. The initial decrease in SGS was probably driven by both random and non-random thinning of offspring clusters during recruitment. Intergenerational variation in SGS could have been driven by variation in gene flow processes, overlapping generations in the adult stage or local selection. Our study shows that complex sequential processes during recruitment contribute to SGS of tree populations.
Layers: A molecular surface peeling algorithm and its applications to analyze protein structures
Karampudi, Naga Bhushana Rao; Bahadur, Ranjit Prasad
2015-01-01
We present an algorithm ‘Layers’ to peel the atoms of proteins as layers. Using Layers we show an efficient way to transform protein structures into 2D pattern, named residue transition pattern (RTP), which is independent of molecular orientations. RTP explains the folding patterns of proteins and hence identification of similarity between proteins is simple and reliable using RTP than with the standard sequence or structure based methods. Moreover, Layers generates a fine-tunable coarse model for the molecular surface by using non-random sampling. The coarse model can be used for shape comparison, protein recognition and ligand design. Additionally, Layers can be used to develop biased initial configuration of molecules for protein folding simulations. We have developed a random forest classifier to predict the RTP of a given polypeptide sequence. Layers is a standalone application; however, it can be merged with other applications to reduce the computational load when working with large datasets of protein structures. Layers is available freely at http://www.csb.iitkgp.ernet.in/applications/mol_layers/main. PMID:26553411
NASA Astrophysics Data System (ADS)
Pichierri, Manuele; Hajnsek, Irena
2015-04-01
In this work, the potential of multi-baseline Pol-InSAR for crop parameter estimation (e.g. crop height and extinction coefficients) is explored. For this reason, a novel Oriented Volume over Ground (OVoG) inversion scheme is developed, which makes use of multi-baseline observables to estimate the whole stack of model parameters. The proposed algorithm has been initially validated on a set of randomly-generated OVoG scenarios, to assess its stability over crop structure changes and its robustness against volume decorrelation and other decorrelation sources. Then, it has been applied to a collection of multi-baseline repeat-pass SAR data, acquired over a rural area in Germany by DLR's F-SAR.
Turbulent mixing induced by Richtmyer-Meshkov instability
NASA Astrophysics Data System (ADS)
Krivets, V. V.; Ferguson, K. J.; Jacobs, J. W.
2017-01-01
Richtmyer-Meshkov instability is studied in shock tube experiments with an Atwood number of 0.7. The interface is formed in a vertical shock tube using opposed gas flows, and three-dimensional random initial interface perturbations are generated by the vertical oscillation of gas column producing Faraday waves. Planar Laser Mie scattering is used for flow visualization and for measurements of the mixing process. Experimental image sequences are recorded at 6 kHz frequency and processed to obtain the time dependent variation of the integral mixing layer width. Measurements of the mixing layer width are compared with Mikaelian's [1] model in order to extract the growth exponent θ where a fairly wide range of values is found varying from θ ≈ 0.2 to 0.6.
Method for nonlinear optimization for gas tagging and other systems
Chen, Ting; Gross, Kenny C.; Wegerich, Stephan
1998-01-01
A method and system for providing nuclear fuel rods with a configuration of isotopic gas tags. The method includes selecting a true location of a first gas tag node, selecting initial locations for the remaining n-1 nodes using target gas tag compositions, generating a set of random gene pools with L nodes, applying a Hopfield network for computing on energy, or cost, for each of the L gene pools and using selected constraints to establish minimum energy states to identify optimal gas tag nodes with each energy compared to a convergence threshold and then upon identifying the gas tag node continuing this procedure until establishing the next gas tag node until all remaining n nodes have been established.
Method for nonlinear optimization for gas tagging and other systems
Chen, T.; Gross, K.C.; Wegerich, S.
1998-01-06
A method and system are disclosed for providing nuclear fuel rods with a configuration of isotopic gas tags. The method includes selecting a true location of a first gas tag node, selecting initial locations for the remaining n-1 nodes using target gas tag compositions, generating a set of random gene pools with L nodes, applying a Hopfield network for computing on energy, or cost, for each of the L gene pools and using selected constraints to establish minimum energy states to identify optimal gas tag nodes with each energy compared to a convergence threshold and then upon identifying the gas tag node continuing this procedure until establishing the next gas tag node until all remaining n nodes have been established. 6 figs.
Medicine Based Evidence and Personalized Care of Patients.
Horwitz, Ralph I; Charlson, Mary E; Singer, Burton H
2018-04-27
For the past 70 years, evidence generation for patient management in clinical medicine has been dominated by Evidence Based Medicine (EBM) with its emphasis on Randomized Controlled Trials (RCTs). EBM can tell us about the benefits of treatment in the average patient but not for the patient at hand; for how to initiate treatment but not how to adjust or modify therapy after treatment has started; for treatment efficacy when compared to placebo but less often when compared to other effective treatments; when outcomes are chosen as hard endpoints, but not when the predominant concerns of patients are physical limitations or social functioning or psychological distress. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Radiation Transport in Random Media With Large Fluctuations
NASA Astrophysics Data System (ADS)
Olson, Aaron; Prinja, Anil; Franke, Brian
2017-09-01
Neutral particle transport in media exhibiting large and complex material property spatial variation is modeled by representing cross sections as lognormal random functions of space and generated through a nonlinear memory-less transformation of a Gaussian process with covariance uniquely determined by the covariance of the cross section. A Karhunen-Loève decomposition of the Gaussian process is implemented to effciently generate realizations of the random cross sections and Woodcock Monte Carlo used to transport particles on each realization and generate benchmark solutions for the mean and variance of the particle flux as well as probability densities of the particle reflectance and transmittance. A computationally effcient stochastic collocation method is implemented to directly compute the statistical moments such as the mean and variance, while a polynomial chaos expansion in conjunction with stochastic collocation provides a convenient surrogate model that also produces probability densities of output quantities of interest. Extensive numerical testing demonstrates that use of stochastic reduced-order modeling provides an accurate and cost-effective alternative to random sampling for particle transport in random media.
Kanter, Ido; Butkovski, Maria; Peleg, Yitzhak; Zigzag, Meital; Aviad, Yaara; Reidler, Igor; Rosenbluh, Michael; Kinzel, Wolfgang
2010-08-16
Random bit generators (RBGs) constitute an important tool in cryptography, stochastic simulations and secure communications. The later in particular has some difficult requirements: high generation rate of unpredictable bit strings and secure key-exchange protocols over public channels. Deterministic algorithms generate pseudo-random number sequences at high rates, however, their unpredictability is limited by the very nature of their deterministic origin. Recently, physical RBGs based on chaotic semiconductor lasers were shown to exceed Gbit/s rates. Whether secure synchronization of two high rate physical RBGs is possible remains an open question. Here we propose a method, whereby two fast RBGs based on mutually coupled chaotic lasers, are synchronized. Using information theoretic analysis we demonstrate security against a powerful computational eavesdropper, capable of noiseless amplification, where all parameters are publicly known. The method is also extended to secure synchronization of a small network of three RBGs.
Dipolar eddies in a decaying stratified turbulent flow
NASA Astrophysics Data System (ADS)
Voropayev, S. I.; Fernando, H. J. S.; Morrison, R.
2008-02-01
Laboratory experiments on the evolution of dipolar (momentum) eddies in a stratified fluid in the presence of random background motions are described. A turbulent jet puff was used to generate the momentum eddies, and a decaying field of ambient random vortical motions was generated by a towed grid. Data on vorticity/velocity fields of momentum eddies, those of background motions, and their interactions were collected in the presence and absence of the other, and the main characteristics thereof were parametrized. Similarity arguments predict that dipolar eddies in stratified fluids may preserve their identity in decaying grid-generated stratified turbulence, which was verified experimentally. Possible applications of the results include mushroomlike currents and other naturally/artificially generated large dipolar eddies in strongly stratified layers of the ocean, the longevity of which is expected to be determined by the characteristics of the eddies and random background motions.
Puthanakit, Thanyawee; Ananworanich, Jintanat; Vonthanak, Saphonn; Kosalaraksa, Pope; Hansudewechakul, Rawiwan; van der Lugt, Jasper; Kerr, Stephen J; Kanjanavanit, Suparat; Ngampiyaskul, Chaiwat; Wongsawat, Jurai; Luesomboon, Wicharn; Vibol, Ung; Pruksakaew, Kanchana; Suwarnlerk, Tulathip; Apornpong, Tanakorn; Ratanadilok, Kattiya; Paul, Robert; Mofenson, Lynne M; Fox, Lawrence; Valcour, Victor; Brouwers, Pim; Ruxrungtham, Kiat
2013-05-01
We previously reported similar AIDS-free survival at 3 years in children who were >1 year old initiating antiretroviral therapy (ART) and randomized to early versus deferred ART in the Pediatric Randomized to Early versus Deferred Initiation in Cambodia and Thailand (PREDICT) study. We now report neurodevelopmental outcomes. Two hundred eighty-four HIV-infected Thai and Cambodian children aged 1-12 years with CD4 counts between 15% and 24% and no AIDS-defining illness were randomized to initiate ART at enrollment ("early," n = 139) or when CD4 count became <15% or a Centers for Disease Control (CDC) category C event developed ("deferred," n = 145). All underwent age-appropriate neurodevelopment testing including Beery Visual Motor Integration, Purdue Pegboard, Color Trails and Child Behavioral Checklist. Thai children (n = 170) also completed Wechsler Intelligence Scale (intelligence quotient) and Stanford Binet Memory test. We compared week 144 measures by randomized group and to HIV-uninfected children (n = 319). At week 144, the median age was 9 years and 69 (48%) of the deferred arm children had initiated ART. The early arm had a higher CD4 (33% versus 24%, P < 0.001) and a greater percentage of children with viral suppression (91% versus 40%, P < 0.001). Neurodevelopmental scores did not differ by arm, and there were no differences in changes between arms across repeated assessments in time-varying multivariate models. HIV-infected children performed worse than uninfected children on intelligence quotient, Beery Visual Motor Integration, Binet memory and Child Behavioral Checklist. In HIV-infected children surviving beyond 1 year of age without ART, neurodevelopmental outcomes were similar with ART initiation at CD4 15%-24% versus <15%, but both groups performed worse than HIV-uninfected children. The window of opportunity for a positive effect of ART initiation on neurodevelopment may remain in infancy.
25 CFR 547.14 - What are the minimum technical standards for electronic random number generation?
Code of Federal Regulations, 2011 CFR
2011-04-01
... CLASS II GAMES § 547.14 What are the minimum technical standards for electronic random number generation... rules of the game. For example, if a bingo game with 75 objects with numbers or other designations has a... serial correlation (outcomes shall be independent from the previous game); and (x) Test on subsequences...
25 CFR 547.14 - What are the minimum technical standards for electronic random number generation?
Code of Federal Regulations, 2012 CFR
2012-04-01
... CLASS II GAMES § 547.14 What are the minimum technical standards for electronic random number generation... rules of the game. For example, if a bingo game with 75 objects with numbers or other designations has a... serial correlation (outcomes shall be independent from the previous game); and (x) Test on subsequences...
25 CFR 547.14 - What are the minimum technical standards for electronic random number generation?
Code of Federal Regulations, 2010 CFR
2010-04-01
... CLASS II GAMES § 547.14 What are the minimum technical standards for electronic random number generation... rules of the game. For example, if a bingo game with 75 objects with numbers or other designations has a... serial correlation (outcomes shall be independent from the previous game); and (x) Test on subsequences...
Monte Carlo Simulation Using HyperCard and Lotus 1-2-3.
ERIC Educational Resources Information Center
Oulman, Charles S.; Lee, Motoko Y.
Monte Carlo simulation is a computer modeling procedure for mimicking observations on a random variable. A random number generator is used in generating the outcome for the events that are being modeled. The simulation can be used to obtain results that otherwise require extensive testing or complicated computations. This paper describes how Monte…
System and method for key generation in security tokens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Philip G.; Humble, Travis S.; Paul, Nathanael R.
Functional randomness in security tokens (FRIST) may achieve improved security in two-factor authentication hardware tokens by improving on the algorithms used to securely generate random data. A system and method in one embodiment according to the present invention may allow for security of a token based on storage cost and computational security. This approach may enable communication where security is no longer based solely on onetime pads (OTPs) generated from a single cryptographic function (e.g., SHA-256).
Random numbers from vacuum fluctuations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Yicheng; Kurtsiefer, Christian, E-mail: christian.kurtsiefer@gmail.com; Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543
2016-07-25
We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.
Cognitive training in Alzheimer's disease: a controlled randomized study.
Giovagnoli, A R; Manfredi, V; Parente, A; Schifano, L; Oliveri, S; Avanzini, G
2017-08-01
This controlled randomized single-blind study evaluated the effects of cognitive training (CT), compared to active music therapy (AMT) and neuroeducation (NE), on initiative in patients with mild to moderate Alzheimer's disease (AD). Secondarily, we explored the effects of CT on episodic memory, mood, and social relationships. Thirty-nine AD patients were randomly assigned to CT, AMT, or NE. Each treatment lasted 3 months. Before, at the end, and 3 months after treatment, neuropsychological tests and self-rated scales assessed initiative, episodic memory, depression, anxiety, and social relationships. At the end of the CT, initiative significantly improved, whereas, at the end of AMT and NE, it was unchanged. Episodic memory showed no changes at the end of CT or AMT and a worsening after NE. The rates of the patients with clinically significant improvement of initiative were greater after CT (about 62%) than after AMT (about 8%) or NE (none). At the 3-month follow-up, initiative and episodic memory declined in all patients. Mood and social relationships improved in the three groups, with greater changes after AMT or NE. In patients with mild to moderate AD, CT can improve initiative and stabilize memory, while the non-cognitive treatments can ameliorate the psychosocial aspects. The combining of CT and non-cognitive treatments may have useful clinical implications.
Yotebieng, Marcel; Behets, Frieda; Kawende, Bienvenu; Ravelomanana, Noro Lantoniaina Rosa; Tabala, Martine; Okitolonda, Emile W
2017-04-26
Despite the rapid adoption of the World Health Organization's 2013 guidelines, children continue to be infected with HIV perinatally because of sub-optimal adherence to the continuum of HIV care in maternal and child health (MCH) clinics. To achieve the UNAIDS goal of eliminating mother-to-child HIV transmission, multiple, adaptive interventions need to be implemented to improve adherence to the HIV continuum. The aim of this open label, parallel, group randomized trial is to evaluate the effectiveness of Continuous Quality Improvement (CQI) interventions implemented at facility and health district levels to improve retention in care and virological suppression through 24 months postpartum among pregnant and breastfeeding women receiving ART in MCH clinics in Kinshasa, Democratic Republic of Congo. Prior to randomization, the current monitoring and evaluation system will be strengthened to enable collection of high quality individual patient-level data necessary for timely indicators production and program outcomes monitoring to inform CQI interventions. Following randomization, in health districts randomized to CQI, quality improvement (QI) teams will be established at the district level and at MCH clinics level. For 18 months, QI teams will be brought together quarterly to identify key bottlenecks in the care delivery system using data from the monitoring system, develop an action plan to address those bottlenecks, and implement the action plan at the level of their district or clinics. If proven to be effective, CQI as designed here, could be scaled up rapidly in resource-scarce settings to accelerate progress towards the goal of an AIDS free generation. The protocol was retrospectively registered on February 7, 2017. ClinicalTrials.gov Identifier: NCT03048669 .
A meta-analysis of randomized controlled trials of azilsartan therapy for blood pressure reduction.
Takagi, Hisato; Mizuno, Yusuke; Niwa, Masao; Goto, Shin-Nosuke; Umemoto, Takuya
2014-05-01
Although there have been a number of azilsartan trials, no meta-analysis of the findings has been conducted to date. We performed the first meta-analysis of randomized controlled trials of azilsartan therapy for the reduction of blood pressure (BP) in patients with hypertension. MEDLINE, EMBASE and the Cochrane Central Register of Controlled Trials were searched from the beginning of the records through March 2013 using web-based search engines (PubMed and OVID). Eligible studies were prospective randomized controlled trials of azilsartan (including azilsartan medoxomil) vs. any control therapy that reported clinic or 24-h mean BP as an outcome. For each study, data for the changes from baseline to final clinic systolic BP (SBP) and diastolic BP (DBP) in both the azilsartan group and the control group were used to generate mean differences and 95% confidence intervals (CIs). Of 27 potentially relevant articles screened initially, 7 reports of randomized trials of azilsartan or azilsartan medoxomil therapy enrolling a total of 6152 patients with hypertension were identified and included. Pooled analysis suggested a significant reduction in BP changes among patients randomized to 40 mg of azilsartan vs. control therapy (clinic SBP: -4.20 mm Hg; 95% CI: -6.05 to -2.35 mm Hg; P<0.00001; clinic DBP: -2.58 mm Hg; 95% CI: -3.69 to -1.48 mm Hg; P<0.00001; 24-h mean SBP: -3.33 mm Hg; 95% CI: -4.74 to -1.93 mm Hg; P<0.00001; 24-h mean DBP: -2.12 mm Hg; 95% CI: -2.74 to -1.49 mm Hg; P<0.00001). In conclusion, azilsartan therapy appears to provide a greater reduction in BP than control therapy in patients with hypertension.
Hoffmann, Vivian; Jones, Kelly; Leroy, Jef
2015-12-03
While the few studies that have looked at the association between stunting and aflatoxin exposure have found surprisingly large effects, the results remain inconclusive due to a lack of randomized controlled studies. This protocol describes a non-blinded, cluster-randomized controlled trial with the specific objective of testing the impact of reduced aflatoxin exposure on (individual) child linear growth. Participants were recruited from among households containing women in the last 5 months of pregnancy in 28 maize-growing villages within Meru and Tharaka-Nithi Counties in Kenya. Households in villages assigned to the intervention group are offered rapid testing of their stored maize for the presence of aflatoxin each month; any maize found to contain more than 10 ppb aflatoxin is replaced with an equal amount of maize that contains less than this concentration of the toxin. They are also offered the opportunity to buy maize that has been tested and found to contain less than 10 ppb aflatoxin at local shops. Clusters (villages) were allocated to the intervention group (28 villages containing 687 participating households) or control group (28 villages containing 536 participating households) using a random number generator. The trial, which is funded by United Kingdom (UK) aid from the UK government, the Global Food Security Portal, and the Ministry for Foreign Affairs of Finland, is currently ongoing. This study is the first randomized controlled trial (RCT) to test for a causal impact of aflatoxin exposure on child growth. Whether or not this relationship is found, its results will have implications for the prioritization of aflatoxin control efforts by governments in affected regions, as well as international donors. American Economic Association RCT Registry # 0000105 . Initial registration date: 6 November 2013, last updated 30 December 2014.
Umasunthar, T; Procktor, A; Hodes, M; Smith, J G; Gore, C; Cox, H E; Marrs, T; Hanna, H; Phillips, K; Pinto, C; Turner, P J; Warner, J O; Boyle, R J
2015-07-01
Previous work has shown patients commonly misuse adrenaline autoinjectors (AAI). It is unclear whether this is due to inadequate training, or poor device design. We undertook a prospective randomized controlled trial to evaluate ability to administer adrenaline using different AAI devices. We allocated mothers of food-allergic children prescribed an AAI for the first time to Anapen or EpiPen using a computer-generated randomization list, with optimal training according to manufacturer's instructions. After one year, participants were randomly allocated a new device (EpiPen, Anapen, new EpiPen, JEXT or Auvi-Q), without device-specific training. We assessed ability to deliver adrenaline using their AAI in a simulated anaphylaxis scenario six weeks and one year after initial training, and following device switch. Primary outcome was successful adrenaline administration at six weeks, assessed by an independent expert. Secondary outcomes were success at one year, success after switching device, and adverse events. We randomized 158 participants. At six weeks, 30 of 71 (42%) participants allocated to Anapen and 31 of 73 (43%) participants allocated to EpiPen were successful - RR 1.00 (95% CI 0.68-1.46). Success rates at one year were also similar, but digital injection was more common at one year with EpiPen (8/59, 14%) than Anapen (0/51, 0%, P = 0.007). When switched to a new device without specific training, success rates were higher with Auvi-Q (26/28, 93%) than other devices (39/80, 49%; P < 0.001). AAI device design is a major determinant of successful adrenaline administration. Success rates were low with several devices, but were high using the audio-prompt device Auvi-Q. © 2015 The Authors Allergy Published by John Wiley & Sons Ltd.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-13
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-792] Certain Static Random Access Memories and Products Containing Same; Commission Determination Affirming a Final Initial Determination..., and the sale within the United States after importation of certain static random access memories and...
2011-01-01
Background Many newly screened people living with HIV (PLHIV) in Sub-Saharan Africa do not understand the importance of regular pre-antiretroviral (ARV) care because most of them have been counseled by staff who lack basic counseling skills. This results in low uptake of pre-ARV care and late treatment initiation in resource-poor settings. The effect of providing post-test counseling by staff equipped with basic counseling skills, combined with home visits by community support agents on uptake of pre-ARV care for newly diagnosed PLHIV was evaluated through a randomized intervention trial in Uganda. Methods An intervention trial was performed consisting of post-test counseling by trained counselors, combined with monthly home visits by community support agents for continued counseling to newly screened PLHIV in Iganga district, Uganda between July 2009 and June 2010, Participants (N = 400) from three public recruitment centres were randomized to receive either the intervention, or the standard care (the existing post-test counseling by ARV clinic staff who lack basic training in counseling skills), the control arm. The outcome measure was the proportion of newly screened and counseled PLHIV in either arm who had been to their nearest health center for clinical check-up in the subsequent three months +2 months. Treatment was randomly assigned using computer-generated random numbers. The statistical significance of differences between the two study arms was assessed using chi-square and t-tests for categorical and quantitative data respectively. Risk ratios and 95% confidence intervals were used to assess the effect of the intervention. Results Participants in the intervention arm were 80% more likely to accept (take up) pre-ARV care compared to those in the control arm (RR 1.8, 95% CI 1.4-2.1). No adverse events were reported. Conclusions Provision of post-test counseling by staff trained in basic counseling skills, combined with home visits by community support agents had a significant effect on uptake of pre-ARV care and appears to be a cost-effective way to increase the prerequisites for timely ARV initiation. Trial registration The trial was registered by Current Controlled Trials Ltd C/OBioMed Central Ltd as ISRCTN94133652 and received financial support from Sida and logistical support from the European Commission. PMID:21794162
Origins of Protein Functions in Cells
NASA Technical Reports Server (NTRS)
Seelig, Burchard; Pohorille, Andrzej
2011-01-01
In modern organisms proteins perform a majority of cellular functions, such as chemical catalysis, energy transduction and transport of material across cell walls. Although great strides have been made towards understanding protein evolution, a meaningful extrapolation from contemporary proteins to their earliest ancestors is virtually impossible. In an alternative approach, the origin of water-soluble proteins was probed through the synthesis and in vitro evolution of very large libraries of random amino acid sequences. In combination with computer modeling and simulations, these experiments allow us to address a number of fundamental questions about the origins of proteins. Can functionality emerge from random sequences of proteins? How did the initial repertoire of functional proteins diversify to facilitate new functions? Did this diversification proceed primarily through drawing novel functionalities from random sequences or through evolution of already existing proto-enzymes? Did protein evolution start from a pool of proteins defined by a frozen accident and other collections of proteins could start a different evolutionary pathway? Although we do not have definitive answers to these questions yet, important clues have been uncovered. In one example (Keefe and Szostak, 2001), novel ATP binding proteins were identified that appear to be unrelated in both sequence and structure to any known ATP binding proteins. One of these proteins was subsequently redesigned computationally to bind GTP through introducing several mutations that introduce targeted structural changes to the protein, improve its binding to guanine and prevent water from accessing the active center. This study facilitates further investigations of individual evolutionary steps that lead to a change of function in primordial proteins. In a second study (Seelig and Szostak, 2007), novel enzymes were generated that can join two pieces of RNA in a reaction for which no natural enzymes are known. Recently it was found that, as in the previous case, the proteins have a structure unknown among modern enzymes. In this case, in vitro evolution started from a small, non-enzymatic protein. A similar selection process initiated from a library of random polypeptides is in progress. These results not only allow for estimating the occurrence of function in random protein assemblies but also provide evidence for the possibility of alternative protein worlds. Extant proteins might simply represent a frozen accident in the world of possible proteins. Alternative collections of proteins, even with similar functions, could originate alternative evolutionary paths.
Zhang, Hanwei; Zhou, Pu; Wang, Xiong; Du, Xueyuan; Xiao, Hu; Xu, Xiaojun
2015-06-29
Two kinds of hundred-watt-level random distributed feedback Raman fiber have been demonstrated. The optical efficiency can reach to as high as 84.8%. The reported power and efficiency of the random laser is the highest one as we know. We have also demonstrated that the developed random laser can be further used to pump a Ho-doped fiber laser for mid-infrared laser generation. Finally, 23 W 2050 nm laser is achieved. The presented laser can obtain high power output efficiently and conveniently and opens a new direction for high power laser sources at designed wavelength.
Kuz'min, A A; Meshkovskiĭ, D V; Filist, S A
2008-01-01
Problems of engineering and algorithm development of magnetic therapy apparatuses with pseudo-random radiation spectrum within the audio range for treatment of prostatitis and gynecopathies are considered. A typical design based on a PIC 16F microcontroller is suggested. It includes a keyboard, LCD indicator, audio amplifier, inducer, and software units. The problem of pseudo-random signal generation within the audio range is considered. A series of rectangular pulses is generated on a random-length interval on the basis of a three-component random vector. This series provides the required spectral characteristics of the therapeutic magnetic field and their adaptation to the therapeutic conditions and individual features of the patient.
Random functions via Dyson Brownian Motion: progress and problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Gaoyuan; Battefeld, Thorsten
2016-09-05
We develope a computationally efficient extension of the Dyson Brownian Motion (DBM) algorithm to generate random function in C{sup 2} locally. We further explain that random functions generated via DBM show an unstable growth as the traversed distance increases. This feature restricts the use of such functions considerably if they are to be used to model globally defined ones. The latter is the case if one uses random functions to model landscapes in string theory. We provide a concrete example, based on a simple axionic potential often used in cosmology, to highlight this problem and also offer an ad hocmore » modification of DBM that suppresses this growth to some degree.« less
ERIC Educational Resources Information Center
Vickers, F. D.
1973-01-01
A brief description of a test generating program which generates questions concerning the Fortran programming language in a random but guided fashion and without resorting to an item bank.'' (Author/AK)
NASA Technical Reports Server (NTRS)
Dean, Bruce H. (Inventor)
2009-01-01
A method of recovering unknown aberrations in an optical system includes collecting intensity data produced by the optical system, generating an initial estimate of a phase of the optical system, iteratively performing a phase retrieval on the intensity data to generate a phase estimate using an initial diversity function corresponding to the intensity data, generating a phase map from the phase retrieval phase estimate, decomposing the phase map to generate a decomposition vector, generating an updated diversity function by combining the initial diversity function with the decomposition vector, generating an updated estimate of the phase of the optical system by removing the initial diversity function from the phase map. The method may further include repeating the process beginning with iteratively performing a phase retrieval on the intensity data using the updated estimate of the phase of the optical system in place of the initial estimate of the phase of the optical system, and using the updated diversity function in place of the initial diversity function, until a predetermined convergence is achieved.
NASA Astrophysics Data System (ADS)
Xia, Zhiye; Xu, Lisheng; Chen, Hongbin; Wang, Yongqian; Liu, Jinbao; Feng, Wenlan
2017-06-01
Extended range forecasting of 10-30 days, which lies between medium-term and climate prediction in terms of timescale, plays a significant role in decision-making processes for the prevention and mitigation of disastrous meteorological events. The sensitivity of initial error, model parameter error, and random error in a nonlinear crossprediction error (NCPE) model, and their stability in the prediction validity period in 10-30-day extended range forecasting, are analyzed quantitatively. The associated sensitivity of precipitable water, temperature, and geopotential height during cases of heavy rain and hurricane is also discussed. The results are summarized as follows. First, the initial error and random error interact. When the ratio of random error to initial error is small (10-6-10-2), minor variation in random error cannot significantly change the dynamic features of a chaotic system, and therefore random error has minimal effect on the prediction. When the ratio is in the range of 10-1-2 (i.e., random error dominates), attention should be paid to the random error instead of only the initial error. When the ratio is around 10-2-10-1, both influences must be considered. Their mutual effects may bring considerable uncertainty to extended range forecasting, and de-noising is therefore necessary. Second, in terms of model parameter error, the embedding dimension m should be determined by the factual nonlinear time series. The dynamic features of a chaotic system cannot be depicted because of the incomplete structure of the attractor when m is small. When m is large, prediction indicators can vanish because of the scarcity of phase points in phase space. A method for overcoming the cut-off effect ( m > 4) is proposed. Third, for heavy rains, precipitable water is more sensitive to the prediction validity period than temperature or geopotential height; however, for hurricanes, geopotential height is most sensitive, followed by precipitable water.
Accurate initial conditions in mixed dark matter-baryon simulations
NASA Astrophysics Data System (ADS)
Valkenburg, Wessel; Villaescusa-Navarro, Francisco
2017-06-01
We quantify the error in the results of mixed baryon-dark-matter hydrodynamic simulations, stemming from outdated approximations for the generation of initial conditions. The error at redshift 0 in contemporary large simulations is of the order of few to 10 per cent in the power spectra of baryons and dark matter, and their combined total-matter power spectrum. After describing how to properly assign initial displacements and peculiar velocities to multiple species, we review several approximations: (1) using the total-matter power spectrum to compute displacements and peculiar velocities of both fluids, (2) scaling the linear redshift-zero power spectrum back to the initial power spectrum using the Newtonian growth factor ignoring homogeneous radiation, (3) using a mix of general-relativistic gauges so as to approximate Newtonian gravity, namely longitudinal-gauge velocities with synchronous-gauge densities and (4) ignoring the phase-difference in the Fourier modes for the offset baryon grid, relative to the dark-matter grid. Three of these approximations do not take into account that dark matter and baryons experience a scale-dependent growth after photon decoupling, which results in directions of velocity that are not the same as their direction of displacement. We compare the outcome of hydrodynamic simulations with these four approximations to our reference simulation, all setup with the same random seed and simulated using gadget-III.
Evolutionary Initial Poses of Reduced D.O.F’s Quadruped Robot
NASA Astrophysics Data System (ADS)
Iida, Ken-Ichi; Nakata, Yoshitaka; Hira, Toshio; Kamano, Takuya; Suzuki, Takayuki
In this paper, an application of genetic algorithm for generation of evolutionary initial poses of a quadrupedal robot which reduced degrees of freedom is described. To reduce degree of freedom, each leg of the robot has a slider-crank mechanism and is driven by an actuator. Furthermore we introduced the forward movement mode and the rotating mode because the omnidirection movement should be made possible. To generate the suitable initial pose, the initial angle of four legs are coded under gray code and tuned by an estimation function in each mode with the genetic algorithm. As a result of generation, the cooperation of the legs is realized to move toward the omnidirection. The experimental results demonstrate that the proposed scheme is effective for generation of the suitable initial poses and the robot can walk smoothly with the generated patterns.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-02
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-792] Certain Static Random Access Memories and Products Containing Same; Commission Determination To Review in Part a Final Initial... States after importation of certain static random access memories and products containing the same by...
Color image encryption based on gyrator transform and Arnold transform
NASA Astrophysics Data System (ADS)
Sui, Liansheng; Gao, Bo
2013-06-01
A color image encryption scheme using gyrator transform and Arnold transform is proposed, which has two security levels. In the first level, the color image is separated into three components: red, green and blue, which are normalized and scrambled using the Arnold transform. The green component is combined with the first random phase mask and transformed to an interim using the gyrator transform. The first random phase mask is generated with the sum of the blue component and a logistic map. Similarly, the red component is combined with the second random phase mask and transformed to three-channel-related data. The second random phase mask is generated with the sum of the phase of the interim and an asymmetrical tent map. In the second level, the three-channel-related data are scrambled again and combined with the third random phase mask generated with the sum of the previous chaotic maps, and then encrypted into a gray scale ciphertext. The encryption result has stationary white noise distribution and camouflage property to some extent. In the process of encryption and decryption, the rotation angle of gyrator transform, the iterative numbers of Arnold transform, the parameters of the chaotic map and generated accompanied phase function serve as encryption keys, and hence enhance the security of the system. Simulation results and security analysis are presented to confirm the security, validity and feasibility of the proposed scheme.
An adaptive random search for short term generation scheduling with network constraints.
Marmolejo, J A; Velasco, Jonás; Selley, Héctor J
2017-01-01
This paper presents an adaptive random search approach to address a short term generation scheduling with network constraints, which determines the startup and shutdown schedules of thermal units over a given planning horizon. In this model, we consider the transmission network through capacity limits and line losses. The mathematical model is stated in the form of a Mixed Integer Non Linear Problem with binary variables. The proposed heuristic is a population-based method that generates a set of new potential solutions via a random search strategy. The random search is based on the Markov Chain Monte Carlo method. The main key of the proposed method is that the noise level of the random search is adaptively controlled in order to exploring and exploiting the entire search space. In order to improve the solutions, we consider coupling a local search into random search process. Several test systems are presented to evaluate the performance of the proposed heuristic. We use a commercial optimizer to compare the quality of the solutions provided by the proposed method. The solution of the proposed algorithm showed a significant reduction in computational effort with respect to the full-scale outer approximation commercial solver. Numerical results show the potential and robustness of our approach.
Labrecque, Michel; Ratté, Stéphane; Frémont, Pierre; Cauchon, Michel; Ouellet, Jérôme; Hogg, William; McGowan, Jessie; Gagnon, Marie-Pierre; Njoya, Merlin; Légaré, France
2013-10-01
To compare the ability of users of 2 medical search engines, InfoClinique and the Trip database, to provide correct answers to clinical questions and to explore the perceived effects of the tools on the clinical decision-making process. Randomized trial. Three family medicine units of the family medicine program of the Faculty of Medicine at Laval University in Quebec city, Que. Fifteen second-year family medicine residents. Residents generated 30 structured questions about therapy or preventive treatment (2 questions per resident) based on clinical encounters. Using an Internet platform designed for the trial, each resident answered 20 of these questions (their own 2, plus 18 of the questions formulated by other residents, selected randomly) before and after searching for information with 1 of the 2 search engines. For each question, 5 residents were randomly assigned to begin their search with InfoClinique and 5 with the Trip database. The ability of residents to provide correct answers to clinical questions using the search engines, as determined by third-party evaluation. After answering each question, participants completed a questionnaire to assess their perception of the engine's effect on the decision-making process in clinical practice. Of 300 possible pairs of answers (1 answer before and 1 after the initial search), 254 (85%) were produced by 14 residents. Of these, 132 (52%) and 122 (48%) pairs of answers concerned questions that had been assigned an initial search with InfoClinique and the Trip database, respectively. Both engines produced an important and similar absolute increase in the proportion of correct answers after searching (26% to 62% for InfoClinique, for an increase of 36%; 24% to 63% for the Trip database, for an increase of 39%; P = .68). For all 30 clinical questions, at least 1 resident produced the correct answer after searching with either search engine. The mean (SD) time of the initial search for each question was 23.5 (7.6) minutes with InfoClinique and 22.3 (7.8) minutes with the Trip database (P = .30). Participants' perceptions of each engine's effect on the decision-making process were very positive and similar for both search engines. Family medicine residents' ability to provide correct answers to clinical questions increased dramatically and similarly with the use of both InfoClinique and the Trip database. These tools have strong potential to increase the quality of medical care.
ERIC Educational Resources Information Center
Rinehart, Nicole J.; Bradshaw, John L.; Moss, Simon A.; Brereton, Avril V.; Tonge, Bruce J.
2006-01-01
The repetitive, stereotyped and obsessive behaviours, which are core diagnostic features of autism, are thought to be underpinned by executive dysfunction. This study examined executive impairment in individuals with autism and Asperger's disorder using a verbal equivalent of an established pseudo-random number generating task. Different patterns…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong
The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving,more » so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.« less
Wang, Yonggang; Hui, Cong; Liu, Chong; Xu, Chao
2016-04-01
The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.
NASA Astrophysics Data System (ADS)
Lee, Hochul; Ebrahimi, Farbod; Amiri, Pedram Khalili; Wang, Kang L.
2017-05-01
A true random number generator based on perpendicularly magnetized voltage-controlled magnetic tunnel junction devices (MRNG) is presented. Unlike MTJs used in memory applications where a stable bit is needed to store information, in this work, the MTJ is intentionally designed with small perpendicular magnetic anisotropy (PMA). This allows one to take advantage of the thermally activated fluctuations of its free layer as a stochastic noise source. Furthermore, we take advantage of the voltage dependence of anisotropy to temporarily change the MTJ state into an unstable state when a voltage is applied. Since the MTJ has two energetically stable states, the final state is randomly chosen by thermal fluctuation. The voltage controlled magnetic anisotropy (VCMA) effect is used to generate the metastable state of the MTJ by lowering its energy barrier. The proposed MRNG achieves a high throughput (32 Gbps) by implementing a 64 ×64 MTJ array into CMOS circuits and executing operations in a parallel manner. Furthermore, the circuit consumes very low energy to generate a random bit (31.5 fJ/bit) due to the high energy efficiency of the voltage-controlled MTJ switching.
Shaping the spectrum of random-phase radar waveforms
Doerry, Armin W.; Marquette, Brandeis
2017-05-09
The various technologies presented herein relate to generation of a desired waveform profile in the form of a spectrum of apparently random noise (e.g., white noise or colored noise), but with precise spectral characteristics. Hence, a waveform profile that could be readily determined (e.g., by a spoofing system) is effectively obscured. Obscuration is achieved by dividing the waveform into a series of chips, each with an assigned frequency, wherein the sequence of chips are subsequently randomized. Randomization can be a function of the application of a key to the chip sequence. During processing of the echo pulse, a copy of the randomized transmitted pulse is recovered or regenerated against which the received echo is correlated. Hence, with the echo energy range-compressed in this manner, it is possible to generate a radar image with precise impulse response.
Global solutions to random 3D vorticity equations for small initial data
NASA Astrophysics Data System (ADS)
Barbu, Viorel; Röckner, Michael
2017-11-01
One proves the existence and uniqueness in (Lp (R3)) 3, 3/2 < p < 2, of a global mild solution to random vorticity equations associated to stochastic 3D Navier-Stokes equations with linear multiplicative Gaussian noise of convolution type, for sufficiently small initial vorticity. This resembles some earlier deterministic results of T. Kato [16] and are obtained by treating the equation in vorticity form and reducing the latter to a random nonlinear parabolic equation. The solution has maximal regularity in the spatial variables and is weakly continuous in (L3 ∩L 3p/4p - 6)3 with respect to the time variable. Furthermore, we obtain the pathwise continuous dependence of solutions with respect to the initial data. In particular, one gets a locally unique solution of 3D stochastic Navier-Stokes equation in vorticity form up to some explosion stopping time τ adapted to the Brownian motion.
Sippola, S; Grönroos, J; Tuominen, R; Paajanen, H; Rautio, T; Nordström, P; Aarnio, M; Rantanen, T; Hurme, S; Salminen, P
2017-09-01
An increasing amount of evidence supports antibiotic therapy for treating uncomplicated acute appendicitis. The objective of this study was to compare the costs of antibiotics alone versus appendicectomy in treating uncomplicated acute appendicitis within the randomized controlled APPAC (APPendicitis ACuta) trial. The APPAC multicentre, non-inferiority RCT was conducted on patients with CT-confirmed uncomplicated acute appendicitis. Patients were assigned randomly to appendicectomy or antibiotic treatment. All costs were recorded, whether generated by the initial visit and subsequent treatment or possible recurrent appendicitis during the 1-year follow-up. The cost estimates were based on cost levels for the year 2012. Some 273 patients were assigned to the appendicectomy group and 257 to antibiotic treatment. Most patients randomized to antibiotic treatment did not require appendicectomy during the 1-year follow-up. In the operative group, overall societal costs (€5989·2, 95 per cent c.i. 5787·3 to 6191·1) were 1·6 times higher (€2244·8, 1940·5 to 2549·1) than those in the antibiotic group (€3744·4, 3514·6 to 3974·2). In both groups, productivity losses represented a slightly higher proportion of overall societal costs than all treatment costs together, with diagnostics and medicines having a minor role. Those in the operative group were prescribed significantly more sick leave than those in the antibiotic group (mean(s.d.) 17·0(8·3) (95 per cent c.i. 16·0 to 18·0) versus 9·2(6·9) (8·3 to 10·0) days respectively; P < 0·001). When the age and sex of the patient as well as the hospital were controlled for simultaneously, the operative treatment generated significantly more costs in all models. Patients receiving antibiotic therapy for uncomplicated appendicitis incurred lower costs than those who had surgery. © 2017 BJS Society Ltd Published by John Wiley & Sons Ltd.
True random bit generators based on current time series of contact glow discharge electrolysis
NASA Astrophysics Data System (ADS)
Rojas, Andrea Espinel; Allagui, Anis; Elwakil, Ahmed S.; Alawadhi, Hussain
2018-05-01
Random bit generators (RBGs) in today's digital information and communication systems employ a high rate physical entropy sources such as electronic, photonic, or thermal time series signals. However, the proper functioning of such physical systems is bound by specific constrains that make them in some cases weak and susceptible to external attacks. In this study, we show that the electrical current time series of contact glow discharge electrolysis, which is a dc voltage-powered micro-plasma in liquids, can be used for generating random bit sequences in a wide range of high dc voltages. The current signal is quantized into a binary stream by first using a simple moving average function which makes the distribution centered around zero, and then applying logical operations which enables the binarized data to pass all tests in industry-standard randomness test suite by the National Institute of Standard Technology. Furthermore, the robustness of this RBG against power supply attacks has been examined and verified.
A New Quantum Gray-Scale Image Encoding Scheme
NASA Astrophysics Data System (ADS)
Naseri, Mosayeb; Abdolmaleky, Mona; Parandin, Fariborz; Fatahi, Negin; Farouk, Ahmed; Nazari, Reza
2018-02-01
In this paper, a new quantum images encoding scheme is proposed. The proposed scheme mainly consists of four different encoding algorithms. The idea behind of the scheme is a binary key generated randomly for each pixel of the original image. Afterwards, the employed encoding algorithm is selected corresponding to the qubit pair of the generated randomized binary key. The security analysis of the proposed scheme proved its enhancement through both randomization of the generated binary image key and altering the gray-scale value of the image pixels using the qubits of randomized binary key. The simulation of the proposed scheme assures that the final encoded image could not be recognized visually. Moreover, the histogram diagram of encoded image is flatter than the original one. The Shannon entropies of the final encoded images are significantly higher than the original one, which indicates that the attacker can not gain any information about the encoded images. Supported by Kermanshah Branch, Islamic Azad University, Kermanshah, IRAN
NASA Astrophysics Data System (ADS)
Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min
2016-01-01
Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.
Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min
2016-01-01
Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information. PMID:26823196
Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min
2016-01-29
Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.
Sampling large random knots in a confined space
NASA Astrophysics Data System (ADS)
Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.
2007-09-01
DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.
Automatic Nanodesign Using Evolutionary Techniques
NASA Technical Reports Server (NTRS)
Globus, Al; Saini, Subhash (Technical Monitor)
1998-01-01
Many problems associated with the development of nanotechnology require custom designed molecules. We use genetic graph software, a new development, to automatically evolve molecules of interest when only the requirements are known. Genetic graph software designs molecules, and potentially nanoelectronic circuits, given a fitness function that determines which of two molecules is better. A set of molecules, the first generation, is generated at random then tested with the fitness function, Subsequent generations are created by randomly choosing two parent molecules with a bias towards high scoring molecules, tearing each molecules in two at random, and mating parts from the mother and father to create two children. This procedure is repeated until a satisfactory molecule is found. An atom pair similarity test is currently used as the fitness function to evolve molecules similar to existing pharmaceuticals.
Attardo, Alessio; Calegari, Federico; Haubensak, Wulf; Wilsch-Bräuninger, Michaela; Huttner, Wieland B.
2008-01-01
The neurons of the mammalian brain are generated by progenitors dividing either at the apical surface of the ventricular zone (neuroepithelial and radial glial cells, collectively referred to as apical progenitors) or at its basal side (basal progenitors, also called intermediate progenitors). For apical progenitors, the orientation of the cleavage plane relative to their apical-basal axis is thought to be of critical importance for the fate of the daughter cells. For basal progenitors, the relationship between cell polarity, cleavage plane orientation and the fate of daughter cells is unknown. Here, we have investigated these issues at the very onset of cortical neurogenesis. To directly observe the generation of neurons from apical and basal progenitors, we established a novel transgenic mouse line in which membrane GFP is expressed from the beta-III-tubulin promoter, an early pan-neuronal marker, and crossed this line with a previously described knock-in line in which nuclear GFP is expressed from the Tis21 promoter, a pan-neurogenic progenitor marker. Mitotic Tis21-positive basal progenitors nearly always divided symmetrically, generating two neurons, but, in contrast to symmetrically dividing apical progenitors, lacked apical-basal polarity and showed a nearly randomized cleavage plane orientation. Moreover, the appearance of beta-III-tubulin–driven GFP fluorescence in basal progenitor-derived neurons, in contrast to that in apical progenitor-derived neurons, was so rapid that it suggested the initiation of the neuronal phenotype already in the progenitor. Our observations imply that (i) the loss of apical-basal polarity restricts neuronal progenitors to the symmetric mode of cell division, and that (ii) basal progenitors initiate the expression of neuronal phenotype already before mitosis, in contrast to apical progenitors. PMID:18545663
Crouch, Daniel J M
2017-10-27
The prevalence of sexual reproduction remains mysterious, as it poses clear evolutionary drawbacks compared to reproducing asexually. Several possible explanations exist, with one of the most likely being that finite population size causes linkage disequilibria to randomly generate and impede the progress of natural selection, and that these are eroded by recombination via sexual reproduction. Previous investigations have either analysed this phenomenon in detail for small numbers of loci, or performed population simulations for many loci. Here we present a quantitative genetic model for fitness, based on the Price Equation, in order to examine the theoretical consequences of randomly generated linkage disequilibria when there are many loci. In addition, most previous work has been concerned with the long-term consequences of deleterious linkage disequilibria for population fitness. The expected change in mean fitness between consecutive generations, a measure of short-term evolutionary success, is shown under random environmental influences to be related to the autocovariance in mean fitness between the generations, capturing the effects of stochastic forces such as genetic drift. Interaction between genetic drift and natural selection, due to randomly generated linkage disequilibria, is demonstrated to be one possible source of mean fitness autocovariance. This suggests a possible role for sexual reproduction in reducing the negative effects of genetic drift, thereby improving the short-term efficacy of natural selection. Copyright © 2017 Elsevier Ltd. All rights reserved.
Design and implementation of the NaI(Tl)/CsI(Na) detectors output signal generator
NASA Astrophysics Data System (ADS)
Zhou, Xu; Liu, Cong-Zhan; Zhao, Jian-Ling; Zhang, Fei; Zhang, Yi-Fei; Li, Zheng-Wei; Zhang, Shuo; Li, Xu-Fang; Lu, Xue-Feng; Xu, Zhen-Ling; Lu, Fang-Jun
2014-02-01
We designed and implemented a signal generator that can simulate the output of the NaI(Tl)/CsI(Na) detectors' pre-amplifier onboard the Hard X-ray Modulation Telescope (HXMT). Using the development of the FPGA (Field Programmable Gate Array) with VHDL language and adding a random constituent, we have finally produced the double exponential random pulse signal generator. The statistical distribution of the signal amplitude is programmable. The occurrence time intervals of the adjacent signals contain negative exponential distribution statistically.