Neutron monitor generated data distributions in quantum variational Monte Carlo
NASA Astrophysics Data System (ADS)
Kussainov, A. S.; Pya, N.
2016-08-01
We have assessed the potential applications of the neutron monitor hardware as random number generator for normal and uniform distributions. The data tables from the acquisition channels with no extreme changes in the signal level were chosen as the retrospective model. The stochastic component was extracted by fitting the raw data with splines and then subtracting the fit. Scaling the extracted data to zero mean and variance of one is sufficient to obtain a stable standard normal random variate. Distributions under consideration pass all available normality tests. Inverse transform sampling is suggested to use as a source of the uniform random numbers. Variational Monte Carlo method for quantum harmonic oscillator was used to test the quality of our random numbers. If the data delivery rate is of importance and the conventional one minute resolution neutron count is insufficient, we could always settle for an efficient seed generator to feed into the faster algorithmic random number generator or create a buffer.
A Random Variable Transformation Process.
ERIC Educational Resources Information Center
Scheuermann, Larry
1989-01-01
Provides a short BASIC program, RANVAR, which generates random variates for various theoretical probability distributions. The seven variates include: uniform, exponential, normal, binomial, Poisson, Pascal, and triangular. (MVL)
High-speed true random number generation based on paired memristors for security electronics
NASA Astrophysics Data System (ADS)
Zhang, Teng; Yin, Minghui; Xu, Changmin; Lu, Xiayan; Sun, Xinhao; Yang, Yuchao; Huang, Ru
2017-11-01
True random number generator (TRNG) is a critical component in hardware security that is increasingly important in the era of mobile computing and internet of things. Here we demonstrate a TRNG using intrinsic variation of memristors as a natural source of entropy that is otherwise undesirable in most applications. The random bits were produced by cyclically switching a pair of tantalum oxide based memristors and comparing their resistance values in the off state, taking advantage of the more pronounced resistance variation compared with that in the on state. Using an alternating read scheme in the designed TRNG circuit, the unbiasedness of the random numbers was significantly improved, and the bitstream passed standard randomness tests. The Pt/TaO x /Ta memristors fabricated in this work have fast programming/erasing speeds of ˜30 ns, suggesting a high random number throughput. The approach proposed here thus holds great promise for physically-implemented random number generation.
High-speed true random number generation based on paired memristors for security electronics.
Zhang, Teng; Yin, Minghui; Xu, Changmin; Lu, Xiayan; Sun, Xinhao; Yang, Yuchao; Huang, Ru
2017-11-10
True random number generator (TRNG) is a critical component in hardware security that is increasingly important in the era of mobile computing and internet of things. Here we demonstrate a TRNG using intrinsic variation of memristors as a natural source of entropy that is otherwise undesirable in most applications. The random bits were produced by cyclically switching a pair of tantalum oxide based memristors and comparing their resistance values in the off state, taking advantage of the more pronounced resistance variation compared with that in the on state. Using an alternating read scheme in the designed TRNG circuit, the unbiasedness of the random numbers was significantly improved, and the bitstream passed standard randomness tests. The Pt/TaO x /Ta memristors fabricated in this work have fast programming/erasing speeds of ∼30 ns, suggesting a high random number throughput. The approach proposed here thus holds great promise for physically-implemented random number generation.
Computer methods for sampling from the gamma distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, M.E.; Tadikamalla, P.R.
1978-01-01
Considerable attention has recently been directed at developing ever faster algorithms for generating gamma random variates on digital computers. This paper surveys the current state of the art including the leading algorithms of Ahrens and Dieter, Atkinson, Cheng, Fishman, Marsaglia, Tadikamalla, and Wallace. General random variate generation techniques are explained with reference to these gamma algorithms. Computer simulation experiments on IBM and CDC computers are reported.
Generation of physical random numbers by using homodyne detection
NASA Astrophysics Data System (ADS)
Hirakawa, Kodai; Oya, Shota; Oguri, Yusuke; Ichikawa, Tsubasa; Eto, Yujiro; Hirano, Takuya; Tsurumaru, Toyohiro
2016-10-01
Physical random numbers generated by quantum measurements are, in principle, impossible to predict. We have demonstrated the generation of physical random numbers by using a high-speed balanced photodetector to measure the quadrature amplitudes of vacuum states. Using this method, random numbers were generated at 500 Mbps, which is more than one order of magnitude faster than previously [Gabriel et al:, Nature Photonics 4, 711-715 (2010)]. The Crush test battery of the TestU01 suite consists of 31 tests in 144 variations, and we used them to statistically analyze these numbers. The generated random numbers passed 14 of the 31 tests. To improve the randomness, we performed a hash operation, in which each random number was multiplied by a random Toeplitz matrix; the resulting numbers passed all of the tests in the TestU01 Crush battery.
Random field assessment of nanoscopic inhomogeneity of bone
Dong, X. Neil; Luo, Qing; Sparkman, Daniel M.; Millwater, Harry R.; Wang, Xiaodu
2010-01-01
Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to present the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. PMID:20817128
Autocorrelation peaks in congruential pseudorandom number generators
NASA Technical Reports Server (NTRS)
Neuman, F.; Merrick, R. B.
1976-01-01
The complete correlation structure of several congruential pseudorandom number generators (PRNG) of the same type and small cycle length was studied to deal with the problem of congruential PRNG almost repeating themselves at intervals smaller than their cycle lengths, during simulation of bandpass filtered normal random noise. Maximum period multiplicative and mixed congruential generators were studied, with inferences drawn from examination of several tractable members of a class of random number generators, and moduli from 2 to the 5th power to 2 to the 9th power. High correlation is shown to exist in mixed and multiplicative congruential random number generators and prime moduli Lehmer generators for shifts a fraction of their cycle length. The random noise sequences in question are required when simulating electrical noise, air turbulence, or time variation of wind parameters.
Random field assessment of nanoscopic inhomogeneity of bone.
Dong, X Neil; Luo, Qing; Sparkman, Daniel M; Millwater, Harry R; Wang, Xiaodu
2010-12-01
Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to represent the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. Copyright © 2010 Elsevier Inc. All rights reserved.
Generation of dense plume fingers in saturated-unsaturated homogeneous porous media
NASA Astrophysics Data System (ADS)
Cremer, Clemens J. M.; Graf, Thomas
2015-02-01
Flow under variable-density conditions is widespread, occurring in geothermal reservoirs, at waste disposal sites or due to saltwater intrusion. The migration of dense plumes typically results in the formation of vertical plume fingers which are known to be triggered by material heterogeneity or by variations in source concentration that causes the density variation. Using a numerical groundwater model, six perturbation methods are tested under saturated and unsaturated flow conditions to mimic heterogeneity and concentration variations on the pore scale in order to realistically generate dense fingers. A laboratory-scale sand tank experiment is numerically simulated, and the perturbation methods are evaluated by comparing plume fingers obtained from the laboratory experiment with numerically simulated fingers. Dense plume fingering for saturated flow can best be reproduced with a spatially random, time-constant perturbation of the solute source. For unsaturated flow, a spatially and temporally random noise of solute concentration or a random conductivity field adequately simulate plume fingering.
Variational Solutions and Random Dynamical Systems to SPDEs Perturbed by Fractional Gaussian Noise
Zeng, Caibin; Yang, Qigui; Cao, Junfei
2014-01-01
This paper deals with the following type of stochastic partial differential equations (SPDEs) perturbed by an infinite dimensional fractional Brownian motion with a suitable volatility coefficient Φ: dX(t) = A(X(t))dt+Φ(t)dB H(t), where A is a nonlinear operator satisfying some monotonicity conditions. Using the variational approach, we prove the existence and uniqueness of variational solutions to such system. Moreover, we prove that this variational solution generates a random dynamical system. The main results are applied to a general type of nonlinear SPDEs and the stochastic generalized p-Laplacian equation. PMID:24574903
Design space exploration for early identification of yield limiting patterns
NASA Astrophysics Data System (ADS)
Li, Helen; Zou, Elain; Lee, Robben; Hong, Sid; Liu, Square; Wang, JinYan; Du, Chunshan; Zhang, Recco; Madkour, Kareem; Ali, Hussein; Hsu, Danny; Kabeel, Aliaa; ElManhawy, Wael; Kwan, Joe
2016-03-01
In order to resolve the causality dilemma of which comes first, accurate design rules or real designs, this paper presents a flow for exploration of the layout design space to early identify problematic patterns that will negatively affect the yield. A new random layout generating method called Layout Schema Generator (LSG) is reported in this paper, this method generates realistic design-like layouts without any design rule violation. Lithography simulation is then used on the generated layout to discover the potentially problematic patterns (hotspots). These hotspot patterns are further explored by randomly inducing feature and context variations to these identified hotspots through a flow called Hotspot variation Flow (HSV). Simulation is then performed on these expanded set of layout clips to further identify more problematic patterns. These patterns are then classified into design forbidden patterns that should be included in the design rule checker and legal patterns that need better handling in the RET recipes and processes.
In Darwinian evolution, feedback from natural selection leads to biased mutations.
Caporale, Lynn Helena; Doyle, John
2013-12-01
Natural selection provides feedback through which information about the environment and its recurring challenges is captured, inherited, and accumulated within genomes in the form of variations that contribute to survival. The variation upon which natural selection acts is generally described as "random." Yet evidence has been mounting for decades, from such phenomena as mutation hotspots, horizontal gene transfer, and highly mutable repetitive sequences, that variation is far from the simplifying idealization of random processes as white (uniform in space and time and independent of the environment or context). This paper focuses on what is known about the generation and control of mutational variation, emphasizing that it is not uniform across the genome or in time, not unstructured with respect to survival, and is neither memoryless nor independent of the (also far from white) environment. We suggest that, as opposed to frequentist methods, Bayesian analysis could capture the evolution of nonuniform probabilities of distinct classes of mutation, and argue not only that the locations, styles, and timing of real mutations are not correctly modeled as generated by a white noise random process, but that such a process would be inconsistent with evolutionary theory. © 2013 New York Academy of Sciences.
Random number generators tested on quantum Monte Carlo simulations.
Hongo, Kenta; Maezono, Ryo; Miura, Kenichi
2010-08-01
We have tested and compared several (pseudo) random number generators (RNGs) applied to a practical application, ground state energy calculations of molecules using variational and diffusion Monte Carlo metheds. A new multiple recursive generator with 8th-order recursion (MRG8) and the Mersenne twister generator (MT19937) are tested and compared with the RANLUX generator with five luxury levels (RANLUX-[0-4]). Both MRG8 and MT19937 are proven to give the same total energy as that evaluated with RANLUX-4 (highest luxury level) within the statistical error bars with less computational cost to generate the sequence. We also tested the notorious implementation of linear congruential generator (LCG), RANDU, for comparison. (c) 2010 Wiley Periodicals, Inc.
Analysis of overdispersed count data by mixtures of Poisson variables and Poisson processes.
Hougaard, P; Lee, M L; Whitmore, G A
1997-12-01
Count data often show overdispersion compared to the Poisson distribution. Overdispersion is typically modeled by a random effect for the mean, based on the gamma distribution, leading to the negative binomial distribution for the count. This paper considers a larger family of mixture distributions, including the inverse Gaussian mixture distribution. It is demonstrated that it gives a significantly better fit for a data set on the frequency of epileptic seizures. The same approach can be used to generate counting processes from Poisson processes, where the rate or the time is random. A random rate corresponds to variation between patients, whereas a random time corresponds to variation within patients.
Sunspot random walk and 22-year variation
Love, Jeffrey J.; Rigler, E. Joshua
2012-01-01
We examine two stochastic models for consistency with observed long-term secular trends in sunspot number and a faint, but semi-persistent, 22-yr signal: (1) a null hypothesis, a simple one-parameter random-walk model of sunspot-number cycle-to-cycle change, and, (2) an alternative hypothesis, a two-parameter random-walk model with an imposed 22-yr alternating amplitude. The observed secular trend in sunspots, seen from solar cycle 5 to 23, would not be an unlikely result of the accumulation of multiple random-walk steps. Statistical tests show that a 22-yr signal can be resolved in historical sunspot data; that is, the probability is low that it would be realized from random data. On the other hand, the 22-yr signal has a small amplitude compared to random variation, and so it has a relatively small effect on sunspot predictions. Many published predictions for cycle 24 sunspots fall within the dispersion of previous cycle-to-cycle sunspot differences. The probability is low that the Sun will, with the accumulation of random steps over the next few cycles, walk down to a Dalton-like minimum. Our models support published interpretations of sunspot secular variation and 22-yr variation resulting from cycle-to-cycle accumulation of dynamo-generated magnetic energy.
Theoretical principles for biology: Variation.
Montévil, Maël; Mossio, Matteo; Pocheville, Arnaud; Longo, Giuseppe
2016-10-01
Darwin introduced the concept that random variation generates new living forms. In this paper, we elaborate on Darwin's notion of random variation to propose that biological variation should be given the status of a fundamental theoretical principle in biology. We state that biological objects such as organisms are specific objects. Specific objects are special in that they are qualitatively different from each other. They can undergo unpredictable qualitative changes, some of which are not defined before they happen. We express the principle of variation in terms of symmetry changes, where symmetries underlie the theoretical determination of the object. We contrast the biological situation with the physical situation, where objects are generic (that is, different objects can be assumed to be identical) and evolve in well-defined state spaces. We derive several implications of the principle of variation, in particular, biological objects show randomness, historicity and contextuality. We elaborate on the articulation between this principle and the two other principles proposed in this special issue: the principle of default state and the principle of organization. Copyright © 2016 Elsevier Ltd. All rights reserved.
A principle of organization which facilitates broad Lamarckian-like adaptations by improvisation.
Soen, Yoav; Knafo, Maor; Elgart, Michael
2015-12-02
During the lifetime of an organism, every individual encounters many combinations of diverse changes in the somatic genome, epigenome and microbiome. This gives rise to many novel combinations of internal failures which are unique to each individual. How any individual can tolerate this high load of new, individual-specific scenarios of failure is not clear. While stress-induced plasticity and hidden variation have been proposed as potential mechanisms of tolerance, the main conceptual problem remains unaddressed, namely: how largely non-beneficial random variation can be rapidly and safely organized into net benefits to every individual. We propose an organizational principle which explains how every individual can alleviate a high load of novel stressful scenarios using many random variations in flexible and inherently less harmful traits. Random changes which happen to reduce stress, benefit the organism and decrease the drive for additional changes. This adaptation (termed 'Adaptive Improvisation') can be further enhanced, propagated, stabilized and memorized when beneficial changes reinforce themselves by auto-regulatory mechanisms. This principle implicates stress not only in driving diverse variations in cells tissues and organs, but also in organizing these variations into adaptive outcomes. Specific (but not exclusive) examples include stress reduction by rapid exchange of mobile genetic elements (or exosomes) in unicellular, and rapid changes in the symbiotic microorganisms of animals. In all cases, adaptive changes can be transmitted across generations, allowing rapid improvement and assimilation in a few generations. We provide testable predictions derived from the hypothesis. The hypothesis raises a critical, but thus far overlooked adaptation problem and explains how random variation can self-organize to confer a wide range of individual-specific adaptations beyond the existing outcomes of natural selection. It portrays gene regulation as an inseparable synergy between natural selection and adaptation by improvisation. The latter provides a basis for Lamarckian adaptation that is not limited to a specific mechanism and readily accounts for the remarkable resistance of tumors to treatment.
Ratio index variables or ANCOVA? Fisher's cats revisited.
Tu, Yu-Kang; Law, Graham R; Ellison, George T H; Gilthorpe, Mark S
2010-01-01
Over 60 years ago Ronald Fisher demonstrated a number of potential pitfalls with statistical analyses using ratio variables. Nonetheless, these pitfalls are largely overlooked in contemporary clinical and epidemiological research, which routinely uses ratio variables in statistical analyses. This article aims to demonstrate how very different findings can be generated as a result of less than perfect correlations among the data used to generate ratio variables. These imperfect correlations result from measurement error and random biological variation. While the former can often be reduced by improvements in measurement, random biological variation is difficult to estimate and eliminate in observational studies. Moreover, wherever the underlying biological relationships among epidemiological variables are unclear, and hence the choice of statistical model is also unclear, the different findings generated by different analytical strategies can lead to contradictory conclusions. Caution is therefore required when interpreting analyses of ratio variables whenever the underlying biological relationships among the variables involved are unspecified or unclear. (c) 2009 John Wiley & Sons, Ltd.
El-Kassaby, Yousry A; Funda, Tomas; Lai, Ben S K
2010-01-01
The impact of female reproductive success on the mating system, gene flow, and genetic diversity of the filial generation was studied using a random sample of 801 bulk seed from a 49-clone Pseudotsuga menziesii seed orchard. We used microsatellite DNA fingerprinting and pedigree reconstruction to assign each seed's maternal and paternal parents and directly estimated clonal reproductive success, selfing rate, and the proportion of seed sired by outside pollen sources. Unlike most family array mating system and gene flow studies conducted on natural and experimental populations, which used an equal number of seeds per maternal genotype and thus generating unbiased inferences only on male reproductive success, the random sample we used was a representative of the entire seed crop; therefore, provided a unique opportunity to draw unbiased inferences on both female and male reproductive success variation. Selfing rate and the number of seed sired by outside pollen sources were found to be a function of female fertility variation. This variation also substantially and negatively affected female effective population size. Additionally, the results provided convincing evidence that the use of clone size as a proxy to fertility is questionable and requires further consideration.
Postural control model interpretation of stabilogram diffusion analysis
NASA Technical Reports Server (NTRS)
Peterka, R. J.
2000-01-01
Collins and De Luca [Collins JJ. De Luca CJ (1993) Exp Brain Res 95: 308-318] introduced a new method known as stabilogram diffusion analysis that provides a quantitative statistical measure of the apparently random variations of center-of-pressure (COP) trajectories recorded during quiet upright stance in humans. This analysis generates a stabilogram diffusion function (SDF) that summarizes the mean square COP displacement as a function of the time interval between COP comparisons. SDFs have a characteristic two-part form that suggests the presence of two different control regimes: a short-term open-loop control behavior and a longer-term closed-loop behavior. This paper demonstrates that a very simple closed-loop control model of upright stance can generate realistic SDFs. The model consists of an inverted pendulum body with torque applied at the ankle joint. This torque includes a random disturbance torque and a control torque. The control torque is a function of the deviation (error signal) between the desired upright body position and the actual body position, and is generated in proportion to the error signal, the derivative of the error signal, and the integral of the error signal [i.e. a proportional, integral and derivative (PID) neural controller]. The control torque is applied with a time delay representing conduction, processing, and muscle activation delays. Variations in the PID parameters and the time delay generate variations in SDFs that mimic real experimental SDFs. This model analysis allows one to interpret experimentally observed changes in SDFs in terms of variations in neural controller and time delay parameters rather than in terms of open-loop versus closed-loop behavior.
NASA Astrophysics Data System (ADS)
Iz, H. Bâki
2018-05-01
This study provides additional information about the impact of atmospheric pressure on sea level variations. The observed regularity in sea level atmospheric pressure depends mainly on the latitude and verified to be dominantly random closer to the equator. It was demonstrated that almost all the annual and semiannual sea level variations at 27 globally distributed tide gauge stations can be attributed to the regional/local atmospheric forcing as an inverted barometric effect. Statistically significant non-linearities were detected in the regional atmospheric pressure series, which in turn impacted other sea level variations as compounders in tandem with the lunar nodal forcing, generating lunar sub-harmonics with multidecadal periods. It was shown that random component of regional atmospheric pressure tends to cluster at monthly intervals. The clusters are likely to be caused by the intraannual seasonal atmospheric temperature changes,which may also act as random beats in generating sub-harmonics observed in sea level changes as another mechanism. This study also affirmed that there are no statistically significant secular trends in the progression of regional atmospheric pressures, hence there was no contribution to the sea level trends during the 20th century by the atmospheric pressure.Meanwhile, the estimated nonuniform scale factors of the inverted barometer effects suggest that the sea level atmospheric pressure will bias the sea level trends inferred from satellite altimetry measurements if their impact is accounted for as corrections without proper scaling.
Engen, Steinar; Saether, Bernt-Erik
2014-03-01
We analyze the stochastic components of the Robertson-Price equation for the evolution of quantitative characters that enables decomposition of the selection differential into components due to demographic and environmental stochasticity. We show how these two types of stochasticity affect the evolution of multivariate quantitative characters by defining demographic and environmental variances as components of individual fitness. The exact covariance formula for selection is decomposed into three components, the deterministic mean value, as well as stochastic demographic and environmental components. We show that demographic and environmental stochasticity generate random genetic drift and fluctuating selection, respectively. This provides a common theoretical framework for linking ecological and evolutionary processes. Demographic stochasticity can cause random variation in selection differentials independent of fluctuating selection caused by environmental variation. We use this model of selection to illustrate that the effect on the expected selection differential of random variation in individual fitness is dependent on population size, and that the strength of fluctuating selection is affected by how environmental variation affects the covariance in Malthusian fitness between individuals with different phenotypes. Thus, our approach enables us to partition out the effects of fluctuating selection from the effects of selection due to random variation in individual fitness caused by demographic stochasticity. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.
Takahashi, M; Tango, T
2001-05-01
As methods for estimating excess mortality associated with influenza-epidemic, the Serfling's cyclical regression model and the Kawai and Fukutomi model with seasonal indices have been proposed. Excess mortality under the old definition (i.e., the number of deaths actually recorded in excess of the number expected on the basis of past seasonal experience) covers the random error for that portion of variation regarded as due to chance. In addition, it disregards the range of random variation of mortality with the season. In this paper, we propose a new definition of excess mortality associated with influenza-epidemics and a new estimation method, considering these questions with the Kawai and Fukutomi method. The new definition of excess mortality and a novel method for its estimation were generated as follows. Factors bringing about variation in mortality in months with influenza-epidemics may be divided into two groups: 1. Influenza itself, 2. others (practically random variation). The range of variation of mortality due to the latter (normal range) can be estimated from the range for months in the absence of influenza-epidemics. Excess mortality is defined as death over the normal range. A new definition of excess mortality associated with influenza-epidemics and an estimation method are proposed. The new method considers variation in mortality in months in the absence of influenza-epidemics. Consequently, it provides reasonable estimates of excess mortality by separating the portion of random variation. Further, it is a characteristic that the proposed estimate can be used as a criterion of statistical significance test.
Radiation Transport in Random Media With Large Fluctuations
NASA Astrophysics Data System (ADS)
Olson, Aaron; Prinja, Anil; Franke, Brian
2017-09-01
Neutral particle transport in media exhibiting large and complex material property spatial variation is modeled by representing cross sections as lognormal random functions of space and generated through a nonlinear memory-less transformation of a Gaussian process with covariance uniquely determined by the covariance of the cross section. A Karhunen-Loève decomposition of the Gaussian process is implemented to effciently generate realizations of the random cross sections and Woodcock Monte Carlo used to transport particles on each realization and generate benchmark solutions for the mean and variance of the particle flux as well as probability densities of the particle reflectance and transmittance. A computationally effcient stochastic collocation method is implemented to directly compute the statistical moments such as the mean and variance, while a polynomial chaos expansion in conjunction with stochastic collocation provides a convenient surrogate model that also produces probability densities of output quantities of interest. Extensive numerical testing demonstrates that use of stochastic reduced-order modeling provides an accurate and cost-effective alternative to random sampling for particle transport in random media.
Free Vibration of Uncertain Unsymmetrically Laminated Beams
NASA Technical Reports Server (NTRS)
Kapania, Rakesh K.; Goyal, Vijay K.
2001-01-01
Monte Carlo Simulation and Stochastic FEA are used to predict randomness in the free vibration response of thin unsymmetrically laminated beams. For the present study, it is assumed that randomness in the response is only caused by uncertainties in the ply orientations. The ply orientations may become random or uncertain during the manufacturing process. A new 16-dof beam element, based on the first-order shear deformation beam theory, is used to study the stochastic nature of the natural frequencies. Using variational principles, the element stiffness matrix and mass matrix are obtained through analytical integration. Using a random sequence a large data set is generated, containing possible random ply-orientations. This data is assumed to be symmetric. The stochastic-based finite element model for free vibrations predicts the relation between the randomness in fundamental natural frequencies and the randomness in ply-orientation. The sensitivity derivatives are calculated numerically through an exact formulation. The squared fundamental natural frequencies are expressed in terms of deterministic and probabilistic quantities, allowing to determine how sensitive they are to variations in ply angles. The predicted mean-valued fundamental natural frequency squared and the variance of the present model are in good agreement with Monte Carlo Simulation. Results, also, show that variations between plus or minus 5 degrees in ply-angles can affect free vibration response of unsymmetrically and symmetrically laminated beams.
NASA Astrophysics Data System (ADS)
Boning, Duane S.; Chung, James E.
1998-11-01
Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.
GENOPT 2016: Design of a generalization-based challenge in global optimization
NASA Astrophysics Data System (ADS)
Battiti, Roberto; Sergeyev, Yaroslav; Brunato, Mauro; Kvasov, Dmitri
2016-10-01
While comparing results on benchmark functions is a widely used practice to demonstrate the competitiveness of global optimization algorithms, fixed benchmarks can lead to a negative data mining process. To avoid this negative effect, the GENOPT contest benchmarks can be used which are based on randomized function generators, designed for scientific experiments, with fixed statistical characteristics but individual variation of the generated instances. The generators are available to participants for off-line tests and online tuning schemes, but the final competition is based on random seeds communicated in the last phase through a cooperative process. A brief presentation and discussion of the methods and results obtained in the framework of the GENOPT contest are given in this contribution.
K-Fold Crossvalidation in Canonical Analysis.
ERIC Educational Resources Information Center
Liang, Kun-Hsia; And Others
1995-01-01
A computer-assisted, K-fold cross-validation technique is discussed in the framework of canonical correlation analysis of randomly generated data sets. Analysis results suggest that this technique can effectively reduce the contamination of canonical variates and canonical correlations by sample-specific variance components. (Author/SLD)
Packing microstructure and local density variations of experimental and computational pebble beds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Auwerda, G. J.; Kloosterman, J. L.; Lathouwers, D.
2012-07-01
In pebble bed type nuclear reactors the fuel is contained in graphite pebbles, which form a randomly stacked bed with a non-uniform packing density. These variations can influence local coolant flow and power density and are a possible cause of hotspots. To analyse local density variations computational methods are needed that can generate randomly stacked pebble beds with a realistic packing structure on a pebble-to-pebble level. We first compare various properties of the local packing structure of a computed bed with those of an image made using computer aided X-ray tomography, looking at properties in the bulk of the bedmore » and near the wall separately. Especially for the bulk of the bed, properties of the computed bed show good comparison with the scanned bed and with literature, giving confidence our method generates beds with realistic packing microstructure. Results also show the packing structure is different near the wall than in the bulk of the bed, with pebbles near the wall forming ordered layers similar to hexagonal close packing. Next, variations in the local packing density are investigated by comparing probability density functions of the packing fraction of small clusters of pebbles throughout the bed. Especially near the wall large variations in local packing fractions exists, with a higher probability for both clusters of pebbles with low (<0.6) and high (>0.65) packing fraction, which could significantly affect flow rates and, together with higher power densities, could result in hotspots. (authors)« less
NASA Astrophysics Data System (ADS)
Cheng, Ken
2015-09-01
In a perspective in this issue based on thorough review, Andy Reynolds [1] tackles the issue of how the by now ubiquitously found Lévy walks can be generated, by animals, by organisms other than animals, and other forms of life below the level of organisms, such as cells. The answer comes not in a single whole cloth, but rather in a patchwork of generating factors. Lévy-like movements arise in objects blowing in the wind, or from travelers encountering turbulence in the seas or being repelled by boundaries. A variety of desiderata in movements, not related to achieving optimal foraging, may also engender Lévy-like movements. These include avoiding other organisms or not crossing one's traveled path. Adding to that plethora are ways in which variations on the theme of garden-variety random walks can at least approach a Lévy walk, if not capturing the mathematical form perfectly. Such variations include executing random walks on multiple scales, a strategy exhibited by desert ants [2,3], mussels [4], and quite likely extant hunter-gatherer humans as well [5]. It is possible that fossil tracks over 50 million years old also show this strategy, as the curve fitting with multiple random walks, characterized by multiple exponential distributions, is as good or better than curve fits having the power-law distribution characteristic of Lévy walks [6]. Another variation is to have a random walk search whose scale is expanding over time. In great detail and based on extensive literature - the review has over 200 references - a range of other ways in which Lévy-like movements might come about are also discussed.
Simulation of random road microprofile based on specified correlation function
NASA Astrophysics Data System (ADS)
Rykov, S. P.; Rykova, O. A.; Koval, V. S.; Vlasov, V. G.; Fedotov, K. V.
2018-03-01
The paper aims to develop a numerical simulation method and an algorithm for a random microprofile of special roads based on the specified correlation function. The paper used methods of correlation, spectrum and numerical analysis. It proves that the transfer function of the generating filter for known expressions of spectrum input and output filter characteristics can be calculated using a theorem on nonnegative and fractional rational factorization and integral transformation. The model of the random function equivalent of the real road surface microprofile enables us to assess springing system parameters and identify ranges of variations.
Graham, John H; Robb, Daniel T; Poe, Amy R
2012-01-01
Distributed robustness is thought to influence the buffering of random phenotypic variation through the scale-free topology of gene regulatory, metabolic, and protein-protein interaction networks. If this hypothesis is true, then the phenotypic response to the perturbation of particular nodes in such a network should be proportional to the number of links those nodes make with neighboring nodes. This suggests a probability distribution approximating an inverse power-law of random phenotypic variation. Zero phenotypic variation, however, is impossible, because random molecular and cellular processes are essential to normal development. Consequently, a more realistic distribution should have a y-intercept close to zero in the lower tail, a mode greater than zero, and a long (fat) upper tail. The double Pareto-lognormal (DPLN) distribution is an ideal candidate distribution. It consists of a mixture of a lognormal body and upper and lower power-law tails. If our assumptions are true, the DPLN distribution should provide a better fit to random phenotypic variation in a large series of single-gene knockout lines than other skewed or symmetrical distributions. We fit a large published data set of single-gene knockout lines in Saccharomyces cerevisiae to seven different probability distributions: DPLN, right Pareto-lognormal (RPLN), left Pareto-lognormal (LPLN), normal, lognormal, exponential, and Pareto. The best model was judged by the Akaike Information Criterion (AIC). Phenotypic variation among gene knockouts in S. cerevisiae fits a double Pareto-lognormal (DPLN) distribution better than any of the alternative distributions, including the right Pareto-lognormal and lognormal distributions. A DPLN distribution is consistent with the hypothesis that developmental stability is mediated, in part, by distributed robustness, the resilience of gene regulatory, metabolic, and protein-protein interaction networks. Alternatively, multiplicative cell growth, and the mixing of lognormal distributions having different variances, may generate a DPLN distribution.
Efficient and robust quantum random number generation by photon number detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Applegate, M. J.; Cavendish Laboratory, University of Cambridge, 19 JJ Thomson Avenue, Cambridge CB3 0HE; Thomas, O.
2015-08-17
We present an efficient and robust quantum random number generator based upon high-rate room temperature photon number detection. We employ an electric field-modulated silicon avalanche photodiode, a type of device particularly suited to high-rate photon number detection with excellent photon number resolution to detect, without an applied dead-time, up to 4 photons from the optical pulses emitted by a laser. By both measuring and modeling the response of the detector to the incident photons, we are able to determine the illumination conditions that achieve an optimal bit rate that we show is robust against variation in the photon flux. Wemore » extract random bits from the detected photon numbers with an efficiency of 99% corresponding to 1.97 bits per detected photon number yielding a bit rate of 143 Mbit/s, and verify that the extracted bits pass stringent statistical tests for randomness. Our scheme is highly scalable and has the potential of multi-Gbit/s bit rates.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cryns, Jackson W.; Hatchell, Brian K.; Santiago-Rojas, Emiliano
Formal journal article Experimental analysis of a piezoelectric energy harvesting system for harmonic, random, and sine on random vibration Abstract: Harvesting power with a piezoelectric vibration powered generator using a full-wave rectifier conditioning circuit is experimentally compared for varying sinusoidal, random and sine on random (SOR) input vibration scenarios. Additionally, the implications of source vibration characteristics on harvester design are discussed. Studies in vibration harvesting have yielded numerous alternatives for harvesting electrical energy from vibrations but piezoceramics arose as the most compact, energy dense means of energy transduction. The rise in popularity of harvesting energy from ambient vibrations has mademore » piezoelectric generators commercially available. Much of the available literature focuses on maximizing harvested power through nonlinear processing circuits that require accurate knowledge of generator internal mechanical and electrical characteristics and idealization of the input vibration source, which cannot be assumed in general application. In this manuscript, variations in source vibration and load resistance are explored for a commercially available piezoelectric generator. We characterize the source vibration by its acceleration response for repeatability and transcription to general application. The results agree with numerical and theoretical predictions for in previous literature that load optimal resistance varies with transducer natural frequency and source type, and the findings demonstrate that significant gains are seen with lower tuned transducer natural frequencies for similar source amplitudes. Going beyond idealized steady state sinusoidal and simplified random vibration input, SOR testing allows for more accurate representation of real world ambient vibration. It is shown that characteristic interactions from more complex vibrational sources significantly alter power generation and power processing requirements by increasing harvested power, shifting optimal conditioning impedance, inducing significant voltage supply fluctuations and ultimately rendering idealized sinusoidal and random analyses insufficient.« less
Localized motion in random matrix decomposition of complex financial systems
NASA Astrophysics Data System (ADS)
Jiang, Xiong-Fei; Zheng, Bo; Ren, Fei; Qiu, Tian
2017-04-01
With the random matrix theory, we decompose the multi-dimensional time series of complex financial systems into a set of orthogonal eigenmode functions, which are classified into the market mode, sector mode, and random mode. In particular, the localized motion generated by the business sectors, plays an important role in financial systems. Both the business sectors and their impact on the stock market are identified from the localized motion. We clarify that the localized motion induces different characteristics of the time correlations for the stock-market index and individual stocks. With a variation of a two-factor model, we reproduce the return-volatility correlations of the eigenmodes.
Determination of Rolling-Element Fatigue Life From Computer Generated Bearing Tests
NASA Technical Reports Server (NTRS)
Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.
2003-01-01
Two types of rolling-element bearings representing radial loaded and thrust loaded bearings were used for this study. Three hundred forty (340) virtual bearing sets totaling 31400 bearings were randomly assembled and tested by Monte Carlo (random) number generation. The Monte Carlo results were compared with endurance data from 51 bearing sets comprising 5321 bearings. A simple algebraic relation was established for the upper and lower L(sub 10) life limits as function of number of bearings failed for any bearing geometry. There is a fifty percent (50 percent) probability that the resultant bearing life will be less than that calculated. The maximum and minimum variation between the bearing resultant life and the calculated life correlate with the 90-percent confidence limits for a Weibull slope of 1.5. The calculated lives for bearings using a load-life exponent p of 4 for ball bearings and 5 for roller bearings correlated with the Monte Carlo generated bearing lives and the bearing data. STLE life factors for bearing steel and processing provide a reasonable accounting for differences between bearing life data and calculated life. Variations in Weibull slope from the Monte Carlo testing and bearing data correlated. There was excellent agreement between percent of individual components failed from Monte Carlo simulation and that predicted.
Harrison, Xavier A
2015-01-01
Overdispersion is a common feature of models of biological data, but researchers often fail to model the excess variation driving the overdispersion, resulting in biased parameter estimates and standard errors. Quantifying and modeling overdispersion when it is present is therefore critical for robust biological inference. One means to account for overdispersion is to add an observation-level random effect (OLRE) to a model, where each data point receives a unique level of a random effect that can absorb the extra-parametric variation in the data. Although some studies have investigated the utility of OLRE to model overdispersion in Poisson count data, studies doing so for Binomial proportion data are scarce. Here I use a simulation approach to investigate the ability of both OLRE models and Beta-Binomial models to recover unbiased parameter estimates in mixed effects models of Binomial data under various degrees of overdispersion. In addition, as ecologists often fit random intercept terms to models when the random effect sample size is low (<5 levels), I investigate the performance of both model types under a range of random effect sample sizes when overdispersion is present. Simulation results revealed that the efficacy of OLRE depends on the process that generated the overdispersion; OLRE failed to cope with overdispersion generated from a Beta-Binomial mixture model, leading to biased slope and intercept estimates, but performed well for overdispersion generated by adding random noise to the linear predictor. Comparison of parameter estimates from an OLRE model with those from its corresponding Beta-Binomial model readily identified when OLRE were performing poorly due to disagreement between effect sizes, and this strategy should be employed whenever OLRE are used for Binomial data to assess their reliability. Beta-Binomial models performed well across all contexts, but showed a tendency to underestimate effect sizes when modelling non-Beta-Binomial data. Finally, both OLRE and Beta-Binomial models performed poorly when models contained <5 levels of the random intercept term, especially for estimating variance components, and this effect appeared independent of total sample size. These results suggest that OLRE are a useful tool for modelling overdispersion in Binomial data, but that they do not perform well in all circumstances and researchers should take care to verify the robustness of parameter estimates of OLRE models.
A model for simulating random atmospheres as a function of latitude, season, and time
NASA Technical Reports Server (NTRS)
Campbell, J. W.
1977-01-01
An empirical stochastic computer model was developed with the capability of generating random thermodynamic profiles of the atmosphere below an altitude of 99 km which are characteristic of any given season, latitude, and time of day. Samples of temperature, density, and pressure profiles generated by the model are statistically similar to measured profiles in a data base of over 6000 rocket and high-altitude atmospheric soundings; that is, means and standard deviations of modeled profiles and their vertical gradients are in close agreement with data. Model-generated samples can be used for Monte Carlo simulations of aircraft or spacecraft trajectories to predict or account for the effects on a vehicle's performance of atmospheric variability. Other potential uses for the model are in simulating pollutant dispersion patterns, variations in sound propagation, and other phenomena which are dependent on atmospheric properties, and in developing data-reduction software for satellite monitoring systems.
Peng, Zhen; Genewein, Tim; Braun, Daniel A.
2014-01-01
Complexity is a hallmark of intelligent behavior consisting both of regular patterns and random variation. To quantitatively assess the complexity and randomness of human motion, we designed a motor task in which we translated subjects' motion trajectories into strings of symbol sequences. In the first part of the experiment participants were asked to perform self-paced movements to create repetitive patterns, copy pre-specified letter sequences, and generate random movements. To investigate whether the degree of randomness can be manipulated, in the second part of the experiment participants were asked to perform unpredictable movements in the context of a pursuit game, where they received feedback from an online Bayesian predictor guessing their next move. We analyzed symbol sequences representing subjects' motion trajectories with five common complexity measures: predictability, compressibility, approximate entropy, Lempel-Ziv complexity, as well as effective measure complexity. We found that subjects' self-created patterns were the most complex, followed by drawing movements of letters and self-paced random motion. We also found that participants could change the randomness of their behavior depending on context and feedback. Our results suggest that humans can adjust both complexity and regularity in different movement types and contexts and that this can be assessed with information-theoretic measures of the symbolic sequences generated from movement trajectories. PMID:24744716
Numerical Generation of Dense Plume Fingers in Unsaturated Homogeneous Porous Media
NASA Astrophysics Data System (ADS)
Cremer, C.; Graf, T.
2012-04-01
In nature, the migration of dense plumes typically results in the formation of vertical plume fingers. Flow direction in fingers is downwards, which is counterbalanced by upwards flow of less dense fluid between fingers. In heterogeneous media, heterogeneity itself is known to trigger the formation of fingers. In homogeneous media, however, fingers are also created even if all grains had the same diameter. The reason is that pore-scale heterogeneity leading to different flow velocities also exists in homogeneous media due to two effects: (i) Grains of identical size may randomly arrange differently, e.g. forming tetrahedrons, hexahedrons or octahedrons. Each arrangement creates pores of varying diameter, thus resulting in different average flow velocities. (ii) Random variations of solute concentration lead to varying buoyancy effects, thus also resulting in different velocities. As a continuation of previously made efforts to incorporate pore-scale heterogeneity into fully saturated soil such that dense fingers are realistically generated (Cremer and Graf, EGU Assembly, 2011), the current paper extends the research scope from saturated to unsaturated soil. Perturbation methods are evaluated by numerically re-simulating a laboratory-scale experiment of plume transport in homogeneous unsaturated sand (Simmons et al., Transp. Porous Media, 2002). The following 5 methods are being discussed: (i) homogeneous sand, (ii) initial perturbation of solute concentration, (iii) spatially random, time-constant perturbation of solute source, (iv) spatially and temporally random noise of simulated solute concentration, and (v) random K-field that introduces physically insignificant but numerically significant heterogeneity. Results demonstrate that, as opposed to saturated flow, perturbing the solute source will not result in plume fingering. This is because the location of the perturbed source (domain top) and the location of finger generation (groundwater surface) do not coincide. Alternatively, similar to saturated flow, applying either a random concentration noise (iv) or a random K-field (v) generates realistic plume fingering. Future work will focus on the generation mechanisms of plume finger splitting.
Gabora, Liane; Kauffman, Stuart
2016-04-01
Dietrich and Haider (Psychonomic Bulletin & Review, 21 (5), 897-915, 2014) justify their integrative framework for creativity founded on evolutionary theory and prediction research on the grounds that "theories and approaches guiding empirical research on creativity have not been supported by the neuroimaging evidence." Although this justification is controversial, the general direction holds promise. This commentary clarifies points of disagreement and unresolved issues, and addresses mis-applications of evolutionary theory that lead the authors to adopt a Darwinian (versus Lamarckian) approach. To say that creativity is Darwinian is not to say that it consists of variation plus selection - in the everyday sense of the term - as the authors imply; it is to say that evolution is occurring because selection is affecting the distribution of randomly generated heritable variation across generations. In creative thought the distribution of variants is not key, i.e., one is not inclined toward idea A because 60 % of one's candidate ideas are variants of A while only 40 % are variants of B; one is inclined toward whichever seems best. The authors concede that creative variation is partly directed; however, the greater the extent to which variants are generated non-randomly, the greater the extent to which the distribution of variants can reflect not selection but the initial generation bias. Since each thought in a creative process can alter the selective criteria against which the next is evaluated, there is no demarcation into generations as assumed in a Darwinian model. We address the authors' claim that reduced variability and individuality are more characteristic of Lamarckism than Darwinian evolution, and note that a Lamarckian approach to creativity has addressed the challenge of modeling the emergent features associated with insight.
USDA-ARS?s Scientific Manuscript database
Background Circadian rhythms regulate key biological processes influencing metabolic pathways. Dysregulation is associated with type 2 diabetes (T2D) and cardiovascular diseases (CVD). Circadian rhythms are generated by a transcriptional autoregulatory feedback loop involving core clock genes. CLOCK...
Archer, Charles Jens; Musselman, Roy Glenn; Peters, Amanda; Pinnow, Kurt Walter; Swartz, Brent Allen; Wallenfelt, Brian Paul
2010-11-23
A massively parallel computer system contains an inter-nodal communications network of node-to-node links. Nodes vary a choice of routing policy for routing data in the network in a semi-random manner, so that similarly situated packets are not always routed along the same path. Semi-random variation of the routing policy tends to avoid certain local hot spots of network activity, which might otherwise arise using more consistent routing determinations. Preferably, the originating node chooses a routing policy for a packet, and all intermediate nodes in the path route the packet according to that policy. Policies may be rotated on a round-robin basis, selected by generating a random number, or otherwise varied.
Effect of aberration on the acoustic field in tissue harmonic imaging (THI)
NASA Astrophysics Data System (ADS)
Jing, Yuan; Cleveland, Robin
2003-10-01
A numerical simulation was used to study the impact of an aberrating layer on the generation of the fundamental and second-harmonic (SH) field in a tissue harmonic imaging scenario. The simulation used a three-dimensional time-domain code for solving the KZK equation and accounted for arbitrary spatial variations in all acoustic properties. The aberration effect was modeled by assuming that the tissue consisted of two layers where the interface has a spatial variation C that acted like an effective phase screen. Initial experiments were carried out with sinusoidal-shaped interfaces. The sinusoidal interface produced grating lobes which were at least 6 dB larger for the fundamental signal than the SH. The energy outside of the main lobe was found to increase linearly as the amplitude of the interface variation increased. The location of the grating lobes was affected by the spatial period on the interface variation. The inhomogeneous nature of tissue was modeled with an interface with a random spatial variation. With the random interface the average sidelobe level for the fundamental was -30 dB whereas the SH had an average sidelobe level of -36 dB. [Work supported by the NSF through the Center for Subsurface Sensing and Imaging Systems.
ERIC Educational Resources Information Center
Carlson, Deven; Chingos, Matthew M.; Campbell, David E.
2017-01-01
In 1997, the New York School Choice Scholarships Foundation Program (SCSF) randomly offered three-year scholarships to attend private schools to approximately 1,000 low-income families in New York City. In this paper we leverage exogenous variation generated by the SCSF to estimate the causal effect of the private school voucher offer--and the…
Evolutionary constraints or opportunities?
Sharov, Alexei A
2014-09-01
Natural selection is traditionally viewed as a leading factor of evolution, whereas variation is assumed to be random and non-directional. Any order in variation is attributed to epigenetic or developmental constraints that can hinder the action of natural selection. In contrast I consider the positive role of epigenetic mechanisms in evolution because they provide organisms with opportunities for rapid adaptive change. Because the term "constraint" has negative connotations, I use the term "regulated variation" to emphasize the adaptive nature of phenotypic variation, which helps populations and species to survive and evolve in changing environments. The capacity to produce regulated variation is a phenotypic property, which is not described in the genome. Instead, the genome acts as a switchboard, where mostly random mutations switch "on" or "off" preexisting functional capacities of organism components. Thus, there are two channels of heredity: informational (genomic) and structure-functional (phenotypic). Functional capacities of organisms most likely emerged in a chain of modifications and combinations of more simple ancestral functions. The role of DNA has been to keep records of these changes (without describing the result) so that they can be reproduced in the following generations. Evolutionary opportunities include adjustments of individual functions, multitasking, connection between various components of an organism, and interaction between organisms. The adaptive nature of regulated variation can be explained by the differential success of lineages in macro-evolution. Lineages with more advantageous patterns of regulated variation are likely to produce more species and secure more resources (i.e., long-term lineage selection). Published by Elsevier Ireland Ltd.
Statistical error model for a solar electric propulsion thrust subsystem
NASA Technical Reports Server (NTRS)
Bantell, M. H.
1973-01-01
The solar electric propulsion thrust subsystem statistical error model was developed as a tool for investigating the effects of thrust subsystem parameter uncertainties on navigation accuracy. The model is currently being used to evaluate the impact of electric engine parameter uncertainties on navigation system performance for a baseline mission to Encke's Comet in the 1980s. The data given represent the next generation in statistical error modeling for low-thrust applications. Principal improvements include the representation of thrust uncertainties and random process modeling in terms of random parametric variations in the thrust vector process for a multi-engine configuration.
Underestimating extreme events in power-law behavior due to machine-dependent cutoffs
NASA Astrophysics Data System (ADS)
Radicchi, Filippo
2014-11-01
Power-law distributions are typical macroscopic features occurring in almost all complex systems observable in nature. As a result, researchers in quantitative analyses must often generate random synthetic variates obeying power-law distributions. The task is usually performed through standard methods that map uniform random variates into the desired probability space. Whereas all these algorithms are theoretically solid, in this paper we show that they are subject to severe machine-dependent limitations. As a result, two dramatic consequences arise: (i) the sampling in the tail of the distribution is not random but deterministic; (ii) the moments of the sample distribution, which are theoretically expected to diverge as functions of the sample sizes, converge instead to finite values. We provide quantitative indications for the range of distribution parameters that can be safely handled by standard libraries used in computational analyses. Whereas our findings indicate possible reinterpretations of numerical results obtained through flawed sampling methodologies, they also pave the way for the search for a concrete solution to this central issue shared by all quantitative sciences dealing with complexity.
Borak, T B
1986-04-01
Periodic grab sampling in combination with time-of-occupancy surveys has been the accepted procedure for estimating the annual exposure of underground U miners to Rn daughters. Temporal variations in the concentration of potential alpha energy in the mine generate uncertainties in this process. A system to randomize the selection of locations for measurement is described which can reduce uncertainties and eliminate systematic biases in the data. In general, a sample frequency of 50 measurements per year is sufficient to satisfy the criteria that the annual exposure be determined in working level months to within +/- 50% of the true value with a 95% level of confidence. Suggestions for implementing this randomization scheme are presented.
Macroscopic Spatial Complexity of the Game of Life Cellular Automaton: A Simple Data Analysis
NASA Astrophysics Data System (ADS)
Hernández-Montoya, A. R.; Coronel-Brizio, H. F.; Rodríguez-Achach, M. E.
In this chapter we present a simple data analysis of an ensemble of 20 time series, generated by averaging the spatial positions of the living cells for each state of the Game of Life Cellular Automaton (GoL). We show that at the macroscopic level described by these time series, complexity properties of GoL are also presented and the following emergent properties, typical of data extracted complex systems such as financial or economical come out: variations of the generated time series following an asymptotic power law distribution, large fluctuations tending to be followed by large fluctuations, and small fluctuations tending to be followed by small ones, and fast decay of linear correlations, however, the correlations associated to their absolute variations exhibit a long range memory. Finally, a Detrended Fluctuation Analysis (DFA) of the generated time series, indicates that the GoL spatial macro states described by the time series are not either completely ordered or random, in a measurable and very interesting way.
Onsongo, Getiria; Baughn, Linda B; Bower, Matthew; Henzler, Christine; Schomaker, Matthew; Silverstein, Kevin A T; Thyagarajan, Bharat
2016-11-01
Simultaneous detection of small copy number variations (CNVs) (<0.5 kb) and single-nucleotide variants in clinically significant genes is of great interest for clinical laboratories. The analytical variability in next-generation sequencing (NGS) and artifacts in coverage data because of issues with mappability along with lack of robust bioinformatics tools for CNV detection have limited the utility of targeted NGS data to identify CNVs. We describe the development and implementation of a bioinformatics algorithm, copy number variation-random forest (CNV-RF), that incorporates a machine learning component to identify CNVs from targeted NGS data. Using CNV-RF, we identified 12 of 13 deletions in samples with known CNVs, two cases with duplications, and identified novel deletions in 22 additional cases. Furthermore, no CNVs were identified among 60 genes in 14 cases with normal copy number and no CNVs were identified in another 104 patients with clinical suspicion of CNVs. All positive deletions and duplications were confirmed using a quantitative PCR method. CNV-RF also detected heterozygous deletions and duplications with a specificity of 50% across 4813 genes. The ability of CNV-RF to detect clinically relevant CNVs with a high degree of sensitivity along with confirmation using a low-cost quantitative PCR method provides a framework for providing comprehensive NGS-based CNV/single-nucleotide variant detection in a clinical molecular diagnostics laboratory. Copyright © 2016 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.
Nonquadratic Variation of the Blum Blum Shub Pseudorandom Number Generator
2016-09-01
maximum 200 words) Cryptography is essential for secure online communications. Many different types of ciphers are implemented in modern-day... cryptography , but they all have one common factor. All ciphers require a source of randomness, which makes them unpre- dictable. One such source of this...Martinsen Second Reader Craig Rasmussen Chair, Department of Applied Mathematics iii THIS PAGE INTENTIONALLY LEFT BLANK iv ABSTRACT Cryptography is
Estimation of population mean under systematic sampling
NASA Astrophysics Data System (ADS)
Noor-ul-amin, Muhammad; Javaid, Amjad
2017-11-01
In this study we propose a generalized ratio estimator under non-response for systematic random sampling. We also generate a class of estimators through special cases of generalized estimator using different combinations of coefficients of correlation, kurtosis and variation. The mean square errors and mathematical conditions are also derived to prove the efficiency of proposed estimators. Numerical illustration is included using three populations to support the results.
Skinner, Michael K
2015-04-26
Environment has a critical role in the natural selection process for Darwinian evolution. The primary molecular component currently considered for neo-Darwinian evolution involves genetic alterations and random mutations that generate the phenotypic variation required for natural selection to act. The vast majority of environmental factors cannot directly alter DNA sequence. Epigenetic mechanisms directly regulate genetic processes and can be dramatically altered by environmental factors. Therefore, environmental epigenetics provides a molecular mechanism to directly alter phenotypic variation generationally. Lamarck proposed in 1802 the concept that environment can directly alter phenotype in a heritable manner. Environmental epigenetics and epigenetic transgenerational inheritance provide molecular mechanisms for this process. Therefore, environment can on a molecular level influence the phenotypic variation directly. The ability of environmental epigenetics to alter phenotypic and genotypic variation directly can significantly impact natural selection. Neo-Lamarckian concept can facilitate neo-Darwinian evolution. A unified theory of evolution is presented to describe the integration of environmental epigenetic and genetic aspects of evolution. © The Author(s) 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
ERIC Educational Resources Information Center
Gougis, Rebekka Darner; Stomberg, Janet F.; O'Hare, Alicia T.; O'Reilly, Catherine M.; Bader, Nicholas E.; Meixner, Thomas; Carey, Cayelan C.
2017-01-01
The concepts of randomness and variation are pervasive in science. The purpose of this study was to document how post-secondary life science students explain randomness and variation, infer relationships between their explanations, and ability to describe and identify appropriate and inappropriate variation, and determine if students can identify…
Evolutionary constraints or opportunities?
Sharov, Alexei A
2014-04-22
Natural selection is traditionally viewed as a leading factor of evolution, whereas variation is assumed to be random and non-directional. Any order in variation is attributed to epigenetic or developmental constraints that can hinder the action of natural selection. In contrast I consider the positive role of epigenetic mechanisms in evolution because they provide organisms with opportunities for rapid adaptive change. Because the term "constraint" has negative connotations, I use the term "regulated variation" to emphasize the adaptive nature of phenotypic variation, which helps populations and species to survive and evolve in changing environments. The capacity to produce regulated variation is a phenotypic property, which is not described in the genome. Instead, the genome acts as a switchboard, where mostly random mutations switch "on" or "off" preexisting functional capacities of organism components. Thus, there are two channels of heredity: informational (genomic) and structure-functional (phenotypic). Functional capacities of organisms most likely emerged in a chain of modifications and combinations of more simple ancestral functions. The role of DNA has been to keep records of these changes (without describing the result) so that they can be reproduced in the following generations. Evolutionary opportunities include adjustments of individual functions, multitasking, connection between various components of an organism, and interaction between organisms. The adaptive nature of regulated variation can be explained by the differential success of lineages in macro-evolution. Lineages with more advantageous patterns of regulated variation are likely to produce more species and secure more resources (i.e., long-term lineage selection). Copyright © 2014. Published by Elsevier Ireland Ltd.
Systematic and random variations in digital Thematic Mapper data
NASA Technical Reports Server (NTRS)
Duggin, M. J. (Principal Investigator); Sakhavat, H.
1985-01-01
Radiance recorded by any remote sensing instrument will contain noise which will consist of both systematic and random variations. Systematic variations may be due to sun-target-sensor geometry, atmospheric conditions, and the interaction of the spectral characteristics of the sensor with those of upwelling radiance. Random variations in the data may be caused by variations in the nature and in the heterogeneity of the ground cover, by variations in atmospheric transmission, and by the interaction of these variations with the sensing device. It is important to be aware of the extent of random and systematic errors in recorded radiance data across ostensibly uniform ground areas in order to assess the impact on quantative image analysis procedures for both the single date and the multidate cases. It is the intention here to examine the systematic and the random variations in digital radiance data recorded in each band by the thematic mapper over crop areas which are ostensibly uniform and which are free from visible cloud.
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2014-01-01
This report describes a modeling and simulation approach for disturbance patterns representative of the environment experienced by a digital system in an electromagnetic reverberation chamber. The disturbance is modeled by a multi-variate statistical distribution based on empirical observations. Extended versions of the Rejection Samping and Inverse Transform Sampling techniques are developed to generate multi-variate random samples of the disturbance. The results show that Inverse Transform Sampling returns samples with higher fidelity relative to the empirical distribution. This work is part of an ongoing effort to develop a resilience assessment methodology for complex safety-critical distributed systems.
Network Ecology and Adolescent Social Structure
McFarland, Daniel A.; Moody, James; Diehl, David; Smith, Jeffrey A.; Thomas, Reuben J.
2014-01-01
Adolescent societies—whether arising from weak, short-term classroom friendships or from close, long-term friendships—exhibit various levels of network clustering, segregation, and hierarchy. Some are rank-ordered caste systems and others are flat, cliquish worlds. Explaining the source of such structural variation remains a challenge, however, because global network features are generally treated as the agglomeration of micro-level tie-formation mechanisms, namely balance, homophily, and dominance. How do the same micro-mechanisms generate significant variation in global network structures? To answer this question we propose and test a network ecological theory that specifies the ways features of organizational environments moderate the expression of tie-formation processes, thereby generating variability in global network structures across settings. We develop this argument using longitudinal friendship data on schools (Add Health study) and classrooms (Classroom Engagement study), and by extending exponential random graph models to the study of multiple societies over time. PMID:25535409
Network Ecology and Adolescent Social Structure.
McFarland, Daniel A; Moody, James; Diehl, David; Smith, Jeffrey A; Thomas, Reuben J
2014-12-01
Adolescent societies-whether arising from weak, short-term classroom friendships or from close, long-term friendships-exhibit various levels of network clustering, segregation, and hierarchy. Some are rank-ordered caste systems and others are flat, cliquish worlds. Explaining the source of such structural variation remains a challenge, however, because global network features are generally treated as the agglomeration of micro-level tie-formation mechanisms, namely balance, homophily, and dominance. How do the same micro-mechanisms generate significant variation in global network structures? To answer this question we propose and test a network ecological theory that specifies the ways features of organizational environments moderate the expression of tie-formation processes, thereby generating variability in global network structures across settings. We develop this argument using longitudinal friendship data on schools (Add Health study) and classrooms (Classroom Engagement study), and by extending exponential random graph models to the study of multiple societies over time.
NASA Astrophysics Data System (ADS)
Nex, F.; Gerke, M.
2014-08-01
Image matching techniques can nowadays provide very dense point clouds and they are often considered a valid alternative to LiDAR point cloud. However, photogrammetric point clouds are often characterized by a higher level of random noise compared to LiDAR data and by the presence of large outliers. These problems constitute a limitation in the practical use of photogrammetric data for many applications but an effective way to enhance the generated point cloud has still to be found. In this paper we concentrate on the restoration of Digital Surface Models (DSM), computed from dense image matching point clouds. A photogrammetric DSM, i.e. a 2.5D representation of the surface is still one of the major products derived from point clouds. Four different algorithms devoted to DSM denoising are presented: a standard median filter approach, a bilateral filter, a variational approach (TGV: Total Generalized Variation), as well as a newly developed algorithm, which is embedded into a Markov Random Field (MRF) framework and optimized through graph-cuts. The ability of each algorithm to recover the original DSM has been quantitatively evaluated. To do that, a synthetic DSM has been generated and different typologies of noise have been added to mimic the typical errors of photogrammetric DSMs. The evaluation reveals that standard filters like median and edge preserving smoothing through a bilateral filter approach cannot sufficiently remove typical errors occurring in a photogrammetric DSM. The TGV-based approach much better removes random noise, but large areas with outliers still remain. Our own method which explicitly models the degradation properties of those DSM outperforms the others in all aspects.
Multimodal biometric approach for cancelable face template generation
NASA Astrophysics Data System (ADS)
Paul, Padma Polash; Gavrilova, Marina
2012-06-01
Due to the rapid growth of biometric technology, template protection becomes crucial to secure integrity of the biometric security system and prevent unauthorized access. Cancelable biometrics is emerging as one of the best solutions to secure the biometric identification and verification system. We present a novel technique for robust cancelable template generation algorithm that takes advantage of the multimodal biometric using feature level fusion. Feature level fusion of different facial features is applied to generate the cancelable template. A proposed algorithm based on the multi-fold random projection and fuzzy communication scheme is used for this purpose. In cancelable template generation, one of the main difficulties is keeping interclass variance of the feature. We have found that interclass variations of the features that are lost during multi fold random projection can be recovered using fusion of different feature subsets and projecting in a new feature domain. Applying the multimodal technique in feature level, we enhance the interclass variability hence improving the performance of the system. We have tested the system for classifier fusion for different feature subset and different cancelable template fusion. Experiments have shown that cancelable template improves the performance of the biometric system compared with the original template.
Latin Hypercube Sampling (LHS) UNIX Library/Standalone
DOE Office of Scientific and Technical Information (OSTI.GOV)
2004-05-13
The LHS UNIX Library/Standalone software provides the capability to draw random samples from over 30 distribution types. It performs the sampling by a stratified sampling method called Latin Hypercube Sampling (LHS). Multiple distributions can be sampled simultaneously, with user-specified correlations amongst the input distributions, LHS UNIX Library/ Standalone provides a way to generate multi-variate samples. The LHS samples can be generated either as a callable library (e.g., from within the DAKOTA software framework) or as a standalone capability. LHS UNIX Library/Standalone uses the Latin Hypercube Sampling method (LHS) to generate samples. LHS is a constrained Monte Carlo sampling scheme. Inmore » LHS, the range of each variable is divided into non-overlapping intervals on the basis of equal probability. A sample is selected at random with respect to the probability density in each interval, If multiple variables are sampled simultaneously, then values obtained for each are paired in a random manner with the n values of the other variables. In some cases, the pairing is restricted to obtain specified correlations amongst the input variables. Many simulation codes have input parameters that are uncertain and can be specified by a distribution, To perform uncertainty analysis and sensitivity analysis, random values are drawn from the input parameter distributions, and the simulation is run with these values to obtain output values. If this is done repeatedly, with many input samples drawn, one can build up a distribution of the output as well as examine correlations between input and output variables.« less
Analysis of force profile during a maximum voluntary isometric contraction task.
Househam, Elizabeth; McAuley, John; Charles, Thompson; Lightfoot, Timothy; Swash, Michael
2004-03-01
This study analyses maximum voluntary isometric contraction (MVIC) and its measurement by recording the force profile during maximal-effort, 7-s hand-grip contractions. Six healthy subjects each performed three trials repeated at short intervals to study variation from fatigue. These three trials were performed during three separate sessions at daily intervals to look at random variation. A pattern of force development during a trial was identified. An initiation phase, with or without an initiation peak, was followed by a maintenance phase, sometimes with secondary pulses and an underlying decline in force. Of these three MVIC parameters, maximum force during the maintenance phase showed less random variability compared to intertrial fatigue variability than did maximum force during the initiation phase or absolute maximum force. Analysis of MVIC as a task, rather than a single, maximal value reveals deeper levels of motor control in its generation. Thus, force parameters other than the absolute maximum force may be better suited to quantification of muscle performance in health and disease.
NASA Technical Reports Server (NTRS)
Englander, Jacob A.; Englander, Arnold C.
2014-01-01
Trajectory optimization methods using monotonic basin hopping (MBH) have become well developed during the past decade [1, 2, 3, 4, 5, 6]. An essential component of MBH is a controlled random search through the multi-dimensional space of possible solutions. Historically, the randomness has been generated by drawing random variable (RV)s from a uniform probability distribution. Here, we investigate the generating the randomness by drawing the RVs from Cauchy and Pareto distributions, chosen because of their characteristic long tails. We demonstrate that using Cauchy distributions (as first suggested by J. Englander [3, 6]) significantly improves monotonic basin hopping (MBH) performance, and that Pareto distributions provide even greater improvements. Improved performance is defined in terms of efficiency and robustness. Efficiency is finding better solutions in less time. Robustness is efficiency that is undiminished by (a) the boundary conditions and internal constraints of the optimization problem being solved, and (b) by variations in the parameters of the probability distribution. Robustness is important for achieving performance improvements that are not problem specific. In this work we show that the performance improvements are the result of how these long-tailed distributions enable MBH to search the solution space faster and more thoroughly. In developing this explanation, we use the concepts of sub-diffusive, normally-diffusive, and super-diffusive random walks (RWs) originally developed in the field of statistical physics.
Hooper, Lisa M.; Weinfurt, Kevin P.; Cooper, Lisa A.; Mensh, Julie; Harless, William; Kuhajda, Melissa C.; Epstein, Steven A.
2009-01-01
Background Some primary care physicians provide less than optimal care for depression (Kessler et al., Journal of the American Medical Association 291, 2581–90, 2004). However, the literature is not unanimous on the best method to use in order to investigate this variation in care. To capture variations in physician behaviour and decision making in primary care settings, 32 interactive CD-ROM vignettes were constructed and tested. Aim and method The primary aim of this methods-focused paper was to review the extent to which our study method – an interactive CD-ROM patient vignette methodology – was effective in capturing variation in physician behaviour. Specifically, we examined the following questions: (a) Did the interactive CD-ROM technology work? (b) Did we create believable virtual patients? (c) Did the research protocol enable interviews (data collection) to be completed as planned? (d) To what extent was the targeted study sample size achieved? and (e) Did the study interview protocol generate valid and reliable quantitative data and rich, credible qualitative data? Findings Among a sample of 404 randomly selected primary care physicians, our voice-activated interactive methodology appeared to be effective. Specifically, our methodology – combining interactive virtual patient vignette technology, experimental design, and expansive open-ended interview protocol – generated valid explanations for variations in primary care physician practice patterns related to depression care. PMID:20463864
ERIC Educational Resources Information Center
Carlson, Deven; Chingos, Matthew M.; Campbell, David E.
2016-01-01
In 1997, the New York School Choice Scholarships Foundation Program (SCSF) randomly offered three-year scholarships to attend private schools to approximately 1,000 low-income families in New York City. In this paper we leverage exogenous variation generated by the SCSF to estimate the causal effect of the private school voucher offer--and the…
Mwogi, Thomas S.; Biondich, Paul G.; Grannis, Shaun J.
2014-01-01
Motivated by the need for readily available data for testing an open-source health information exchange platform, we developed and evaluated two methods for generating synthetic messages. The methods used HL7 version 2 messages obtained from the Indiana Network for Patient Care. Data from both methods were analyzed to assess how effectively the output reflected original ‘real-world’ data. The Markov Chain method (MCM) used an algorithm based on transitional probability matrix while the Music Box model (MBM) randomly selected messages of particular trigger type from the original data to generate new messages. The MBM was faster, generated shorter messages and exhibited less variation in message length. The MCM required more computational power, generated longer messages with more message length variability. Both methods exhibited adequate coverage, producing a high proportion of messages consistent with original messages. Both methods yielded similar rates of valid messages. PMID:25954458
Streaks and vortices in near-wall turbulence.
Chernyshenko, S I; Baig, M F
2005-05-15
This paper presents evidence that organization of wall-normal motions plays almost no role in the creation of streaks. This evidence consists of the theory of streak generation not requiring the existence of organized vortices, extensive quantitative comparisons between the theory and direct numerical simulations, including examples of large variation in average spacing of the streaks of different scalars simultaneously present in the flow, and an example of the scalar streaks in an artificially created purely random flow.
On the Use of the Beta Distribution in Probabilistic Resource Assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olea, Ricardo A., E-mail: olea@usgs.gov
2011-12-15
The triangular distribution is a popular choice when it comes to modeling bounded continuous random variables. Its wide acceptance derives mostly from its simple analytic properties and the ease with which modelers can specify its three parameters through the extremes and the mode. On the negative side, hardly any real process follows a triangular distribution, which from the outset puts at a disadvantage any model employing triangular distributions. At a time when numerical techniques such as the Monte Carlo method are displacing analytic approaches in stochastic resource assessments, easy specification remains the most attractive characteristic of the triangular distribution. Themore » beta distribution is another continuous distribution defined within a finite interval offering wider flexibility in style of variation, thus allowing consideration of models in which the random variables closely follow the observed or expected styles of variation. Despite its more complex definition, generation of values following a beta distribution is as straightforward as generating values following a triangular distribution, leaving the selection of parameters as the main impediment to practically considering beta distributions. This contribution intends to promote the acceptance of the beta distribution by explaining its properties and offering several suggestions to facilitate the specification of its two shape parameters. In general, given the same distributional parameters, use of the beta distributions in stochastic modeling may yield significantly different results, yet better estimates, than the triangular distribution.« less
Evolution viewed from physics, physiology and medicine.
Noble, Denis
2017-10-06
Stochasticity is harnessed by organisms to generate functionality. Randomness does not, therefore, necessarily imply lack of function or 'blind chance' at higher levels. In this respect, biology must resemble physics in generating order from disorder. This fact is contrary to Schrödinger's idea of biology generating phenotypic order from molecular- level order, which inspired the central dogma of molecular biology. The order originates at higher levels, which constrain the components at lower levels. We now know that this includes the genome, which is controlled by patterns of transcription factors and various epigenetic and reorganization mechanisms. These processes can occur in response to environmental stress, so that the genome becomes 'a highly sensitive organ of the cell' (McClintock). Organisms have evolved to be able to cope with many variations at the molecular level. Organisms also make use of physical processes in evolution and development when it is possible to arrive at functional development without the necessity to store all information in DNA sequences. This view of development and evolution differs radically from that of neo-Darwinism with its emphasis on blind chance as the origin of variation. Blind chance is necessary, but the origin of functional variation is not at the molecular level. These observations derive from and reinforce the principle of biological relativity, which holds that there is no privileged level of causation. They also have important implications for medical science.
Griffiths, Jason I.; Fronhofer, Emanuel A.; Garnier, Aurélie; Seymour, Mathew; Altermatt, Florian; Petchey, Owen L.
2017-01-01
The development of video-based monitoring methods allows for rapid, dynamic and accurate monitoring of individuals or communities, compared to slower traditional methods, with far reaching ecological and evolutionary applications. Large amounts of data are generated using video-based methods, which can be effectively processed using machine learning (ML) algorithms into meaningful ecological information. ML uses user defined classes (e.g. species), derived from a subset (i.e. training data) of video-observed quantitative features (e.g. phenotypic variation), to infer classes in subsequent observations. However, phenotypic variation often changes due to environmental conditions, which may lead to poor classification, if environmentally induced variation in phenotypes is not accounted for. Here we describe a framework for classifying species under changing environmental conditions based on the random forest classification. A sliding window approach was developed that restricts temporal and environmentally conditions to improve the classification. We tested our approach by applying the classification framework to experimental data. The experiment used a set of six ciliate species to monitor changes in community structure and behavior over hundreds of generations, in dozens of species combinations and across a temperature gradient. Differences in biotic and abiotic conditions caused simplistic classification approaches to be unsuccessful. In contrast, the sliding window approach allowed classification to be highly successful, as phenotypic differences driven by environmental change, could be captured by the classifier. Importantly, classification using the random forest algorithm showed comparable success when validated against traditional, slower, manual identification. Our framework allows for reliable classification in dynamic environments, and may help to improve strategies for long-term monitoring of species in changing environments. Our classification pipeline can be applied in fields assessing species community dynamics, such as eco-toxicology, ecology and evolutionary ecology. PMID:28472193
Parameterized reduced order models from a single mesh using hyper-dual numbers
NASA Astrophysics Data System (ADS)
Brake, M. R. W.; Fike, J. A.; Topping, S. D.
2016-06-01
In order to assess the predicted performance of a manufactured system, analysts must consider random variations (both geometric and material) in the development of a model, instead of a single deterministic model of an idealized geometry with idealized material properties. The incorporation of random geometric variations, however, potentially could necessitate the development of thousands of nearly identical solid geometries that must be meshed and separately analyzed, which would require an impractical number of man-hours to complete. This research advances a recent approach to uncertainty quantification by developing parameterized reduced order models. These parameterizations are based upon Taylor series expansions of the system's matrices about the ideal geometry, and a component mode synthesis representation for each linear substructure is used to form an efficient basis with which to study the system. The numerical derivatives required for the Taylor series expansions are obtained via hyper-dual numbers, and are compared to parameterized models constructed with finite difference formulations. The advantage of using hyper-dual numbers is two-fold: accuracy of the derivatives to machine precision, and the need to only generate a single mesh of the system of interest. The theory is applied to a stepped beam system in order to demonstrate proof of concept. The results demonstrate that the hyper-dual number multivariate parameterization of geometric variations, which largely are neglected in the literature, are accurate for both sensitivity and optimization studies. As model and mesh generation can constitute the greatest expense of time in analyzing a system, the foundation to create a parameterized reduced order model based off of a single mesh is expected to reduce dramatically the necessary time to analyze multiple realizations of a component's possible geometry.
Molecular Darwinism: The Contingency of Spontaneous Genetic Variation
Arber, Werner
2011-01-01
The availability of spontaneously occurring genetic variants is an important driving force of biological evolution. Largely thanks to experimental investigations by microbial geneticists, we know today that several different molecular mechanisms contribute to the overall genetic variations. These mechanisms can be assigned to three natural strategies to generate genetic variants: 1) local sequence changes, 2) intragenomic reshuffling of DNA segments, and 3) acquisition of a segment of foreign DNA. In these processes, specific gene products are involved in cooperation with different nongenetic elements. Some genetic variations occur fully at random along the DNA filaments, others rather with a statistical reproducibility, although at many possible sites. We have to be aware that evolution in natural ecosystems is of higher complexity than under most laboratory conditions, not at least in view of symbiotic associations and the occurrence of horizontal gene transfer. The encountered contingency of genetic variation can possibly best ensure a long-term persistence of life under steadily changing living conditions. PMID:21979160
Zhou, Lecong; Bailey, K L; Chen, C Y; Keri, Mario
2005-01-01
Molecular and genetic approaches were used to evaluate the genetic relatedness among isolates of the fungus Phoma macrostoma Montagne originating from Canada and Europe and to other species in the genus Phoma. Distinct differences were observed in genetic variation among nine species of the genus Phoma. Randomly amplified polymorphic DNA (RAPD) revealed the presence of intraspecific genetic variation among the isolates of P. macrostoma, with the isolates being used for biological weed control being distributed in a distinct phylogenetic cluster. Additional variation within the biocontrol isolate cluster in P. macrostoma was revealed by pulsed field gel electrophoresis (PFGE), which showed that biocontrol isolates generated two different chromosomal profiles, however the profiles did not relate to their Canadian ecozone origin. Mating studies showed that biocontrol isolates of P. macrostoma from Canada did not produce sexual reproductive structures and were incapable of crossing. These studies also confirmed that no obvious differentiation exists among the biocontrol isolates of P. macrostoma from Canadian Ecozones 3 and 4.
Molecular Darwinism: the contingency of spontaneous genetic variation.
Arber, Werner
2011-01-01
The availability of spontaneously occurring genetic variants is an important driving force of biological evolution. Largely thanks to experimental investigations by microbial geneticists, we know today that several different molecular mechanisms contribute to the overall genetic variations. These mechanisms can be assigned to three natural strategies to generate genetic variants: 1) local sequence changes, 2) intragenomic reshuffling of DNA segments, and 3) acquisition of a segment of foreign DNA. In these processes, specific gene products are involved in cooperation with different nongenetic elements. Some genetic variations occur fully at random along the DNA filaments, others rather with a statistical reproducibility, although at many possible sites. We have to be aware that evolution in natural ecosystems is of higher complexity than under most laboratory conditions, not at least in view of symbiotic associations and the occurrence of horizontal gene transfer. The encountered contingency of genetic variation can possibly best ensure a long-term persistence of life under steadily changing living conditions.
Range and stability of structural colors generated by Morpho-inspired color reflectors.
Chung, Kyungjae; Shin, Jung H
2013-05-01
The range and stability of structural colors generated by Morpho-inspired color reflectors are investigated. We find that despite the internal randomness of such structures that gives rise to their Morpho-like angle-independent iridescence, their colors under ambient lighting condition can be predicted by simple transfer-matrix calculations of corresponding planar multilayer structures. By calculating the possible range of colors generated by multilayers of different structures and material combinations using such transfer-matrix methods, we find that low-refractive index multilayers with intrastructure absorption, such as the melanin-containing chitin/air multilayer structure from the Morpho butterflies, can provide not only the most pure structural colors with the largest color gamut, but also the highest stability of color against variations in multilayer structure.
Correcting Biases in a lower resolution global circulation model with data assimilation
NASA Astrophysics Data System (ADS)
Canter, Martin; Barth, Alexander
2016-04-01
With this work, we aim at developping a new method of bias correction using data assimilation. This method is based on the stochastic forcing of a model to correct bias. First, through a preliminary run, we estimate the bias of the model and its possible sources. Then, we establish a forcing term which is directly added inside the model's equations. We create an ensemble of runs and consider the forcing term as a control variable during the assimilation of observations. We then use this analysed forcing term to correct the bias of the model. Since the forcing is added inside the model, it acts as a source term, unlike external forcings such as wind. This procedure has been developed and successfully tested with a twin experiment on a Lorenz 95 model. It is currently being applied and tested on the sea ice ocean NEMO LIM model, which is used in the PredAntar project. NEMO LIM is a global and low resolution (2 degrees) coupled model (hydrodynamic model and sea ice model) with long time steps allowing simulations over several decades. Due to its low resolution, the model is subject to bias in area where strong currents are present. We aim at correcting this bias by using perturbed current fields from higher resolution models and randomly generated perturbations. The random perturbations need to be constrained in order to respect the physical properties of the ocean, and not create unwanted phenomena. To construct those random perturbations, we first create a random field with the Diva tool (Data-Interpolating Variational Analysis). Using a cost function, this tool penalizes abrupt variations in the field, while using a custom correlation length. It also decouples disconnected areas based on topography. Then, we filter the field to smoothen it and remove small scale variations. We use this field as a random stream function, and take its derivatives to get zonal and meridional velocity fields. We also constrain the stream function along the coasts in order not to have currents perpendicular to the coast. The randomly generated stochastic forcing are then directly injected into the NEMO LIM model's equations in order to force the model at each timestep, and not only during the assimilation step. Results from a twin experiment will be presented. This method is being applied to a real case, with observations on the sea surface height available from the mean dynamic topography of CNES (Centre national d'études spatiales). The model, the bias correction, and more extensive forcings, in particular with a three dimensional structure and a time-varying component, will also be presented.
Learning-based stochastic object models for use in optimizing imaging systems
NASA Astrophysics Data System (ADS)
Dolly, Steven R.; Anastasio, Mark A.; Yu, Lifeng; Li, Hua
2017-03-01
It is widely known that the optimization of imaging systems based on objective, or task-based, measures of image quality via computer-simulation requires use of a stochastic object model (SOM). However, the development of computationally tractable SOMs that can accurately model the statistical variations in anatomy within a specified ensemble of patients remains a challenging task. Because they are established by use of image data corresponding a single patient, previously reported numerical anatomical models lack of the ability to accurately model inter- patient variations in anatomy. In certain applications, however, databases of high-quality volumetric images are available that can facilitate this task. In this work, a novel and tractable methodology for learning a SOM from a set of volumetric training images is developed. The proposed method is based upon geometric attribute distribution (GAD) models, which characterize the inter-structural centroid variations and the intra-structural shape variations of each individual anatomical structure. The GAD models are scalable and deformable, and constrained by their respective principal attribute variations learned from training data. By use of the GAD models, random organ shapes and positions can be generated and integrated to form an anatomical phantom. The randomness in organ shape and position will reflect the variability of anatomy present in the training data. To demonstrate the methodology, a SOM corresponding to the pelvis of an adult male was computed and a corresponding ensemble of phantoms was created. Additionally, computer-simulated X-ray projection images corresponding to the phantoms were computed, from which tomographic images were reconstructed.
Clan-structured migration and phenotypic differentiation in the Jirels of Nepal.
Williams-Blangero, S
1989-04-01
This paper examines the impact of clan-structured migration on the between-village differentiation of the Jirels, a tribal population of eastern Nepal. The Jirel population is geographically restricted to nine villages, all of which were sampled to some extent for this study. Data on five head measurements, stature, and digital ridge counts are utilized to illustrate the patterns of phenotypic variation. Multivariate statistical techniques are used to assess the extent to which clan membership and associated patterns of marital exchange influence the population structure of the Jirels. The phenotypic characteristics of randomly generated migrant sets are compared to those of the observed clan-structured sets, demonstrating the clan-related phenotypic nonrandomness of migrants. The results indicate that clan-structured migration may significantly influence the amount of between-village variation. Clan structure may be a significant factor in determining patterns of variation and should not be ignored in studies of microdifferentiation in tribal populations.
Theory of inhomogeneous quantum systems. III. Variational wave functions for Fermi fluids
NASA Astrophysics Data System (ADS)
Krotscheck, E.
1985-04-01
We develop a general variational theory for inhomogeneous Fermi systems such as the electron gas in a metal surface, the surface of liquid 3He, or simple models of heavy nuclei. The ground-state wave function is expressed in terms of two-body correlations, a one-body attenuation factor, and a model-system Slater determinant. Massive partial summations of cluster expansions are performed by means of Born-Green-Yvon and hypernetted-chain techniques. An optimal single-particle basis is generated by a generalized Hartree-Fock equation in which the two-body correlations screen the bare interparticle interaction. The optimization of the pair correlations leads to a state-averaged random-phase-approximation equation and a strictly microscopic determination of the particle-hole interaction.
Winter Simulation Conference, Miami Beach, Fla., December 4-6, 1978, Proceedings. Volumes 1 & 2
NASA Technical Reports Server (NTRS)
Highland, H. J. (Editor); Nielsen, N. R.; Hull, L. G.
1978-01-01
The papers report on the various aspects of simulation such as random variate generation, simulation optimization, ranking and selection of alternatives, model management, documentation, data bases, and instructional methods. Simulation studies in a wide variety of fields are described, including system design and scheduling, government and social systems, agriculture, computer systems, the military, transportation, corporate planning, ecosystems, health care, manufacturing and industrial systems, computer networks, education, energy, production planning and control, financial models, behavioral models, information systems, and inventory control.
Turbulent, Extreme Multi-zone Model for Simulating Flux and Polarization Variability in Blazars
NASA Astrophysics Data System (ADS)
Marscher, Alan P.
2014-01-01
The author presents a model for variability of the flux and polarization of blazars in which turbulent plasma flowing at a relativistic speed down a jet crosses a standing conical shock. The shock compresses the plasma and accelerates electrons to energies up to γmax >~ 104 times their rest-mass energy, with the value of γmax determined by the direction of the magnetic field relative to the shock front. The turbulence is approximated in a computer code as many cells, each with a uniform magnetic field whose direction is selected randomly. The density of high-energy electrons in the plasma changes randomly with time in a manner consistent with the power spectral density of flux variations derived from observations of blazars. The variations in flux and polarization are therefore caused by continuous noise processes rather than by singular events such as explosive injection of energy at the base of the jet. Sample simulations illustrate the behavior of flux and linear polarization versus time that such a model produces. The variations in γ-ray flux generated by the code are often, but not always, correlated with those at lower frequencies, and many of the flares are sharply peaked. The mean degree of polarization of synchrotron radiation is higher and its timescale of variability shorter toward higher frequencies, while the polarization electric vector sometimes randomly executes apparent rotations. The slope of the spectral energy distribution exhibits sharper breaks than can arise solely from energy losses. All of these results correspond to properties observed in blazars.
Chen, Carla Chia-Ming; Schwender, Holger; Keith, Jonathan; Nunkesser, Robin; Mengersen, Kerrie; Macrossan, Paula
2011-01-01
Due to advancements in computational ability, enhanced technology and a reduction in the price of genotyping, more data are being generated for understanding genetic associations with diseases and disorders. However, with the availability of large data sets comes the inherent challenges of new methods of statistical analysis and modeling. Considering a complex phenotype may be the effect of a combination of multiple loci, various statistical methods have been developed for identifying genetic epistasis effects. Among these methods, logic regression (LR) is an intriguing approach incorporating tree-like structures. Various methods have built on the original LR to improve different aspects of the model. In this study, we review four variations of LR, namely Logic Feature Selection, Monte Carlo Logic Regression, Genetic Programming for Association Studies, and Modified Logic Regression-Gene Expression Programming, and investigate the performance of each method using simulated and real genotype data. We contrast these with another tree-like approach, namely Random Forests, and a Bayesian logistic regression with stochastic search variable selection.
Fine-scale spatial genetic dynamics over the life cycle of the tropical tree Prunus africana.
Berens, D G; Braun, C; González-Martínez, S C; Griebeler, E M; Nathan, R; Böhning-Gaese, K
2014-11-01
Studying fine-scale spatial genetic patterns across life stages is a powerful approach to identify ecological processes acting within tree populations. We investigated spatial genetic dynamics across five life stages in the insect-pollinated and vertebrate-dispersed tropical tree Prunus africana in Kakamega Forest, Kenya. Using six highly polymorphic microsatellite loci, we assessed genetic diversity and spatial genetic structure (SGS) from seed rain and seedlings, and different sapling stages to adult trees. We found significant SGS in all stages, potentially caused by limited seed dispersal and high recruitment rates in areas with high light availability. SGS decreased from seed and early seedling stages to older juvenile stages. Interestingly, SGS was stronger in adults than in late juveniles. The initial decrease in SGS was probably driven by both random and non-random thinning of offspring clusters during recruitment. Intergenerational variation in SGS could have been driven by variation in gene flow processes, overlapping generations in the adult stage or local selection. Our study shows that complex sequential processes during recruitment contribute to SGS of tree populations.
A stochastic-geometric model of soil variation in Pleistocene patterned ground
NASA Astrophysics Data System (ADS)
Lark, Murray; Meerschman, Eef; Van Meirvenne, Marc
2013-04-01
In this paper we examine the spatial variability of soil in parent material with complex spatial structure which arises from complex non-linear geomorphic processes. We show that this variability can be better-modelled by a stochastic-geometric model than by a standard Gaussian random field. The benefits of the new model are seen in the reproduction of features of the target variable which influence processes like water movement and pollutant dispersal. Complex non-linear processes in the soil give rise to properties with non-Gaussian distributions. Even under a transformation to approximate marginal normality, such variables may have a more complex spatial structure than the Gaussian random field model of geostatistics can accommodate. In particular the extent to which extreme values of the variable are connected in spatially coherent regions may be misrepresented. As a result, for example, geostatistical simulation generally fails to reproduce the pathways for preferential flow in an environment where coarse infill of former fluvial channels or coarse alluvium of braided streams creates pathways for rapid movement of water. Multiple point geostatistics has been developed to deal with this problem. Multiple point methods proceed by sampling from a set of training images which can be assumed to reproduce the non-Gaussian behaviour of the target variable. The challenge is to identify appropriate sources of such images. In this paper we consider a mode of soil variation in which the soil varies continuously, exhibiting short-range lateral trends induced by local effects of the factors of soil formation which vary across the region of interest in an unpredictable way. The trends in soil variation are therefore only apparent locally, and the soil variation at regional scale appears random. We propose a stochastic-geometric model for this mode of soil variation called the Continuous Local Trend (CLT) model. We consider a case study of soil formed in relict patterned ground with pronounced lateral textural variations arising from the presence of infilled ice-wedges of Pleistocene origin. We show how knowledge of the pedogenetic processes in this environment, along with some simple descriptive statistics, can be used to select and fit a CLT model for the apparent electrical conductivity (ECa) of the soil. We use the model to simulate realizations of the CLT process, and compare these with realizations of a fitted Gaussian random field. We show how statistics that summarize the spatial coherence of regions with small values of ECa, which are expected to have coarse texture and so larger saturated hydraulic conductivity, are better reproduced by the CLT model than by the Gaussian random field. This suggests that the CLT model could be used to generate an unlimited supply of training images to allow multiple point geostatistical simulation or prediction of this or similar variables.
Population and allelic variation of A-to-I RNA editing in human transcriptomes.
Park, Eddie; Guo, Jiguang; Shen, Shihao; Demirdjian, Levon; Wu, Ying Nian; Lin, Lan; Xing, Yi
2017-07-28
A-to-I RNA editing is an important step in RNA processing in which specific adenosines in some RNA molecules are post-transcriptionally modified to inosines. RNA editing has emerged as a widespread mechanism for generating transcriptome diversity. However, there remain significant knowledge gaps about the variation and function of RNA editing. In order to determine the influence of genetic variation on A-to-I RNA editing, we integrate genomic and transcriptomic data from 445 human lymphoblastoid cell lines by combining an RNA editing QTL (edQTL) analysis with an allele-specific RNA editing (ASED) analysis. We identify 1054 RNA editing events associated with cis genetic polymorphisms. Additionally, we find that a subset of these polymorphisms is linked to genome-wide association study signals of complex traits or diseases. Finally, compared to random cis polymorphisms, polymorphisms associated with RNA editing variation are located closer spatially to their respective editing sites and have a more pronounced impact on RNA secondary structure. Our study reveals widespread cis variation in RNA editing among genetically distinct individuals and sheds light on possible phenotypic consequences of such variation on complex traits and diseases.
Incompleteness and limit of security theory of quantum key distribution
NASA Astrophysics Data System (ADS)
Hirota, Osamu; Murakami, Dan; Kato, Kentaro; Futami, Fumio
2012-10-01
It is claimed in the many papers that a trace distance: d guarantees the universal composition security in quantum key distribution (QKD) like BB84 protocol. In this introduction paper, at first, it is explicitly explained what is the main misconception in the claim of the unconditional security for QKD theory. In general terms, the cause of the misunderstanding on the security claim is the Lemma in the paper of Renner. It suggests that the generation of the perfect random key is assured by the probability (1-d), and its failure probability is d. Thus, it concludes that the generated key provides the perfect random key sequence when the protocol is success. So the QKD provides perfect secrecy to the one time pad. This is the reason for the composition claim. However, the quantity of the trace distance (or variational distance) is not the probability for such an event. If d is not small enough, always the generated key sequence is not uniform. Now one needs the reconstruction of the evaluation of the trace distance if one wants to use it. One should first go back to the indistinguishability theory in the computational complexity based, and to clarify the meaning of the value of the variational distance. In addition, the same analysis for the information theoretic case is necessary. The recent serial papers by H.P.Yuen have given the answer on such questions. In this paper, we show more concise description of Yuen's theory, and clarify that the upper bound theories for the trace distance by Tomamichel et al and Hayashi et al are constructed by the wrong reasoning of Renner and it is unsuitable as the security analysis. Finally, we introduce a new macroscopic quantum communication to replace Q-bit QKD.
Super-resolution photoacoustic microscopy using joint sparsity
NASA Astrophysics Data System (ADS)
Burgholzer, P.; Haltmeier, M.; Berer, T.; Leiss-Holzinger, E.; Murray, T. W.
2017-07-01
We present an imaging method that uses the random optical speckle patterns that naturally emerge as light propagates through strongly scattering media as a structured illumination source for photoacoustic imaging. Our approach, termed blind structured illumination photoacoustic microscopy (BSIPAM), was inspired by recent work in fluorescence microscopy where super-resolution imaging was demonstrated using multiple unknown speckle illumination patterns. We extend this concept to the multiple scattering domain using photoacoustics (PA), with the speckle pattern serving to generate ultrasound. The optical speckle pattern that emerges as light propagates through diffuse media provides structured illumination to an object placed behind a scattering wall. The photoacoustic signal produced by such illumination is detected using a focused ultrasound transducer. We demonstrate through both simulation and experiment, that by acquiring multiple photoacoustic images, each produced by a different random and unknown speckle pattern, an image of an absorbing object can be reconstructed with a spatial resolution far exceeding that of the ultrasound transducer. We experimentally and numerically demonstrate a gain in resolution of more than a factor of two by using multiple speckle illuminations. The variations in the photoacoustic signals generated with random speckle patterns are utilized in BSIPAM using a novel reconstruction algorithm. Exploiting joint sparsity, this algorithm is capable of reconstructing the absorbing structure from measured PA signals with a resolution close to the speckle size. Another way to excite random excitation for photoacoustic imaging are small absorbing particles, including contrast agents, which flow through small vessels. For such a set-up, the joint-sparsity is generated by the fact that all the particles move in the same vessels. Structured illumination in that case is not necessary.
On the Use of the Beta Distribution in Probabilistic Resource Assessments
Olea, R.A.
2011-01-01
The triangular distribution is a popular choice when it comes to modeling bounded continuous random variables. Its wide acceptance derives mostly from its simple analytic properties and the ease with which modelers can specify its three parameters through the extremes and the mode. On the negative side, hardly any real process follows a triangular distribution, which from the outset puts at a disadvantage any model employing triangular distributions. At a time when numerical techniques such as the Monte Carlo method are displacing analytic approaches in stochastic resource assessments, easy specification remains the most attractive characteristic of the triangular distribution. The beta distribution is another continuous distribution defined within a finite interval offering wider flexibility in style of variation, thus allowing consideration of models in which the random variables closely follow the observed or expected styles of variation. Despite its more complex definition, generation of values following a beta distribution is as straightforward as generating values following a triangular distribution, leaving the selection of parameters as the main impediment to practically considering beta distributions. This contribution intends to promote the acceptance of the beta distribution by explaining its properties and offering several suggestions to facilitate the specification of its two shape parameters. In general, given the same distributional parameters, use of the beta distributions in stochastic modeling may yield significantly different results, yet better estimates, than the triangular distribution. ?? 2011 International Association for Mathematical Geology (outside the USA).
Norris, Steven J.
2015-01-01
Summary Spirochetes that cause Lyme borreliosis (also called Lyme disease) possess the vls locus, encoding an elaborate antigenic variation system. This locus contains the expression site vlsE as well as a contiguous array of vls silent cassettes, which contain variations of the central cassette region of vlsE. The locus is present on one of the many linear plasmids in the organism, e.g. plasmid lp28-1 in the strain B. burgdorferi B31. Changes in the sequence of vlsE occur continuously during mammalian infection and consist of random, segmental, unidirectional recombination events between the silent cassettes and the cassette region of vlsE. These gene conversion events do not occur during in vitro culture or the tick portion of the infection cycle of Borrelia burgdorferi or the other related Borrelia species that cause Lyme disease. The mechanism of recombination is largely unknown, but requires the RuvAB Holliday junction branch migrase. Other features of the vls locus also appear to be required, including cis locations of vlsE and the silent cassettes and high G+C content and GC skew. The vls system is required for long-term survival of Lyme Borrelia in infected mammals and represents an important mechanism of immune evasion. In addition to sequence variation, immune selection also results in significant heterogeneity in the sequence of the surface lipoprotein VlsE. Despite antigenic variation, VlsE generates a robust antibody response, and both full length VlsE and the C6 peptide (corresponding to invariant region 6) are widely used in immunodiagnostic tests for Lyme disease. PMID:26104445
Bor, Jacob; Geldsetzer, Pascal; Venkataramani, Atheendar; Bärnighausen, Till
2015-01-01
Purpose of review Randomized, population-representative trials of clinical interventions are rare. Quasi-experiments have been used successfully to generate causal evidence on the cascade of HIV care in a broad range of real-world settings. Recent findings Quasi-experiments exploit exogenous, or quasi-random, variation occurring naturally in the world or because of an administrative rule or policy change to estimate causal effects. Well designed quasi-experiments have greater internal validity than typical observational research designs. At the same time, quasi-experiments may also have potential for greater external validity than experiments and can be implemented when randomized clinical trials are infeasible or unethical. Quasi-experimental studies have established the causal effects of HIV testing and initiation of antiretroviral therapy on health, economic outcomes and sexual behaviors, as well as indirect effects on other community members. Recent quasi-experiments have evaluated specific interventions to improve patient performance in the cascade of care, providing causal evidence to optimize clinical management of HIV. Summary Quasi-experiments have generated important data on the real-world impacts of HIV testing and treatment and on interventions to improve the cascade of care. With the growth in large-scale clinical and administrative data, quasi-experiments enable rigorous evaluation of policies implemented in real-world settings. PMID:26371463
NASA Technical Reports Server (NTRS)
Englander, Jacob; Englander, Arnold
2014-01-01
Trajectory optimization methods using MBH have become well developed during the past decade. An essential component of MBH is a controlled random search through the multi-dimensional space of possible solutions. Historically, the randomness has been generated by drawing RVs from a uniform probability distribution. Here, we investigate the generating the randomness by drawing the RVs from Cauchy and Pareto distributions, chosen because of their characteristic long tails. We demonstrate that using Cauchy distributions (as first suggested by Englander significantly improves MBH performance, and that Pareto distributions provide even greater improvements. Improved performance is defined in terms of efficiency and robustness, where efficiency is finding better solutions in less time, and robustness is efficiency that is undiminished by (a) the boundary conditions and internal constraints of the optimization problem being solved, and (b) by variations in the parameters of the probability distribution. Robustness is important for achieving performance improvements that are not problem specific. In this work we show that the performance improvements are the result of how these long-tailed distributions enable MBH to search the solution space faster and more thoroughly. In developing this explanation, we use the concepts of sub-diffusive, normally-diffusive, and super-diffusive RWs originally developed in the field of statistical physics.
Bor, Jacob; Geldsetzer, Pascal; Venkataramani, Atheendar; Bärnighausen, Till
2015-11-01
Randomized, population-representative trials of clinical interventions are rare. Quasi-experiments have been used successfully to generate causal evidence on the cascade of HIV care in a broad range of real-world settings. Quasi-experiments exploit exogenous, or quasi-random, variation occurring naturally in the world or because of an administrative rule or policy change to estimate causal effects. Well designed quasi-experiments have greater internal validity than typical observational research designs. At the same time, quasi-experiments may also have potential for greater external validity than experiments and can be implemented when randomized clinical trials are infeasible or unethical. Quasi-experimental studies have established the causal effects of HIV testing and initiation of antiretroviral therapy on health, economic outcomes and sexual behaviors, as well as indirect effects on other community members. Recent quasi-experiments have evaluated specific interventions to improve patient performance in the cascade of care, providing causal evidence to optimize clinical management of HIV. Quasi-experiments have generated important data on the real-world impacts of HIV testing and treatment and on interventions to improve the cascade of care. With the growth in large-scale clinical and administrative data, quasi-experiments enable rigorous evaluation of policies implemented in real-world settings.
Ali, S. M.; Mehmood, C. A; Khan, B.; Jawad, M.; Farid, U; Jadoon, J. K.; Ali, M.; Tareen, N. K.; Usman, S.; Majid, M.; Anwar, S. M.
2016-01-01
In smart grid paradigm, the consumer demands are random and time-dependent, owning towards stochastic probabilities. The stochastically varying consumer demands have put the policy makers and supplying agencies in a demanding position for optimal generation management. The utility revenue functions are highly dependent on the consumer deterministic stochastic demand models. The sudden drifts in weather parameters effects the living standards of the consumers that in turn influence the power demands. Considering above, we analyzed stochastically and statistically the effect of random consumer demands on the fixed and variable revenues of the electrical utilities. Our work presented the Multi-Variate Gaussian Distribution Function (MVGDF) probabilistic model of the utility revenues with time-dependent consumer random demands. Moreover, the Gaussian probabilities outcome of the utility revenues is based on the varying consumer n demands data-pattern. Furthermore, Standard Monte Carlo (SMC) simulations are performed that validated the factor of accuracy in the aforesaid probabilistic demand-revenue model. We critically analyzed the effect of weather data parameters on consumer demands using correlation and multi-linear regression schemes. The statistical analysis of consumer demands provided a relationship between dependent (demand) and independent variables (weather data) for utility load management, generation control, and network expansion. PMID:27314229
Ali, S M; Mehmood, C A; Khan, B; Jawad, M; Farid, U; Jadoon, J K; Ali, M; Tareen, N K; Usman, S; Majid, M; Anwar, S M
2016-01-01
In smart grid paradigm, the consumer demands are random and time-dependent, owning towards stochastic probabilities. The stochastically varying consumer demands have put the policy makers and supplying agencies in a demanding position for optimal generation management. The utility revenue functions are highly dependent on the consumer deterministic stochastic demand models. The sudden drifts in weather parameters effects the living standards of the consumers that in turn influence the power demands. Considering above, we analyzed stochastically and statistically the effect of random consumer demands on the fixed and variable revenues of the electrical utilities. Our work presented the Multi-Variate Gaussian Distribution Function (MVGDF) probabilistic model of the utility revenues with time-dependent consumer random demands. Moreover, the Gaussian probabilities outcome of the utility revenues is based on the varying consumer n demands data-pattern. Furthermore, Standard Monte Carlo (SMC) simulations are performed that validated the factor of accuracy in the aforesaid probabilistic demand-revenue model. We critically analyzed the effect of weather data parameters on consumer demands using correlation and multi-linear regression schemes. The statistical analysis of consumer demands provided a relationship between dependent (demand) and independent variables (weather data) for utility load management, generation control, and network expansion.
Schadt, Eric E.; Banerjee, Onureena; Fang, Gang; Feng, Zhixing; Wong, Wing H.; Zhang, Xuegong; Kislyuk, Andrey; Clark, Tyson A.; Luong, Khai; Keren-Paz, Alona; Chess, Andrew; Kumar, Vipin; Chen-Plotkin, Alice; Sondheimer, Neal; Korlach, Jonas; Kasarskis, Andrew
2013-01-01
Current generation DNA sequencing instruments are moving closer to seamlessly sequencing genomes of entire populations as a routine part of scientific investigation. However, while significant inroads have been made identifying small nucleotide variation and structural variations in DNA that impact phenotypes of interest, progress has not been as dramatic regarding epigenetic changes and base-level damage to DNA, largely due to technological limitations in assaying all known and unknown types of modifications at genome scale. Recently, single-molecule real time (SMRT) sequencing has been reported to identify kinetic variation (KV) events that have been demonstrated to reflect epigenetic changes of every known type, providing a path forward for detecting base modifications as a routine part of sequencing. However, to date no statistical framework has been proposed to enhance the power to detect these events while also controlling for false-positive events. By modeling enzyme kinetics in the neighborhood of an arbitrary location in a genomic region of interest as a conditional random field, we provide a statistical framework for incorporating kinetic information at a test position of interest as well as at neighboring sites that help enhance the power to detect KV events. The performance of this and related models is explored, with the best-performing model applied to plasmid DNA isolated from Escherichia coli and mitochondrial DNA isolated from human brain tissue. We highlight widespread kinetic variation events, some of which strongly associate with known modification events, while others represent putative chemically modified sites of unknown types. PMID:23093720
Schadt, Eric E; Banerjee, Onureena; Fang, Gang; Feng, Zhixing; Wong, Wing H; Zhang, Xuegong; Kislyuk, Andrey; Clark, Tyson A; Luong, Khai; Keren-Paz, Alona; Chess, Andrew; Kumar, Vipin; Chen-Plotkin, Alice; Sondheimer, Neal; Korlach, Jonas; Kasarskis, Andrew
2013-01-01
Current generation DNA sequencing instruments are moving closer to seamlessly sequencing genomes of entire populations as a routine part of scientific investigation. However, while significant inroads have been made identifying small nucleotide variation and structural variations in DNA that impact phenotypes of interest, progress has not been as dramatic regarding epigenetic changes and base-level damage to DNA, largely due to technological limitations in assaying all known and unknown types of modifications at genome scale. Recently, single-molecule real time (SMRT) sequencing has been reported to identify kinetic variation (KV) events that have been demonstrated to reflect epigenetic changes of every known type, providing a path forward for detecting base modifications as a routine part of sequencing. However, to date no statistical framework has been proposed to enhance the power to detect these events while also controlling for false-positive events. By modeling enzyme kinetics in the neighborhood of an arbitrary location in a genomic region of interest as a conditional random field, we provide a statistical framework for incorporating kinetic information at a test position of interest as well as at neighboring sites that help enhance the power to detect KV events. The performance of this and related models is explored, with the best-performing model applied to plasmid DNA isolated from Escherichia coli and mitochondrial DNA isolated from human brain tissue. We highlight widespread kinetic variation events, some of which strongly associate with known modification events, while others represent putative chemically modified sites of unknown types.
Generating synthetic wave climates for coastal modelling: a linear mixed modelling approach
NASA Astrophysics Data System (ADS)
Thomas, C.; Lark, R. M.
2013-12-01
Numerical coastline morphological evolution models require wave climate properties to drive morphological change through time. Wave climate properties (typically wave height, period and direction) may be temporally fixed, culled from real wave buoy data, or allowed to vary in some way defined by a Gaussian or other pdf. However, to examine sensitivity of coastline morphologies to wave climate change, it seems desirable to be able to modify wave climate time series from a current to some new state along a trajectory, but in a way consistent with, or initially conditioned by, the properties of existing data, or to generate fully synthetic data sets with realistic time series properties. For example, mean or significant wave height time series may have underlying periodicities, as revealed in numerous analyses of wave data. Our motivation is to develop a simple methodology to generate synthetic wave climate time series that can change in some stochastic way through time. We wish to use such time series in a coastline evolution model to test sensitivities of coastal landforms to changes in wave climate over decadal and centennial scales. We have worked initially on time series of significant wave height, based on data from a Waverider III buoy located off the coast of Yorkshire, England. The statistical framework for the simulation is the linear mixed model. The target variable, perhaps after transformation (Box-Cox), is modelled as a multivariate Gaussian, the mean modelled as a function of a fixed effect, and two random components, one of which is independently and identically distributed (iid) and the second of which is temporally correlated. The model was fitted to the data by likelihood methods. We considered the option of a periodic mean, the period either fixed (e.g. at 12 months) or estimated from the data. We considered two possible correlation structures for the second random effect. In one the correlation decays exponentially with time. In the second (spherical) model, it cuts off at a temporal range. Having fitted the model, multiple realisations were generated; the random effects were simulated by specifying a covariance matrix for the simulated values, with the estimated parameters. The Cholesky factorisation of the covariance matrix was computed and realizations of the random component of the model generated by pre-multiplying a vector of iid standard Gaussian variables by the lower triangular factor. The resulting random variate was added to the mean value computed from the fixed effects, and the result back-transformed to the original scale of the measurement. Realistic simulations result from approach described above. Background exploratory data analysis was undertaken on 20-day sets of 30-minute buoy data, selected from days 5-24 of months January, April, July, October, 2011, to elucidate daily to weekly variations, and to keep numerical analysis tractable computationally. Work remains to be undertaken to develop suitable models for synthetic directional data. We suggest that the general principles of the method will have applications in other geomorphological modelling endeavours requiring time series of stochastically variable environmental parameters.
Hunt, Tam
2015-01-01
There is a singular unifying reality underlying every biologic interaction on our planet. In immunology, that which does not kill you makes you different. -William B. Miller, Jr. We are experiencing a revolution in our understanding of inner space on a par with our exponentially increasing understanding of outer space. In biology, we are learning that the genetic and epigenetic complexity within organisms is far deeper than suspected. This is a key theme in William B. Miller Jr.'s book, The Microcosm Within: Evolution and Extinction in the Hologenome. We are learning also that a focus on the human genome alone is misleading when it comes to who we really are as biological entities, and in terms of how we and other creatures have evolved. Rather than being defined by the human genome alone, we are instead defined by the "hologenome," the sum of the human genome and the far larger genetic endowment of the microbiome and symbiotic communities that reside within and around us. Miller is a medical doctor previously in private practice in Pennsylvania and Phoenix, Arizona. This book is his first foray into evolutionary theory. His book could have been titled "The Origin of Variation" because this is his primary focus. He accepts that natural selection plays a role in evolution, but he demotes this mechanism to a less important role than the Modern Synthesis suggests. His main gripe, however, concerns random variation. He argues that random variation is unable to explain the origin and evolution of biological forms that we see in the world around us and in the historical record. Miller suggests that, rather than random variation as the engine of novelty, there is a creative impulse at the heart of cellular life, and even at the level of the genetic aggregate, that generates novelty on a regular basis. I probe this assertion in the interview below. He also highlights the strong role of "exogenous genetic assault" in variation and in his immunological model of evolution.
Funk, Christopher S; Cohen, K Bretonnel; Hunter, Lawrence E; Verspoor, Karin M
2016-09-09
Gene Ontology (GO) terms represent the standard for annotation and representation of molecular functions, biological processes and cellular compartments, but a large gap exists between the way concepts are represented in the ontology and how they are expressed in natural language text. The construction of highly specific GO terms is formulaic, consisting of parts and pieces from more simple terms. We present two different types of manually generated rules to help capture the variation of how GO terms can appear in natural language text. The first set of rules takes into account the compositional nature of GO and recursively decomposes the terms into their smallest constituent parts. The second set of rules generates derivational variations of these smaller terms and compositionally combines all generated variants to form the original term. By applying both types of rules, new synonyms are generated for two-thirds of all GO terms and an increase in F-measure performance for recognition of GO on the CRAFT corpus from 0.498 to 0.636 is observed. Additionally, we evaluated the combination of both types of rules over one million full text documents from Elsevier; manual validation and error analysis show we are able to recognize GO concepts with reasonable accuracy (88 %) based on random sampling of annotations. In this work we present a set of simple synonym generation rules that utilize the highly compositional and formulaic nature of the Gene Ontology concepts. We illustrate how the generated synonyms aid in improving recognition of GO concepts on two different biomedical corpora. We discuss other applications of our rules for GO ontology quality assurance, explore the issue of overgeneration, and provide examples of how similar methodologies could be applied to other biomedical terminologies. Additionally, we provide all generated synonyms for use by the text-mining community.
How MAP kinase modules function as robust, yet adaptable, circuits.
Tian, Tianhai; Harding, Angus
2014-01-01
Genetic and biochemical studies have revealed that the diversity of cell types and developmental patterns evident within the animal kingdom is generated by a handful of conserved, core modules. Core biological modules must be robust, able to maintain functionality despite perturbations, and yet sufficiently adaptable for random mutations to generate phenotypic variation during evolution. Understanding how robust, adaptable modules have influenced the evolution of eukaryotes will inform both evolutionary and synthetic biology. One such system is the MAP kinase module, which consists of a 3-tiered kinase circuit configuration that has been evolutionarily conserved from yeast to man. MAP kinase signal transduction pathways are used across eukaryotic phyla to drive biological functions that are crucial for life. Here we ask the fundamental question, why do MAPK modules follow a conserved 3-tiered topology rather than some other number? Using computational simulations, we identify a fundamental 2-tiered circuit topology that can be readily reconfigured by feedback loops and scaffolds to generate diverse signal outputs. When this 2-kinase circuit is connected to proximal input kinases, a 3-tiered modular configuration is created that is both robust and adaptable, providing a biological circuit that can regulate multiple phenotypes and maintain functionality in an uncertain world. We propose that the 3-tiered signal transduction module has been conserved through positive selection, because it facilitated the generation of phenotypic variation during eukaryotic evolution.
How MAP kinase modules function as robust, yet adaptable, circuits
Tian, Tianhai; Harding, Angus
2014-01-01
Genetic and biochemical studies have revealed that the diversity of cell types and developmental patterns evident within the animal kingdom is generated by a handful of conserved, core modules. Core biological modules must be robust, able to maintain functionality despite perturbations, and yet sufficiently adaptable for random mutations to generate phenotypic variation during evolution. Understanding how robust, adaptable modules have influenced the evolution of eukaryotes will inform both evolutionary and synthetic biology. One such system is the MAP kinase module, which consists of a 3-tiered kinase circuit configuration that has been evolutionarily conserved from yeast to man. MAP kinase signal transduction pathways are used across eukaryotic phyla to drive biological functions that are crucial for life. Here we ask the fundamental question, why do MAPK modules follow a conserved 3-tiered topology rather than some other number? Using computational simulations, we identify a fundamental 2-tiered circuit topology that can be readily reconfigured by feedback loops and scaffolds to generate diverse signal outputs. When this 2-kinase circuit is connected to proximal input kinases, a 3-tiered modular configuration is created that is both robust and adaptable, providing a biological circuit that can regulate multiple phenotypes and maintain functionality in an uncertain world. We propose that the 3-tiered signal transduction module has been conserved through positive selection, because it facilitated the generation of phenotypic variation during eukaryotic evolution. PMID:25483189
Self-correcting random number generator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humble, Travis S.; Pooser, Raphael C.
2016-09-06
A system and method for generating random numbers. The system may include a random number generator (RNG), such as a quantum random number generator (QRNG) configured to self-correct or adapt in order to substantially achieve randomness from the output of the RNG. By adapting, the RNG may generate a random number that may be considered random regardless of whether the random number itself is tested as such. As an example, the RNG may include components to monitor one or more characteristics of the RNG during operation, and may use the monitored characteristics as a basis for adapting, or self-correcting, tomore » provide a random number according to one or more performance criteria.« less
Damage Propagation Modeling for Aircraft Engine Prognostics
NASA Technical Reports Server (NTRS)
Saxena, Abhinav; Goebel, Kai; Simon, Don; Eklund, Neil
2008-01-01
This paper describes how damage propagation can be modeled within the modules of aircraft gas turbine engines. To that end, response surfaces of all sensors are generated via a thermo-dynamical simulation model for the engine as a function of variations of flow and efficiency of the modules of interest. An exponential rate of change for flow and efficiency loss was imposed for each data set, starting at a randomly chosen initial deterioration set point. The rate of change of the flow and efficiency denotes an otherwise unspecified fault with increasingly worsening effect. The rates of change of the faults were constrained to an upper threshold but were otherwise chosen randomly. Damage propagation was allowed to continue until a failure criterion was reached. A health index was defined as the minimum of several superimposed operational margins at any given time instant and the failure criterion is reached when health index reaches zero. Output of the model was the time series (cycles) of sensed measurements typically available from aircraft gas turbine engines. The data generated were used as challenge data for the Prognostics and Health Management (PHM) data competition at PHM 08.
Ponzi, Adam; Wickens, Jeff
2010-04-28
The striatum is composed of GABAergic medium spiny neurons with inhibitory collaterals forming a sparse random asymmetric network and receiving an excitatory glutamatergic cortical projection. Because the inhibitory collaterals are sparse and weak, their role in striatal network dynamics is puzzling. However, here we show by simulation of a striatal inhibitory network model composed of spiking neurons that cells form assemblies that fire in sequential coherent episodes and display complex identity-temporal spiking patterns even when cortical excitation is simply constant or fluctuating noisily. Strongly correlated large-scale firing rate fluctuations on slow behaviorally relevant timescales of hundreds of milliseconds are shown by members of the same assembly whereas members of different assemblies show strong negative correlation, and we show how randomly connected spiking networks can generate this activity. Cells display highly irregular spiking with high coefficients of variation, broadly distributed low firing rates, and interspike interval distributions that are consistent with exponentially tailed power laws. Although firing rates vary coherently on slow timescales, precise spiking synchronization is absent in general. Our model only requires the minimal but striatally realistic assumptions of sparse to intermediate random connectivity, weak inhibitory synapses, and sufficient cortical excitation so that some cells are depolarized above the firing threshold during up states. Our results are in good qualitative agreement with experimental studies, consistent with recently determined striatal anatomy and physiology, and support a new view of endogenously generated metastable state switching dynamics of the striatal network underlying its information processing operations.
Rate of de novo mutations and the importance of father's age to disease risk.
Kong, Augustine; Frigge, Michael L; Masson, Gisli; Besenbacher, Soren; Sulem, Patrick; Magnusson, Gisli; Gudjonsson, Sigurjon A; Sigurdsson, Asgeir; Jonasdottir, Aslaug; Jonasdottir, Adalbjorg; Wong, Wendy S W; Sigurdsson, Gunnar; Walters, G Bragi; Steinberg, Stacy; Helgason, Hannes; Thorleifsson, Gudmar; Gudbjartsson, Daniel F; Helgason, Agnar; Magnusson, Olafur Th; Thorsteinsdottir, Unnur; Stefansson, Kari
2012-08-23
Mutations generate sequence diversity and provide a substrate for selection. The rate of de novo mutations is therefore of major importance to evolution. Here we conduct a study of genome-wide mutation rates by sequencing the entire genomes of 78 Icelandic parent-offspring trios at high coverage. We show that in our samples, with an average father's age of 29.7, the average de novo mutation rate is 1.20 × 10(-8) per nucleotide per generation. Most notably, the diversity in mutation rate of single nucleotide polymorphisms is dominated by the age of the father at conception of the child. The effect is an increase of about two mutations per year. An exponential model estimates paternal mutations doubling every 16.5 years. After accounting for random Poisson variation, father's age is estimated to explain nearly all of the remaining variation in the de novo mutation counts. These observations shed light on the importance of the father's age on the risk of diseases such as schizophrenia and autism.
Perceived Surface Slant Is Systematically Biased in the Actively-Generated Optic Flow
Fantoni, Carlo; Caudek, Corrado; Domini, Fulvio
2012-01-01
Humans make systematic errors in the 3D interpretation of the optic flow in both passive and active vision. These systematic distortions can be predicted by a biologically-inspired model which disregards self-motion information resulting from head movements (Caudek, Fantoni, & Domini 2011). Here, we tested two predictions of this model: (1) A plane that is stationary in an earth-fixed reference frame will be perceived as changing its slant if the movement of the observer's head causes a variation of the optic flow; (2) a surface that rotates in an earth-fixed reference frame will be perceived to be stationary, if the surface rotation is appropriately yoked to the head movement so as to generate a variation of the surface slant but not of the optic flow. Both predictions were corroborated by two experiments in which observers judged the perceived slant of a random-dot planar surface during egomotion. We found qualitatively similar biases for monocular and binocular viewing of the simulated surfaces, although, in principle, the simultaneous presence of disparity and motion cues allows for a veridical recovery of surface slant. PMID:22479473
Real-time fast physical random number generator with a photonic integrated circuit.
Ugajin, Kazusa; Terashima, Yuta; Iwakawa, Kento; Uchida, Atsushi; Harayama, Takahisa; Yoshimura, Kazuyuki; Inubushi, Masanobu
2017-03-20
Random number generators are essential for applications in information security and numerical simulations. Most optical-chaos-based random number generators produce random bit sequences by offline post-processing with large optical components. We demonstrate a real-time hardware implementation of a fast physical random number generator with a photonic integrated circuit and a field programmable gate array (FPGA) electronic board. We generate 1-Tbit random bit sequences and evaluate their statistical randomness using NIST Special Publication 800-22 and TestU01. All of the BigCrush tests in TestU01 are passed using 410-Gbit random bit sequences. A maximum real-time generation rate of 21.1 Gb/s is achieved for random bit sequences in binary format stored in a computer, which can be directly used for applications involving secret keys in cryptography and random seeds in large-scale numerical simulations.
Acceleration techniques for dependability simulation. M.S. Thesis
NASA Technical Reports Server (NTRS)
Barnette, James David
1995-01-01
As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.
All-fiber pyroelectric nanogenerator
NASA Astrophysics Data System (ADS)
Ghosh, Sujoy Kumar; Xie, Mengying; Bowen, Christopher Rhys; Mandal, Dipankar
2018-04-01
An all-fiber pyroelectric nanogenerator (PyNG) is fabricated where both the active pyroelectric component and the electrodes were composed of fiber. The pyroelectric component was made with randomly organized electrospun PVDF nano-fibers possessing ferroelectric β- and γ-phases. The PyNG possess higher level of sensitivity which can detect very low level of temperature fluctuation, as, low as, 2 K. In addition, the thermal energy harvesting ability of the PyNG under several temperature variations and cycling frequencies paves the way for next generation thermal sensor and self-powered flexible micro-electronics.
Statistical analysis of multivariate atmospheric variables. [cloud cover
NASA Technical Reports Server (NTRS)
Tubbs, J. D.
1979-01-01
Topics covered include: (1) estimation in discrete multivariate distributions; (2) a procedure to predict cloud cover frequencies in the bivariate case; (3) a program to compute conditional bivariate normal parameters; (4) the transformation of nonnormal multivariate to near-normal; (5) test of fit for the extreme value distribution based upon the generalized minimum chi-square; (6) test of fit for continuous distributions based upon the generalized minimum chi-square; (7) effect of correlated observations on confidence sets based upon chi-square statistics; and (8) generation of random variates from specified distributions.
Learning-based stochastic object models for characterizing anatomical variations
NASA Astrophysics Data System (ADS)
Dolly, Steven R.; Lou, Yang; Anastasio, Mark A.; Li, Hua
2018-03-01
It is widely known that the optimization of imaging systems based on objective, task-based measures of image quality via computer-simulation requires the use of a stochastic object model (SOM). However, the development of computationally tractable SOMs that can accurately model the statistical variations in human anatomy within a specified ensemble of patients remains a challenging task. Previously reported numerical anatomic models lack the ability to accurately model inter-patient and inter-organ variations in human anatomy among a broad patient population, mainly because they are established on image data corresponding to a few of patients and individual anatomic organs. This may introduce phantom-specific bias into computer-simulation studies, where the study result is heavily dependent on which phantom is used. In certain applications, however, databases of high-quality volumetric images and organ contours are available that can facilitate this SOM development. In this work, a novel and tractable methodology for learning a SOM and generating numerical phantoms from a set of volumetric training images is developed. The proposed methodology learns geometric attribute distributions (GAD) of human anatomic organs from a broad patient population, which characterize both centroid relationships between neighboring organs and anatomic shape similarity of individual organs among patients. By randomly sampling the learned centroid and shape GADs with the constraints of the respective principal attribute variations learned from the training data, an ensemble of stochastic objects can be created. The randomness in organ shape and position reflects the learned variability of human anatomy. To demonstrate the methodology, a SOM of an adult male pelvis is computed and examples of corresponding numerical phantoms are created.
Quantum random number generator
Pooser, Raphael C.
2016-05-10
A quantum random number generator (QRNG) and a photon generator for a QRNG are provided. The photon generator may be operated in a spontaneous mode below a lasing threshold to emit photons. Photons emitted from the photon generator may have at least one random characteristic, which may be monitored by the QRNG to generate a random number. In one embodiment, the photon generator may include a photon emitter and an amplifier coupled to the photon emitter. The amplifier may enable the photon generator to be used in the QRNG without introducing significant bias in the random number and may enable multiplexing of multiple random numbers. The amplifier may also desensitize the photon generator to fluctuations in power supplied thereto while operating in the spontaneous mode. In one embodiment, the photon emitter and amplifier may be a tapered diode amplifier.
Quantum random number generation
Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; ...
2016-06-28
Quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a highmore » speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.« less
A high-fidelity weather time series generator using the Markov Chain process on a piecewise level
NASA Astrophysics Data System (ADS)
Hersvik, K.; Endrerud, O.-E. V.
2017-12-01
A method is developed for generating a set of unique weather time-series based on an existing weather series. The method allows statistically valid weather variations to take place within repeated simulations of offshore operations. The numerous generated time series need to share the same statistical qualities as the original time series. Statistical qualities here refer mainly to the distribution of weather windows available for work, including durations and frequencies of such weather windows, and seasonal characteristics. The method is based on the Markov chain process. The core new development lies in how the Markov Process is used, specifically by joining small pieces of random length time series together rather than joining individual weather states, each from a single time step, which is a common solution found in the literature. This new Markov model shows favorable characteristics with respect to the requirements set forth and all aspects of the validation performed.
NASA Astrophysics Data System (ADS)
Lahmiri, Salim
2016-05-01
Detrended fluctuation analysis (DFA) is used to examine long-range dependence in variations and volatilities of American treasury bills (TB) during periods of low and high movements in TB rates. Volatility series are estimated by generalized autoregressive conditional heteroskedasticity (GARCH) model under Gaussian, Student, and the generalized error distribution (GED) assumptions. The DFA-based Hurst exponents from 3-month, 6-month, and 1-year TB data indicates that in general the dynamics of the TB variations process is characterized by persistence during stable time period (before 2008 international financial crisis) and anti-persistence during unstable time period (post-2008 international financial crisis). For volatility series, it is found that; for stable period; 3-month volatility process is more likely random, 6-month volatility process is anti-persistent, and 1-year volatility process is persistent. For unstable period, estimation results show that the generating process is persistent for all maturities and for all distributional assumptions.
NASA Astrophysics Data System (ADS)
Sawada, Takuya; Takata, Hidehiro; Nii, Koji; Nagata, Makoto
2013-04-01
Static random access memory (SRAM) cores exhibit susceptibility against power supply voltage variation. False operation is investigated among SRAM cells under sinusoidal voltage variation on power lines introduced by direct RF power injection. A standard SRAM core of 16 kbyte in a 90 nm 1.5 V technology is diagnosed with built-in self test and on-die noise monitor techniques. The sensitivity of bit error rate is shown to be high against the frequency of injected voltage variation, while it is not greatly influenced by the difference in frequency and phase against SRAM clocking. It is also observed that the distribution of false bits is substantially random in a cell array.
NASA Astrophysics Data System (ADS)
Lu, B.; Darmon, M.; Leymarie, N.; Chatillon, S.; Potel, C.
2012-05-01
In-service inspection of Sodium-Cooled Fast Reactors (SFR) requires the development of non-destructive techniques adapted to the harsh environment conditions and the examination complexity. From past experiences, ultrasonic techniques are considered as suitable candidates. The ultrasonic telemetry is a technique used to constantly insure the safe functioning of reactor inner components by determining their exact position: it consists in measuring the time of flight of the ultrasonic response obtained after propagation of a pulse emitted by a transducer and its interaction with the targets. While in-service the sodium flow creates turbulences that lead to temperature inhomogeneities, which translates into ultrasonic velocity inhomogeneities. These velocity variations could directly impact the accuracy of the target locating by introducing time of flight variations. A stochastic simulation model has been developed to calculate the propagation of ultrasonic waves in such an inhomogeneous medium. Using this approach, the travel time is randomly generated by a stochastic process whose inputs are the statistical moments of travel times known analytically. The stochastic model predicts beam deviations due to velocity inhomogeneities, which are similar to those provided by a determinist method, such as the ray method.
NASA Astrophysics Data System (ADS)
Vodenicarevic, D.; Locatelli, N.; Mizrahi, A.; Friedman, J. S.; Vincent, A. F.; Romera, M.; Fukushima, A.; Yakushiji, K.; Kubota, H.; Yuasa, S.; Tiwari, S.; Grollier, J.; Querlioz, D.
2017-11-01
Low-energy random number generation is critical for many emerging computing schemes proposed to complement or replace von Neumann architectures. However, current random number generators are always associated with an energy cost that is prohibitive for these computing schemes. We introduce random number bit generation based on specific nanodevices: superparamagnetic tunnel junctions. We experimentally demonstrate high-quality random bit generation that represents an orders-of-magnitude improvement in energy efficiency over current solutions. We show that the random generation speed improves with nanodevice scaling, and we investigate the impact of temperature, magnetic field, and cross talk. Finally, we show how alternative computing schemes can be implemented using superparamagentic tunnel junctions as random number generators. These results open the way for fabricating efficient hardware computing devices leveraging stochasticity, and they highlight an alternative use for emerging nanodevices.
Bühnemann, Claudia; Li, Simon; Yu, Haiyue; Branford White, Harriet; Schäfer, Karl L; Llombart-Bosch, Antonio; Machado, Isidro; Picci, Piero; Hogendoorn, Pancras C W; Athanasou, Nicholas A; Noble, J Alison; Hassan, A Bassim
2014-01-01
Driven by genomic somatic variation, tumour tissues are typically heterogeneous, yet unbiased quantitative methods are rarely used to analyse heterogeneity at the protein level. Motivated by this problem, we developed automated image segmentation of images of multiple biomarkers in Ewing sarcoma to generate distributions of biomarkers between and within tumour cells. We further integrate high dimensional data with patient clinical outcomes utilising random survival forest (RSF) machine learning. Using material from cohorts of genetically diagnosed Ewing sarcoma with EWSR1 chromosomal translocations, confocal images of tissue microarrays were segmented with level sets and watershed algorithms. Each cell nucleus and cytoplasm were identified in relation to DAPI and CD99, respectively, and protein biomarkers (e.g. Ki67, pS6, Foxo3a, EGR1, MAPK) localised relative to nuclear and cytoplasmic regions of each cell in order to generate image feature distributions. The image distribution features were analysed with RSF in relation to known overall patient survival from three separate cohorts (185 informative cases). Variation in pre-analytical processing resulted in elimination of a high number of non-informative images that had poor DAPI localisation or biomarker preservation (67 cases, 36%). The distribution of image features for biomarkers in the remaining high quality material (118 cases, 104 features per case) were analysed by RSF with feature selection, and performance assessed using internal cross-validation, rather than a separate validation cohort. A prognostic classifier for Ewing sarcoma with low cross-validation error rates (0.36) was comprised of multiple features, including the Ki67 proliferative marker and a sub-population of cells with low cytoplasmic/nuclear ratio of CD99. Through elimination of bias, the evaluation of high-dimensionality biomarker distribution within cell populations of a tumour using random forest analysis in quality controlled tumour material could be achieved. Such an automated and integrated methodology has potential application in the identification of prognostic classifiers based on tumour cell heterogeneity.
Haider, Nadia
2017-01-01
Investigation of genetic variation and phylogenetic relationships among date palm (Phoenix dactylifera L.) cultivars is useful for their conservation and genetic improvement. Various molecular markers such as restriction fragment length polymorphisms (RFLPs), simple sequence repeat (SSR), representational difference analysis (RDA), and amplified fragment length polymorphism (AFLP) have been developed to molecularly characterize date palm cultivars. PCR-based markers random amplified polymorphic DNA (RAPD) and inter-simple sequence repeat (ISSR) are powerful tools to determine the relatedness of date palm cultivars that are difficult to distinguish morphologically. In this chapter, the principles, materials, and methods of RAPD and ISSR techniques are presented. Analysis of data generated from these two techniques and the use of these data to reveal phylogenetic relationships among date palm cultivars are also discussed.
Balderson, M J; Brown, D W; Quirk, S; Ghasroddashti, E; Kirkby, C
2012-07-01
Clinical outcome studies with clear and objective endpoints are necessary to make informed radiotherapy treatment decisions. Commonly, clinical outcomes are established after lengthy and costly clinical trials are performed and the data are analyzed and published. One the challenges with obtaining meaningful data from clinical trials is that by the time the information gets to the medical profession the results may be less clinically relevant than when the trial began, An alternative approach is to estimate clinical outcomes through patient population modeling. We are developing a mathematical tool that uses Monte Carlo techniques to simulate variations in planned and delivered dose distributions of prostate patients receiving radiotherapy. Ultimately, our simulation will calculate a distribution of Tumor Control Probabilities (TCPs) for a population of patients treated under a given protocol. Such distributions can serve as a metric for comparing different treatment modalities, planning and setup approaches, and machine parameter settings or tolerances with respect to outcomes on broad patient populations. It may also help researchers understand differences one might expect to find before actually doing the clinical trial. As a first step and for the focus of this abstract we wanted to see if we could answer the question: "Can a population of dose distributions of prostate patients be accurately modeled by a set of randomly generated Gaussian functions?" Our results have demonstrated that using a set of randomly generated Gaussian functions can simulate a distribution of prostate patients. © 2012 American Association of Physicists in Medicine.
Pseudo-Random Number Generator Based on Coupled Map Lattices
NASA Astrophysics Data System (ADS)
Lü, Huaping; Wang, Shihong; Hu, Gang
A one-way coupled chaotic map lattice is used for generating pseudo-random numbers. It is shown that with suitable cooperative applications of both chaotic and conventional approaches, the output of the spatiotemporally chaotic system can easily meet the practical requirements of random numbers, i.e., excellent random statistical properties, long periodicity of computer realizations, and fast speed of random number generations. This pseudo-random number generator system can be used as ideal synchronous and self-synchronizing stream cipher systems for secure communications.
Generating random numbers by means of nonlinear dynamic systems
NASA Astrophysics Data System (ADS)
Zang, Jiaqi; Hu, Haojie; Zhong, Juhua; Luo, Duanbin; Fang, Yi
2018-07-01
To introduce the randomness of a physical process to students, a chaotic pendulum experiment was opened in East China University of Science and Technology (ECUST) on the undergraduate level in the physics department. It was shown chaotic motion could be initiated through adjusting the operation of a chaotic pendulum. By using the data of the angular displacements of chaotic motion, random binary numerical arrays can be generated. To check the randomness of generated numerical arrays, the NIST Special Publication 800-20 method was adopted. As a result, it was found that all the random arrays which were generated by the chaotic motion could pass the validity criteria and some of them were even better than the quality of pseudo-random numbers generated by a computer. Through the experiments, it is demonstrated that chaotic pendulum can be used as an efficient mechanical facility in generating random numbers, and can be applied in teaching random motion to the students.
Between-User Reliability of Tier 1 Exposure Assessment Tools Used Under REACH.
Lamb, Judith; Galea, Karen S; Miller, Brian G; Hesse, Susanne; Van Tongeren, Martie
2017-10-01
When applying simple screening (Tier 1) tools to estimate exposure to chemicals in a given exposure situation under the Registration, Evaluation, Authorisation and restriction of CHemicals Regulation 2006 (REACH), users must select from several possible input parameters. Previous studies have suggested that results from exposure assessments using expert judgement and from the use of modelling tools can vary considerably between assessors. This study aimed to investigate the between-user reliability of Tier 1 tools. A remote-completion exercise and in person workshop were used to identify and evaluate tool parameters and factors such as user demographics that may be potentially associated with between-user variability. Participants (N = 146) generated dermal and inhalation exposure estimates (N = 4066) from specified workplace descriptions ('exposure situations') and Tier 1 tool combinations (N = 20). Interactions between users, tools, and situations were investigated and described. Systematic variation associated with individual users was minor compared with random between-user variation. Although variation was observed between choices made for the majority of input parameters, differing choices of Process Category ('PROC') code/activity descriptor and dustiness level impacted most on the resultant exposure estimates. Exposure estimates ranging over several orders of magnitude were generated for the same exposure situation by different tool users. Such unpredictable between-user variation will reduce consistency within REACH processes and could result in under-estimation or overestimation of exposure, risking worker ill-health or the implementation of unnecessary risk controls, respectively. Implementation of additional support and quality control systems for all tool users is needed to reduce between-assessor variation and so ensure both the protection of worker health and avoidance of unnecessary business risk management expenditure. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Extracting random numbers from quantum tunnelling through a single diode.
Bernardo-Gavito, Ramón; Bagci, Ibrahim Ethem; Roberts, Jonathan; Sexton, James; Astbury, Benjamin; Shokeir, Hamzah; McGrath, Thomas; Noori, Yasir J; Woodhead, Christopher S; Missous, Mohamed; Roedig, Utz; Young, Robert J
2017-12-19
Random number generation is crucial in many aspects of everyday life, as online security and privacy depend ultimately on the quality of random numbers. Many current implementations are based on pseudo-random number generators, but information security requires true random numbers for sensitive applications like key generation in banking, defence or even social media. True random number generators are systems whose outputs cannot be determined, even if their internal structure and response history are known. Sources of quantum noise are thus ideal for this application due to their intrinsic uncertainty. In this work, we propose using resonant tunnelling diodes as practical true random number generators based on a quantum mechanical effect. The output of the proposed devices can be directly used as a random stream of bits or can be further distilled using randomness extraction algorithms, depending on the application.
Generating and using truly random quantum states in Mathematica
NASA Astrophysics Data System (ADS)
Miszczak, Jarosław Adam
2012-01-01
The problem of generating random quantum states is of a great interest from the quantum information theory point of view. In this paper we present a package for Mathematica computing system harnessing a specific piece of hardware, namely Quantis quantum random number generator (QRNG), for investigating statistical properties of quantum states. The described package implements a number of functions for generating random states, which use Quantis QRNG as a source of randomness. It also provides procedures which can be used in simulations not related directly to quantum information processing. Program summaryProgram title: TRQS Catalogue identifier: AEKA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 7924 No. of bytes in distributed program, including test data, etc.: 88 651 Distribution format: tar.gz Programming language: Mathematica, C Computer: Requires a Quantis quantum random number generator (QRNG, http://www.idquantique.com/true-random-number-generator/products-overview.html) and supporting a recent version of Mathematica Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit) RAM: Case dependent Classification: 4.15 Nature of problem: Generation of random density matrices. Solution method: Use of a physical quantum random number generator. Running time: Generating 100 random numbers takes about 1 second, generating 1000 random density matrices takes more than a minute.
Two-Way Selection for Growth Rate in the Common Carp (CYPRINUS CARPIO L.)
Moav, R.; Wohlfarth, G.
1976-01-01
The domesticated European carp was subjected to a two-way selection for growth rate. Five generations of mass selection for faster growth rate did not yield any response, but subsequent selection between groups (families) resulted in considerable progress while maintaining a large genetic variance. Selection for slow growth rate yielded relatively strong response for the first three generations. Random-bred control lines suffered from strong inbreeding depression and when two lines were crossed, the F1 showed a high degree of heterosis. Selection was performed on pond-raised fish, but growth rate was also tested in cages. A strong pond-cage genetic interaction was found. A theoretical explanation was suggested involving overdominance for fast growth rate and amplification through competition of intra-group but not inter-group variation. PMID:1248737
Quantum random number generation for loophole-free Bell tests
NASA Astrophysics Data System (ADS)
Mitchell, Morgan; Abellan, Carlos; Amaya, Waldimar
2015-05-01
We describe the generation of quantum random numbers at multi-Gbps rates, combined with real-time randomness extraction, to give very high purity random numbers based on quantum events at most tens of ns in the past. The system satisfies the stringent requirements of quantum non-locality tests that aim to close the timing loophole. We describe the generation mechanism using spontaneous-emission-driven phase diffusion in a semiconductor laser, digitization, and extraction by parity calculation using multi-GHz logic chips. We pay special attention to experimental proof of the quality of the random numbers and analysis of the randomness extraction. In contrast to widely-used models of randomness generators in the computer science literature, we argue that randomness generation by spontaneous emission can be extracted from a single source.
An On-Demand Optical Quantum Random Number Generator with In-Future Action and Ultra-Fast Response
Stipčević, Mario; Ursin, Rupert
2015-01-01
Random numbers are essential for our modern information based society e.g. in cryptography. Unlike frequently used pseudo-random generators, physical random number generators do not depend on complex algorithms but rather on a physicsal process to provide true randomness. Quantum random number generators (QRNG) do rely on a process, wich can be described by a probabilistic theory only, even in principle. Here we present a conceptualy simple implementation, which offers a 100% efficiency of producing a random bit upon a request and simultaneously exhibits an ultra low latency. A careful technical and statistical analysis demonstrates its robustness against imperfections of the actual implemented technology and enables to quickly estimate randomness of very long sequences. Generated random numbers pass standard statistical tests without any post-processing. The setup described, as well as the theory presented here, demonstrate the maturity and overall understanding of the technology. PMID:26057576
NASA Astrophysics Data System (ADS)
Miszczak, Jarosław Adam
2013-01-01
The presented package for the Mathematica computing system allows the harnessing of quantum random number generators (QRNG) for investigating the statistical properties of quantum states. The described package implements a number of functions for generating random states. The new version of the package adds the ability to use the on-line quantum random number generator service and implements new functions for retrieving lists of random numbers. Thanks to the introduced improvements, the new version provides faster access to high-quality sources of random numbers and can be used in simulations requiring large amount of random data. New version program summaryProgram title: TRQS Catalogue identifier: AEKA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 18 134 No. of bytes in distributed program, including test data, etc.: 2 520 49 Distribution format: tar.gz Programming language: Mathematica, C. Computer: Any supporting Mathematica in version 7 or higher. Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit). RAM: Case-dependent Supplementary material: Fig. 1 mentioned below can be downloaded. Classification: 4.15. External routines: Quantis software library (http://www.idquantique.com/support/quantis-trng.html) Catalogue identifier of previous version: AEKA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183(2012)118 Does the new version supersede the previous version?: Yes Nature of problem: Generation of random density matrices and utilization of high-quality random numbers for the purpose of computer simulation. Solution method: Use of a physical quantum random number generator and an on-line service providing access to the source of true random numbers generated by quantum real number generator. Reasons for new version: Added support for the high-speed on-line quantum random number generator and improved methods for retrieving lists of random numbers. Summary of revisions: The presented version provides two signicant improvements. The first one is the ability to use the on-line Quantum Random Number Generation service developed by PicoQuant GmbH and the Nano-Optics groups at the Department of Physics of Humboldt University. The on-line service supported in the version 2.0 of the TRQS package provides faster access to true randomness sources constructed using the laws of quantum physics. The service is freely available at https://qrng.physik.hu-berlin.de/. The use of this service allows using the presented package with the need of a physical quantum random number generator. The second improvement introduced in this version is the ability to retrieve arrays of random data directly for the used source. This increases the speed of the random number generation, especially in the case of an on-line service, where it reduces the time necessary to establish the connection. Thanks to the speed improvement of the presented version, the package can now be used in simulations requiring larger amounts of random data. Moreover, the functions for generating random numbers provided by the current version of the package more closely follow the pattern of functions for generating pseudo- random numbers provided in Mathematica. Additional comments: Speed comparison: The implementation of the support for the QRNG on-line service provides a noticeable improvement in the speed of random number generation. For the samples of real numbers of size 101; 102,…,107 the times required to generate these samples using Quantis USB device and QRNG service are compared in Fig. 1. The presented results show that the use of the on-line service provides faster access to random numbers. One should note, however, that the speed gain can increase or decrease depending on the connection speed between the computer and the server providing random numbers. Running time: Depends on the used source of randomness and the amount of random data used in the experiment. References: [1] M. Wahl, M. Leifgen, M. Berlin, T. Röhlicke, H.-J. Rahn, O. Benson., An ultrafast quantum random number generator with provably bounded output bias based on photon arrival time measurements, Applied Physics Letters, Vol. 098, 171105 (2011). http://dx.doi.org/10.1063/1.3578456.
Security of practical private randomness generation
NASA Astrophysics Data System (ADS)
Pironio, Stefano; Massar, Serge
2013-01-01
Measurements on entangled quantum systems necessarily yield outcomes that are intrinsically unpredictable if they violate a Bell inequality. This property can be used to generate certified randomness in a device-independent way, i.e., without making detailed assumptions about the internal working of the quantum devices used to generate the random numbers. Furthermore these numbers are also private; i.e., they appear random not only to the user but also to any adversary that might possess a perfect description of the devices. Since this process requires a small initial random seed to sample the behavior of the quantum devices and to extract uniform randomness from the raw outputs of the devices, one usually speaks of device-independent randomness expansion. The purpose of this paper is twofold. First, we point out that in most real, practical situations, where the concept of device independence is used as a protection against unintentional flaws or failures of the quantum apparatuses, it is sufficient to show that the generated string is random with respect to an adversary that holds only classical side information; i.e., proving randomness against quantum side information is not necessary. Furthermore, the initial random seed does not need to be private with respect to the adversary, provided that it is generated in a way that is independent from the measured systems. The devices, however, will generate cryptographically secure randomness that cannot be predicted by the adversary, and thus one can, given access to free public randomness, talk about private randomness generation. The theoretical tools to quantify the generated randomness according to these criteria were already introduced in S. Pironio [Nature (London)NATUAS0028-083610.1038/nature09008 464, 1021 (2010)], but the final results were improperly formulated. The second aim of this paper is to correct this inaccurate formulation and therefore lay out a precise theoretical framework for practical device-independent randomness generation.
The RANDOM computer program: A linear congruential random number generator
NASA Technical Reports Server (NTRS)
Miles, R. F., Jr.
1986-01-01
The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.
A sibling method for identifying vQTLs
Domingue, Ben; Dawes, Christopher; Boardman, Jason; Siegal, Mark
2018-01-01
The propensity of a trait to vary within a population may have evolutionary, ecological, or clinical significance. In the present study we deploy sibling models to offer a novel and unbiased way to ascertain loci associated with the extent to which phenotypes vary (variance-controlling quantitative trait loci, or vQTLs). Previous methods for vQTL-mapping either exclude genetically related individuals or treat genetic relatedness among individuals as a complicating factor addressed by adjusting estimates for non-independence in phenotypes. The present method uses genetic relatedness as a tool to obtain unbiased estimates of variance effects rather than as a nuisance. The family-based approach, which utilizes random variation between siblings in minor allele counts at a locus, also allows controls for parental genotype, mean effects, and non-linear (dominance) effects that may spuriously appear to generate variation. Simulations show that the approach performs equally well as two existing methods (squared Z-score and DGLM) in controlling type I error rates when there is no unobserved confounding, and performs significantly better than these methods in the presence of small degrees of confounding. Using height and BMI as empirical applications, we investigate SNPs that alter within-family variation in height and BMI, as well as pathways that appear to be enriched. One significant SNP for BMI variability, in the MAST4 gene, replicated. Pathway analysis revealed one gene set, encoding members of several signaling pathways related to gap junction function, which appears significantly enriched for associations with within-family height variation in both datasets (while not enriched in analysis of mean levels). We recommend approximating laboratory random assignment of genotype using family data and more careful attention to the possible conflation of mean and variance effects. PMID:29617452
Unbiased All-Optical Random-Number Generator
NASA Astrophysics Data System (ADS)
Steinle, Tobias; Greiner, Johannes N.; Wrachtrup, Jörg; Giessen, Harald; Gerhardt, Ilja
2017-10-01
The generation of random bits is of enormous importance in modern information science. Cryptographic security is based on random numbers which require a physical process for their generation. This is commonly performed by hardware random-number generators. These often exhibit a number of problems, namely experimental bias, memory in the system, and other technical subtleties, which reduce the reliability in the entropy estimation. Further, the generated outcome has to be postprocessed to "iron out" such spurious effects. Here, we present a purely optical randomness generator, based on the bistable output of an optical parametric oscillator. Detector noise plays no role and postprocessing is reduced to a minimum. Upon entering the bistable regime, initially the resulting output phase depends on vacuum fluctuations. Later, the phase is rigidly locked and can be well determined versus a pulse train, which is derived from the pump laser. This delivers an ambiguity-free output, which is reliably detected and associated with a binary outcome. The resulting random bit stream resembles a perfect coin toss and passes all relevant randomness measures. The random nature of the generated binary outcome is furthermore confirmed by an analysis of resulting conditional entropies.
Variation of fan tone steadiness for several inflow conditions
NASA Technical Reports Server (NTRS)
Balombin, J. R.
1978-01-01
An amplitude probability density function analysis technique for quantifying the degree of fan noise tone steadiness has been applied to data from a fan tested under a variety of inflow conditions. The test conditions included typical static operation, inflow control by a honeycomb/screen device and forward velocity in a wind tunnel simulating flight. The ratio of mean square sinusoidal-to-random signal content in the fundamental and second harmonic tones was found to vary by more than an order-of-magnitude. Some implications of these results concerning the nature of fan noise generation mechanisms are discussed.
Effects of simulated turbulence on aircraft handling qualities
NASA Technical Reports Server (NTRS)
Jacobson, I. D.; Joshi, D. S.
1977-01-01
The influence of simulated turbulence on aircraft handling qualities is presented. Pilot opinions of the handling qualities of a light general aviation aircraft were evaluated in a motion-base simulator using a simulated turbulence environment. A realistic representation of turbulence disturbances is described in terms of rms intensity and scale length and their random variations with time. The time histories generated by the proposed turbulence models showed characteristics which are more similar to real turbulence than the frequently-used Gaussian turbulence model. The proposed turbulence models flexibly accommodate changes in atmospheric conditions and are easily implemented in flight simulator studies.
A 3D model of polarized dust emission in the Milky Way
NASA Astrophysics Data System (ADS)
Martínez-Solaeche, Ginés; Karakci, Ata; Delabrouille, Jacques
2018-05-01
We present a three-dimensional model of polarized galactic dust emission that takes into account the variation of the dust density, spectral index and temperature along the line of sight, and contains randomly generated small-scale polarization fluctuations. The model is constrained to match observed dust emission on large scales, and match on smaller scales extrapolations of observed intensity and polarization power spectra. This model can be used to investigate the impact of plausible complexity of the polarized dust foreground emission on the analysis and interpretation of future cosmic microwave background polarization observations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu
Quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a highmore » speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.« less
Analysis of Uniform Random Numbers Generated by Randu and Urn Ten Different Seeds.
The statistical properties of the numbers generated by two uniform random number generators, RANDU and URN, each using ten different seeds are...The testing is performed on a sequence of 50,000 numbers generated by each uniform random number generator using each of the ten seeds . (Author)
Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin
2015-01-01
Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks. PMID:26501283
Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin
2015-10-16
Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks.
Social Anxiety Modulates Subliminal Affective Priming
Paul, Elizabeth S.; Pope, Stuart A. J.; Fennell, John G.; Mendl, Michael T.
2012-01-01
Background It is well established that there is anxiety-related variation between observers in the very earliest, pre-attentive stage of visual processing of images such as emotionally expressive faces, often leading to enhanced attention to threat in a variety of disorders and traits. Whether there is also variation in early-stage affective (i.e. valenced) responses resulting from such images, however, is not yet known. The present study used the subliminal affective priming paradigm to investigate whether people varying in trait social anxiety also differ in their affective responses to very briefly presented, emotionally expressive face images. Methodology/Principal Findings Participants (n = 67) completed a subliminal affective priming task, in which briefly presented and smiling, neutral and angry faces were shown for 10 ms durations (below objective and subjective thresholds for visual discrimination), and immediately followed by a randomly selected Chinese character mask (2000 ms). Ratings of participants' liking for each Chinese character indicated the degree of valenced affective response made to the unseen emotive images. Participants' ratings of their liking for the Chinese characters were significantly influenced by the type of face image preceding them, with smiling faces generating more positive ratings than neutral and angry ones (F(2,128) = 3.107, p<0.05). Self-reported social anxiety was positively correlated with ratings of smiling relative to neutral-face primed characters (Pearson's r = .323, p<0.01). Individual variation in self-reported mood awareness was not associated with ratings. Conclusions Trait social anxiety is associated with individual variation in affective responding, even in response to the earliest, pre-attentive stage of visual image processing. However, the fact that these priming effects are limited to smiling and not angry (i.e. threatening) images leads us to propose that the pre-attentive processes involved in generating the subliminal affective priming effect may be different from those that generate attentional biases in anxious individuals. PMID:22615873
Social anxiety modulates subliminal affective priming.
Paul, Elizabeth S; Pope, Stuart A J; Fennell, John G; Mendl, Michael T
2012-01-01
It is well established that there is anxiety-related variation between observers in the very earliest, pre-attentive stage of visual processing of images such as emotionally expressive faces, often leading to enhanced attention to threat in a variety of disorders and traits. Whether there is also variation in early-stage affective (i.e. valenced) responses resulting from such images, however, is not yet known. The present study used the subliminal affective priming paradigm to investigate whether people varying in trait social anxiety also differ in their affective responses to very briefly presented, emotionally expressive face images. Participants (n = 67) completed a subliminal affective priming task, in which briefly presented and smiling, neutral and angry faces were shown for 10 ms durations (below objective and subjective thresholds for visual discrimination), and immediately followed by a randomly selected Chinese character mask (2000 ms). Ratings of participants' liking for each Chinese character indicated the degree of valenced affective response made to the unseen emotive images. Participants' ratings of their liking for the Chinese characters were significantly influenced by the type of face image preceding them, with smiling faces generating more positive ratings than neutral and angry ones (F(2,128) = 3.107, p<0.05). Self-reported social anxiety was positively correlated with ratings of smiling relative to neutral-face primed characters (Pearson's r = .323, p<0.01). Individual variation in self-reported mood awareness was not associated with ratings. Trait social anxiety is associated with individual variation in affective responding, even in response to the earliest, pre-attentive stage of visual image processing. However, the fact that these priming effects are limited to smiling and not angry (i.e. threatening) images leads us to propose that the pre-attentive processes involved in generating the subliminal affective priming effect may be different from those that generate attentional biases in anxious individuals.
Optimal balance of the striatal medium spiny neuron network.
Ponzi, Adam; Wickens, Jeffery R
2013-04-01
Slowly varying activity in the striatum, the main Basal Ganglia input structure, is important for the learning and execution of movement sequences. Striatal medium spiny neurons (MSNs) form cell assemblies whose population firing rates vary coherently on slow behaviourally relevant timescales. It has been shown that such activity emerges in a model of a local MSN network but only at realistic connectivities of 10 ~ 20% and only when MSN generated inhibitory post-synaptic potentials (IPSPs) are realistically sized. Here we suggest a reason for this. We investigate how MSN network generated population activity interacts with temporally varying cortical driving activity, as would occur in a behavioural task. We find that at unrealistically high connectivity a stable winners-take-all type regime is found where network activity separates into fixed stimulus dependent regularly firing and quiescent components. In this regime only a small number of population firing rate components interact with cortical stimulus variations. Around 15% connectivity a transition to a more dynamically active regime occurs where all cells constantly switch between activity and quiescence. In this low connectivity regime, MSN population components wander randomly and here too are independent of variations in cortical driving. Only in the transition regime do weak changes in cortical driving interact with many population components so that sequential cell assemblies are reproducibly activated for many hundreds of milliseconds after stimulus onset and peri-stimulus time histograms display strong stimulus and temporal specificity. We show that, remarkably, this activity is maximized at striatally realistic connectivities and IPSP sizes. Thus, we suggest the local MSN network has optimal characteristics - it is neither too stable to respond in a dynamically complex temporally extended way to cortical variations, nor is it too unstable to respond in a consistent repeatable way. Rather, it is optimized to generate stimulus dependent activity patterns for long periods after variations in cortical excitation.
Optimal Balance of the Striatal Medium Spiny Neuron Network
Ponzi, Adam; Wickens, Jeffery R.
2013-01-01
Slowly varying activity in the striatum, the main Basal Ganglia input structure, is important for the learning and execution of movement sequences. Striatal medium spiny neurons (MSNs) form cell assemblies whose population firing rates vary coherently on slow behaviourally relevant timescales. It has been shown that such activity emerges in a model of a local MSN network but only at realistic connectivities of and only when MSN generated inhibitory post-synaptic potentials (IPSPs) are realistically sized. Here we suggest a reason for this. We investigate how MSN network generated population activity interacts with temporally varying cortical driving activity, as would occur in a behavioural task. We find that at unrealistically high connectivity a stable winners-take-all type regime is found where network activity separates into fixed stimulus dependent regularly firing and quiescent components. In this regime only a small number of population firing rate components interact with cortical stimulus variations. Around connectivity a transition to a more dynamically active regime occurs where all cells constantly switch between activity and quiescence. In this low connectivity regime, MSN population components wander randomly and here too are independent of variations in cortical driving. Only in the transition regime do weak changes in cortical driving interact with many population components so that sequential cell assemblies are reproducibly activated for many hundreds of milliseconds after stimulus onset and peri-stimulus time histograms display strong stimulus and temporal specificity. We show that, remarkably, this activity is maximized at striatally realistic connectivities and IPSP sizes. Thus, we suggest the local MSN network has optimal characteristics – it is neither too stable to respond in a dynamically complex temporally extended way to cortical variations, nor is it too unstable to respond in a consistent repeatable way. Rather, it is optimized to generate stimulus dependent activity patterns for long periods after variations in cortical excitation. PMID:23592954
A rule-based software test data generator
NASA Technical Reports Server (NTRS)
Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II
1991-01-01
Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.
Scattering of Gaussian Beams by Disordered Particulate Media
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Dlugach, Janna M.
2016-01-01
A frequently observed characteristic of electromagnetic scattering by a disordered particulate medium is the absence of pronounced speckles in angular patterns of the scattered light. It is known that such diffuse speckle-free scattering patterns can be caused by averaging over randomly changing particle positions and/or over a finite spectral range. To get further insight into the possible physical causes of the absence of speckles, we use the numerically exact superposition T-matrix solver of the Maxwell equations and analyze the scattering of plane-wave and Gaussian beams by representative multi-sphere groups. We show that phase and amplitude variations across an incident Gaussian beam do not serve to extinguish the pronounced speckle pattern typical of plane-wave illumination of a fixed multi-particle group. Averaging over random particle positions and/or over a finite spectral range is still required to generate the classical diffuse speckle-free regime.
A non-linear dimension reduction methodology for generating data-driven stochastic input models
NASA Astrophysics Data System (ADS)
Ganapathysubramanian, Baskar; Zabaras, Nicholas
2008-06-01
Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem of manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space Rn. An isometric mapping F from M to a low-dimensional, compact, connected set A⊂Rd(d≪n) is constructed. Given only a finite set of samples of the data, the methodology uses arguments from graph theory and differential geometry to construct the isometric transformation F:M→A. Asymptotic convergence of the representation of M by A is shown. This mapping F serves as an accurate, low-dimensional, data-driven representation of the property variations. The reduced-order model of the material topology and thermal diffusivity variations is subsequently used as an input in the solution of stochastic partial differential equations that describe the evolution of dependant variables. A sparse grid collocation strategy (Smolyak algorithm) is utilized to solve these stochastic equations efficiently. We showcase the methodology by constructing low-dimensional input stochastic models to represent thermal diffusivity in two-phase microstructures. This model is used in analyzing the effect of topological variations of two-phase microstructures on the evolution of temperature in heat conduction processes.
Generalised filtering and stochastic DCM for fMRI.
Li, Baojuan; Daunizeau, Jean; Stephan, Klaas E; Penny, Will; Hu, Dewen; Friston, Karl
2011-09-15
This paper is about the fitting or inversion of dynamic causal models (DCMs) of fMRI time series. It tries to establish the validity of stochastic DCMs that accommodate random fluctuations in hidden neuronal and physiological states. We compare and contrast deterministic and stochastic DCMs, which do and do not ignore random fluctuations or noise on hidden states. We then compare stochastic DCMs, which do and do not ignore conditional dependence between hidden states and model parameters (generalised filtering and dynamic expectation maximisation, respectively). We first characterise state-noise by comparing the log evidence of models with different a priori assumptions about its amplitude, form and smoothness. Face validity of the inversion scheme is then established using data simulated with and without state-noise to ensure that DCM can identify the parameters and model that generated the data. Finally, we address construct validity using real data from an fMRI study of internet addiction. Our analyses suggest the following. (i) The inversion of stochastic causal models is feasible, given typical fMRI data. (ii) State-noise has nontrivial amplitude and smoothness. (iii) Stochastic DCM has face validity, in the sense that Bayesian model comparison can distinguish between data that have been generated with high and low levels of physiological noise and model inversion provides veridical estimates of effective connectivity. (iv) Relaxing conditional independence assumptions can have greater construct validity, in terms of revealing group differences not disclosed by variational schemes. Finally, we note that the ability to model endogenous or random fluctuations on hidden neuronal (and physiological) states provides a new and possibly more plausible perspective on how regionally specific signals in fMRI are generated. Copyright © 2011. Published by Elsevier Inc.
Evaluation of some random effects methodology applicable to bird ringing data
Burnham, K.P.; White, Gary C.
2002-01-01
Existing models for ring recovery and recapture data analysis treat temporal variations in annual survival probability (S) as fixed effects. Often there is no explainable structure to the temporal variation in S1,..., Sk; random effects can then be a useful model: Si = E(S) + ??i. Here, the temporal variation in survival probability is treated as random with average value E(??2) = ??2. This random effects model can now be fit in program MARK. Resultant inferences include point and interval estimation for process variation, ??2, estimation of E(S) and var (E??(S)) where the latter includes a component for ??2 as well as the traditional component for v??ar(S??\\S??). Furthermore, the random effects model leads to shrinkage estimates, Si, as improved (in mean square error) estimators of Si compared to the MLE, S??i, from the unrestricted time-effects model. Appropriate confidence intervals based on the Si are also provided. In addition, AIC has been generalized to random effects models. This paper presents results of a Monte Carlo evaluation of inference performance under the simple random effects model. Examined by simulation, under the simple one group Cormack-Jolly-Seber (CJS) model, are issues such as bias of ??s2, confidence interval coverage on ??2, coverage and mean square error comparisons for inference about Si based on shrinkage versus maximum likelihood estimators, and performance of AIC model selection over three models: Si ??? S (no effects), Si = E(S) + ??i (random effects), and S1,..., Sk (fixed effects). For the cases simulated, the random effects methods performed well and were uniformly better than fixed effects MLE for the Si.
Reliability Coupled Sensitivity Based Design Approach for Gravity Retaining Walls
NASA Astrophysics Data System (ADS)
Guha Ray, A.; Baidya, D. K.
2012-09-01
Sensitivity analysis involving different random variables and different potential failure modes of a gravity retaining wall focuses on the fact that high sensitivity of a particular variable on a particular mode of failure does not necessarily imply a remarkable contribution to the overall failure probability. The present paper aims at identifying a probabilistic risk factor ( R f ) for each random variable based on the combined effects of failure probability ( P f ) of each mode of failure of a gravity retaining wall and sensitivity of each of the random variables on these failure modes. P f is calculated by Monte Carlo simulation and sensitivity analysis of each random variable is carried out by F-test analysis. The structure, redesigned by modifying the original random variables with the risk factors, is safe against all the variations of random variables. It is observed that R f for friction angle of backfill soil ( φ 1 ) increases and cohesion of foundation soil ( c 2 ) decreases with an increase of variation of φ 1 , while R f for unit weights ( γ 1 and γ 2 ) for both soil and friction angle of foundation soil ( φ 2 ) remains almost constant for variation of soil properties. The results compared well with some of the existing deterministic and probabilistic methods and found to be cost-effective. It is seen that if variation of φ 1 remains within 5 %, significant reduction in cross-sectional area can be achieved. But if the variation is more than 7-8 %, the structure needs to be modified. Finally design guidelines for different wall dimensions, based on the present approach, are proposed.
NASA Technical Reports Server (NTRS)
Roble, R. G.; Hays, P. B.
1979-01-01
The paper presents a model of global atmospheric electricity used to examine the effect of upper atmospheric generators on the global electrical circuit. The model represents thunderstorms as dipole current generators randomly distributed in areas of known thunderstorm frequency; the electrical conductivity in the model increases with altitude, and electrical effects are coupled with a passive magnetosphere along geomagnetic field lines. The large horizontal-scale potential differences at ionospheric heights map downward into the lower atmosphere where the perturbations in the ground electric field are superimposed on the diurnal variation. Finally, changes in the upper atmospheric conductivity due to solar flares, polar cap absorptions, and Forbush decreases are shown to alter the downward mapping of the high-latitude potential pattern and the global distribution of fields and currents.
A fast ergodic algorithm for generating ensembles of equilateral random polygons
NASA Astrophysics Data System (ADS)
Varela, R.; Hinson, K.; Arsuaga, J.; Diao, Y.
2009-03-01
Knotted structures are commonly found in circular DNA and along the backbone of certain proteins. In order to properly estimate properties of these three-dimensional structures it is often necessary to generate large ensembles of simulated closed chains (i.e. polygons) of equal edge lengths (such polygons are called equilateral random polygons). However finding efficient algorithms that properly sample the space of equilateral random polygons is a difficult problem. Currently there are no proven algorithms that generate equilateral random polygons with its theoretical distribution. In this paper we propose a method that generates equilateral random polygons in a 'step-wise uniform' way. We prove that this method is ergodic in the sense that any given equilateral random polygon can be generated by this method and we show that the time needed to generate an equilateral random polygon of length n is linear in terms of n. These two properties make this algorithm a big improvement over the existing generating methods. Detailed numerical comparisons of our algorithm with other widely used algorithms are provided.
Using Computer-Generated Random Numbers to Calculate the Lifetime of a Comet.
ERIC Educational Resources Information Center
Danesh, Iraj
1991-01-01
An educational technique to calculate the lifetime of a comet using software-generated random numbers is introduced to undergraduate physiques and astronomy students. Discussed are the generation and eligibility of the required random numbers, background literature related to the problem, and the solution to the problem using random numbers.…
Brandstätter, Anita; Peterson, Christine T; Irwin, Jodi A; Mpoke, Solomon; Koech, Davy K; Parson, Walther; Parsons, Thomas J
2004-10-01
Large forensic mtDNA databases which adhere to strict guidelines for generation and maintenance, are not available for many populations outside of the United States and western Europe. We have established a high quality mtDNA control region sequence database for urban Nairobi as both a reference database for forensic investigations, and as a tool to examine the genetic variation of Kenyan sequences in the context of known African variation. The Nairobi sequences exhibited high variation and a low random match probability, indicating utility for forensic testing. Haplogroup identification and frequencies were compared with those reported from other published studies on African, or African-origin populations from Mozambique, Sierra Leone, and the United States, and suggest significant differences in the mtDNA compositions of the various populations. The quality of the sequence data in our study was investigated and supported using phylogenetic measures. Our data demonstrate the diversity and distinctiveness of African populations, and underline the importance of establishing additional forensic mtDNA databases of indigenous African populations.
Dhakal, Rajendra; Kim, E S; Jo, Yong-Hwa; Kim, Sung-Soo; Kim, Nam-Young
2017-03-01
We present a concept for the characterization of micro-fabricated based resonator incorporating air-bridge metal-insulator-semiconductor (MIS) capacitor to continuously monitor an individual's state of glucose levels based on frequency variation. The investigation revealed that, the micro-resonator based on MIS capacitor holds considerable promise for implementation and recognition as a glucose sensor for human serum. The discrepancy in complex permittivity as a result of enhanced capacitor was achieved for the detection and determination of random glucose concentration levels using a unique variation of capacitor that indeed results in an adequate variation of the resonance frequency. Moreover, the design and development of micro-resonator with enhanced MIS capacitor generate a resolution of 112.38 × 10 -3 pF/mg/dl, minimum detectable glucose level of 7.45mg/dl, and a limit of quantification of 22.58mg/dl. Additionally, this unique approach offers long-term reliability for mediator-free glucose sensing with a relative standard deviation of less than 0.5%. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.
A Comparison of Three Random Number Generators for Aircraft Dynamic Modeling Applications
NASA Technical Reports Server (NTRS)
Grauer, Jared A.
2017-01-01
Three random number generators, which produce Gaussian white noise sequences, were compared to assess their suitability in aircraft dynamic modeling applications. The first generator considered was the MATLAB (registered) implementation of the Mersenne-Twister algorithm. The second generator was a website called Random.org, which processes atmospheric noise measured using radios to create the random numbers. The third generator was based on synthesis of the Fourier series, where the random number sequences are constructed from prescribed amplitude and phase spectra. A total of 200 sequences, each having 601 random numbers, for each generator were collected and analyzed in terms of the mean, variance, normality, autocorrelation, and power spectral density. These sequences were then applied to two problems in aircraft dynamic modeling, namely estimating stability and control derivatives from simulated onboard sensor data, and simulating flight in atmospheric turbulence. In general, each random number generator had good performance and is well-suited for aircraft dynamic modeling applications. Specific strengths and weaknesses of each generator are discussed. For Monte Carlo simulation, the Fourier synthesis method is recommended because it most accurately and consistently approximated Gaussian white noise and can be implemented with reasonable computational effort.
Realization of a Quantum Random Generator Certified with the Kochen-Specker Theorem
NASA Astrophysics Data System (ADS)
Kulikov, Anatoly; Jerger, Markus; Potočnik, Anton; Wallraff, Andreas; Fedorov, Arkady
2017-12-01
Random numbers are required for a variety of applications from secure communications to Monte Carlo simulation. Yet randomness is an asymptotic property, and no output string generated by a physical device can be strictly proven to be random. We report an experimental realization of a quantum random number generator (QRNG) with randomness certified by quantum contextuality and the Kochen-Specker theorem. The certification is not performed in a device-independent way but through a rigorous theoretical proof of each outcome being value indefinite even in the presence of experimental imperfections. The analysis of the generated data confirms the incomputable nature of our QRNG.
Realization of a Quantum Random Generator Certified with the Kochen-Specker Theorem.
Kulikov, Anatoly; Jerger, Markus; Potočnik, Anton; Wallraff, Andreas; Fedorov, Arkady
2017-12-15
Random numbers are required for a variety of applications from secure communications to Monte Carlo simulation. Yet randomness is an asymptotic property, and no output string generated by a physical device can be strictly proven to be random. We report an experimental realization of a quantum random number generator (QRNG) with randomness certified by quantum contextuality and the Kochen-Specker theorem. The certification is not performed in a device-independent way but through a rigorous theoretical proof of each outcome being value indefinite even in the presence of experimental imperfections. The analysis of the generated data confirms the incomputable nature of our QRNG.
Source-Independent Quantum Random Number Generation
NASA Astrophysics Data System (ADS)
Cao, Zhu; Zhou, Hongyi; Yuan, Xiao; Ma, Xiongfeng
2016-01-01
Quantum random number generators can provide genuine randomness by appealing to the fundamental principles of quantum mechanics. In general, a physical generator contains two parts—a randomness source and its readout. The source is essential to the quality of the resulting random numbers; hence, it needs to be carefully calibrated and modeled to achieve information-theoretical provable randomness. However, in practice, the source is a complicated physical system, such as a light source or an atomic ensemble, and any deviations in the real-life implementation from the theoretical model may affect the randomness of the output. To close this gap, we propose a source-independent scheme for quantum random number generation in which output randomness can be certified, even when the source is uncharacterized and untrusted. In our randomness analysis, we make no assumptions about the dimension of the source. For instance, multiphoton emissions are allowed in optical implementations. Our analysis takes into account the finite-key effect with the composable security definition. In the limit of large data size, the length of the input random seed is exponentially small compared to that of the output random bit. In addition, by modifying a quantum key distribution system, we experimentally demonstrate our scheme and achieve a randomness generation rate of over 5 ×103 bit /s .
Quantum Random Number Generation Using a Quanta Image Sensor
Amri, Emna; Felk, Yacine; Stucki, Damien; Ma, Jiaju; Fossum, Eric R.
2016-01-01
A new quantum random number generation method is proposed. The method is based on the randomness of the photon emission process and the single photon counting capability of the Quanta Image Sensor (QIS). It has the potential to generate high-quality random numbers with remarkable data output rate. In this paper, the principle of photon statistics and theory of entropy are discussed. Sample data were collected with QIS jot device, and its randomness quality was analyzed. The randomness assessment method and results are discussed. PMID:27367698
Image encryption using random sequence generated from generalized information domain
NASA Astrophysics Data System (ADS)
Xia-Yan, Zhang; Guo-Ji, Zhang; Xuan, Li; Ya-Zhou, Ren; Jie-Hua, Wu
2016-05-01
A novel image encryption method based on the random sequence generated from the generalized information domain and permutation-diffusion architecture is proposed. The random sequence is generated by reconstruction from the generalized information file and discrete trajectory extraction from the data stream. The trajectory address sequence is used to generate a P-box to shuffle the plain image while random sequences are treated as keystreams. A new factor called drift factor is employed to accelerate and enhance the performance of the random sequence generator. An initial value is introduced to make the encryption method an approximately one-time pad. Experimental results show that the random sequences pass the NIST statistical test with a high ratio and extensive analysis demonstrates that the new encryption scheme has superior security.
A generator for unique quantum random numbers based on vacuum states
NASA Astrophysics Data System (ADS)
Gabriel, Christian; Wittmann, Christoffer; Sych, Denis; Dong, Ruifang; Mauerer, Wolfgang; Andersen, Ulrik L.; Marquardt, Christoph; Leuchs, Gerd
2010-10-01
Random numbers are a valuable component in diverse applications that range from simulations over gambling to cryptography. The quest for true randomness in these applications has engendered a large variety of different proposals for producing random numbers based on the foundational unpredictability of quantum mechanics. However, most approaches do not consider that a potential adversary could have knowledge about the generated numbers, so the numbers are not verifiably random and unique. Here we present a simple experimental setup based on homodyne measurements that uses the purity of a continuous-variable quantum vacuum state to generate unique random numbers. We use the intrinsic randomness in measuring the quadratures of a mode in the lowest energy vacuum state, which cannot be correlated to any other state. The simplicity of our source, combined with its verifiably unique randomness, are important attributes for achieving high-reliability, high-speed and low-cost quantum random number generators.
NASA Astrophysics Data System (ADS)
Barbagallo, Annamaria; Di Meglio, Guglielmo; Mauro, Paolo
2017-07-01
The aim of the paper is to study, in a Hilbert space setting, a general random oligopolistic market equilibrium problem in presence of both production and demand excesses and to characterize the random Cournot-Nash equilibrium principle by means of a stochastic variational inequality. Some existence results are presented.
Random ambience using high fidelity images
NASA Astrophysics Data System (ADS)
Abu, Nur Azman; Sahib, Shahrin
2011-06-01
Most of the secure communication nowadays mandates true random keys as an input. These operations are mostly designed and taken care of by the developers of the cryptosystem. Due to the nature of confidential crypto development today, pseudorandom keys are typically designed and still preferred by the developers of the cryptosystem. However, these pseudorandom keys are predictable, periodic and repeatable, hence they carry minimal entropy. True random keys are believed to be generated only via hardware random number generators. Careful statistical analysis is still required to have any confidence the process and apparatus generates numbers that are sufficiently random to suit the cryptographic use. In this underlying research, each moment in life is considered unique in itself. The random key is unique for the given moment generated by the user whenever he or she needs the random keys in practical secure communication. An ambience of high fidelity digital image shall be tested for its randomness according to the NIST Statistical Test Suite. Recommendation on generating a simple 4 megabits per second random cryptographic keys live shall be reported.
Shibayama, Yusuke; Arimura, Hidetaka; Hirose, Taka-Aki; Nakamoto, Takahiro; Sasaki, Tomonari; Ohga, Saiji; Matsushita, Norimasa; Umezu, Yoshiyuki; Nakamura, Yasuhiko; Honda, Hiroshi
2017-05-01
The setup errors and organ motion errors pertaining to clinical target volume (CTV) have been considered as two major causes of uncertainties in the determination of the CTV-to-planning target volume (PTV) margins for prostate cancer radiation treatment planning. We based our study on the assumption that interfractional target shape variations are not negligible as another source of uncertainty for the determination of precise CTV-to-PTV margins. Thus, we investigated the interfractional shape variations of CTVs based on a point distribution model (PDM) for prostate cancer radiation therapy. To quantitate the shape variations of CTVs, the PDM was applied for the contours of 4 types of CTV regions (low-risk, intermediate- risk, high-risk CTVs, and prostate plus entire seminal vesicles), which were delineated by considering prostate cancer risk groups on planning computed tomography (CT) and cone beam CT (CBCT) images of 73 fractions of 10 patients. The standard deviations (SDs) of the interfractional random errors for shape variations were obtained from covariance matrices based on the PDMs, which were generated from vertices of triangulated CTV surfaces. The correspondences between CTV surface vertices were determined based on a thin-plate spline robust point matching algorithm. The systematic error for shape variations was defined as the average deviation between surfaces of an average CTV and planning CTVs, and the random error as the average deviation of CTV surface vertices for fractions from an average CTV surface. The means of the SDs of the systematic errors for the four types of CTVs ranged from 1.0 to 2.0 mm along the anterior direction, 1.2 to 2.6 mm along the posterior direction, 1.0 to 2.5 mm along the superior direction, 0.9 to 1.9 mm along the inferior direction, 0.9 to 2.6 mm along the right direction, and 1.0 to 3.0 mm along the left direction. Concerning the random errors, the means of the SDs ranged from 0.9 to 1.2 mm along the anterior direction, 1.0 to 1.4 mm along the posterior direction, 0.9 to 1.3 mm along the superior direction, 0.8 to 1.0 mm along the inferior direction, 0.8 to 0.9 mm along the right direction, and 0.8 to 1.0 mm along the left direction. Since the shape variations were not negligible for intermediate and high-risk CTVs, they should be taken into account for the determination of the CTV-to-PTV margins in radiation treatment planning of prostate cancer. © 2017 American Association of Physicists in Medicine.
Modeling stimulus variation in three common implicit attitude tasks.
Wolsiefer, Katie; Westfall, Jacob; Judd, Charles M
2017-08-01
We explored the consequences of ignoring the sampling variation due to stimuli in the domain of implicit attitudes. A large literature in psycholinguistics has examined the statistical treatment of random stimulus materials, but the recommendations from this literature have not been applied to the social psychological literature on implicit attitudes. This is partly because of inherent complications in applying crossed random-effect models to some of the most common implicit attitude tasks, and partly because no work to date has demonstrated that random stimulus variation is in fact consequential in implicit attitude measurement. We addressed this problem by laying out statistically appropriate and practically feasible crossed random-effect models for three of the most commonly used implicit attitude measures-the Implicit Association Test, affect misattribution procedure, and evaluative priming task-and then applying these models to large datasets (average N = 3,206) that assess participants' implicit attitudes toward race, politics, and self-esteem. We showed that the test statistics from the traditional analyses are substantially (about 60 %) inflated relative to the more-appropriate analyses that incorporate stimulus variation. Because all three tasks used the same stimulus words and faces, we could also meaningfully compare the relative contributions of stimulus variation across the tasks. In an appendix, we give syntax in R, SAS, and SPSS for fitting the recommended crossed random-effects models to data from all three tasks, as well as instructions on how to structure the data file.
Superparamagnetic perpendicular magnetic tunnel junctions for true random number generators
NASA Astrophysics Data System (ADS)
Parks, Bradley; Bapna, Mukund; Igbokwe, Julianne; Almasi, Hamid; Wang, Weigang; Majetich, Sara A.
2018-05-01
Superparamagnetic perpendicular magnetic tunnel junctions are fabricated and analyzed for use in random number generators. Time-resolved resistance measurements are used as streams of bits in statistical tests for randomness. Voltage control of the thermal stability enables tuning the average speed of random bit generation up to 70 kHz in a 60 nm diameter device. In its most efficient operating mode, the device generates random bits at an energy cost of 600 fJ/bit. A narrow range of magnetic field tunes the probability of a given state from 0 to 1, offering a means of probabilistic computing.
Brownian motion properties of optoelectronic random bit generators based on laser chaos.
Li, Pu; Yi, Xiaogang; Liu, Xianglian; Wang, Yuncai; Wang, Yongge
2016-07-11
The nondeterministic property of the optoelectronic random bit generator (RBG) based on laser chaos are experimentally analyzed from two aspects of the central limit theorem and law of iterated logarithm. The random bits are extracted from an optical feedback chaotic laser diode using a multi-bit extraction technique in the electrical domain. Our experimental results demonstrate that the generated random bits have no statistical distance from the Brownian motion, besides that they can pass the state-of-the-art industry-benchmark statistical test suite (NIST SP800-22). All of them give a mathematically provable evidence that the ultrafast random bit generator based on laser chaos can be used as a nondeterministic random bit source.
A study of the breast cancer dynamics in North Carolina.
Christakos, G; Lai, J J
1997-11-01
This work is concerned with the study of breast cancer incidence in the State of North Carolina. Methodologically, the current analysis illustrates the importance of spatiotemporal random field modelling and introduces a mode of reasoning that is based on a combination of inductive and deductive processes. The composite space/time analysis utilizes the variability characteristics of incidence and the mathematical features of the random field model to fit it to the data. The analysis is significantly general and can efficiently represent non-homogeneous and non-stationary characteristics of breast cancer variation. Incidence predictions are produced using data at the same time period as well as data from other time periods and disease registries. The random field provides a rigorous and systematic method for generating detailed maps, which offer a quantitative description of the incidence variation from place to place and from time to time, together with a measure of the accuracy of the incidence maps. Spatiotemporal mapping accounts for the geographical locations and the time instants of the incidence observations, which is not usually the case with most empirical Bayes methods. It is also more accurate than purely spatial statistics methods, and can offer valuable information about the breast cancer risk and dynamics in North Carolina. Field studies could be initialized in high-rate areas identified by the maps in an effort to uncover environmental or life-style factors that might be responsible for the high risk rates. Also, the incidence maps can help elucidate causal mechanisms, explain disease occurrences at a certain scale, and offer guidance in health management and administration.
Facilitation of learning induced by both random and gradual visuomotor task variation
Braun, Daniel A.; Wolpert, Daniel M.
2012-01-01
Motor task variation has been shown to be a key ingredient in skill transfer, retention, and structural learning. However, many studies only compare training of randomly varying tasks to either blocked or null training, and it is not clear how experiencing different nonrandom temporal orderings of tasks might affect the learning process. Here we study learning in human subjects who experience the same set of visuomotor rotations, evenly spaced between −60° and +60°, either in a random order or in an order in which the rotation angle changed gradually. We compared subsequent learning of three test blocks of +30°→−30°→+30° rotations. The groups that underwent either random or gradual training showed significant (P < 0.01) facilitation of learning in the test blocks compared with a control group who had not experienced any visuomotor rotations before. We also found that movement initiation times in the random group during the test blocks were significantly (P < 0.05) lower than for the gradual or the control group. When we fit a state-space model with fast and slow learning processes to our data, we found that the differences in performance in the test block were consistent with the gradual or random task variation changing the learning and retention rates of only the fast learning process. Such adaptation of learning rates may be a key feature of ongoing meta-learning processes. Our results therefore suggest that both gradual and random task variation can induce meta-learning and that random learning has an advantage in terms of shorter initiation times, suggesting less reliance on cognitive processes. PMID:22131385
Towards a high-speed quantum random number generator
NASA Astrophysics Data System (ADS)
Stucki, Damien; Burri, Samuel; Charbon, Edoardo; Chunnilall, Christopher; Meneghetti, Alessio; Regazzoni, Francesco
2013-10-01
Randomness is of fundamental importance in various fields, such as cryptography, numerical simulations, or the gaming industry. Quantum physics, which is fundamentally probabilistic, is the best option for a physical random number generator. In this article, we will present the work carried out in various projects in the context of the development of a commercial and certified high speed random number generator.
Generating variable and random schedules of reinforcement using Microsoft Excel macros.
Bancroft, Stacie L; Bourret, Jason C
2008-01-01
Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to write macros in Microsoft Excel that will generate variable-ratio, variable-interval, variable-time, random-ratio, random-interval, and random-time reinforcement schedule values.
Random Variation in Student Performance by Class Size: Implications of NCLB in Rural Pennsylvania
ERIC Educational Resources Information Center
Goetz, Stephan J.
2005-01-01
Schools that fail to make "adequate yearly progress" under NCLB face sanctions and may lose students to other schools. In smaller schools, random yearly variation in innate student ability and behavior can cause changes in scores that are beyond the influence of teachers. This study examines changes in reading and math scores across…
Dynamic Loads Generation for Multi-Point Vibration Excitation Problems
NASA Technical Reports Server (NTRS)
Shen, Lawrence
2011-01-01
A random-force method has been developed to predict dynamic loads produced by rocket-engine random vibrations for new rocket-engine designs. The method develops random forces at multiple excitation points based on random vibration environments scaled from accelerometer data obtained during hot-fire tests of existing rocket engines. This random-force method applies random forces to the model and creates expected dynamic response in a manner that simulates the way the operating engine applies self-generated random vibration forces (random pressure acting on an area) with the resulting responses that we measure with accelerometers. This innovation includes the methodology (implementation sequence), the computer code, two methods to generate the random-force vibration spectra, and two methods to reduce some of the inherent conservatism in the dynamic loads. This methodology would be implemented to generate the random-force spectra at excitation nodes without requiring the use of artificial boundary conditions in a finite element model. More accurate random dynamic loads than those predicted by current industry methods can then be generated using the random force spectra. The scaling method used to develop the initial power spectral density (PSD) environments for deriving the random forces for the rocket engine case is based on the Barrett Criteria developed at Marshall Space Flight Center in 1963. This invention approach can be applied in the aerospace, automotive, and other industries to obtain reliable dynamic loads and responses from a finite element model for any structure subject to multipoint random vibration excitations.
Secure uniform random-number extraction via incoherent strategies
NASA Astrophysics Data System (ADS)
Hayashi, Masahito; Zhu, Huangjun
2018-01-01
To guarantee the security of uniform random numbers generated by a quantum random-number generator, we study secure extraction of uniform random numbers when the environment of a given quantum state is controlled by the third party, the eavesdropper. Here we restrict our operations to incoherent strategies that are composed of the measurement on the computational basis and incoherent operations (or incoherence-preserving operations). We show that the maximum secure extraction rate is equal to the relative entropy of coherence. By contrast, the coherence of formation gives the extraction rate when a certain constraint is imposed on the eavesdropper's operations. The condition under which the two extraction rates coincide is then determined. Furthermore, we find that the exponential decreasing rate of the leaked information is characterized by Rényi relative entropies of coherence. These results clarify the power of incoherent strategies in random-number generation, and can be applied to guarantee the quality of random numbers generated by a quantum random-number generator.
Certified randomness in quantum physics.
Acín, Antonio; Masanes, Lluis
2016-12-07
The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.
Lisovskiĭ, A A; Pavlinov, I Ia
2008-01-01
Any morphospace is partitioned by the forms of group variation, its structure is described by a set of scalar (range, overlap) and vector (direction) characteristics. They are analyzed quantitatively for the sex and age variations in the sample of 200 skulls of the pine marten described by 14 measurable traits. Standard dispersion and variance components analyses are employed, accompanied with several resampling methods (randomization and bootstrep); effects of changes in the analysis design on results of the above methods are also considered. Maximum likelihood algorithm of variance components analysis is shown to give an adequate estimates of portions of particular forms of group variation within the overall disparity. It is quite stable in respect to changes of the analysis design and therefore could be used in the explorations of the real data with variously unbalanced designs. A new algorithm of estimation of co-directionality of particular forms of group variation within the overall disparity is elaborated, which includes angle measures between eigenvectors of covariation matrices of effects of group variations calculated by dispersion analysis. A null hypothesis of random portion of a given group variation could be tested by means of randomization of the respective grouping variable. A null hypothesis of equality of both portions and directionalities of different forms of group variation could be tested by means of the bootstrep procedure.
Problems with the random number generator RANF implemented on the CDC cyber 205
NASA Astrophysics Data System (ADS)
Kalle, Claus; Wansleben, Stephan
1984-10-01
We show that using RANF may lead to wrong results when lattice models are simulated by Monte Carlo methods. We present a shift-register sequence random number generator which generates two random numbers per cycle on a two pipe CDC Cyber 205.
Recommendations and illustrations for the evaluation of photonic random number generators
NASA Astrophysics Data System (ADS)
Hart, Joseph D.; Terashima, Yuta; Uchida, Atsushi; Baumgartner, Gerald B.; Murphy, Thomas E.; Roy, Rajarshi
2017-09-01
The never-ending quest to improve the security of digital information combined with recent improvements in hardware technology has caused the field of random number generation to undergo a fundamental shift from relying solely on pseudo-random algorithms to employing optical entropy sources. Despite these significant advances on the hardware side, commonly used statistical measures and evaluation practices remain ill-suited to understand or quantify the optical entropy that underlies physical random number generation. We review the state of the art in the evaluation of optical random number generation and recommend a new paradigm: quantifying entropy generation and understanding the physical limits of the optical sources of randomness. In order to do this, we advocate for the separation of the physical entropy source from deterministic post-processing in the evaluation of random number generators and for the explicit consideration of the impact of the measurement and digitization process on the rate of entropy production. We present the Cohen-Procaccia estimate of the entropy rate h (𝜖 ,τ ) as one way to do this. In order to provide an illustration of our recommendations, we apply the Cohen-Procaccia estimate as well as the entropy estimates from the new NIST draft standards for physical random number generators to evaluate and compare three common optical entropy sources: single photon time-of-arrival detection, chaotic lasers, and amplified spontaneous emission.
Quantitative recurrence for free semigroup actions
NASA Astrophysics Data System (ADS)
Carvalho, Maria; Rodrigues, Fagner B.; Varandas, Paulo
2018-03-01
We consider finitely generated free semigroup actions on a compact metric space and obtain quantitative information on Poincaré recurrence, average first return time and hitting frequency for the random orbits induced by the semigroup action. Besides, we relate the recurrence to balls with the rates of expansion of the semigroup generators and the topological entropy of the semigroup action. Finally, we establish a partial variational principle and prove an ergodic optimization for this kind of dynamical action. MC has been financially supported by CMUP (UID/MAT/00144/2013), which is funded by FCT (Portugal) with national (MEC) and European structural funds (FEDER) under the partnership agreement PT2020. FR and PV were partially supported by BREUDS. PV has also benefited from a fellowship awarded by CNPq-Brazil and is grateful to the Faculty of Sciences of the University of Porto for the excellent research conditions.
Degree-constrained multicast routing for multimedia communications
NASA Astrophysics Data System (ADS)
Wang, Yanlin; Sun, Yugeng; Li, Guidan
2005-02-01
Multicast services have been increasingly used by many multimedia applications. As one of the key techniques to support multimedia applications, the rational and effective multicast routing algorithms are very important to networks performance. When switch nodes in networks have different multicast capability, multicast routing problem is modeled as the degree-constrained Steiner problem. We presented two heuristic algorithms, named BMSTA and BSPTA, for the degree-constrained case in multimedia communications. Both algorithms are used to generate degree-constrained multicast trees with bandwidth and end to end delay bound. Simulations over random networks were carried out to compare the performance of the two proposed algorithms. Experimental results show that the proposed algorithms have advantages in traffic load balancing, which can avoid link blocking and enhance networks performance efficiently. BMSTA has better ability in finding unsaturated links and (or) unsaturated nodes to generate multicast trees than BSPTA. The performance of BMSTA is affected by the variation of degree constraints.
Modelling small-area inequality in premature mortality using years of life lost rates
NASA Astrophysics Data System (ADS)
Congdon, Peter
2013-04-01
Analysis of premature mortality variations via standardized expected years of life lost (SEYLL) measures raises questions about suitable modelling for mortality data, especially when developing SEYLL profiles for areas with small populations. Existing fixed effects estimation methods take no account of correlations in mortality levels over ages, causes, socio-ethnic groups or areas. They also do not specify an underlying data generating process, or a likelihood model that can include trends or correlations, and are likely to produce unstable estimates for small-areas. An alternative strategy involves a fully specified data generation process, and a random effects model which "borrows strength" to produce stable SEYLL estimates, allowing for correlations between ages, areas and socio-ethnic groups. The resulting modelling strategy is applied to gender-specific differences in SEYLL rates in small-areas in NE London, and to cause-specific mortality for leading causes of premature mortality in these areas.
Generating Variable and Random Schedules of Reinforcement Using Microsoft Excel Macros
Bancroft, Stacie L; Bourret, Jason C
2008-01-01
Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to write macros in Microsoft Excel that will generate variable-ratio, variable-interval, variable-time, random-ratio, random-interval, and random-time reinforcement schedule values. PMID:18595286
Programmable random interval generator
NASA Technical Reports Server (NTRS)
Lindsey, R. S., Jr.
1973-01-01
Random pulse generator can supply constant-amplitude randomly distributed pulses with average rate ranging from a few counts per second to more than one million counts per second. Generator requires no high-voltage power supply or any special thermal cooling apparatus. Device is uniquely versatile and provides wide dynamic range of operation.
Occurrence of CPPopt Values in Uncorrelated ICP and ABP Time Series.
Cabeleira, M; Czosnyka, M; Liu, X; Donnelly, J; Smielewski, P
2018-01-01
Optimal cerebral perfusion pressure (CPPopt) is a concept that uses the pressure reactivity (PRx)-CPP relationship over a given period to find a value of CPP at which PRx shows best autoregulation. It has been proposed that this relationship be modelled by a U-shaped curve, where the minimum is interpreted as being the CPP value that corresponds to the strongest autoregulation. Owing to the nature of the calculation and the signals involved in it, the occurrence of CPPopt curves generated by non-physiological variations of intracranial pressure (ICP) and arterial blood pressure (ABP), termed here "false positives", is possible. Such random occurrences would artificially increase the yield of CPPopt values and decrease the reliability of the methodology.In this work, we studied the probability of the random occurrence of false-positives and we compared the effect of the parameters used for CPPopt calculation on this probability. To simulate the occurrence of false-positives, uncorrelated ICP and ABP time series were generated by destroying the relationship between the waves in real recordings. The CPPopt algorithm was then applied to these new series and the number of false-positives was counted for different values of the algorithm's parameters. The percentage of CPPopt curves generated from uncorrelated data was demonstrated to be 11.5%. This value can be minimised by tuning some of the calculation parameters, such as increasing the calculation window and increasing the minimum PRx span accepted on the curve.
Experimentally generated randomness certified by the impossibility of superluminal signals.
Bierhorst, Peter; Knill, Emanuel; Glancy, Scott; Zhang, Yanbao; Mink, Alan; Jordan, Stephen; Rommal, Andrea; Liu, Yi-Kai; Christensen, Bradley; Nam, Sae Woo; Stevens, Martin J; Shalm, Lynden K
2018-04-01
From dice to modern electronic circuits, there have been many attempts to build better devices to generate random numbers. Randomness is fundamental to security and cryptographic systems and to safeguarding privacy. A key challenge with random-number generators is that it is hard to ensure that their outputs are unpredictable 1-3 . For a random-number generator based on a physical process, such as a noisy classical system or an elementary quantum measurement, a detailed model that describes the underlying physics is necessary to assert unpredictability. Imperfections in the model compromise the integrity of the device. However, it is possible to exploit the phenomenon of quantum non-locality with a loophole-free Bell test to build a random-number generator that can produce output that is unpredictable to any adversary that is limited only by general physical principles, such as special relativity 1-11 . With recent technological developments, it is now possible to carry out such a loophole-free Bell test 12-14,22 . Here we present certified randomness obtained from a photonic Bell experiment and extract 1,024 random bits that are uniformly distributed to within 10 -12 . These random bits could not have been predicted according to any physical theory that prohibits faster-than-light (superluminal) signalling and that allows independent measurement choices. To certify and quantify the randomness, we describe a protocol that is optimized for devices that are characterized by a low per-trial violation of Bell inequalities. Future random-number generators based on loophole-free Bell tests may have a role in increasing the security and trust of our cryptographic systems and infrastructure.
ERIC Educational Resources Information Center
Bloom, Howard S.; Raudenbush, Stephen W.; Weiss, Michael J.; Porter, Kristin
2017-01-01
The present article considers a fundamental question in evaluation research: "By how much do program effects vary across sites?" The article first presents a theoretical model of cross-site impact variation and a related estimation model with a random treatment coefficient and fixed site-specific intercepts. This approach eliminates…
Lotka-Volterra system in a random environment.
Dimentberg, Mikhail F
2002-03-01
Classical Lotka-Volterra (LV) model for oscillatory behavior of population sizes of two interacting species (predator-prey or parasite-host pairs) is conservative. This may imply unrealistically high sensitivity of the system's behavior to environmental variations. Thus, a generalized LV model is considered with the equation for preys' reproduction containing the following additional terms: quadratic "damping" term that accounts for interspecies competition, and term with white-noise random variations of the preys' reproduction factor that simulates the environmental variations. An exact solution is obtained for the corresponding Fokker-Planck-Kolmogorov equation for stationary probability densities (PDF's) of the population sizes. It shows that both population sizes are independent gamma-distributed stationary random processes. Increasing level of the environmental variations does not lead to extinction of the populations. However it may lead to an intermittent behavior, whereby one or both population sizes experience very rare and violent short pulses or outbreaks while remaining on a very low level most of the time. This intermittency is described analytically by direct use of the solutions for the PDF's as well as by applying theory of excursions of random functions and by predicting PDF of peaks in the predators' population size.
Lotka-Volterra system in a random environment
NASA Astrophysics Data System (ADS)
Dimentberg, Mikhail F.
2002-03-01
Classical Lotka-Volterra (LV) model for oscillatory behavior of population sizes of two interacting species (predator-prey or parasite-host pairs) is conservative. This may imply unrealistically high sensitivity of the system's behavior to environmental variations. Thus, a generalized LV model is considered with the equation for preys' reproduction containing the following additional terms: quadratic ``damping'' term that accounts for interspecies competition, and term with white-noise random variations of the preys' reproduction factor that simulates the environmental variations. An exact solution is obtained for the corresponding Fokker-Planck-Kolmogorov equation for stationary probability densities (PDF's) of the population sizes. It shows that both population sizes are independent γ-distributed stationary random processes. Increasing level of the environmental variations does not lead to extinction of the populations. However it may lead to an intermittent behavior, whereby one or both population sizes experience very rare and violent short pulses or outbreaks while remaining on a very low level most of the time. This intermittency is described analytically by direct use of the solutions for the PDF's as well as by applying theory of excursions of random functions and by predicting PDF of peaks in the predators' population size.
NASA Astrophysics Data System (ADS)
Ambekar Ramachandra Rao, Raghu; Mehta, Monal R.; Toussaint, Kimani C., Jr.
2010-02-01
We demonstrate the use of Fourier transform-second-harmonic generation (FT-SHG) imaging of collagen fibers as a means of performing quantitative analysis of obtained images of selected spatial regions in porcine trachea, ear, and cornea. Two quantitative markers, preferred orientation and maximum spatial frequency are proposed for differentiating structural information between various spatial regions of interest in the specimens. The ear shows consistent maximum spatial frequency and orientation as also observed in its real-space image. However, there are observable changes in the orientation and minimum feature size of fibers in the trachea indicating a more random organization. Finally, the analysis is applied to a 3D image stack of the cornea. It is shown that the standard deviation of the orientation is sensitive to the randomness in fiber orientation. Regions with variations in the maximum spatial frequency, but with relatively constant orientation, suggest that maximum spatial frequency is useful as an independent quantitative marker. We emphasize that FT-SHG is a simple, yet powerful, tool for extracting information from images that is not obvious in real space. This technique can be used as a quantitative biomarker to assess the structure of collagen fibers that may change due to damage from disease or physical injury.
APPLICATION OF TRAVEL TIME RELIABILITY FOR PERFORMANCE ORIENTED OPERATIONAL PLANNING OF EXPRESSWAYS
NASA Astrophysics Data System (ADS)
Mehran, Babak; Nakamura, Hideki
Evaluation of impacts of congestion improvement scheme s on travel time reliability is very significant for road authorities since travel time reliability repr esents operational performance of expressway segments. In this paper, a methodology is presented to estimate travel tim e reliability prior to implementation of congestion relief schemes based on travel time variation modeling as a function of demand, capacity, weather conditions and road accident s. For subject expressway segmen ts, traffic conditions are modeled over a whole year considering demand and capacity as random variables. Patterns of demand and capacity are generated for each five minute interval by appl ying Monte-Carlo simulation technique, and accidents are randomly generated based on a model that links acci dent rate to traffic conditions. A whole year analysis is performed by comparing de mand and available capacity for each scenario and queue length is estimated through shockwave analysis for each time in terval. Travel times are estimated from refined speed-flow relationships developed for intercity expressways and buffer time index is estimated consequently as a measure of travel time reliability. For validation, estimated reliability indices are compared with measured values from empirical data, and it is shown that the proposed method is suitable for operational evaluation and planning purposes.
Pseudo-random number generator for the Sigma 5 computer
NASA Technical Reports Server (NTRS)
Carroll, S. N.
1983-01-01
A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.
Implementation of a quantum random number generator based on the optimal clustering of photocounts
NASA Astrophysics Data System (ADS)
Balygin, K. A.; Zaitsev, V. I.; Klimov, A. N.; Kulik, S. P.; Molotkov, S. N.
2017-10-01
To implement quantum random number generators, it is fundamentally important to have a mathematically provable and experimentally testable process of measurements of a system from which an initial random sequence is generated. This makes sure that randomness indeed has a quantum nature. A quantum random number generator has been implemented with the use of the detection of quasi-single-photon radiation by a silicon photomultiplier (SiPM) matrix, which makes it possible to reliably reach the Poisson statistics of photocounts. The choice and use of the optimal clustering of photocounts for the initial sequence of photodetection events and a method of extraction of a random sequence of 0's and 1's, which is polynomial in the length of the sequence, have made it possible to reach a yield rate of 64 Mbit/s of the output certainly random sequence.
Living Organisms Author Their Read-Write Genomes in Evolution
2017-01-01
Evolutionary variations generating phenotypic adaptations and novel taxa resulted from complex cellular activities altering genome content and expression: (i) Symbiogenetic cell mergers producing the mitochondrion-bearing ancestor of eukaryotes and chloroplast-bearing ancestors of photosynthetic eukaryotes; (ii) interspecific hybridizations and genome doublings generating new species and adaptive radiations of higher plants and animals; and, (iii) interspecific horizontal DNA transfer encoding virtually all of the cellular functions between organisms and their viruses in all domains of life. Consequently, assuming that evolutionary processes occur in isolated genomes of individual species has become an unrealistic abstraction. Adaptive variations also involved natural genetic engineering of mobile DNA elements to rewire regulatory networks. In the most highly evolved organisms, biological complexity scales with “non-coding” DNA content more closely than with protein-coding capacity. Coincidentally, we have learned how so-called “non-coding” RNAs that are rich in repetitive mobile DNA sequences are key regulators of complex phenotypes. Both biotic and abiotic ecological challenges serve as triggers for episodes of elevated genome change. The intersections of cell activities, biosphere interactions, horizontal DNA transfers, and non-random Read-Write genome modifications by natural genetic engineering provide a rich molecular and biological foundation for understanding how ecological disruptions can stimulate productive, often abrupt, evolutionary transformations. PMID:29211049
Living Organisms Author Their Read-Write Genomes in Evolution.
Shapiro, James A
2017-12-06
Evolutionary variations generating phenotypic adaptations and novel taxa resulted from complex cellular activities altering genome content and expression: (i) Symbiogenetic cell mergers producing the mitochondrion-bearing ancestor of eukaryotes and chloroplast-bearing ancestors of photosynthetic eukaryotes; (ii) interspecific hybridizations and genome doublings generating new species and adaptive radiations of higher plants and animals; and, (iii) interspecific horizontal DNA transfer encoding virtually all of the cellular functions between organisms and their viruses in all domains of life. Consequently, assuming that evolutionary processes occur in isolated genomes of individual species has become an unrealistic abstraction. Adaptive variations also involved natural genetic engineering of mobile DNA elements to rewire regulatory networks. In the most highly evolved organisms, biological complexity scales with "non-coding" DNA content more closely than with protein-coding capacity. Coincidentally, we have learned how so-called "non-coding" RNAs that are rich in repetitive mobile DNA sequences are key regulators of complex phenotypes. Both biotic and abiotic ecological challenges serve as triggers for episodes of elevated genome change. The intersections of cell activities, biosphere interactions, horizontal DNA transfers, and non-random Read-Write genome modifications by natural genetic engineering provide a rich molecular and biological foundation for understanding how ecological disruptions can stimulate productive, often abrupt, evolutionary transformations.
Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Analysis
NASA Technical Reports Server (NTRS)
Hanson, J. M.; Beard, B. B.
2010-01-01
This Technical Publication (TP) is meant to address a number of topics related to the application of Monte Carlo simulation to launch vehicle design and requirements analysis. Although the focus is on a launch vehicle application, the methods may be applied to other complex systems as well. The TP is organized so that all the important topics are covered in the main text, and detailed derivations are in the appendices. The TP first introduces Monte Carlo simulation and the major topics to be discussed, including discussion of the input distributions for Monte Carlo runs, testing the simulation, how many runs are necessary for verification of requirements, what to do if results are desired for events that happen only rarely, and postprocessing, including analyzing any failed runs, examples of useful output products, and statistical information for generating desired results from the output data. Topics in the appendices include some tables for requirements verification, derivation of the number of runs required and generation of output probabilistic data with consumer risk included, derivation of launch vehicle models to include possible variations of assembled vehicles, minimization of a consumable to achieve a two-dimensional statistical result, recontact probability during staging, ensuring duplicated Monte Carlo random variations, and importance sampling.
Murchie, P; Chowdhury, A; Smith, S; Campbell, N C; Lee, A J; Linden, D; Burton, C D
2015-05-26
Publicly available data show variation in GPs' use of urgent suspected cancer (USC) referral pathways. We investigated whether this could be due to small numbers of cancer cases and random case-mix, rather than due to true variation in performance. We analysed individual GP practice USC referral detection rates (proportion of the practice's cancer cases that are detected via USC) and conversion rates (proportion of the practice's USC referrals that prove to be cancer) in routinely collected data from GP practices in all of England (over 4 years) and northeast Scotland (over 7 years). We explored the effect of pooling data. We then modelled the effects of adding random case-mix to practice variation. Correlations between practice detection rate and conversion rate became less positive when data were aggregated over several years. Adding random case-mix to between-practice variation indicated that the median proportion of poorly performing practices correctly identified after 25 cancer cases were examined was 20% (IQR 17 to 24) and after 100 cases was 44% (IQR 40 to 47). Much apparent variation in GPs' use of suspected cancer referral pathways can be attributed to random case-mix. The methods currently used to assess the quality of GP-suspected cancer referral performance, and to compare individual practices, are misleading. These should no longer be used, and more appropriate and robust methods should be developed.
Pramudya, Ragita C; Seo, Han-Seok
2018-03-01
Temperatures of most hot or cold meal items change over the period of consumption, possibly influencing sensory perception of those items. Unlike temporal variations in sensory attributes, product temperature-induced variations have not received much attention. Using a Check-All-That-Apply (CATA) method, this study aimed to characterize variations in sensory attributes over a wide range of temperatures at which hot or cold foods and beverages may be consumed. Cooked milled rice, typically consumed at temperatures between 70 and 30°C in many rice-eating countries, was used as a target sample in this study. Two brands of long-grain milled rice were cooked and randomly presented at 70, 60, 50, 40, and 30°C. Thirty-five CATA terms for cooked milled rice were generated. Eighty-eight untrained panelists were asked to quickly select all the CATA terms that they considered appropriate to characterize sensory attributes of cooked rice samples presented at each temperature. Proportions of selection by panelists for 13 attributes significantly differed among the five temperature conditions. "Product temperature-dependent sensory-attribute variations" differed with two brands of milled rice grains. Such variations in sensory attributes, resulted from both product temperature and rice brand, were more pronounced among panelists who more frequently consumed rice. In conclusion, the CATA method can be useful for characterizing "product temperature-dependent sensory attribute variations" in cooked milled-rice samples. Further study is needed to examine whether the CATA method is also effective in capturing "product temperature-dependent sensory-attribute variations" in other hot or cold foods and beverages. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Berger, Noam; Mukherjee, Chiranjib; Okamura, Kazuki
2018-03-01
We prove a quenched large deviation principle (LDP) for a simple random walk on a supercritical percolation cluster (SRWPC) on {Z^d} ({d ≥ 2}). The models under interest include classical Bernoulli bond and site percolation as well as models that exhibit long range correlations, like the random cluster model, the random interlacement and the vacant set of random interlacements (for {d ≥ 3}) and the level sets of the Gaussian free field ({d≥ 3}). Inspired by the methods developed by Kosygina et al. (Commun Pure Appl Math 59:1489-1521, 2006) for proving quenched LDP for elliptic diffusions with a random drift, and by Yilmaz (Commun Pure Appl Math 62(8):1033-1075, 2009) and Rosenbluth (Quenched large deviations for multidimensional random walks in a random environment: a variational formula. Ph.D. thesis, NYU, arXiv:0804.1444v1) for similar results regarding elliptic random walks in random environment, we take the point of view of the moving particle and prove a large deviation principle for the quenched distribution of the pair empirical measures of the environment Markov chain in the non-elliptic case of SRWPC. Via a contraction principle, this reduces easily to a quenched LDP for the distribution of the mean velocity of the random walk and both rate functions admit explicit variational formulas. The main difficulty in our set up lies in the inherent non-ellipticity as well as the lack of translation-invariance stemming from conditioning on the fact that the origin belongs to the infinite cluster. We develop a unifying approach for proving quenched large deviations for SRWPC based on exploiting coercivity properties of the relative entropies in the context of convex variational analysis, combined with input from ergodic theory and invoking geometric properties of the supercritical percolation cluster.
NASA Astrophysics Data System (ADS)
Berger, Noam; Mukherjee, Chiranjib; Okamura, Kazuki
2017-12-01
We prove a quenched large deviation principle (LDP) for a simple random walk on a supercritical percolation cluster (SRWPC) on {Z^d} ({d ≥ 2} ). The models under interest include classical Bernoulli bond and site percolation as well as models that exhibit long range correlations, like the random cluster model, the random interlacement and the vacant set of random interlacements (for {d ≥ 3} ) and the level sets of the Gaussian free field ({d≥ 3} ). Inspired by the methods developed by Kosygina et al. (Commun Pure Appl Math 59:1489-1521, 2006) for proving quenched LDP for elliptic diffusions with a random drift, and by Yilmaz (Commun Pure Appl Math 62(8):1033-1075, 2009) and Rosenbluth (Quenched large deviations for multidimensional random walks in a random environment: a variational formula. Ph.D. thesis, NYU, arXiv:0804.1444v1) for similar results regarding elliptic random walks in random environment, we take the point of view of the moving particle and prove a large deviation principle for the quenched distribution of the pair empirical measures of the environment Markov chain in the non-elliptic case of SRWPC. Via a contraction principle, this reduces easily to a quenched LDP for the distribution of the mean velocity of the random walk and both rate functions admit explicit variational formulas. The main difficulty in our set up lies in the inherent non-ellipticity as well as the lack of translation-invariance stemming from conditioning on the fact that the origin belongs to the infinite cluster. We develop a unifying approach for proving quenched large deviations for SRWPC based on exploiting coercivity properties of the relative entropies in the context of convex variational analysis, combined with input from ergodic theory and invoking geometric properties of the supercritical percolation cluster.
NASA Astrophysics Data System (ADS)
Das, Suman; Sadique Uz Zaman, J. K. M.; Ghosh, Ranjan
2016-06-01
In Advanced Encryption Standard (AES), the standard S-Box is conventionally generated by using a particular irreducible polynomial {11B} in GF(28) as the modulus and a particular additive constant polynomial {63} in GF(2), though it can be generated by many other polynomials. In this paper, it has been shown that it is possible to generate secured AES S-Boxes by using some other selected modulus and additive polynomials and also can be generated randomly, using a PRNG like BBS. A comparative study has been made on the randomness of corresponding AES ciphertexts generated, using these S-Boxes, by the NIST Test Suite coded for this paper. It has been found that besides using the standard one, other moduli and additive constants are also able to generate equally or better random ciphertexts; the same is true for random S-Boxes also. As these new types of S-Boxes are user-defined, hence unknown, they are able to prevent linear and differential cryptanalysis. Moreover, they act as additional key-inputs to AES, thus increasing the key-space.
On grey levels in random CAPTCHA generation
NASA Astrophysics Data System (ADS)
Newton, Fraser; Kouritzin, Michael A.
2011-06-01
A CAPTCHA is an automatically generated test designed to distinguish between humans and computer programs; specifically, they are designed to be easy for humans but difficult for computer programs to pass in order to prevent the abuse of resources by automated bots. They are commonly seen guarding webmail registration forms, online auction sites, and preventing brute force attacks on passwords. In the following, we address the question: How does adding a grey level to random CAPTCHA generation affect the utility of the CAPTCHA? We treat the problem of generating the random CAPTCHA as one of random field simulation: An initial state of background noise is evolved over time using Gibbs sampling and an efficient algorithm for generating correlated random variables. This approach has already been found to yield highly-readable yet difficult-to-crack CAPTCHAs. We detail how the requisite parameters for introducing grey levels are estimated and how we generate the random CAPTCHA. The resulting CAPTCHA will be evaluated in terms of human readability as well as its resistance to automated attacks in the forms of character segmentation and optical character recognition.
Pawlowski, Marcin Piotr; Jara, Antonio; Ogorzalek, Maciej
2015-01-01
Entropy in computer security is associated with the unpredictability of a source of randomness. The random source with high entropy tends to achieve a uniform distribution of random values. Random number generators are one of the most important building blocks of cryptosystems. In constrained devices of the Internet of Things ecosystem, high entropy random number generators are hard to achieve due to hardware limitations. For the purpose of the random number generation in constrained devices, this work proposes a solution based on the least-significant bits concatenation entropy harvesting method. As a potential source of entropy, on-board integrated sensors (i.e., temperature, humidity and two different light sensors) have been analyzed. Additionally, the costs (i.e., time and memory consumption) of the presented approach have been measured. The results obtained from the proposed method with statistical fine tuning achieved a Shannon entropy of around 7.9 bits per byte of data for temperature and humidity sensors. The results showed that sensor-based random number generators are a valuable source of entropy with very small RAM and Flash memory requirements for constrained devices of the Internet of Things. PMID:26506357
Pawlowski, Marcin Piotr; Jara, Antonio; Ogorzalek, Maciej
2015-10-22
Entropy in computer security is associated with the unpredictability of a source of randomness. The random source with high entropy tends to achieve a uniform distribution of random values. Random number generators are one of the most important building blocks of cryptosystems. In constrained devices of the Internet of Things ecosystem, high entropy random number generators are hard to achieve due to hardware limitations. For the purpose of the random number generation in constrained devices, this work proposes a solution based on the least-significant bits concatenation entropy harvesting method. As a potential source of entropy, on-board integrated sensors (i.e., temperature, humidity and two different light sensors) have been analyzed. Additionally, the costs (i.e., time and memory consumption) of the presented approach have been measured. The results obtained from the proposed method with statistical fine tuning achieved a Shannon entropy of around 7.9 bits per byte of data for temperature and humidity sensors. The results showed that sensor-based random number generators are a valuable source of entropy with very small RAM and Flash memory requirements for constrained devices of the Internet of Things.
Truly random number generation: an example
NASA Astrophysics Data System (ADS)
Frauchiger, Daniela; Renner, Renato
2013-10-01
Randomness is crucial for a variety of applications, ranging from gambling to computer simulations, and from cryptography to statistics. However, many of the currently used methods for generating randomness do not meet the criteria that are necessary for these applications to work properly and safely. A common problem is that a sequence of numbers may look random but nevertheless not be truly random. In fact, the sequence may pass all standard statistical tests and yet be perfectly predictable. This renders it useless for many applications. For example, in cryptography, the predictability of a "andomly" chosen password is obviously undesirable. Here, we review a recently developed approach to generating true | and hence unpredictable | randomness.
Recent advances in scalable non-Gaussian geostatistics: The generalized sub-Gaussian model
NASA Astrophysics Data System (ADS)
Guadagnini, Alberto; Riva, Monica; Neuman, Shlomo P.
2018-07-01
Geostatistical analysis has been introduced over half a century ago to allow quantifying seemingly random spatial variations in earth quantities such as rock mineral content or permeability. The traditional approach has been to view such quantities as multivariate Gaussian random functions characterized by one or a few well-defined spatial correlation scales. There is, however, mounting evidence that many spatially varying quantities exhibit non-Gaussian behavior over a multiplicity of scales. The purpose of this minireview is not to paint a broad picture of the subject and its treatment in the literature. Instead, we focus on very recent advances in the recognition and analysis of this ubiquitous phenomenon, which transcends hydrology and the Earth sciences, brought about largely by our own work. In particular, we use porosity data from a deep borehole to illustrate typical aspects of such scalable non-Gaussian behavior, describe a very recent theoretical model that (for the first time) captures all these behavioral aspects in a comprehensive manner, show how this allows generating random realizations of the quantity conditional on sampled values, point toward ways of incorporating scalable non-Gaussian behavior in hydrologic analysis, highlight the significance of doing so, and list open questions requiring further research.
Effective band structure of random III-V alloys
NASA Astrophysics Data System (ADS)
Popescu, Voicu; Zunger, Alex
2010-03-01
Random substitutional alloys have no long range order (LRO) or translational symmetry so rigorously speaking they have no E(k) band structure or manifestations thereof. Yet, many experiments on alloys are interpreted using the language of band theory, e.g. inferring Van Hove singularities, band dispersion and effective masses. Many standard alloy theories (VCA- or CPA-based) have the LRO imposed on the alloy Hamiltonian, assuming only on-site disorder, so they can not be used to judge the extent of LRO that really exists. We adopt the opposite way, by using large (thousand atom) randomly generated supercells in which chemically identical alloy atoms are allowed to have different local environments (a polymorphous representation). This then drives site-dependent atomic relaxation as well as potential fluctuations. The eigenstates from such supercells are then mapped onto the Brillouin zone (BZ) of the primitive cell, producing effective band dispersion. Results for (In,Ga)X show band-like behaviour only near the centre and faces of the BZ but rapidly lose such characteristics away from γ or for higher bands. We further analyse the effects of stoichiometry variation, internal relaxation, and short-range order on the alloy band structure.
Clinical Applications of Stochastic Dynamic Models of the Brain, Part I: A Primer.
Roberts, James A; Friston, Karl J; Breakspear, Michael
2017-04-01
Biological phenomena arise through interactions between an organism's intrinsic dynamics and stochastic forces-random fluctuations due to external inputs, thermal energy, or other exogenous influences. Dynamic processes in the brain derive from neurophysiology and anatomical connectivity; stochastic effects arise through sensory fluctuations, brainstem discharges, and random microscopic states such as thermal noise. The dynamic evolution of systems composed of both dynamic and random effects can be studied with stochastic dynamic models (SDMs). This article, Part I of a two-part series, offers a primer of SDMs and their application to large-scale neural systems in health and disease. The companion article, Part II, reviews the application of SDMs to brain disorders. SDMs generate a distribution of dynamic states, which (we argue) represent ideal candidates for modeling how the brain represents states of the world. When augmented with variational methods for model inversion, SDMs represent a powerful means of inferring neuronal dynamics from functional neuroimaging data in health and disease. Together with deeper theoretical considerations, this work suggests that SDMs will play a unique and influential role in computational psychiatry, unifying empirical observations with models of perception and behavior. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
DNA-based random number generation in security circuitry.
Gearheart, Christy M; Arazi, Benjamin; Rouchka, Eric C
2010-06-01
DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. This research focuses on further developing DNA-based methodologies to mimic digital data manipulation. While exhibiting fundamental principles, this work was done in conjunction with the vision that DNA-based circuitry, when the technology matures, will form the basis for a tamper-proof security module, revolutionizing the meaning and concept of tamper-proofing and possibly preventing it altogether based on accurate scientific observations. A paramount part of such a solution would be self-generation of random numbers. A novel prototype schema employs solid phase synthesis of oligonucleotides for random construction of DNA sequences; temporary storage and retrieval is achieved through plasmid vectors. A discussion of how to evaluate sequence randomness is included, as well as how these techniques are applied to a simulation of the random number generation circuitry. Simulation results show generated sequences successfully pass three selected NIST random number generation tests specified for security applications.
NASA Astrophysics Data System (ADS)
Matsumoto, Kouhei; Kasuya, Yuki; Yumoto, Mitsuki; Arai, Hideaki; Sato, Takashi; Sakamoto, Shuichi; Ohkawa, Masashi; Ohdaira, Yasuo
2018-02-01
Not so long ago, pseudo random numbers generated by numerical formulae were considered to be adequate for encrypting important data-files, because of the time needed to decode them. With today's ultra high-speed processors, however, this is no longer true. So, in order to thwart ever-more advanced attempts to breach our system's protections, cryptologists have devised a method that is considered to be virtually impossible to decode, and uses what is a limitless number of physical random numbers. This research describes a method, whereby laser diode's frequency noise generate a large quantities of physical random numbers. Using two types of photo detectors (APD and PIN-PD), we tested the abilities of two types of lasers (FP-LD and VCSEL) to generate random numbers. In all instances, an etalon served as frequency discriminator, the examination pass rates were determined using NIST FIPS140-2 test at each bit, and the Random Number Generation (RNG) speed was noted.
Generation of pseudo-random numbers
NASA Technical Reports Server (NTRS)
Howell, L. W.; Rheinfurth, M. H.
1982-01-01
Practical methods for generating acceptable random numbers from a variety of probability distributions which are frequently encountered in engineering applications are described. The speed, accuracy, and guarantee of statistical randomness of the various methods are discussed.
Ultra-fast quantum randomness generation by accelerated phase diffusion in a pulsed laser diode.
Abellán, C; Amaya, W; Jofre, M; Curty, M; Acín, A; Capmany, J; Pruneri, V; Mitchell, M W
2014-01-27
We demonstrate a high bit-rate quantum random number generator by interferometric detection of phase diffusion in a gain-switched DFB laser diode. Gain switching at few-GHz frequencies produces a train of bright pulses with nearly equal amplitudes and random phases. An unbalanced Mach-Zehnder interferometer is used to interfere subsequent pulses and thereby generate strong random-amplitude pulses, which are detected and digitized to produce a high-rate random bit string. Using established models of semiconductor laser field dynamics, we predict a regime of high visibility interference and nearly complete vacuum-fluctuation-induced phase diffusion between pulses. These are confirmed by measurement of pulse power statistics at the output of the interferometer. Using a 5.825 GHz excitation rate and 14-bit digitization, we observe 43 Gbps quantum randomness generation.
Self-balanced real-time photonic scheme for ultrafast random number generation
NASA Astrophysics Data System (ADS)
Li, Pu; Guo, Ya; Guo, Yanqiang; Fan, Yuanlong; Guo, Xiaomin; Liu, Xianglian; Shore, K. Alan; Dubrova, Elena; Xu, Bingjie; Wang, Yuncai; Wang, Anbang
2018-06-01
We propose a real-time self-balanced photonic method for extracting ultrafast random numbers from broadband randomness sources. In place of electronic analog-to-digital converters (ADCs), the balanced photo-detection technology is used to directly quantize optically sampled chaotic pulses into a continuous random number stream. Benefitting from ultrafast photo-detection, our method can efficiently eliminate the generation rate bottleneck from electronic ADCs which are required in nearly all the available fast physical random number generators. A proof-of-principle experiment demonstrates that using our approach 10 Gb/s real-time and statistically unbiased random numbers are successfully extracted from a bandwidth-enhanced chaotic source. The generation rate achieved experimentally here is being limited by the bandwidth of the chaotic source. The method described has the potential to attain a real-time rate of 100 Gb/s.
Statistical considerations for grain-size analyses of tills
Jacobs, A.M.
1971-01-01
Relative percentages of sand, silt, and clay from samples of the same till unit are not identical because of different lithologies in the source areas, sorting in transport, random variation, and experimental error. Random variation and experimental error can be isolated from the other two as follows. For each particle-size class of each till unit, a standard population is determined by using a normally distributed, representative group of data. New measurements are compared with the standard population and, if they compare satisfactorily, the experimental error is not significant and random variation is within the expected range for the population. The outcome of the comparison depends on numerical criteria derived from a graphical method rather than on a more commonly used one-way analysis of variance with two treatments. If the number of samples and the standard deviation of the standard population are substituted in a t-test equation, a family of hyperbolas is generated, each of which corresponds to a specific number of subsamples taken from each new sample. The axes of the graphs of the hyperbolas are the standard deviation of new measurements (horizontal axis) and the difference between the means of the new measurements and the standard population (vertical axis). The area between the two branches of each hyperbola corresponds to a satisfactory comparison between the new measurements and the standard population. Measurements from a new sample can be tested by plotting their standard deviation vs. difference in means on axes containing a hyperbola corresponding to the specific number of subsamples used. If the point lies between the branches of the hyperbola, the measurements are considered reliable. But if the point lies outside this region, the measurements are repeated. Because the critical segment of the hyperbola is approximately a straight line parallel to the horizontal axis, the test is simplified to a comparison between the means of the standard population and the means of the subsample. The minimum number of subsamples required to prove significant variation between samples caused by different lithologies in the source areas and sorting in transport can be determined directly from the graphical method. The minimum number of subsamples required is the maximum number to be run for economy of effort. ?? 1971 Plenum Publishing Corporation.
Time Course of Visual Extrapolation Accuracy
1995-09-01
The pond and duckweed problem: Three experiments on the misperception of exponential growth . Acta Psychologica 43, 239-251. Wiener, E.L., 1962...random variation in tracker velocity. Both models predicted changes in hit and false alarm rates well, except in a condition where response asymmetries...systematic velocity error in tracking, only random variation in tracker velocity. Both models predicted changes in hit and false alarm rates well
Detrended fluctuation analysis of short datasets: An application to fetal cardiac data
NASA Astrophysics Data System (ADS)
Govindan, R. B.; Wilson, J. D.; Preißl, H.; Eswaran, H.; Campbell, J. Q.; Lowery, C. L.
2007-02-01
Using detrended fluctuation analysis (DFA) we perform scaling analysis of short datasets of length 500-1500 data points. We quantify the long range correlation (exponent α) by computing the mean value of the local exponents αL (in the asymptotic regime). The local exponents are obtained as the (numerical) derivative of the logarithm of the fluctuation function F(s) with respect to the logarithm of the scale factor s:αL=dlog10F(s)/dlog10s. These local exponents display huge variations and complicate the correct quantification of the underlying correlations. We propose the use of the phase randomized surrogate (PRS), which preserves the long range correlations of the original data, to minimize the variations in the local exponents. Using the numerically generated uncorrelated and long range correlated data, we show that performing DFA on several realizations of PRS and estimating αL from the averaged fluctuation functions (of all realizations) can minimize the variations in αL. The application of this approach to the fetal cardiac data (RR intervals) is discussed and we show that there is a statistically significant correlation between α and the gestation age.
Kallifatidis, Beatrice; Borovička, Jan; Stránská, Jana; Drábek, Jiří; Mills, Deetta K
2014-03-01
The capability of Fluorescent Random Amplified Microsatellites (F-RAMS) to profile hallucinogenic mushrooms to species and sub-species level was assessed. Fifteen samples of Amanita rubescens and 22 samples of other hallucinogenic and non-hallucinogenic mushrooms of the genera Amanita and Psilocybe were profiled using two fluorescently-labeled, 5'degenerate primers, 5'-6FAM-SpC3-DD (CCA)5 and 5'-6FAM-SpC3-DHB (CGA)5, which target different microsatellite repeat regions. Among the two primers, 5'-6FAM-SpC3-DHB (CGA)5 provided more reliable data for identification purposes, by grouping samples of the same species and clustering closely related species together in a dendrogram based on amplicon similarities. A high degree of intra-specific variation between the 15 A. rubescens samples was shown with both primers and the amplicons generated for all A. rubescens samples were organized into three classes of amplicons (discriminant, private, and marker) based on their individualizing potential. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Approximate Genealogies Under Genetic Hitchhiking
Pfaffelhuber, P.; Haubold, B.; Wakolbinger, A.
2006-01-01
The rapid fixation of an advantageous allele leads to a reduction in linked neutral variation around the target of selection. The genealogy at a neutral locus in such a selective sweep can be simulated by first generating a random path of the advantageous allele's frequency and then a structured coalescent in this background. Usually the frequency path is approximated by a logistic growth curve. We discuss an alternative method that approximates the genealogy by a random binary splitting tree, a so-called Yule tree that does not require first constructing a frequency path. Compared to the coalescent in a logistic background, this method gives a slightly better approximation for identity by descent during the selective phase and a much better approximation for the number of lineages that stem from the founder of the selective sweep. In applications such as the approximation of the distribution of Tajima's D, the two approximation methods perform equally well. For relevant parameter ranges, the Yule approximation is faster. PMID:17182733
Friberg, Urban; Lew, Timothy A; Byrne, Phillip G; Rice, William R
2005-07-01
In promiscuous species, sexual selection generates two opposing male traits: offense (acquiring new mates and supplanting stored sperm) and defense (enforcing fidelity on one's mates and preventing sperm displacement when this fails). Coevolution between these traits requires both additive genetic variation and associated natural selection. Previous work with Drosophila melanogaster found autosomal genetic variation for these traits among inbred lines from a mixture of populations, but only nonheritable genetic variation was found within a single outbred population. These results do not support ongoing antagonistic coevolution between offense and defense, nor between either of these male traits and female reproductive characters. Here we use a new method (hemiclonal analysis) to study genomewide genetic variation in a large outbred laboratory population of D. melanogaster. Hemiclonal analysis estimates the additive genetic variation among random, genomewide haplotypes taken from a large, outbred, locally adapted laboratory population and determines the direction of the selection gradient on this variation. In contrast to earlier studies, we found low but biologically significant heritable variation for defensive and offensive offspring production as well as all their components (P1, fidelity, P2, and remating). Genetic correlations between these traits were substantially different from those reported for inbred lines. A positive genetic correlation was found between defense and offense, demonstrating that some shared genes influence both traits. In addition to this common variation, evidence for unique genetic variation for each trait was also found, supporting an ongoing coevolutionary arms race between defense and offense. Reproductive conflict between males can strongly influence female fitness. Correspondingly, we found genetic variation in both defense and offense that affected female fitness. No evidence was found for intersexual conflict in the context of male defense, but we found substantial intersexual conflict in the context of male offensive sperm competitive ability. These results indicate that conflict between competing males also promotes an associated arms race between the sexes.
Levine, M W
1991-01-01
Simulated neural impulse trains were generated by a digital realization of the integrate-and-fire model. The variability in these impulse trains had as its origin a random noise of specified distribution. Three different distributions were used: the normal (Gaussian) distribution (no skew, normokurtic), a first-order gamma distribution (positive skew, leptokurtic), and a uniform distribution (no skew, platykurtic). Despite these differences in the distribution of the variability, the distributions of the intervals between impulses were nearly indistinguishable. These inter-impulse distributions were better fit with a hyperbolic gamma distribution than a hyperbolic normal distribution, although one might expect a better approximation for normally distributed inverse intervals. Consideration of why the inter-impulse distribution is independent of the distribution of the causative noise suggests two putative interval distributions that do not depend on the assumed noise distribution: the log normal distribution, which is predicated on the assumption that long intervals occur with the joint probability of small input values, and the random walk equation, which is the diffusion equation applied to a random walk model of the impulse generating process. Either of these equations provides a more satisfactory fit to the simulated impulse trains than the hyperbolic normal or hyperbolic gamma distributions. These equations also provide better fits to impulse trains derived from the maintained discharges of ganglion cells in the retinae of cats or goldfish. It is noted that both equations are free from the constraint that the coefficient of variation (CV) have a maximum of unity.(ABSTRACT TRUNCATED AT 250 WORDS)
NASA Astrophysics Data System (ADS)
Sun, Fubao; Roderick, Michael L.; Lim, Wee Ho; Farquhar, Graham D.
2011-12-01
We assess hydroclimatic projections for the Murray-Darling Basin (MDB) using an ensemble of 39 Intergovernmental Panel on Climate Change AR4 climate model runs based on the A1B emissions scenario. The raw model output for precipitation, P, was adjusted using a quantile-based bias correction approach. We found that the projected change, ΔP, between two 30 year periods (2070-2099 less 1970-1999) was little affected by bias correction. The range for ΔP among models was large (˜±150 mm yr-1) with all-model run and all-model ensemble averages (4.9 and -8.1 mm yr-1) near zero, against a background climatological P of ˜500 mm yr-1. We found that the time series of actually observed annual P over the MDB was indistinguishable from that generated by a purely random process. Importantly, nearly all the model runs showed similar behavior. We used these facts to develop a new approach to understanding variability in projections of ΔP. By plotting ΔP versus the variance of the time series, we could easily identify model runs with projections for ΔP that were beyond the bounds expected from purely random variations. For the MDB, we anticipate that a purely random process could lead to differences of ±57 mm yr-1 (95% confidence) between successive 30 year periods. This is equivalent to ±11% of the climatological P and translates into variations in runoff of around ±29%. This sets a baseline for gauging modeled and/or observed changes.
Comparison and correlation of Simple Sequence Repeats distribution in genomes of Brucella species
Kiran, Jangampalli Adi Pradeep; Chakravarthi, Veeraraghavulu Praveen; Kumar, Yellapu Nanda; Rekha, Somesula Swapna; Kruti, Srinivasan Shanthi; Bhaskar, Matcha
2011-01-01
Computational genomics is one of the important tools to understand the distribution of closely related genomes including simple sequence repeats (SSRs) in an organism, which gives valuable information regarding genetic variations. The central objective of the present study was to screen the SSRs distributed in coding and non-coding regions among different human Brucella species which are involved in a range of pathological disorders. Computational analysis of the SSRs in the Brucella indicates few deviations from expected random models. Statistical analysis also reveals that tri-nucleotide SSRs are overrepresented and tetranucleotide SSRs underrepresented in Brucella genomes. From the data, it can be suggested that over expressed tri-nucleotide SSRs in genomic and coding regions might be responsible in the generation of functional variation of proteins expressed which in turn may lead to different pathogenicity, virulence determinants, stress response genes, transcription regulators and host adaptation proteins of Brucella genomes. Abbreviations SSRs - Simple Sequence Repeats, ORFs - Open Reading Frames. PMID:21738309
Determinants of urban sprawl in European cities
Alvanides, Seraphim; Garrod, Guy
2015-01-01
This paper provides empirical evidence that helps to answer several key questions relating to the extent of urban sprawl in Europe. Building on the monocentric city model, this study uses existing data sources to derive a set of panel data for 282 European cities at three time points (1990, 2000 and 2006). Two indices of urban sprawl are calculated that, respectively, reflect changes in artificial area and the levels of urban fragmentation for each city. These are supplemented by a set of data on various economic and geographical variables that might explain the variation of the two indices. Using a Hausman-Taylor estimator and random regressors to control for the possible correlation between explanatory variables and unobservable city-level effects, we find that the fundamental conclusions of the standard monocentric model are valid in the European context for both indices. Although the variables generated by the monocentric model explain a large part of the variation of artificial area, their explanatory power for modelling the fragmentation index is relatively low. PMID:26321770
Determinants of urban sprawl in European cities.
Oueslati, Walid; Alvanides, Seraphim; Garrod, Guy
2015-07-01
This paper provides empirical evidence that helps to answer several key questions relating to the extent of urban sprawl in Europe. Building on the monocentric city model, this study uses existing data sources to derive a set of panel data for 282 European cities at three time points (1990, 2000 and 2006). Two indices of urban sprawl are calculated that, respectively, reflect changes in artificial area and the levels of urban fragmentation for each city. These are supplemented by a set of data on various economic and geographical variables that might explain the variation of the two indices. Using a Hausman-Taylor estimator and random regressors to control for the possible correlation between explanatory variables and unobservable city-level effects, we find that the fundamental conclusions of the standard monocentric model are valid in the European context for both indices. Although the variables generated by the monocentric model explain a large part of the variation of artificial area, their explanatory power for modelling the fragmentation index is relatively low.
A non-linear dimension reduction methodology for generating data-driven stochastic input models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ganapathysubramanian, Baskar; Zabaras, Nicholas
Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem ofmore » manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space R{sup n}. An isometric mapping F from M to a low-dimensional, compact, connected set A is contained in R{sup d}(d<
Jia, Rui; Tai, Fadao; An, Shucheng; Zhang, Xia; Broders, Hugh
2009-11-01
This study examined whether neonatal paternal deprivation (PD: father was removed and pups were raised just by mother) or early deprivation (ED: pups were raised by both parents except separated from not only the dam but also the peers for three hours a day from PND 0 to 13) has long-term effects on anxiety and social behaviors of adult mandarin voles. Newborn mandarin voles of F2 generation were randomly assigned to one of three groups: bi-parental care (PC: pups were raised by both parents), PD and ED. The parental care behaviors of F1 generation were observed at the age of 0, 13 and 21 days (PND 0, 13, 21) of F2 generation of PC and PD groups. Moreover, each mandarin vole of F2 generation received an open field test and a social interaction test on PND 70 and PND 75, respectively. No significant differences of parental behavior were observed between mothers and fathers from PC families, showing typical parental behavior of socially monogamous rodents. In addition, no significant differences of maternal behaviors were found between mothers from PC and PD families, indicating no maternal compensation towards pups for the absence of the paternal care. In the open field test, mandarin voles from both PD and ED families displayed higher levels of anxiety and lower locomotor activity, relative to offspring of PC family. In the social interaction test, both PD and ED mandarin voles also showed lower levels of social behavior and higher levels of anxiety. Thus, both PD and ED significantly increase anxiety and reduce social behavior of adult mandarin voles, suggesting that variation in parental investment may lead to variation in anxiety and social behaviors in rodents with different mating systems.
Application of random effects to the study of resource selection by animals
Gillies, C.S.; Hebblewhite, M.; Nielsen, S.E.; Krawchuk, M.A.; Aldridge, Cameron L.; Frair, J.L.; Saher, D.J.; Stevens, C.E.; Jerde, C.L.
2006-01-01
1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence.2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability.3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed.4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects.5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection.6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions limiting their generality. This approach will allow researchers to appropriately estimate marginal (population) and conditional (individual) responses, and account for complex grouping, unbalanced sample designs and autocorrelation.
Application of random effects to the study of resource selection by animals.
Gillies, Cameron S; Hebblewhite, Mark; Nielsen, Scott E; Krawchuk, Meg A; Aldridge, Cameron L; Frair, Jacqueline L; Saher, D Joanne; Stevens, Cameron E; Jerde, Christopher L
2006-07-01
1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence. 2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability. 3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed. 4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects. 5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection. 6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions limiting their generality. This approach will allow researchers to appropriately estimate marginal (population) and conditional (individual) responses, and account for complex grouping, unbalanced sample designs and autocorrelation.
NASA Astrophysics Data System (ADS)
Liu, L.; Zhao, Z.; Wang, Y.; Huang, Q.
2013-12-01
The lithosphere-atmosphere- ionosphere (LAI) system formed an electromagnetic (EM) cavity that hosts the EM field excited by electric currents generated by lightning and other natural sources. There have also been numerous reports on variations of the EM field existing in LAI system prior to some significance earthquakes. We simulated the EM field in the lithosphere-ionosphere waveguide with a whole-earth model using a curvature coordinate by the hybrid pseudo-spectral and finite difference time domain method. Considering the seismogensis as a fully coupled seismoelectric process, we simulate the seismic wave and the EM wave in this 2D model. In the model we have observed the excitation of the Schumann Resonance (SR) as the background EM field generated by randomly placed electric-current impulses within the lowest 10 kilometers of the atmosphere. The diurnal variation and the latitude-dependence in ion concentration in the ionosphere are included in the model. After the SR reaching a steady state, an electric impulse is introduced in the shallow lithosphere to mimic the seismogenic process (pre-, co- and post-seismic) to assess the possible precursory effects on SR strength and frequency. The modeling results can explain the observed fact of why SR has a much more sensitive response to continental earthquakes, and much less response to oceanic events. The fundamental reason is simply due to the shielding effect of the conductive ocean that prevents effective radiation of the seismoelectric signals from oceanic earthquake events into the LAI waveguide.
Miller, Thomas Martin; de Wet, Wouter C.; Patton, Bruce W.
2015-10-28
In this study, a computational assessment of the variation in terrestrial neutron and photon background from extraterrestrial sources is presented. The motivation of this assessment is to evaluate the practicality of developing a tool or database to estimate background in real time (or near–real time) during an experimental measurement or to even predict the background for future measurements. The extraterrestrial source focused on during this assessment is naturally occurring galactic cosmic rays (GCRs). The MCNP6 transport code was used to perform the computational assessment. However, the GCR source available in MCNP6 was not used. Rather, models developed and maintained bymore » NASA were used to generate the GCR sources. The largest variation in both neutron and photon background spectra was found to be caused by changes in elevation on Earth's surface, which can be as large as an order of magnitude. All other perturbations produced background variations on the order of a factor of 3 or less. The most interesting finding was that ~80% and 50% of terrestrial background neutrons and photons, respectively, are generated by interactions in Earth's surface and other naturally occurring and man-made objects near a detector of particles from extraterrestrial sources and their progeny created in Earth's atmosphere. In conclusion, this assessment shows that it will be difficult to estimate the terrestrial background from extraterrestrial sources without a good understanding of a detector's surroundings. Therefore, estimating or predicting background during a measurement environment like a mobile random search will be difficult.« less
Waxman, D
2012-06-01
A fundamental result of population genetics states that a new mutation, at an unlinked neutral locus in a randomly mating diploid population, has a mean time of fixation of ∼4N(e) generations, where N(e) is the effective population size. This result is based on an assumption of fixed population size, which does not universally hold in natural populations. Here, we analyze such neutral fixations in populations of changing size within the framework of the diffusion approximation. General expressions are derived for the mean and variance of the fixation time in changing populations. Some explicit results are given for two cases: (i) the effective population size undergoes a sudden change, representing a sudden population expansion or a sudden bottleneck; (ii) the effective population changes linearly for a limited period of time and then remains constant. Additionally, a lower bound for the mean time of fixation is obtained for an effective population size that increases with time, and this is applied to exponentially growing populations. The results obtained in this work show, among other things, that for populations that increase in size, the mean time of fixation can be enhanced, sometimes substantially so, over 4N(e,0) generations, where N(e,0) is the effective population size at the time the mutation arises. Such an enhancement is associated with (i) an increased probability of neutral polymorphism in a population and (ii) an enhanced persistence of high-frequency neutral variation, which is the variation most likely to be observed.
Temporal changes in randomness of bird communities across Central Europe.
Renner, Swen C; Gossner, Martin M; Kahl, Tiemo; Kalko, Elisabeth K V; Weisser, Wolfgang W; Fischer, Markus; Allan, Eric
2014-01-01
Many studies have examined whether communities are structured by random or deterministic processes, and both are likely to play a role, but relatively few studies have attempted to quantify the degree of randomness in species composition. We quantified, for the first time, the degree of randomness in forest bird communities based on an analysis of spatial autocorrelation in three regions of Germany. The compositional dissimilarity between pairs of forest patches was regressed against the distance between them. We then calculated the y-intercept of the curve, i.e. the 'nugget', which represents the compositional dissimilarity at zero spatial distance. We therefore assume, following similar work on plant communities, that this represents the degree of randomness in species composition. We then analysed how the degree of randomness in community composition varied over time and with forest management intensity, which we expected to reduce the importance of random processes by increasing the strength of environmental drivers. We found that a high portion of the bird community composition could be explained by chance (overall mean of 0.63), implying that most of the variation in local bird community composition is driven by stochastic processes. Forest management intensity did not consistently affect the mean degree of randomness in community composition, perhaps because the bird communities were relatively insensitive to management intensity. We found a high temporal variation in the degree of randomness, which may indicate temporal variation in assembly processes and in the importance of key environmental drivers. We conclude that the degree of randomness in community composition should be considered in bird community studies, and the high values we find may indicate that bird community composition is relatively hard to predict at the regional scale.
Yen, A M-F; Liou, H-H; Lin, H-L; Chen, T H-H
2006-01-01
The study aimed to develop a predictive model to deal with data fraught with heterogeneity that cannot be explained by sampling variation or measured covariates. The random-effect Poisson regression model was first proposed to deal with over-dispersion for data fraught with heterogeneity after making allowance for measured covariates. Bayesian acyclic graphic model in conjunction with Markov Chain Monte Carlo (MCMC) technique was then applied to estimate the parameters of both relevant covariates and random effect. Predictive distribution was then generated to compare the predicted with the observed for the Bayesian model with and without random effect. Data from repeated measurement of episodes among 44 patients with intractable epilepsy were used as an illustration. The application of Poisson regression without taking heterogeneity into account to epilepsy data yielded a large value of heterogeneity (heterogeneity factor = 17.90, deviance = 1485, degree of freedom (df) = 83). After taking the random effect into account, the value of heterogeneity factor was greatly reduced (heterogeneity factor = 0.52, deviance = 42.5, df = 81). The Pearson chi2 for the comparison between the expected seizure frequencies and the observed ones at two and three months of the model with and without random effect were 34.27 (p = 1.00) and 1799.90 (p < 0.0001), respectively. The Bayesian acyclic model using the MCMC method was demonstrated to have great potential for disease prediction while data show over-dispersion attributed either to correlated property or to subject-to-subject variability.
Note: Fully integrated 3.2 Gbps quantum random number generator with real-time extraction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiao-Guang; Nie, You-Qi; Liang, Hao
2016-07-15
We present a real-time and fully integrated quantum random number generator (QRNG) by measuring laser phase fluctuations. The QRNG scheme based on laser phase fluctuations is featured for its capability of generating ultra-high-speed random numbers. However, the speed bottleneck of a practical QRNG lies on the limited speed of randomness extraction. To close the gap between the fast randomness generation and the slow post-processing, we propose a pipeline extraction algorithm based on Toeplitz matrix hashing and implement it in a high-speed field-programmable gate array. Further, all the QRNG components are integrated into a module, including a compact and actively stabilizedmore » interferometer, high-speed data acquisition, and real-time data post-processing and transmission. The final generation rate of the QRNG module with real-time extraction can reach 3.2 Gbps.« less
Cure fraction model with random effects for regional variation in cancer survival.
Seppä, Karri; Hakulinen, Timo; Kim, Hyon-Jung; Läärä, Esa
2010-11-30
Assessing regional differences in the survival of cancer patients is important but difficult when separate regions are small or sparsely populated. In this paper, we apply a mixture cure fraction model with random effects to cause-specific survival data of female breast cancer patients collected by the population-based Finnish Cancer Registry. Two sets of random effects were used to capture the regional variation in the cure fraction and in the survival of the non-cured patients, respectively. This hierarchical model was implemented in a Bayesian framework using a Metropolis-within-Gibbs algorithm. To avoid poor mixing of the Markov chain, when the variance of either set of random effects was close to zero, posterior simulations were based on a parameter-expanded model with tailor-made proposal distributions in Metropolis steps. The random effects allowed the fitting of the cure fraction model to the sparse regional data and the estimation of the regional variation in 10-year cause-specific breast cancer survival with a parsimonious number of parameters. Before 1986, the capital of Finland clearly stood out from the rest, but since then all the 21 hospital districts have achieved approximately the same level of survival. Copyright © 2010 John Wiley & Sons, Ltd.
Poverty and Economic Decision-Making: Evidence from Changes in Financial Resources at Payday
Carvalho, Leandro S.; Meier, Stephan; Wang, Stephanie W.
2016-01-01
We study the effect of financial resources on decision-making. Low-income U.S. households are randomly assigned to receive an online survey before or after payday. The survey collects measures of cognitive function and administers risk and intertemporal choice tasks. The study design generates variation in cash, checking and savings balances, and expenditures. Before-payday participants behave as if they are more present-biased when making intertemporal choices about monetary rewards but not when making intertemporal choices about non-monetary real-effort tasks. Nor do we find before-after differences in risk-taking, the quality of decision-making, the performance in cognitive function tasks, or in heuristic judgments. PMID:28003681
Exploration Opportunity Search of Near-earth Objects Based on Analytical Gradients
NASA Astrophysics Data System (ADS)
Ren, Yuan; Cui, Ping-Yuan; Luan, En-Jie
2008-07-01
The problem of search of opportunity for the exploration of near-earth minor objects is investigated. For rendezvous missions, the analytical gradients of the performance index with respect to the free parameters are derived using the variational calculus and the theory of state-transition matrix. After generating randomly some initial guesses in the search space, the performance index is optimized, guided by the analytical gradients, leading to the local minimum points representing the potential launch opportunities. This method not only keeps the global-search property of the traditional method, but also avoids the blindness in the latter, thereby increasing greatly the computing speed. Furthermore, with this method, the searching precision could be controlled effectively.
Search of exploration opportunity for near earth objects based on analytical gradients
NASA Astrophysics Data System (ADS)
Ren, Y.; Cui, P. Y.; Luan, E. J.
2008-01-01
The problem of searching for exploration opportunity of near Earth objects is investigated. For rendezvous missions, the analytical gradients of performance index with respect to free parameters are derived by combining the calculus of variation with the theory of state-transition matrix. Then, some initial guesses are generated random in the search space, and the performance index is optimized with the guidance of analytical gradients from these initial guesses. This method not only keeps the property of global search in traditional method, but also avoids the blindness in the traditional exploration opportunity search; hence, the computing speed could be increased greatly. Furthermore, by using this method, the search precision could be controlled effectively.
Curved tails in polymerization-based bacterial motility
NASA Astrophysics Data System (ADS)
Rutenberg, Andrew D.; Grant, Martin
2001-08-01
The curved actin ``comet-tail'' of the bacterium Listeria monocytogenes is a visually striking signature of actin polymerization-based motility. Similar actin tails are associated with Shigella flexneri, spotted-fever Rickettsiae, the Vaccinia virus, and vesicles and microspheres in related in vitro systems. We show that the torque required to produce the curvature in the tail can arise from randomly placed actin filaments pushing the bacterium or particle. We find that the curvature magnitude determines the number of actively pushing filaments, independent of viscosity and of the molecular details of force generation. The variation of the curvature with time can be used to infer the dynamics of actin filaments at the bacterial surface.
Experimental nonlocality-based randomness generation with nonprojective measurements
NASA Astrophysics Data System (ADS)
Gómez, S.; Mattar, A.; Gómez, E. S.; Cavalcanti, D.; Farías, O. Jiménez; Acín, A.; Lima, G.
2018-04-01
We report on an optical setup generating more than one bit of randomness from one entangled bit (i.e., a maximally entangled state of two qubits). The amount of randomness is certified through the observation of Bell nonlocal correlations. To attain this result we implemented a high-purity entanglement source and a nonprojective three-outcome measurement. Our implementation achieves a gain of 27% of randomness as compared with the standard methods using projective measurements. Additionally, we estimate the amount of randomness certified in a one-sided device-independent scenario, through the observation of Einstein-Podolsky-Rosen steering. Our results prove that nonprojective quantum measurements allow extending the limits for nonlocality-based certified randomness generation using current technology.
Lee, Jeffrey S; Cleaver, Gerald B
2017-10-01
In this note, the Cosmic Microwave Background (CMB) Radiation is shown to be capable of functioning as a Random Bit Generator, and constitutes an effectively infinite supply of truly random one-time pad values of arbitrary length. It is further argued that the CMB power spectrum potentially conforms to the FIPS 140-2 standard. Additionally, its applicability to the generation of a (n × n) random key matrix for a Vernam cipher is established.
A new simple technique for improving the random properties of chaos-based cryptosystems
NASA Astrophysics Data System (ADS)
Garcia-Bosque, M.; Pérez-Resa, A.; Sánchez-Azqueta, C.; Celma, S.
2018-03-01
A new technique for improving the security of chaos-based stream ciphers has been proposed and tested experimentally. This technique manages to improve the randomness properties of the generated keystream by preventing the system to fall into short period cycles due to digitation. In order to test this technique, a stream cipher based on a Skew Tent Map algorithm has been implemented on a Virtex 7 FPGA. The randomness of the keystream generated by this system has been compared to the randomness of the keystream generated by the same system with the proposed randomness-enhancement technique. By subjecting both keystreams to the National Institute of Standards and Technology (NIST) tests, we have proved that our method can considerably improve the randomness of the generated keystreams. In order to incorporate our randomness-enhancement technique, only 41 extra slices have been needed, proving that, apart from effective, this method is also efficient in terms of area and hardware resources.
Doing better by getting worse: posthypnotic amnesia improves random number generation.
Terhune, Devin Blair; Brugger, Peter
2011-01-01
Although forgetting is often regarded as a deficit that we need to control to optimize cognitive functioning, it can have beneficial effects in a number of contexts. We examined whether disrupting memory for previous numerical responses would attenuate repetition avoidance (the tendency to avoid repeating the same number) during random number generation and thereby improve the randomness of responses. Low suggestible and low dissociative and high dissociative highly suggestible individuals completed a random number generation task in a control condition, following a posthypnotic amnesia suggestion to forget previous numerical responses, and in a second control condition following the cancellation of the suggestion. High dissociative highly suggestible participants displayed a selective increase in repetitions during posthypnotic amnesia, with equivalent repetition frequency to a random system, whereas the other two groups exhibited repetition avoidance across conditions. Our results demonstrate that temporarily disrupting memory for previous numerical responses improves random number generation.
Doing Better by Getting Worse: Posthypnotic Amnesia Improves Random Number Generation
Terhune, Devin Blair; Brugger, Peter
2011-01-01
Although forgetting is often regarded as a deficit that we need to control to optimize cognitive functioning, it can have beneficial effects in a number of contexts. We examined whether disrupting memory for previous numerical responses would attenuate repetition avoidance (the tendency to avoid repeating the same number) during random number generation and thereby improve the randomness of responses. Low suggestible and low dissociative and high dissociative highly suggestible individuals completed a random number generation task in a control condition, following a posthypnotic amnesia suggestion to forget previous numerical responses, and in a second control condition following the cancellation of the suggestion. High dissociative highly suggestible participants displayed a selective increase in repetitions during posthypnotic amnesia, with equivalent repetition frequency to a random system, whereas the other two groups exhibited repetition avoidance across conditions. Our results demonstrate that temporarily disrupting memory for previous numerical responses improves random number generation. PMID:22195022
Method and apparatus for in-situ characterization of energy storage and energy conversion devices
Christophersen, Jon P [Idaho Falls, ID; Motloch, Chester G [Idaho Falls, ID; Morrison, John L [Butte, MT; Albrecht, Weston [Layton, UT
2010-03-09
Disclosed are methods and apparatuses for determining an impedance of an energy-output device using a random noise stimulus applied to the energy-output device. A random noise signal is generated and converted to a random noise stimulus as a current source correlated to the random noise signal. A bias-reduced response of the energy-output device to the random noise stimulus is generated by comparing a voltage at the energy-output device terminal to an average voltage signal. The random noise stimulus and bias-reduced response may be periodically sampled to generate a time-varying current stimulus and a time-varying voltage response, which may be correlated to generate an autocorrelated stimulus, an autocorrelated response, and a cross-correlated response. Finally, the autocorrelated stimulus, the autocorrelated response, and the cross-correlated response may be combined to determine at least one of impedance amplitude, impedance phase, and complex impedance.
ERIC Educational Resources Information Center
Boonsathorn, Wasita; Charoen, Danuvasin; Dryver, Arthur L.
2014-01-01
E-Learning brings access to a powerful but often overlooked teaching tool: random number generation. Using random number generation, a practically infinite number of quantitative problem-solution sets can be created. In addition, within the e-learning context, in the spirit of the mastery of learning, it is possible to assign online quantitative…
Random numbers certified by Bell's theorem.
Pironio, S; Acín, A; Massar, S; de la Giroday, A Boyer; Matsukevich, D N; Maunz, P; Olmschenk, S; Hayes, D; Luo, L; Manning, T A; Monroe, C
2010-04-15
Randomness is a fundamental feature of nature and a valuable resource for applications ranging from cryptography and gambling to numerical simulation of physical and biological systems. Random numbers, however, are difficult to characterize mathematically, and their generation must rely on an unpredictable physical process. Inaccuracies in the theoretical modelling of such processes or failures of the devices, possibly due to adversarial attacks, limit the reliability of random number generators in ways that are difficult to control and detect. Here, inspired by earlier work on non-locality-based and device-independent quantum information processing, we show that the non-local correlations of entangled quantum particles can be used to certify the presence of genuine randomness. It is thereby possible to design a cryptographically secure random number generator that does not require any assumption about the internal working of the device. Such a strong form of randomness generation is impossible classically and possible in quantum systems only if certified by a Bell inequality violation. We carry out a proof-of-concept demonstration of this proposal in a system of two entangled atoms separated by approximately one metre. The observed Bell inequality violation, featuring near perfect detection efficiency, guarantees that 42 new random numbers are generated with 99 per cent confidence. Our results lay the groundwork for future device-independent quantum information experiments and for addressing fundamental issues raised by the intrinsic randomness of quantum theory.
Evolutionary Perspective on Collective Decision Making
NASA Astrophysics Data System (ADS)
Farrell, Dene; Sayama, Hiroki; Dionne, Shelley D.; Yammarino, Francis J.; Wilson, David Sloan
Team decision making dynamics are investigated from a novel perspective by shifting agency from decision makers to representations of potential solutions. We provide a new way to navigate social dynamics of collective decision making by interpreting decision makers as constituents of an evolutionary environment of an ecology of evolving solutions. We demonstrate distinct patterns of evolution with respect to three forms of variation: (1) Results with random variations in utility functions of individuals indicate that groups demonstrating minimal internal variation produce higher true utility values of group solutions and display better convergence; (2) analysis of variations in behavioral patterns within a group shows that a proper balance between selective and creative evolutionary forces is crucial to producing adaptive solutions; and (3) biased variations of the utility functions diminish the range of variation for potential solution utility, leaving only the differential of convergence performance static. We generally find that group cohesion (low random variation within a group) and composition (appropriate variation of behavioral patterns within a group) are necessary for a successful navigation of the solution space, but performance in both cases is susceptible to group level biases.
Social Noise: Generating Random Numbers from Twitter Streams
NASA Astrophysics Data System (ADS)
Fernández, Norberto; Quintas, Fernando; Sánchez, Luis; Arias, Jesús
2015-12-01
Due to the multiple applications of random numbers in computer systems (cryptography, online gambling, computer simulation, etc.) it is important to have mechanisms to generate these numbers. True Random Number Generators (TRNGs) are commonly used for this purpose. TRNGs rely on non-deterministic sources to generate randomness. Physical processes (like noise in semiconductors, quantum phenomenon, etc.) play this role in state of the art TRNGs. In this paper, we depart from previous work and explore the possibility of defining social TRNGs using the stream of public messages of the microblogging service Twitter as randomness source. Thus, we define two TRNGs based on Twitter stream information and evaluate them using the National Institute of Standards and Technology (NIST) statistical test suite. The results of the evaluation confirm the feasibility of the proposed approach.
Fast generation of sparse random kernel graphs
Hagberg, Aric; Lemons, Nathan; Du, Wen -Bo
2015-09-10
The development of kernel-based inhomogeneous random graphs has provided models that are flexible enough to capture many observed characteristics of real networks, and that are also mathematically tractable. We specify a class of inhomogeneous random graph models, called random kernel graphs, that produces sparse graphs with tunable graph properties, and we develop an efficient generation algorithm to sample random instances from this model. As real-world networks are usually large, it is essential that the run-time of generation algorithms scales better than quadratically in the number of vertices n. We show that for many practical kernels our algorithm runs in timemore » at most ο(n(logn)²). As an example, we show how to generate samples of power-law degree distribution graphs with tunable assortativity.« less
NASA Astrophysics Data System (ADS)
Bisadi, Zahra; Acerbi, Fabio; Fontana, Giorgio; Zorzi, Nicola; Piemonte, Claudio; Pucker, Georg; Pavesi, Lorenzo
2018-02-01
A small-sized photonic quantum random number generator, easy to be implemented in small electronic devices for secure data encryption and other applications, is highly demanding nowadays. Here, we propose a compact configuration with Silicon nanocrystals large area light emitting device (LED) coupled to a Silicon photomultiplier to generate random numbers. The random number generation methodology is based on the photon arrival time and is robust against the non-idealities of the detector and the source of quantum entropy. The raw data show high quality of randomness and pass all the statistical tests in national institute of standards and technology tests (NIST) suite without a post-processing algorithm. The highest bit rate is 0.5 Mbps with the efficiency of 4 bits per detected photon.
Response Rates in Random-Digit-Dialed Telephone Surveys: Estimation vs. Measurement.
ERIC Educational Resources Information Center
Franz, Jennifer D.
The efficacy of the random digit dialing method in telephone surveys was examined. Random digit dialing (RDD) generates a pure random sample and provides the advantage of including unlisted phone numbers, as well as numbers which are too new to be listed. Its disadvantage is that it generates a major proportion of nonworking and business…
Revisiting sample size: are big trials the answer?
Lurati Buse, Giovanna A L; Botto, Fernando; Devereaux, P J
2012-07-18
The superiority of the evidence generated in randomized controlled trials over observational data is not only conditional to randomization. Randomized controlled trials require proper design and implementation to provide a reliable effect estimate. Adequate random sequence generation, allocation implementation, analyses based on the intention-to-treat principle, and sufficient power are crucial to the quality of a randomized controlled trial. Power, or the probability of the trial to detect a difference when a real difference between treatments exists, strongly depends on sample size. The quality of orthopaedic randomized controlled trials is frequently threatened by a limited sample size. This paper reviews basic concepts and pitfalls in sample-size estimation and focuses on the importance of large trials in the generation of valid evidence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stemkens, B; Glitzner, M; Kontaxis, C
Purpose: To assess the dose deposition in simulated single-fraction MR-Linac treatments of renal cell carcinoma, when inter-cycle respiratory motion variation is taken into account using online MRI. Methods: Three motion characterization methods, with increasing complexity, were compared to evaluate the effect of inter-cycle motion variation and drifts on the accumulated dose for an SBRT kidney MR-Linac treatment: 1) STATIC, in which static anatomy was assumed, 2) AVG-RESP, in which 4D-MRI phase-volumes were time-weighted, based on the respiratory phase and 3) PCA, in which 3D volumes were generated using a PCA-model, enabling the detection of inter-cycle variations and drifts. An experimentalmore » ITV-based kidney treatment was simulated in a 1.5T magnetic field on three volunteer datasets. For each volunteer a retrospectively sorted 4D-MRI (ten respiratory phases) and fast 2D cine-MR images (temporal resolution = 476ms) were acquired to simulate MR-imaging during radiation. For each method, the high spatio-temporal resolution 3D volumes were non-rigidly registered to obtain deformation vector fields (DVFs). Using the DVFs, pseudo-CTs (generated from the 4D-MRI) were deformed and the dose was accumulated for the entire treatment. The accuracies of all methods were independently determined using an additional, orthogonal 2D-MRI slice. Results: Motion was most accurately estimated using the PCA method, which correctly estimated drifts and inter-cycle variations (RMSE=3.2, 2.2, 1.1mm on average for STATIC, AVG-RESP and PCA, compared to the 2DMRI slice). Dose-volume parameters on the ITV showed moderate changes (D99=35.2, 32.5, 33.8Gy for STATIC, AVG-RESP and PCA). AVG-RESP showed distinct hot/cold spots outside the ITV margin, which were more distributed for the PCA scenario, since inter-cycle variations were not modeled by the AVG-RESP method. Conclusion: Dose differences were observed when inter-cycle variations were taken into account. The increased inter-cycle randomness in motion as captured by the PCA model mitigates the local (erroneous) hotspots estimated by the AVG-RESP method.« less
Shteingart, Hanan; Loewenstein, Yonatan
2016-01-01
There is a long history of experiments in which participants are instructed to generate a long sequence of binary random numbers. The scope of this line of research has shifted over the years from identifying the basic psychological principles and/or the heuristics that lead to deviations from randomness, to one of predicting future choices. In this paper, we used generalized linear regression and the framework of Reinforcement Learning in order to address both points. In particular, we used logistic regression analysis in order to characterize the temporal sequence of participants' choices. Surprisingly, a population analysis indicated that the contribution of the most recent trial has only a weak effect on behavior, compared to more preceding trials, a result that seems irreconcilable with standard sequential effects that decay monotonously with the delay. However, when considering each participant separately, we found that the magnitudes of the sequential effect are a monotonous decreasing function of the delay, yet these individual sequential effects are largely averaged out in a population analysis because of heterogeneity. The substantial behavioral heterogeneity in this task is further demonstrated quantitatively by considering the predictive power of the model. We show that a heterogeneous model of sequential dependencies captures the structure available in random sequence generation. Finally, we show that the results of the logistic regression analysis can be interpreted in the framework of reinforcement learning, allowing us to compare the sequential effects in the random sequence generation task to those in an operant learning task. We show that in contrast to the random sequence generation task, sequential effects in operant learning are far more homogenous across the population. These results suggest that in the random sequence generation task, different participants adopt different cognitive strategies to suppress sequential dependencies when generating the "random" sequences.
Solution-Processed Carbon Nanotube True Random Number Generator.
Gaviria Rojas, William A; McMorrow, Julian J; Geier, Michael L; Tang, Qianying; Kim, Chris H; Marks, Tobin J; Hersam, Mark C
2017-08-09
With the growing adoption of interconnected electronic devices in consumer and industrial applications, there is an increasing demand for robust security protocols when transmitting and receiving sensitive data. Toward this end, hardware true random number generators (TRNGs), commonly used to create encryption keys, offer significant advantages over software pseudorandom number generators. However, the vast network of devices and sensors envisioned for the "Internet of Things" will require small, low-cost, and mechanically flexible TRNGs with low computational complexity. These rigorous constraints position solution-processed semiconducting single-walled carbon nanotubes (SWCNTs) as leading candidates for next-generation security devices. Here, we demonstrate the first TRNG using static random access memory (SRAM) cells based on solution-processed SWCNTs that digitize thermal noise to generate random bits. This bit generation strategy can be readily implemented in hardware with minimal transistor and computational overhead, resulting in an output stream that passes standardized statistical tests for randomness. By using solution-processed semiconducting SWCNTs in a low-power, complementary architecture to achieve TRNG, we demonstrate a promising approach for improving the security of printable and flexible electronics.
Multistate Lempel-Ziv (MLZ) index interpretation as a measure of amplitude and complexity changes.
Sarlabous, Leonardo; Torres, Abel; Fiz, Jose A; Gea, Joaquim; Galdiz, Juan B; Jane, Raimon
2009-01-01
The Lempel-Ziv complexity (LZ) has been widely used to evaluate the randomness of finite sequences. In general, the LZ complexity has been used to determine the complexity grade present in biomedical signals. The LZ complexity is not able to discern between signals with different amplitude variations and similar random components. On the other hand, amplitude parameters, as the root mean square (RMS), are not able to discern between signals with similar power distributions and different random components. In this work, we present a novel method to quantify amplitude and complexity variations in biomedical signals by means of the computation of the LZ coefficient using more than two quantification states, and with thresholds fixed and independent of the dynamic range or standard deviation of the analyzed signal: the Multistate Lempel-Ziv (MLZ) index. Our results indicate that MLZ index with few quantification levels only evaluate the complexity changes of the signal, with high number of levels, the amplitude variations, and with an intermediate number of levels informs about both amplitude and complexity variations. The study performed in diaphragmatic mechanomyographic signals shows that the amplitude variations of this signal are more correlated with the respiratory effort than the complexity variations. Furthermore, it has been observed that the MLZ index with high number of levels practically is not affected by the existence of impulsive, sinusoidal, constant and Gaussian noises compared with the RMS amplitude parameter.
Biological monitoring of environmental quality: The use of developmental instability
Freeman, D.C.; Emlen, J.M.; Graham, J.H.; Hough, R. A.; Bannon, T.A.
1994-01-01
Distributed robustness is thought to influence the buffering of random phenotypic variation through the scale-free topology of gene regulatory, metabolic, and protein-protein interaction networks. If this hypothesis is true, then the phenotypic response to the perturbation of particular nodes in such a network should be proportional to the number of links those nodes make with neighboring nodes. This suggests a probability distribution approximating an inverse power-law of random phenotypic variation. Zero phenotypic variation, however, is impossible, because random molecular and cellular processes are essential to normal development. Consequently, a more realistic distribution should have a y-intercept close to zero in the lower tail, a mode greater than zero, and a long (fat) upper tail. The double Pareto-lognormal (DPLN) distribution is an ideal candidate distribution. It consists of a mixture of a lognormal body and upper and lower power-law tails.
NASA Astrophysics Data System (ADS)
Vatankhah, Saeed; Renaut, Rosemary A.; Ardestani, Vahid E.
2018-04-01
We present a fast algorithm for the total variation regularization of the 3-D gravity inverse problem. Through imposition of the total variation regularization, subsurface structures presenting with sharp discontinuities are preserved better than when using a conventional minimum-structure inversion. The associated problem formulation for the regularization is nonlinear but can be solved using an iteratively reweighted least-squares algorithm. For small-scale problems the regularized least-squares problem at each iteration can be solved using the generalized singular value decomposition. This is not feasible for large-scale, or even moderate-scale, problems. Instead we introduce the use of a randomized generalized singular value decomposition in order to reduce the dimensions of the problem and provide an effective and efficient solution technique. For further efficiency an alternating direction algorithm is used to implement the total variation weighting operator within the iteratively reweighted least-squares algorithm. Presented results for synthetic examples demonstrate that the novel randomized decomposition provides good accuracy for reduced computational and memory demands as compared to use of classical approaches.
640-Gbit/s fast physical random number generation using a broadband chaotic semiconductor laser
NASA Astrophysics Data System (ADS)
Zhang, Limeng; Pan, Biwei; Chen, Guangcan; Guo, Lu; Lu, Dan; Zhao, Lingjuan; Wang, Wei
2017-04-01
An ultra-fast physical random number generator is demonstrated utilizing a photonic integrated device based broadband chaotic source with a simple post data processing method. The compact chaotic source is implemented by using a monolithic integrated dual-mode amplified feedback laser (AFL) with self-injection, where a robust chaotic signal with RF frequency coverage of above 50 GHz and flatness of ±3.6 dB is generated. By using 4-least significant bits (LSBs) retaining from the 8-bit digitization of the chaotic waveform, random sequences with a bit-rate up to 640 Gbit/s (160 GS/s × 4 bits) are realized. The generated random bits have passed each of the fifteen NIST statistics tests (NIST SP800-22), indicating its randomness for practical applications.
Hints of correlation between broad-line and radio variations for 3C 120
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, H. T.; Bai, J. M.; Li, S. K.
2014-01-01
In this paper, we investigate the correlation between broad-line and radio variations for the broad-line radio galaxy 3C 120. By the z-transformed discrete correlation function method and the model-independent flux randomization/random subset selection (FR/RSS) Monte Carlo method, we find that broad Hβ line variations lead the 15 GHz variations. The FR/RSS method shows that the Hβ line variations lead the radio variations by a factor of τ{sub ob} = 0.34 ± 0.01 yr. This time lag can be used to locate the position of the emitting region of radio outbursts in the jet, on the order of ∼5 lt-yr frommore » the central engine. This distance is much larger than the size of the broad-line region. The large separation of the radio outburst emitting region from the broad-line region will observably influence the gamma-ray emission in 3C 120.« less
NASA Astrophysics Data System (ADS)
Apdilah, D.; Harahap, M. K.; Khairina, N.; Husein, A. M.; Harahap, M.
2018-04-01
One Time Pad algorithm always requires a pairing of the key for plaintext. If the length of keys less than a length of the plaintext, the key will be repeated until the length of the plaintext same with the length of the key. In this research, we use Linear Congruential Generator and Quadratic Congruential Generator for generating a random number. One Time Pad use a random number as a key for encryption and decryption process. Key will generate the first letter from the plaintext, we compare these two algorithms in terms of time speed encryption, and the result is a combination of OTP with LCG faster than the combination of OTP with QCG.
NASA Astrophysics Data System (ADS)
Degtyar, V. G.; Kalashnikov, S. T.; Mokin, Yu. A.
2017-10-01
The paper considers problems of analyzing aerodynamic properties (ADP) of reenetry vehicles (RV) as blunted rotary bodies with small random surface distortions. The interactions of math simulation of surface distortions, selection of tools for predicting ADPs of shaped bodies, evaluation of different-type ADP variations and their adaptation for dynamic problems are analyzed. The possibilities of deterministic and probabilistic approaches to evaluation of ADP variations are considered. The practical value of the probabilistic approach is demonstrated. The examples of extremal deterministic evaluations of ADP variations for a sphere and a sharp cone are given.
Early stage hot spot analysis through standard cell base random pattern generation
NASA Astrophysics Data System (ADS)
Jeon, Joong-Won; Song, Jaewan; Kim, Jeong-Lim; Park, Seongyul; Yang, Seung-Hune; Lee, Sooryong; Kang, Hokyu; Madkour, Kareem; ElManhawy, Wael; Lee, SeungJo; Kwan, Joe
2017-04-01
Due to limited availability of DRC clean patterns during the process and RET recipe development, OPC recipes are not tested with high pattern coverage. Various kinds of pattern can help OPC engineer to detect sensitive patterns to lithographic effects. Random pattern generation is needed to secure robust OPC recipe. However, simple random patterns without considering real product layout style can't cover patterning hotspot in production levels. It is not effective to use them for OPC optimization thus it is important to generate random patterns similar to real product patterns. This paper presents a strategy for generating random patterns based on design architecture information and preventing hotspot in early process development stage through a tool called Layout Schema Generator (LSG). Using LSG, we generate standard cell based on random patterns reflecting real design cell structure - fin pitch, gate pitch and cell height. The output standard cells from LSG are applied to an analysis methodology to assess their hotspot severity by assigning a score according to their optical image parameters - NILS, MEEF, %PV band and thus potential hotspots can be defined by determining their ranking. This flow is demonstrated on Samsung 7nm technology optimizing OPC recipe and early enough in the process avoiding using problematic patterns.
Chen, Bor-Sen; Lin, Ying-Po
2011-01-01
In the evolutionary process, the random transmission and mutation of genes provide biological diversities for natural selection. In order to preserve functional phenotypes between generations, gene networks need to evolve robustly under the influence of random perturbations. Therefore, the robustness of the phenotype, in the evolutionary process, exerts a selection force on gene networks to keep network functions. However, gene networks need to adjust, by variations in genetic content, to generate phenotypes for new challenges in the network’s evolution, ie, the evolvability. Hence, there should be some interplay between the evolvability and network robustness in evolutionary gene networks. In this study, the interplay between the evolvability and network robustness of a gene network and a biochemical network is discussed from a nonlinear stochastic system point of view. It was found that if the genetic robustness plus environmental robustness is less than the network robustness, the phenotype of the biological network is robust in evolution. The tradeoff between the genetic robustness and environmental robustness in evolution is discussed from the stochastic stability robustness and sensitivity of the nonlinear stochastic biological network, which may be relevant to the statistical tradeoff between bias and variance, the so-called bias/variance dilemma. Further, the tradeoff could be considered as an antagonistic pleiotropic action of a gene network and discussed from the systems biology perspective. PMID:22084563
Sabry, A H; W Hasan, W Z; Ab Kadir, M Z A; Radzi, M A M; Shafie, S
2018-01-01
The power system always has several variations in its profile due to random load changes or environmental effects such as device switching effects when generating further transients. Thus, an accurate mathematical model is important because most system parameters vary with time. Curve modeling of power generation is a significant tool for evaluating system performance, monitoring and forecasting. Several numerical techniques compete to fit the curves of empirical data such as wind, solar, and demand power rates. This paper proposes a new modified methodology presented as a parametric technique to determine the system's modeling equations based on the Bode plot equations and the vector fitting (VF) algorithm by fitting the experimental data points. The modification is derived from the familiar VF algorithm as a robust numerical method. This development increases the application range of the VF algorithm for modeling not only in the frequency domain but also for all power curves. Four case studies are addressed and compared with several common methods. From the minimal RMSE, the results show clear improvements in data fitting over other methods. The most powerful features of this method is the ability to model irregular or randomly shaped data and to be applied to any algorithms that estimating models using frequency-domain data to provide state-space or transfer function for the model.
W. Hasan, W. Z.
2018-01-01
The power system always has several variations in its profile due to random load changes or environmental effects such as device switching effects when generating further transients. Thus, an accurate mathematical model is important because most system parameters vary with time. Curve modeling of power generation is a significant tool for evaluating system performance, monitoring and forecasting. Several numerical techniques compete to fit the curves of empirical data such as wind, solar, and demand power rates. This paper proposes a new modified methodology presented as a parametric technique to determine the system’s modeling equations based on the Bode plot equations and the vector fitting (VF) algorithm by fitting the experimental data points. The modification is derived from the familiar VF algorithm as a robust numerical method. This development increases the application range of the VF algorithm for modeling not only in the frequency domain but also for all power curves. Four case studies are addressed and compared with several common methods. From the minimal RMSE, the results show clear improvements in data fitting over other methods. The most powerful features of this method is the ability to model irregular or randomly shaped data and to be applied to any algorithms that estimating models using frequency-domain data to provide state-space or transfer function for the model. PMID:29351554
Device-independent randomness generation from several Bell estimators
NASA Astrophysics Data System (ADS)
Nieto-Silleras, Olmo; Bamps, Cédric; Silman, Jonathan; Pironio, Stefano
2018-02-01
Device-independent randomness generation and quantum key distribution protocols rely on a fundamental relation between the non-locality of quantum theory and its random character. This relation is usually expressed in terms of a trade-off between the probability of guessing correctly the outcomes of measurements performed on quantum systems and the amount of violation of a given Bell inequality. However, a more accurate assessment of the randomness produced in Bell experiments can be obtained if the value of several Bell expressions is simultaneously taken into account, or if the full set of probabilities characterizing the behavior of the device is considered. We introduce protocols for device-independent randomness generation secure against classical side information, that rely on the estimation of an arbitrary number of Bell expressions or even directly on the experimental frequencies of measurement outcomes. Asymptotically, this results in an optimal generation of randomness from experimental data (as measured by the min-entropy), without having to assume beforehand that the devices violate a specific Bell inequality.
NASA Astrophysics Data System (ADS)
Li, Jiafu; Xiang, Shuiying; Wang, Haoning; Gong, Junkai; Wen, Aijun
2018-03-01
In this paper, a novel image encryption algorithm based on synchronization of physical random bit generated in a cascade-coupled semiconductor ring lasers (CCSRL) system is proposed, and the security analysis is performed. In both transmitter and receiver parts, the CCSRL system is a master-slave configuration consisting of a master semiconductor ring laser (M-SRL) with cross-feedback and a solitary SRL (S-SRL). The proposed image encryption algorithm includes image preprocessing based on conventional chaotic maps, pixel confusion based on control matrix extracted from physical random bit, and pixel diffusion based on random bit stream extracted from physical random bit. Firstly, the preprocessing method is used to eliminate the correlation between adjacent pixels. Secondly, physical random bit with verified randomness is generated based on chaos in the CCSRL system, and is used to simultaneously generate the control matrix and random bit stream. Finally, the control matrix and random bit stream are used for the encryption algorithm in order to change the position and the values of pixels, respectively. Simulation results and security analysis demonstrate that the proposed algorithm is effective and able to resist various typical attacks, and thus is an excellent candidate for secure image communication application.
Effect of Random Thermal Spikes on Stirling Convertor Heater Head Reliability
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Korovaichuk, Igor; Halford, Gary R.
2004-01-01
Onboard radioisotope power systems being developed to support future NASA exploration missions require reliable design lifetimes of up to 14 yr and beyond. The structurally critical heater head of the high-efficiency developmental Stirling power convertor has undergone extensive computational analysis of operating temperatures (up to 650 C), stresses, and creep resistance of the thin-walled Inconel 718 bill of material. Additionally, assessment of the effect of uncertainties in the creep behavior of the thin-walled heater head, the variation in the manufactured thickness, variation in control temperature, and variation in pressure on the durability and reliability were performed. However, it is possible for the heater head to experience rare incidences of random temperature spikes (excursions) of short duration. These incidences could occur randomly with random magnitude and duration during the desired mission life. These rare incidences could affect the creep strain rate and therefore the life. The paper accounts for these uncertainties and includes the effect of such rare incidences, random in nature, on the reliability. The sensitivities of variables affecting the reliability are quantified and guidelines developed to improve the reliability are outlined. Furthermore, the quantified reliability is being verified with test data from the accelerated benchmark tests being conducted at the NASA Glenn Research Center.
Probabilistic generation of random networks taking into account information on motifs occurrence.
Bois, Frederic Y; Gayraud, Ghislaine
2015-01-01
Because of the huge number of graphs possible even with a small number of nodes, inference on network structure is known to be a challenging problem. Generating large random directed graphs with prescribed probabilities of occurrences of some meaningful patterns (motifs) is also difficult. We show how to generate such random graphs according to a formal probabilistic representation, using fast Markov chain Monte Carlo methods to sample them. As an illustration, we generate realistic graphs with several hundred nodes mimicking a gene transcription interaction network in Escherichia coli.
Probabilistic Generation of Random Networks Taking into Account Information on Motifs Occurrence
Bois, Frederic Y.
2015-01-01
Abstract Because of the huge number of graphs possible even with a small number of nodes, inference on network structure is known to be a challenging problem. Generating large random directed graphs with prescribed probabilities of occurrences of some meaningful patterns (motifs) is also difficult. We show how to generate such random graphs according to a formal probabilistic representation, using fast Markov chain Monte Carlo methods to sample them. As an illustration, we generate realistic graphs with several hundred nodes mimicking a gene transcription interaction network in Escherichia coli. PMID:25493547
Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore
2011-12-01
Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.
CONTEXTUAL INTERFERENCE AND INTROVERSION/EXTRAVERSION IN MOTOR LEARNING.
Meira, Cassio M; Fairbrother, Jeffrey T; Perez, Carlos R
2015-10-01
The Introversion/Extraversion dimension may interact with contextual interference, as random and blocked practice schedules imply distinct levels of variation. This study investigated the effect of different practice schedules in the acquisition of a motor skill in extraverts and introverts. Forty male undergraduate students (M = 24.3 yr., SD = 5.6) were classified as extraverts (n = 20) and introverts (n = 20) by the Eysenck Personality Questionnaire and allocated in one of two practice schedules with different levels of contextual interference: blocked (low contextual interference) and random (high contextual interference). Half of each group was assigned to a blocked practice schedule, and the other half was assigned to a random practice schedule. The design had two phases: acquisition and transfer (5 min. and 24 hr.). The participants learned variations of a sequential timing keypressing task. Each variation required the same sequence but different timing; three variations were used in acquisition, and one variation of intermediate length was used in transfer. Results for absolute error and overall timing error (root mean square error) indicated that the contextual interference effect was more pronounced for introverts. In addition, introverts who practiced according to the blocked schedule committed more errors during the 24-hr. transfer, suggesting that introverts did not appear to be challenged by a low contextual interference practice schedule.
A hybrid-type quantum random number generator
NASA Astrophysics Data System (ADS)
Hai-Qiang, Ma; Wu, Zhu; Ke-Jin, Wei; Rui-Xue, Li; Hong-Wei, Liu
2016-05-01
This paper proposes a well-performing hybrid-type truly quantum random number generator based on the time interval between two independent single-photon detection signals, which is practical and intuitive, and generates the initial random number sources from a combination of multiple existing random number sources. A time-to-amplitude converter and multichannel analyzer are used for qualitative analysis to demonstrate that each and every step is random. Furthermore, a carefully designed data acquisition system is used to obtain a high-quality random sequence. Our scheme is simple and proves that the random number bit rate can be dramatically increased to satisfy practical requirements. Project supported by the National Natural Science Foundation of China (Grant Nos. 61178010 and 11374042), the Fund of State Key Laboratory of Information Photonics and Optical Communications (Beijing University of Posts and Telecommunications), China, and the Fundamental Research Funds for the Central Universities of China (Grant No. bupt2014TS01).
Lauria, Massimiliano; Piccinini, Sara; Pirona, Raul; Lund, Gertrud; Viotti, Angelo; Motto, Mario
2014-03-01
Pure epigenetic variation, or epigenetic variation that is independent of genetic context, may provide a mechanism for phenotypic variation in the absence of DNA mutations. To estimate the extent of pure epigenetic variation within and across generations and to identify the DNA regions targeted, a group of eight plants derived from a highly inbred line of maize (Zea mays) was analyzed by the methylation-sensitive amplified polymorphism (MSAP) technique. We found that cytosine methylation (mC) differences among individuals accounted for up to 7.4% of CCGG sites investigated by MSAP. Of the differentially methylated fragments (DMFs) identified in the S0 generation, ∼12% were meiotically inherited for at least six generations. We show that meiotically heritable mC variation was consistently generated for an average of 0.5% CCGG sites per generation and that it largely occurred somatically. We provide evidence that mC variation can be established and inherited in a parent-of-origin manner, given that the paternal lineage is more prone to both forward and reverse mC changes. The molecular characterization of selected DMFs revealed that the variation was largely determined by CG methylation changes that map within gene regions. The expression analysis of genes overlapping with DMFs did not reveal an obvious correlation between mC variation and transcription, reinforcing the idea that the primary function of gene-body methylation is not to control gene expression. Because this study focuses on epigenetic variation in field-grown plants, the data presented herein pertain to spontaneous epigenetic changes of the maize genome in a natural context.
Lauria, Massimiliano; Piccinini, Sara; Pirona, Raul; Lund, Gertrud; Viotti, Angelo; Motto, Mario
2014-01-01
Pure epigenetic variation, or epigenetic variation that is independent of genetic context, may provide a mechanism for phenotypic variation in the absence of DNA mutations. To estimate the extent of pure epigenetic variation within and across generations and to identify the DNA regions targeted, a group of eight plants derived from a highly inbred line of maize (Zea mays) was analyzed by the methylation-sensitive amplified polymorphism (MSAP) technique. We found that cytosine methylation (mC) differences among individuals accounted for up to 7.4% of CCGG sites investigated by MSAP. Of the differentially methylated fragments (DMFs) identified in the S0 generation, ∼12% were meiotically inherited for at least six generations. We show that meiotically heritable mC variation was consistently generated for an average of 0.5% CCGG sites per generation and that it largely occurred somatically. We provide evidence that mC variation can be established and inherited in a parent-of-origin manner, given that the paternal lineage is more prone to both forward and reverse mC changes. The molecular characterization of selected DMFs revealed that the variation was largely determined by CG methylation changes that map within gene regions. The expression analysis of genes overlapping with DMFs did not reveal an obvious correlation between mC variation and transcription, reinforcing the idea that the primary function of gene-body methylation is not to control gene expression. Because this study focuses on epigenetic variation in field-grown plants, the data presented herein pertain to spontaneous epigenetic changes of the maize genome in a natural context. PMID:24374354
Kaitaniemi, Pekka
2008-04-09
Allometric equations are widely used in many branches of biological science. The potential information content of the normalization constant b in allometric equations of the form Y = bX(a) has, however, remained largely neglected. To demonstrate the potential for utilizing this information, I generated a large number of artificial datasets that resembled those that are frequently encountered in biological studies, i.e., relatively small samples including measurement error or uncontrolled variation. The value of X was allowed to vary randomly within the limits describing different data ranges, and a was set to a fixed theoretical value. The constant b was set to a range of values describing the effect of a continuous environmental variable. In addition, a normally distributed random error was added to the values of both X and Y. Two different approaches were then used to model the data. The traditional approach estimated both a and b using a regression model, whereas an alternative approach set the exponent a at its theoretical value and only estimated the value of b. Both approaches produced virtually the same model fit with less than 0.3% difference in the coefficient of determination. Only the alternative approach was able to precisely reproduce the effect of the environmental variable, which was largely lost among noise variation when using the traditional approach. The results show how the value of b can be used as a source of valuable biological information if an appropriate regression model is selected.
Shahin, Arwa; Smulders, Marinus J. M.; van Tuyl, Jaap M.; Arens, Paul; Bakker, Freek T.
2014-01-01
Next Generation Sequencing (NGS) may enable estimating relationships among genotypes using allelic variation of multiple nuclear genes simultaneously. We explored the potential and caveats of this strategy in four genetically distant Lilium cultivars to estimate their genetic divergence from transcriptome sequences using three approaches: POFAD (Phylogeny of Organisms from Allelic Data, uses allelic information of sequence data), RAxML (Randomized Accelerated Maximum Likelihood, tree building based on concatenated consensus sequences) and Consensus Network (constructing a network summarizing among gene tree conflicts). Twenty six gene contigs were chosen based on the presence of orthologous sequences in all cultivars, seven of which also had an orthologous sequence in Tulipa, used as out-group. The three approaches generated the same topology. Although the resolution offered by these approaches is high, in this case there was no extra benefit in using allelic information. We conclude that these 26 genes can be widely applied to construct a species tree for the genus Lilium. PMID:25368628
Fine-scale population structure and the era of next-generation sequencing.
Henn, Brenna M; Gravel, Simon; Moreno-Estrada, Andres; Acevedo-Acevedo, Suehelay; Bustamante, Carlos D
2010-10-15
Fine-scale population structure characterizes most continents and is especially pronounced in non-cosmopolitan populations. Roughly half of the world's population remains non-cosmopolitan and even populations within cities often assort along ethnic and linguistic categories. Barriers to random mating can be ecologically extreme, such as the Sahara Desert, or cultural, such as the Indian caste system. In either case, subpopulations accumulate genetic differences if the barrier is maintained over multiple generations. Genome-wide polymorphism data, initially with only a few hundred autosomal microsatellites, have clearly established differences in allele frequency not only among continental regions, but also within continents and within countries. We review recent evidence from the analysis of genome-wide polymorphism data for genetic boundaries delineating human population structure and the main demographic and genomic processes shaping variation, and discuss the implications of population structure for the distribution and discovery of disease-causing genetic variants, in the light of the imminent availability of sequencing data for a multitude of diverse human genomes.
de Blasio, Birgitte Freiesleben; Seierstad, Taral Guldahl; Aalen, Odd O
2011-01-01
Preferential attachment is a proportionate growth process in networks, where nodes receive new links in proportion to their current degree. Preferential attachment is a popular generative mechanism to explain the widespread observation of power-law-distributed networks. An alternative explanation for the phenomenon is a randomly grown network with large individual variation in growth rates among the nodes (frailty). We derive analytically the distribution of individual rates, which will reproduce the connectivity distribution that is obtained from a general preferential attachment process (Yule process), and the structural differences between the two types of graphs are examined by simulations. We present a statistical test to distinguish the two generative mechanisms from each other and we apply the test to both simulated data and two real data sets of scientific citation and sexual partner networks. The findings from the latter analyses argue for frailty effects as an important mechanism underlying the dynamics of complex networks. PMID:21572513
NASA Astrophysics Data System (ADS)
Hiraga, R.; Omura, Y.
2017-12-01
By recent observations, chorus waves include fine structures such as amplitude fluctuations (i.e. sub-packet structure), and it has not been verified in detail yet how energetic electrons are efficiently accelerated under the wave features. In this study, we firstly focus on the acceleration process of a single electron: how it experiences the efficient energy increase by interaction with sub-packet chorus waves in parallel propagation along the Earth's magnetic field. In order to reproduce the chorus waves as seen by the latest observations by Van Allen Probes (Foster et al. 2017), the wave model amplitude in our simulation is structured such that when the wave amplitude nonlinearly grows to reach the optimum amplitude, it starts decreasing until crossing the threshold. Once it crosses the threshold, the wave dissipates and a new wave rises to repeat the nonlinear growth and damping in the same manner. The multiple occurrence of this growth-damping cycle forms a saw tooth-like amplitude variation called sub-packet. This amplitude variation also affects the wave frequency behavior which is derived by the chorus wave equations as a function of the wave amplitude (Omura et al. 2009). It is also reasonable to assume that when a wave packet diminishes and the next wave rises, it has a random phase independent of the previous wave. This randomness (discontinuity) in phase variation is included in the simulation. Through interaction with such waves, dynamics of energetic electrons were tracked. As a result, some electrons underwent an efficient acceleration process defined as successive entrapping, in which an electron successfully continues to surf the trapping potential generated by consecutive wave packets. When successive entrapping occurs, an electron trapped and de-trapped (escape the trapping potential) by a single wave packet falls into another trapping potential generated by the next wave sub-packet and continuously accelerated. The occurrence of successive entrapping is influenced by some factors such as the magnitude of wave amplitude or inhomogeneity of the Earth's dipole magnetic field. In addition, an energy range of electrons is also a major factor. In this way, it has been examined in detail how and under which conditions electrons are efficiently accelerated in the formation process of the radiation belts.
On the design of henon and logistic map-based random number generator
NASA Astrophysics Data System (ADS)
Magfirawaty; Suryadi, M. T.; Ramli, Kalamullah
2017-10-01
The key sequence is one of the main elements in the cryptosystem. True Random Number Generators (TRNG) method is one of the approaches to generating the key sequence. The randomness source of the TRNG divided into three main groups, i.e. electrical noise based, jitter based and chaos based. The chaos based utilizes a non-linear dynamic system (continuous time or discrete time) as an entropy source. In this study, a new design of TRNG based on discrete time chaotic system is proposed, which is then simulated in LabVIEW. The principle of the design consists of combining 2D and 1D chaotic systems. A mathematical model is implemented for numerical simulations. We used comparator process as a harvester method to obtain the series of random bits. Without any post processing, the proposed design generated random bit sequence with high entropy value and passed all NIST 800.22 statistical tests.
Bayesian statistics and Monte Carlo methods
NASA Astrophysics Data System (ADS)
Koch, K. R.
2018-03-01
The Bayesian approach allows an intuitive way to derive the methods of statistics. Probability is defined as a measure of the plausibility of statements or propositions. Three rules are sufficient to obtain the laws of probability. If the statements refer to the numerical values of variables, the so-called random variables, univariate and multivariate distributions follow. They lead to the point estimation by which unknown quantities, i.e. unknown parameters, are computed from measurements. The unknown parameters are random variables, they are fixed quantities in traditional statistics which is not founded on Bayes' theorem. Bayesian statistics therefore recommends itself for Monte Carlo methods, which generate random variates from given distributions. Monte Carlo methods, of course, can also be applied in traditional statistics. The unknown parameters, are introduced as functions of the measurements, and the Monte Carlo methods give the covariance matrix and the expectation of these functions. A confidence region is derived where the unknown parameters are situated with a given probability. Following a method of traditional statistics, hypotheses are tested by determining whether a value for an unknown parameter lies inside or outside the confidence region. The error propagation of a random vector by the Monte Carlo methods is presented as an application. If the random vector results from a nonlinearly transformed vector, its covariance matrix and its expectation follow from the Monte Carlo estimate. This saves a considerable amount of derivatives to be computed, and errors of the linearization are avoided. The Monte Carlo method is therefore efficient. If the functions of the measurements are given by a sum of two or more random vectors with different multivariate distributions, the resulting distribution is generally not known. TheMonte Carlo methods are then needed to obtain the covariance matrix and the expectation of the sum.
Pseudo-random bit generator based on lag time series
NASA Astrophysics Data System (ADS)
García-Martínez, M.; Campos-Cantón, E.
2014-12-01
In this paper, we present a pseudo-random bit generator (PRBG) based on two lag time series of the logistic map using positive and negative values in the bifurcation parameter. In order to hidden the map used to build the pseudo-random series we have used a delay in the generation of time series. These new series when they are mapped xn against xn+1 present a cloud of points unrelated to the logistic map. Finally, the pseudo-random sequences have been tested with the suite of NIST giving satisfactory results for use in stream ciphers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahn, Peter H.; Ahn, Andrew I.; Lee, C. Joe
2009-02-01
Purpose: With 54{sup o} of freedom from the skull to mandible to C7, ensuring adequate immobilization for head-and-neck radiotherapy (RT) is complex. We quantify variations in skull, mandible, and cervical spine movement between RT sessions. Methods and Materials: Twenty-three sequential head-and-neck RT patients underwent serial computed tomography. Patients underwent planned rescanning at 11, 22, and 33 fractions for a total of 93 scans. Coordinates of multiple bony elements of the skull, mandible, and cervical spine were used to calculate rotational and translational changes of bony anatomy compared with the original planning scan. Results: Mean translational and rotational variations on rescanningmore » were negligible, but showed a wide range. Changes in scoliosis and lordosis of the cervical spine between fractions showed similar variability. There was no correlation between positional variation and fraction number and no strong correlation with weight loss or skin separation. Semi-independent rotational and translation movement of the skull in relation to the lower cervical spine was shown. Positioning variability measured by means of vector displacement was largest in the mandible and lower cervical spine. Conclusions: Although only small overall variations in position between head-and-neck RT sessions exist on average, there is significant random variation in patient positioning of the skull, mandible, and cervical spine elements. Such variation is accentuated in the mandible and lower cervical spine. These random semirigid variations in positioning of the skull and spine point to a need for improved immobilization and/or confirmation of patient positioning in RT of the head and neck.« less
Estimating Cross-Site Impact Variation in the Presence of Heteroscedasticity
ERIC Educational Resources Information Center
Bloom, Howard S.; Porter, Kristin E.; Weiss, Michael J.; Raudenbush, Stephen
2013-01-01
To date, evaluation research and policy analysis have focused mainly on average program impacts and paid little systematic attention to their variation. Recently, the growing number of multi-site randomized trials that are being planned and conducted make it increasingly feasible to study "cross-site" variation in impacts. Important…
Zheng, Guanglou; Fang, Gengfa; Shankaran, Rajan; Orgun, Mehmet A; Zhou, Jie; Qiao, Li; Saleem, Kashif
2017-05-01
Generating random binary sequences (BSes) is a fundamental requirement in cryptography. A BS is a sequence of N bits, and each bit has a value of 0 or 1. For securing sensors within wireless body area networks (WBANs), electrocardiogram (ECG)-based BS generation methods have been widely investigated in which interpulse intervals (IPIs) from each heartbeat cycle are processed to produce BSes. Using these IPI-based methods to generate a 128-bit BS in real time normally takes around half a minute. In order to improve the time efficiency of such methods, this paper presents an ECG multiple fiducial-points based binary sequence generation (MFBSG) algorithm. The technique of discrete wavelet transforms is employed to detect arrival time of these fiducial points, such as P, Q, R, S, and T peaks. Time intervals between them, including RR, RQ, RS, RP, and RT intervals, are then calculated based on this arrival time, and are used as ECG features to generate random BSes with low latency. According to our analysis on real ECG data, these ECG feature values exhibit the property of randomness and, thus, can be utilized to generate random BSes. Compared with the schemes that solely rely on IPIs to generate BSes, this MFBSG algorithm uses five feature values from one heart beat cycle, and can be up to five times faster than the solely IPI-based methods. So, it achieves a design goal of low latency. According to our analysis, the complexity of the algorithm is comparable to that of fast Fourier transforms. These randomly generated ECG BSes can be used as security keys for encryption or authentication in a WBAN system.
Random Item Generation Is Affected by Age
ERIC Educational Resources Information Center
Multani, Namita; Rudzicz, Frank; Wong, Wing Yiu Stephanie; Namasivayam, Aravind Kumar; van Lieshout, Pascal
2016-01-01
Purpose: Random item generation (RIG) involves central executive functioning. Measuring aspects of random sequences can therefore provide a simple method to complement other tools for cognitive assessment. We examine the extent to which RIG relates to specific measures of cognitive function, and whether those measures can be estimated using RIG…
NASA Astrophysics Data System (ADS)
Molotkov, S. N.
2017-03-01
Various methods for the clustering of photocounts constituting a sequence of random numbers are considered. It is shown that the clustering of photocounts resulting in the Fermi-Dirac distribution makes it possible to achieve the theoretical limit of the random number generation rate.
Measuring larval nematode contamination on cattle pastures: Comparing two herbage sampling methods.
Verschave, S H; Levecke, B; Duchateau, L; Vercruysse, J; Charlier, J
2015-06-15
Assessing levels of pasture larval contamination is frequently used to study the population dynamics of the free-living stages of parasitic nematodes of livestock. Direct quantification of infective larvae (L3) on herbage is the most applied method to measure pasture larval contamination. However, herbage collection remains labour intensive and there is a lack of studies addressing the variation induced by the sampling method and the required sample size. The aim of this study was (1) to compare two different sampling methods in terms of pasture larval count results and time required to sample, (2) to assess the amount of variation in larval counts at the level of sample plot, pasture and season, respectively and (3) to calculate the required sample size to assess pasture larval contamination with a predefined precision using random plots across pasture. Eight young stock pastures of different commercial dairy herds were sampled in three consecutive seasons during the grazing season (spring, summer and autumn). On each pasture, herbage samples were collected through both a double-crossed W-transect with samples taken every 10 steps (method 1) and four random located plots of 0.16 m(2) with collection of all herbage within the plot (method 2). The average (± standard deviation (SD)) pasture larval contamination using sampling methods 1 and 2 was 325 (± 479) and 305 (± 444)L3/kg dry herbage (DH), respectively. Large discrepancies in pasture larval counts of the same pasture and season were often seen between methods, but no significant difference (P = 0.38) in larval counts between methods was found. Less time was required to collect samples with method 2. This difference in collection time between methods was most pronounced for pastures with a surface area larger than 1 ha. The variation in pasture larval counts from samples generated by random plot sampling was mainly due to the repeated measurements on the same pasture in the same season (residual variance component = 6.2), rather than due to pasture (variance component = 0.55) or season (variance component = 0.15). Using the observed distribution of L3, the required sample size (i.e. number of plots per pasture) for sampling a pasture through random plots with a particular precision was simulated. A higher relative precision was acquired when estimating PLC on pastures with a high larval contamination and a low level of aggregation compared to pastures with a low larval contamination when the same sample size was applied. In the future, herbage sampling through random plots across pasture (method 2) seems a promising method to develop further as no significant difference in counts between the methods was found and this method was less time consuming. Copyright © 2015 Elsevier B.V. All rights reserved.
Freeman, Lindsay M; Pang, Lin; Fainman, Yeshaiahu
2018-05-09
The analysis of DNA has led to revolutionary advancements in the fields of medical diagnostics, genomics, prenatal screening, and forensic science, with the global DNA testing market expected to reach revenues of USD 10.04 billion per year by 2020. However, the current methods for DNA analysis remain dependent on the necessity for fluorophores or conjugated proteins, leading to high costs associated with consumable materials and manual labor. Here, we demonstrate a potential label-free DNA composition detection method using surface-enhanced Raman spectroscopy (SERS) in which we identify the composition of cytosine and adenine within single strands of DNA. This approach depends on the fact that there is one phosphate backbone per nucleotide, which we use as a reference to compensate for systematic measurement variations. We utilize plasmonic nanomaterials with random Raman sampling to perform label-free detection of the nucleotide composition within DNA strands, generating a calibration curve from standard samples of DNA and demonstrating the capability of resolving the nucleotide composition. The work represents an innovative way for detection of the DNA composition within DNA strands without the necessity of attached labels, offering a highly sensitive and reproducible method that factors in random sampling to minimize error.
López-Jaramillo, Patricio; Rincón, Melvin Y.; García, Ronald G.; Silva, Sandra Y.; Smith, Erin; Kampeerapappun, Piyaporn; García, Carlos; Smith, Daniel J.; López, Marcos; Vélez, Iván D.
2010-01-01
A topical nanofiber nitric oxide (NO) releasing patch (≈3.5 μmol NO/cm2/day for 20 days, NOP) was compared with intramuscular meglumine antimoniate (Glucantime, 20 mg/kg/day for 20 days) for the treatment of cutaneous leishmaniasis (CL) caused by Leishmania (V.) panamensis in Santander and Tolima, Colombia. A double-blind, randomized, placebo-controlled, clinical trial was conducted to determine whether the NOP is as effective as Glucantime for the treatment of CL. Patients were randomly assigned to Glucantime and placebo patches or NOP and placebo of Glucantime. The cure rates after a 3-month follow-up were 94.8% for the group that received Glucantime compared with 37.1% in the NOP group. Despite the lower efficacy of the NOP versus Glucantime, a significantly lower frequency of non-serious adverse events and a reduced variation in serum markers were observed in patients treated with NOP. Treatment of CL with NOP resulted in a lower effectiveness compared with Glucantime; however, the low frequency of adverse events and the facility of topic administration justify the development of new generations of NOP systems for the treatment of CL. PMID:20595484
Genome Sequencing of Steroid Producing Bacteria Using Ion Torrent Technology and a Reference Genome.
Sola-Landa, Alberto; Rodríguez-García, Antonio; Barreiro, Carlos; Pérez-Redondo, Rosario
2017-01-01
The Next-Generation Sequencing technology has enormously eased the bacterial genome sequencing and several tens of thousands of genomes have been sequenced during the last 10 years. Most of the genome projects are published as draft version, however, for certain applications the complete genome sequence is required.In this chapter, we describe the strategy that allowed the complete genome sequencing of Mycobacterium neoaurum NRRL B-3805, an industrial strain exploited for steroid production, using Ion Torrent sequencing reads and the genome of a close strain as the reference. This protocol can be applied to analyze the genetic variations between closely related strains; for example, to elucidate the point mutations between a parental strain and a random mutagenesis-derived mutant.
NASA Astrophysics Data System (ADS)
Zarubin, V. S.; Sergeeva, E. S.
2017-11-01
This paper outlines simulation models that represent the quantitative interdependencies between the thermal conductivity and the thermoelastic properties of composites, on the one hand, and their porous structure and matrix properties, as well as the volume fraction of their reinforcing inclusions, on the other hand. As the reinforcing inclusions, randomly-oriented anisotropic single-wall carbon nanotubes (SWNT) are taken. The key means for constructing the simulation models are the self-matching method and the dual variational formulation of the thermal conductivity/thermoelasticity problem for a non-homogeneous solid body. With the simulation models presented below, it is possible to estimate the effect the nanocomposite porosity has on the thermoelastic properties and thermal conductivity of nanocomposites.
Turbulent mixing induced by Richtmyer-Meshkov instability
NASA Astrophysics Data System (ADS)
Krivets, V. V.; Ferguson, K. J.; Jacobs, J. W.
2017-01-01
Richtmyer-Meshkov instability is studied in shock tube experiments with an Atwood number of 0.7. The interface is formed in a vertical shock tube using opposed gas flows, and three-dimensional random initial interface perturbations are generated by the vertical oscillation of gas column producing Faraday waves. Planar Laser Mie scattering is used for flow visualization and for measurements of the mixing process. Experimental image sequences are recorded at 6 kHz frequency and processed to obtain the time dependent variation of the integral mixing layer width. Measurements of the mixing layer width are compared with Mikaelian's [1] model in order to extract the growth exponent θ where a fairly wide range of values is found varying from θ ≈ 0.2 to 0.6.
The correlation structure of several popular pseudorandom number generators
NASA Technical Reports Server (NTRS)
Neuman, F.; Merrick, R.; Martin, C. F.
1973-01-01
One of the desirable properties of a pseudorandom number generator is that the sequence of numbers it generates should have very low autocorrelation for all shifts except for zero shift and those that are multiples of its cycle length. Due to the simple methods of constructing random numbers, the ideal is often not quite fulfilled. A simple method of examining any random generator for previously unsuspected regularities is discussed. Once they are discovered it is often easy to derive the mathematical relationships, which describe the mathematical relationships, which describe the regular behavior. As examples, it is shown that high correlation exists in mixed and multiplicative congruential random number generators and prime moduli Lehmer generators for shifts a fraction of their cycle lengths.
Zahran, Sammy; Tavani, Daniele; Weiler, Stephan
2013-07-01
Casualties from natural disasters may depend on the day of the week they strike. With data from the Spatial Hazard Events and Losses Database for the United States (SHELDUS), daily variation in hurricane and tornado casualties from 5,043 tornado and 2,455 hurricane time/place events is analyzed. Hurricane forecasts provide at-risk populations with considerable lead time. Such lead time allows strategic behavior in choosing protective measures under hurricane threat; opportunity costs in terms of lost income are higher during weekdays than during weekends. On the other hand, the lead time provided by tornadoes is near zero; hence tornados generate no opportunity costs. Tornado casualties are related to risk information flows, which are higher during workdays than during leisure periods, and are related to sheltering-in-place opportunities, which are better in permanent buildings like businesses and schools. Consistent with theoretical expectations, random effects negative binomial regression results indicate that tornado events occurring on the workdays of Monday through Thursday are significantly less lethal than tornados that occur on weekends. In direct contrast, and also consistent with theory, the expected count of hurricane casualties increases significantly with weekday occurrences. The policy implications of observed daily variation in tornado and hurricane events are considered. © 2012 Society for Risk Analysis.
Mohamed, Somaia; Rosenheck, Robert A; Lin, Haiqun; Swartz, Marvin; McEvoy, Joseph; Stroup, Scott
2015-07-01
No large-scale randomized trial has compared the effect of different second-generation antipsychotic drugs and any first-generation drug on alcohol, drug and nicotine use in patients with schizophrenia. The Clinical Antipsychotic Trial of Intervention Effectiveness study randomly assigned 1432 patients formally diagnosed with schizophrenia to four second-generation antipsychotic drugs (olanzapine, risperidone quetiapine, and ziprasidone) and one first-generation antipsychotic (perphenazine) and followed them for up to 18 months. Secondary outcome data documented cigarettes smoked in the past week and alcohol and drug use severity ratings. At baseline, 61% of patients smoked, 35% used alcohol, and 23% used illicit drugs. Although there were significant effects of time showing reduction in substance use over the 18 months (all p < 0.0001), this study found no evidence that any antipsychotic was robustly superior to any other in a secondary analysis of data on substance use outcomes from a large 18-month randomized schizophrenia trial.
High-Speed Device-Independent Quantum Random Number Generation without a Detection Loophole
NASA Astrophysics Data System (ADS)
Liu, Yang; Yuan, Xiao; Li, Ming-Han; Zhang, Weijun; Zhao, Qi; Zhong, Jiaqiang; Cao, Yuan; Li, Yu-Huai; Chen, Luo-Kan; Li, Hao; Peng, Tianyi; Chen, Yu-Ao; Peng, Cheng-Zhi; Shi, Sheng-Cai; Wang, Zhen; You, Lixing; Ma, Xiongfeng; Fan, Jingyun; Zhang, Qiang; Pan, Jian-Wei
2018-01-01
Quantum mechanics provides the means of generating genuine randomness that is impossible with deterministic classical processes. Remarkably, the unpredictability of randomness can be certified in a manner that is independent of implementation devices. Here, we present an experimental study of device-independent quantum random number generation based on a detection-loophole-free Bell test with entangled photons. In the randomness analysis, without the independent identical distribution assumption, we consider the worst case scenario that the adversary launches the most powerful attacks against the quantum adversary. After considering statistical fluctuations and applying an 80 Gb ×45.6 Mb Toeplitz matrix hashing, we achieve a final random bit rate of 114 bits /s , with a failure probability less than 10-5. This marks a critical step towards realistic applications in cryptography and fundamental physics tests.
NASA Astrophysics Data System (ADS)
Leetmaa, Mikael; Skorodumova, Natalia V.
2015-11-01
We here present a revised version, v1.1, of the KMCLib general framework for kinetic Monte-Carlo (KMC) simulations. The generation of random numbers in KMCLib now relies on the C++11 standard library implementation, and support has been added for the user to choose from a set of C++11 implemented random number generators. The Mersenne-twister, the 24 and 48 bit RANLUX and a 'minimal-standard' PRNG are supported. We have also included the possibility to use true random numbers via the C++11 std::random_device generator. This release also includes technical updates to support the use of an extended range of operating systems and compilers.
Minimalist design of a robust real-time quantum random number generator
NASA Astrophysics Data System (ADS)
Kravtsov, K. S.; Radchenko, I. V.; Kulik, S. P.; Molotkov, S. N.
2015-08-01
We present a simple and robust construction of a real-time quantum random number generator (QRNG). Our minimalist approach ensures stable operation of the device as well as its simple and straightforward hardware implementation as a stand-alone module. As a source of randomness the device uses measurements of time intervals between clicks of a single-photon detector. The obtained raw sequence is then filtered and processed by a deterministic randomness extractor, which is realized as a look-up table. This enables high speed on-the-fly processing without the need of extensive computations. The overall performance of the device is around 1 random bit per detector click, resulting in 1.2 Mbit/s generation rate in our implementation.
Abzhanov, Arhat
2017-12-01
In 1917, the publication of On Growth and Form by D'Arcy Wentworth Thompson challenged both mathematicians and naturalists to think about biological shapes and diversity as more than a confusion of chaotic forms generated at random, but rather as geometric shapes that could be described by principles of physics and mathematics. Thompson's work was based on the ideas of Galileo and Goethe on morphology and of Russell on functionalism, but he was first to postulate that physical forces and internal growth parameters regulate biological forms and could be revealed via geometric transformations in morphological space. Such precise mathematical structure suggested a unifying generative process, as reflected in the title of the book. To Thompson it was growth that could explain the generation of any particular biological form, and changes in ontogeny, rather than natural selection, could then explain the diversity of biological shapes. Whereas adaptationism, widely accepted in evolutionary biology, gives primacy to extrinsic factors in producing morphological variation, Thompson's 'laws of growth' provide intrinsic directives and constraints for the generation of individual shapes, helping to explain the 'profusion of forms, colours, and other modifications' observed in the living world. © 2017. Published by The Company of Biologists Ltd.
Spline methods for approximating quantile functions and generating random samples
NASA Technical Reports Server (NTRS)
Schiess, J. R.; Matthews, C. G.
1985-01-01
Two cubic spline formulations are presented for representing the quantile function (inverse cumulative distribution function) of a random sample of data. Both B-spline and rational spline approximations are compared with analytic representations of the quantile function. It is also shown how these representations can be used to generate random samples for use in simulation studies. Comparisons are made on samples generated from known distributions and a sample of experimental data. The spline representations are more accurate for multimodal and skewed samples and to require much less time to generate samples than the analytic representation.
Münte, Thomas F; Joppich, Gregor; Däuper, Jan; Schrader, Christoph; Dengler, Reinhard; Heldmann, Marcus
2015-01-01
The generation of random sequences is considered to tax executive functions and has been reported to be impaired in Parkinson's disease (PD) previously. To assess the neurophysiological markers of random number generation in PD. Event-related potentials (ERP) were recorded in 12 PD patients and 12 age-matched normal controls (NC) while either engaging in random number generation (RNG) by pressing the number keys on a computer keyboard in a random sequence or in ordered number generation (ONG) necessitating key presses in the canonical order. Key presses were paced by an external auditory stimulus at a rate of 1 tone every 1800 ms. As a secondary task subjects had to monitor the tone-sequence for a particular target tone to which the number "0" key had to be pressed. This target tone occurred randomly and infrequently, thus creating a secondary oddball task. Behaviorally, PD patients showed an increased tendency to count in steps of one as well as a tendency towards repetition avoidance. Electrophysiologically, the amplitude of the P3 component of the ERP to the target tone of the secondary task was reduced during RNG in PD but not in NC. The behavioral findings indicate less random behavior in PD while the ERP findings suggest that this impairment comes about, because attentional resources are depleted in PD.
Anderson localization for radial tree-like random quantum graphs
NASA Astrophysics Data System (ADS)
Hislop, Peter D.; Post, Olaf
We prove that certain random models associated with radial, tree-like, rooted quantum graphs exhibit Anderson localization at all energies. The two main examples are the random length model (RLM) and the random Kirchhoff model (RKM). In the RLM, the lengths of each generation of edges form a family of independent, identically distributed random variables (iid). For the RKM, the iid random variables are associated with each generation of vertices and moderate the current flow through the vertex. We consider extensions to various families of decorated graphs and prove stability of localization with respect to decoration. In particular, we prove Anderson localization for the random necklace model.
The Reliability of Randomly Generated Math Curriculum-Based Measurements
ERIC Educational Resources Information Center
Strait, Gerald G.; Smith, Bradley H.; Pender, Carolyn; Malone, Patrick S.; Roberts, Jarod; Hall, John D.
2015-01-01
"Curriculum-Based Measurement" (CBM) is a direct method of academic assessment used to screen and evaluate students' skills and monitor their responses to academic instruction and intervention. Interventioncentral.org offers a math worksheet generator at no cost that creates randomly generated "math curriculum-based measures"…
Space Weather Effects on the Dynamics of Equatorial F Region Irregularities
NASA Astrophysics Data System (ADS)
Bhattacharyya, A.; Basu, S.; Groves, K.; Valladares, C.; Sheehan, R.
Space weather effects on transionospheric radio waves used for navigation and communication may be divided into two categories depending on the spatial scale size of the ionospheric perturbation produced by such effects. For large-scale (> 10 km) perturbations in the ionospheric plasma density, there are changes in the excess time delay for a radio wave signal, which propagates through the ionosphere, while small scale (< 1 m) structures or irregularities in the ionosphere may give rise tok amplitude and phase scintillations on UHF/L-band radio waves, resulting in loss of data, cycle slips and loss of phase lock for signals used in communication/navigation systems. In the equatorial region, where such effects may be severe, space weather effects on the dynamics of equatorial spread F (ESF) irregularities are studied from two different angles. The first one deals with the effect of magnetic activity on the generation of ESF irregularities by helping or hindering the growth of the Rayleigh Taylor (R-T) instability in the post-sunset equatorial F region. For this purpose, spaced receiver observations of scintillations on a UHF signal transmitted from a geostationary satellite and recorded near the dip equator, are used to establish the `age' of the irregularities. This is necessary because the occurrence of scintillations, particularly in the post midnight period, may also be due to irregularities which drift into the path of the radio wave signal, after having been generated more than 3 hours before the actual observation of scintillations. In order to associate the generation of irregularities with major changes in space weather, a parameter that is a measure of random variations in irregularity drift speed is computed from spaced receiver scintillation data. A large value of this parameter is usually a signature of random variations in irregularity drift due to polarization electric fields associated with freshly generated irregularities. Once these electric fields decay, the irregularities drift with the background plasma. This allows a study of the other effect of space weather on the dynamics of equatorial F region irregularities, viz. magnetically disturbed ionospheric drifts in the equatorial region. The drifts estimated for magnetically quiet days with ESF, within a period of a month, display far less variability than the quiet time variability for non-ESF days, thus making it possible to quantify perturbations in irregularity drift due to disturbance dynamo electric fields and/or prompt penetration of transient magnetospheric electric fields.
Lague, Michael R; Collard, Nicole J; Richmond, Brian G; Wood, Bernard A
2008-12-01
Mandibular corpora are well represented in the hominin fossil record, yet few studies have rigorously assessed the utility of mandibular corpus morphology for species recognition, particularly with respect to the linear dimensions that are most commonly available. In this study, we explored the extent to which commonly preserved mandibular corpus morphology can be used to: (i) discriminate among extant hominid taxa and (ii) support species designations among fossil specimens assigned to the genus Homo. In the first part of the study, discriminant analysis was used to test for significant differences in mandibular corpus shape at different taxonomic levels (genus, species and subspecies) among extant hominid taxa (i.e. Homo, Pan, Gorilla, Pongo). In the second part of the study, we examined shape variation among fossil mandibles assigned to Homo (including H. habilis sensu stricto, H. rudolfensis, early African H. erectus/H. ergaster, late African H. erectus, Asian H. erectus, H. heidelbergensis, H. neanderthalensis and H. sapiens). A novel randomization procedure designed for small samples (and using group 'distinctness values') was used to determine whether shape variation among the fossils is consistent with conventional taxonomy (or alternatively, whether a priori taxonomic groupings are completely random with respect to mandibular morphology). The randomization of 'distinctness values' was also used on the extant samples to assess the ability of the test to recognize known taxa. The discriminant analysis results demonstrated that, even for a relatively modest set of traditional mandibular corpus measurements, we can detect significant differences among extant hominids at the genus and species levels, and, in some cases, also at the subspecies level. Although the randomization of 'distinctness values' test is more conservative than discriminant analysis (based on comparisons with extant specimens), we were able to detect at least four distinct groups among the fossil specimens (i.e. H. sapiens, H. heidelbergensis, Asian H. erectus and a combined 'African Homo' group consisting of H. habilis sensu stricto, H. rudolfensis, early African H. erectus/H. ergaster and late African H. erectus). These four groups appear to be distinct at a level similar to, or greater than, that of modern hominid species. In addition, the mandibular corpora of H. neanderthalensis could be distinguished from those of 'African Homo', although not from those of H. sapiens, H. heidelbergensis, or the Asian H. erectus group. The results suggest that the features most commonly preserved on the hominin mandibular corpus have some taxonomic utility, although they are unlikely to be useful in generating a reliable alpha taxonomy for early African members of the genus Homo.
Lotka-Volterra systems in environments with randomly disordered temporal periodicity
NASA Astrophysics Data System (ADS)
Naess, Arvid; Dimentberg, Michael F.; Gaidai, Oleg
2008-08-01
A generalized Lotka-Volterra model for a pair of interacting populations of predators and prey is studied. The model accounts for the prey’s interspecies competition and therefore is asymptotically stable, whereas its oscillatory behavior is induced by temporal variations in environmental conditions simulated by those in the prey’s reproduction rate. Two models of the variations are considered, each of them combining randomness with “hidden” periodicity. The stationary joint probability density function (PDF) of the number of predators and prey is calculated numerically by the path integration (PI) method based on the use of characteristic functions and the fast Fourier transform. The numerical results match those for the asymptotic case of white-noise variations for which an analytical solution is available. Several examples are studied, with calculations of important characteristics of oscillations, for example the expected rate of up-crossings given the level of the predator number. The calculated PDFs may be of predominantly random (unimodal) or predominantly periodic nature (bimodal). Thus, the PI method has been demonstrated to be a powerful tool for studies of the dynamics of predator-prey pairs. The method captures the random oscillations as observed in nature, taking into account potential periodicity in the environmental conditions.
Lotka-Volterra systems in environments with randomly disordered temporal periodicity.
Naess, Arvid; Dimentberg, Michael F; Gaidai, Oleg
2008-08-01
A generalized Lotka-Volterra model for a pair of interacting populations of predators and prey is studied. The model accounts for the prey's interspecies competition and therefore is asymptotically stable, whereas its oscillatory behavior is induced by temporal variations in environmental conditions simulated by those in the prey's reproduction rate. Two models of the variations are considered, each of them combining randomness with "hidden" periodicity. The stationary joint probability density function (PDF) of the number of predators and prey is calculated numerically by the path integration (PI) method based on the use of characteristic functions and the fast Fourier transform. The numerical results match those for the asymptotic case of white-noise variations for which an analytical solution is available. Several examples are studied, with calculations of important characteristics of oscillations, for example the expected rate of up-crossings given the level of the predator number. The calculated PDFs may be of predominantly random (unimodal) or predominantly periodic nature (bimodal). Thus, the PI method has been demonstrated to be a powerful tool for studies of the dynamics of predator-prey pairs. The method captures the random oscillations as observed in nature, taking into account potential periodicity in the environmental conditions.
Investigating the generation of Love waves in secondary microseisms using 3D numerical simulations
NASA Astrophysics Data System (ADS)
Wenk, Stefan; Hadziioannou, Celine; Pelties, Christian; Igel, Heiner
2014-05-01
Longuet-Higgins (1950) proposed that secondary microseismic noise can be attributed to oceanic disturbances by surface gravity wave interference causing non-linear, second-order pressure perturbations at the ocean bottom. As a first approximation, this source mechanism can be considered as a force acting normal to the ocean bottom. In an isotropic, layered, elastic Earth model with plain interfaces, vertical forces generate P-SV motions in the vertical plane of source and receiver. In turn, only Rayleigh waves are excited at the free surface. However, several authors report on significant Love wave contributions in the secondary microseismic frequency band of real data measurements. The reason is still insufficiently analysed and several hypothesis are under debate: - The source mechanism has strongest influence on the excitation of shear motions, whereas the source direction dominates the effect of Love wave generation in case of point force sources. Darbyshire and Okeke (1969) proposed the topographic coupling effect of pressure loads acting on a sloping sea-floor to generate the shear tractions required for Love wave excitation. - Rayleigh waves can be converted into Love waves by scattering. Therefore, geometric scattering at topographic features or internal scattering by heterogeneous material distributions can cause Love wave generation. - Oceanic disturbances act on large regions of the ocean bottom, and extended sources have to be considered. In combination with topographic coupling and internal scattering, the extent of the source region and the timing of an extended source should effect Love wave excitation. We try to elaborate the contribution of different source mechanisms and scattering effects on Love to Rayleigh wave energy ratios by 3D numerical simulations. In particular, we estimate the amount of Love wave energy generated by point and extended sources acting on the free surface. Simulated point forces are modified in their incident angle, whereas extended sources are adapted in their spatial extent, magnitude and timing. Further, the effect of variations in the correlation length and perturbation magnitude of a random free surface topography as well as an internal random material distribution are studied.
Quantum random number generator based on quantum nature of vacuum fluctuations
NASA Astrophysics Data System (ADS)
Ivanova, A. E.; Chivilikhin, S. A.; Gleim, A. V.
2017-11-01
Quantum random number generator (QRNG) allows obtaining true random bit sequences. In QRNG based on quantum nature of vacuum, optical beam splitter with two inputs and two outputs is normally used. We compare mathematical descriptions of spatial beam splitter and fiber Y-splitter in the quantum model for QRNG, based on homodyne detection. These descriptions were identical, that allows to use fiber Y-splitters in practical QRNG schemes, simplifying the setup. Also we receive relations between the input radiation and the resulting differential current in homodyne detector. We experimentally demonstrate possibility of true random bits generation by using QRNG based on homodyne detection with Y-splitter.
A novel method for rotation invariant palm print image stitching
NASA Astrophysics Data System (ADS)
Rao, Shishir Paramathma; Panetta, Karen; Agaian, Sos S.
2017-05-01
Although not as popular as fingerprint biometrics, palm prints have garnered interest in scientific community for the rich amount of distinctive information available on the palm. In this paper, a novel method for touchless palm print stitching to increase the effective area is presented. The method is not only rotation invariant but also able to robustly handle many distortions of touchless systems like illumination variations, pose variations etc. The proposed method also can handle partial palmprints, which have a high chance of occurrence in a scene of crime, by stitching them together to produce a much larger-to-full size palmprint for authentication purpose. Experiment results are shown for IIT-D palmprint database, from which pseudo partial palmprints were generated by cropping and randomly rotating them. Furthermore, the quality of stitching algorithm is determined by extensive computer simulations and visual analysis of the stitched image. Experimental results also show that the stitching significantly increases the area of palm image for feature point detection and hence provides a way to increase the accuracy and reliability of detection.
NASA Astrophysics Data System (ADS)
Moreto, Jose; Liu, Xiaofeng
2017-11-01
The accuracy of the Rotating Parallel Ray omnidirectional integration for pressure reconstruction from the measured pressure gradient (Liu et al., AIAA paper 2016-1049) is evaluated against both the Circular Virtual Boundary omnidirectional integration (Liu and Katz, 2006 and 2013) and the conventional Poisson equation approach. Dirichlet condition at one boundary point and Neumann condition at all other boundary points are applied to the Poisson solver. A direct numerical simulation database of isotropic turbulence flow (JHTDB), with a homogeneously distributed random noise added to the entire field of DNS pressure gradient, is used to assess the performance of the methods. The random noise, generated by the Matlab function Rand, has a magnitude varying randomly within the range of +/-40% of the maximum DNS pressure gradient. To account for the effect of the noise distribution pattern on the reconstructed pressure accuracy, a total of 1000 different noise distributions achieved by using different random number seeds are involved in the evaluation. Final results after averaging the 1000 realizations show that the error of the reconstructed pressure normalized by the DNS pressure variation range is 0.15 +/-0.07 for the Poisson equation approach, 0.028 +/-0.003 for the Circular Virtual Boundary method and 0.027 +/-0.003 for the Rotating Parallel Ray method, indicating the robustness of the Rotating Parallel Ray method in pressure reconstruction. Sponsor: The San Diego State University UGP program.
NASA Astrophysics Data System (ADS)
Yoon, Jun-Sik; Rim, Taiuk; Kim, Jungsik; Kim, Kihyun; Baek, Chang-Ki; Jeong, Yoon-Ha
2015-03-01
Random dopant fluctuation effects of gate-all-around inversion-mode silicon nanowire field-effect transistors (FETs) with different diameters and extension lengths are investigated. The nanowire FETs with smaller diameter and longer extension length reduce average values and variations of subthreshold swing and drain-induced barrier lowering, thus improving short channel immunity. Relative variations of the drain currents increase as the diameter decreases because of decreased current drivability from narrower channel cross-sections. Absolute variations of the drain currents decrease critically as the extension length increases due to decreasing the number of arsenic dopants penetrating into the channel region. To understand variability origins of the drain currents, variations of source/drain series resistance and low-field mobility are investigated. All these two parameters affect the variations of the drain currents concurrently. The nanowire FETs having extension lengths sufficient to prevent dopant penetration into the channel regions and maintaining relatively large cross-sections are suggested to achieve suitable short channel immunity and small variations of the drain currents.
Hjelm, Barbara; Forsström, Björn; Löfblom, John; Rockberg, Johan; Uhlén, Mathias
2012-01-01
A problem for the generation of polyclonal antibodies is the potential difficulties for obtaining a renewable resource due to batch-to-batch variations when the same antigen is immunized into several separate animals. Here, we have investigated this issue by determining the epitopes of antibodies generated from parallel immunizations of rabbits with recombinant antigens corresponding to ten human protein targets. The epitopes were mapped by both a suspension bead array approach using overlapping synthetic 15-mer peptides and a bacterial display approach using expression of random fragments of the antigen on the surface of bacteria. Both methods determined antibody binding with the aid of fluorescent-based analysis. In addition, one polyclonal antibody was fractionated by peptide-specific affinity capture for in-depth comparison of epitopes. The results show that the same antigen immunized in several rabbits yields polyclonal antibodies with similar epitopes, but with larger differences in the relative amounts of antibodies to the different epitopes. In some cases, unique epitopes were observed for one of the immunizations. The results suggest that polyclonal antibodies generated by repeated immunizations do not display an identical epitope pattern, although many of the epitopes are similar. PMID:23284606
Why the leopard got its spots: relating pattern development to ecology in felids
Allen, William L.; Cuthill, Innes C.; Scott-Samuel, Nicholas E.; Baddeley, Roland
2011-01-01
A complete explanation of the diversity of animal colour patterns requires an understanding of both the developmental mechanisms generating them and their adaptive value. However, only two previous studies, which involved computer-generated evolving prey, have attempted to make this link. This study examines variation in the camouflage patterns displayed on the flanks of many felids. After controlling for the effects of shared ancestry using a fully resolved molecular phylogeny, this study shows how phenotypes from plausible felid coat pattern generation mechanisms relate to ecology. We found that likelihood of patterning and pattern attributes, such as complexity and irregularity, were related to felids' habitats, arboreality and nocturnality. Our analysis also indicates that disruptive selection is a likely explanation for the prevalence of melanistic forms in Felidae. Furthermore, we show that there is little phylogenetic signal in the visual appearance of felid patterning, indicating that camouflage adapts to ecology over relatively short time scales. Our method could be applied to any taxon with colour patterns that can reasonably be matched to reaction–diffusion and similar models, where the kinetics of the reaction between two or more initially randomly dispersed morphogens determines the outcome of pattern development. PMID:20961899
A revision of the subtract-with-borrow random number generators
NASA Astrophysics Data System (ADS)
Sibidanov, Alexei
2017-12-01
The most popular and widely used subtract-with-borrow generator, also known as RANLUX, is reimplemented as a linear congruential generator using large integer arithmetic with the modulus size of 576 bits. Modern computers, as well as the specific structure of the modulus inferred from RANLUX, allow for the development of a fast modular multiplication - the core of the procedure. This was previously believed to be slow and have too high cost in terms of computing resources. Our tests show a significant gain in generation speed which is comparable with other fast, high quality random number generators. An additional feature is the fast skipping of generator states leading to a seeding scheme which guarantees the uniqueness of random number sequences. Licensing provisions: GPLv3 Programming language: C++, C, Assembler
Graphene based widely-tunable and singly-polarized pulse generation with random fiber lasers
Yao, B. C.; Rao, Y. J.; Wang, Z. N.; Wu, Y.; Zhou, J. H.; Wu, H.; Fan, M. Q.; Cao, X. L.; Zhang, W. L.; Chen, Y. F.; Li, Y. R.; Churkin, D.; Turitsyn, S.; Wong, C. W.
2015-01-01
Pulse generation often requires a stabilized cavity and its corresponding mode structure for initial phase-locking. Contrastingly, modeless cavity-free random lasers provide new possibilities for high quantum efficiency lasing that could potentially be widely tunable spectrally and temporally. Pulse generation in random lasers, however, has remained elusive since the discovery of modeless gain lasing. Here we report coherent pulse generation with modeless random lasers based on the unique polarization selectivity and broadband saturable absorption of monolayer graphene. Simultaneous temporal compression of cavity-free pulses are observed with such a polarization modulation, along with a broadly-tunable pulsewidth across two orders of magnitude down to 900 ps, a broadly-tunable repetition rate across three orders of magnitude up to 3 MHz, and a singly-polarized pulse train at 41 dB extinction ratio, about an order of magnitude larger than conventional pulsed fiber lasers. Moreover, our graphene-based pulse formation also demonstrates robust pulse-to-pulse stability and wide-wavelength operation due to the cavity-less feature. Such a graphene-based architecture not only provides a tunable pulsed random laser for fiber-optic sensing, speckle-free imaging, and laser-material processing, but also a new way for the non-random CW fiber lasers to generate widely tunable and singly-polarized pulses. PMID:26687730
Graphene based widely-tunable and singly-polarized pulse generation with random fiber lasers.
Yao, B C; Rao, Y J; Wang, Z N; Wu, Y; Zhou, J H; Wu, H; Fan, M Q; Cao, X L; Zhang, W L; Chen, Y F; Li, Y R; Churkin, D; Turitsyn, S; Wong, C W
2015-12-21
Pulse generation often requires a stabilized cavity and its corresponding mode structure for initial phase-locking. Contrastingly, modeless cavity-free random lasers provide new possibilities for high quantum efficiency lasing that could potentially be widely tunable spectrally and temporally. Pulse generation in random lasers, however, has remained elusive since the discovery of modeless gain lasing. Here we report coherent pulse generation with modeless random lasers based on the unique polarization selectivity and broadband saturable absorption of monolayer graphene. Simultaneous temporal compression of cavity-free pulses are observed with such a polarization modulation, along with a broadly-tunable pulsewidth across two orders of magnitude down to 900 ps, a broadly-tunable repetition rate across three orders of magnitude up to 3 MHz, and a singly-polarized pulse train at 41 dB extinction ratio, about an order of magnitude larger than conventional pulsed fiber lasers. Moreover, our graphene-based pulse formation also demonstrates robust pulse-to-pulse stability and wide-wavelength operation due to the cavity-less feature. Such a graphene-based architecture not only provides a tunable pulsed random laser for fiber-optic sensing, speckle-free imaging, and laser-material processing, but also a new way for the non-random CW fiber lasers to generate widely tunable and singly-polarized pulses.
High-Speed Device-Independent Quantum Random Number Generation without a Detection Loophole.
Liu, Yang; Yuan, Xiao; Li, Ming-Han; Zhang, Weijun; Zhao, Qi; Zhong, Jiaqiang; Cao, Yuan; Li, Yu-Huai; Chen, Luo-Kan; Li, Hao; Peng, Tianyi; Chen, Yu-Ao; Peng, Cheng-Zhi; Shi, Sheng-Cai; Wang, Zhen; You, Lixing; Ma, Xiongfeng; Fan, Jingyun; Zhang, Qiang; Pan, Jian-Wei
2018-01-05
Quantum mechanics provides the means of generating genuine randomness that is impossible with deterministic classical processes. Remarkably, the unpredictability of randomness can be certified in a manner that is independent of implementation devices. Here, we present an experimental study of device-independent quantum random number generation based on a detection-loophole-free Bell test with entangled photons. In the randomness analysis, without the independent identical distribution assumption, we consider the worst case scenario that the adversary launches the most powerful attacks against the quantum adversary. After considering statistical fluctuations and applying an 80 Gb×45.6 Mb Toeplitz matrix hashing, we achieve a final random bit rate of 114 bits/s, with a failure probability less than 10^{-5}. This marks a critical step towards realistic applications in cryptography and fundamental physics tests.
NASA Astrophysics Data System (ADS)
Shemer, L.; Sergeeva, A.
2009-12-01
The statistics of random water wave field determines the probability of appearance of extremely high (freak) waves. This probability is strongly related to the spectral wave field characteristics. Laboratory investigation of the spatial variation of the random wave-field statistics for various initial conditions is thus of substantial practical importance. Unidirectional nonlinear random wave groups are investigated experimentally in the 300 m long Large Wave Channel (GWK) in Hannover, Germany, which is the biggest facility of its kind in Europe. Numerous realizations of a wave field with the prescribed frequency power spectrum, yet randomly-distributed initial phases of each harmonic, were generated by a computer-controlled piston-type wavemaker. Several initial spectral shapes with identical dominant wave length but different width were considered. For each spectral shape, the total duration of sampling in all realizations was long enough to yield sufficient sample size for reliable statistics. Through all experiments, an effort had been made to retain the characteristic wave height value and thus the degree of nonlinearity of the wave field. Spatial evolution of numerous statistical wave field parameters (skewness, kurtosis and probability distributions) is studied using about 25 wave gauges distributed along the tank. It is found that, depending on the initial spectral shape, the frequency spectrum of the wave field may undergo significant modification in the course of its evolution along the tank; the values of all statistical wave parameters are strongly related to the local spectral width. A sample of the measured wave height probability functions (scaled by the variance of surface elevation) is plotted in Fig. 1 for the initially narrow rectangular spectrum. The results in Fig. 1 resemble findings obtained in [1] for the initial Gaussian spectral shape. The probability of large waves notably surpasses that predicted by the Rayleigh distribution and is the highest at the distance of about 100 m. Acknowledgement This study is carried out in the framework of the EC supported project "Transnational access to large-scale tests in the Large Wave Channel (GWK) of Forschungszentrum Küste (Contract HYDRALAB III - No. 022441). [1] L. Shemer and A. Sergeeva, J. Geophys. Res. Oceans 114, C01015 (2009). Figure 1. Variation along the tank of the measured wave height distribution for rectangular initial spectral shape, the carrier wave period T0=1.5 s.
Bill, Anke; Rosethorne, Elizabeth M; Kent, Toby C; Fawcett, Lindsay; Burchell, Lynn; van Diepen, Michiel T; Marelli, Anthony; Batalov, Sergey; Miraglia, Loren; Orth, Anthony P; Renaud, Nicole A; Charlton, Steven J; Gosling, Martin; Gaither, L Alex; Groot-Kormelink, Paul J
2014-01-01
The human prostacyclin receptor (hIP receptor) is a seven-transmembrane G protein-coupled receptor (GPCR) that plays a critical role in vascular smooth muscle relaxation and platelet aggregation. hIP receptor dysfunction has been implicated in numerous cardiovascular abnormalities, including myocardial infarction, hypertension, thrombosis and atherosclerosis. Genomic sequencing has discovered several genetic variations in the PTGIR gene coding for hIP receptor, however, its structure-function relationship has not been sufficiently explored. Here we set out to investigate the applicability of high throughput random mutagenesis to study the structure-function relationship of hIP receptor. While chemical mutagenesis was not suitable to generate a mutagenesis library with sufficient coverage, our data demonstrate error-prone PCR (epPCR) mediated mutagenesis as a valuable method for the unbiased screening of residues regulating hIP receptor function and expression. Here we describe the generation and functional characterization of an epPCR derived mutagenesis library compromising >4000 mutants of the hIP receptor. We introduce next generation sequencing as a useful tool to validate the quality of mutagenesis libraries by providing information about the coverage, mutation rate and mutational bias. We identified 18 mutants of the hIP receptor that were expressed at the cell surface, but demonstrated impaired receptor function. A total of 38 non-synonymous mutations were identified within the coding region of the hIP receptor, mapping to 36 distinct residues, including several mutations previously reported to affect the signaling of the hIP receptor. Thus, our data demonstrates epPCR mediated random mutagenesis as a valuable and practical method to study the structure-function relationship of GPCRs.
Kent, Toby C.; Fawcett, Lindsay; Burchell, Lynn; van Diepen, Michiel T.; Marelli, Anthony; Batalov, Sergey; Miraglia, Loren; Orth, Anthony P.; Renaud, Nicole A.; Charlton, Steven J.; Gosling, Martin; Gaither, L. Alex; Groot-Kormelink, Paul J.
2014-01-01
The human prostacyclin receptor (hIP receptor) is a seven-transmembrane G protein-coupled receptor (GPCR) that plays a critical role in vascular smooth muscle relaxation and platelet aggregation. hIP receptor dysfunction has been implicated in numerous cardiovascular abnormalities, including myocardial infarction, hypertension, thrombosis and atherosclerosis. Genomic sequencing has discovered several genetic variations in the PTGIR gene coding for hIP receptor, however, its structure-function relationship has not been sufficiently explored. Here we set out to investigate the applicability of high throughput random mutagenesis to study the structure-function relationship of hIP receptor. While chemical mutagenesis was not suitable to generate a mutagenesis library with sufficient coverage, our data demonstrate error-prone PCR (epPCR) mediated mutagenesis as a valuable method for the unbiased screening of residues regulating hIP receptor function and expression. Here we describe the generation and functional characterization of an epPCR derived mutagenesis library compromising >4000 mutants of the hIP receptor. We introduce next generation sequencing as a useful tool to validate the quality of mutagenesis libraries by providing information about the coverage, mutation rate and mutational bias. We identified 18 mutants of the hIP receptor that were expressed at the cell surface, but demonstrated impaired receptor function. A total of 38 non-synonymous mutations were identified within the coding region of the hIP receptor, mapping to 36 distinct residues, including several mutations previously reported to affect the signaling of the hIP receptor. Thus, our data demonstrates epPCR mediated random mutagenesis as a valuable and practical method to study the structure-function relationship of GPCRs. PMID:24886841
Huynh, Que-Lam; Benet-Martínez, Verònica; Nguyen, Angela-MinhTu D
2018-06-14
Bicultural Identity Integration (BII) is an individual difference construct that captures variations in the experience of biculturalism. Using multiple samples in a series of steps, we refined BII measurement and then tested the construct in a diverse sample of bicultural individuals. Specifically, we wrote new BII items based on qualitative data ( n = 108), examined the quality of the new measure using subject-matter experts ( n = 23) and bicultural individuals ( n = 5), and then collected validation data from bicultural college students ( n = 1049). We used exploratory factor analyses to select items and explore BIIS-2 structure with a random subset of the larger sample ( n = 600), confirmatory factor analyses to show that the factor structure fit the data well ( n = 449), and multigroup confirmatory factor analyses to demonstrate measurement invariance in two ethnic and two generational groups. Results showed that the Bicultural Identity Integration Scale-Version 2 (BIIS-2) yielded reliable and stable scores. The data also revealed interesting and important patterns of associations with theoretically relevant constructs: personality, acculturation, and psychological well-being. Additionally, structural equation models confirmed that in general, personality and acculturation variables influence individuals' experiences with their dual cultural identities, which in turn influence adjustment, but there were interesting and important generational differences in how these variables were related. These findings lend support for the validity of BIIS-2 score interpretations; add to our understanding of the sociocultural, personality, and adjustment correlates of the bicultural experience; and have important implications for understanding the well-being of bicultural individuals. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
A qualitative assessment of a random process proposed as an atmospheric turbulence model
NASA Technical Reports Server (NTRS)
Sidwell, K.
1977-01-01
A random process is formed by the product of two Gaussian processes and the sum of that product with a third Gaussian process. The resulting total random process is interpreted as the sum of an amplitude modulated process and a slowly varying, random mean value. The properties of the process are examined, including an interpretation of the process in terms of the physical structure of atmospheric motions. The inclusion of the mean value variation gives an improved representation of the properties of atmospheric motions, since the resulting process can account for the differences in the statistical properties of atmospheric velocity components and their gradients. The application of the process to atmospheric turbulence problems, including the response of aircraft dynamic systems, is examined. The effects of the mean value variation upon aircraft loads are small in most cases, but can be important in the measurement and interpretation of atmospheric turbulence data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Zhijie; Tartakovsky, Alexandre M.
This work presents a hierarchical model for solute transport in bounded layered porous media with random permeability. The model generalizes the Taylor-Aris dispersion theory to stochastic transport in random layered porous media with a known velocity covariance function. In the hierarchical model, we represent (random) concentration in terms of its cross-sectional average and a variation function. We derive a one-dimensional stochastic advection-dispersion-type equation for the average concentration and a stochastic Poisson equation for the variation function, as well as expressions for the effective velocity and dispersion coefficient. We observe that velocity fluctuations enhance dispersion in a non-monotonic fashion: the dispersionmore » initially increases with correlation length λ, reaches a maximum, and decreases to zero at infinity. Maximum enhancement can be obtained at the correlation length about 0.25 the size of the porous media perpendicular to flow.« less
Random bits, true and unbiased, from atmospheric turbulence
Marangon, Davide G.; Vallone, Giuseppe; Villoresi, Paolo
2014-01-01
Random numbers represent a fundamental ingredient for secure communications and numerical simulation as well as to games and in general to Information Science. Physical processes with intrinsic unpredictability may be exploited to generate genuine random numbers. The optical propagation in strong atmospheric turbulence is here taken to this purpose, by observing a laser beam after a 143 km free-space path. In addition, we developed an algorithm to extract the randomness of the beam images at the receiver without post-processing. The numbers passed very selective randomness tests for qualification as genuine random numbers. The extracting algorithm can be easily generalized to random images generated by different physical processes. PMID:24976499
Scope of Various Random Number Generators in Ant System Approach for TSP
NASA Technical Reports Server (NTRS)
Sen, S. K.; Shaykhian, Gholam Ali
2007-01-01
Experimented on heuristic, based on an ant system approach for traveling Salesman problem, are several quasi and pseudo-random number generators. This experiment is to explore if any particular generator is most desirable. Such an experiment on large samples has the potential to rank the performance of the generators for the foregoing heuristic. This is just to seek an answer to the controversial performance ranking of the generators in probabilistic/statically sense.
True randomness from an incoherent source
NASA Astrophysics Data System (ADS)
Qi, Bing
2017-11-01
Quantum random number generators (QRNGs) harness the intrinsic randomness in measurement processes: the measurement outputs are truly random, given the input state is a superposition of the eigenstates of the measurement operators. In the case of trusted devices, true randomness could be generated from a mixed state ρ so long as the system entangled with ρ is well protected. We propose a random number generation scheme based on measuring the quadrature fluctuations of a single mode thermal state using an optical homodyne detector. By mixing the output of a broadband amplified spontaneous emission (ASE) source with a single mode local oscillator (LO) at a beam splitter and performing differential photo-detection, we can selectively detect the quadrature fluctuation of a single mode output of the ASE source, thanks to the filtering function of the LO. Experimentally, a quadrature variance about three orders of magnitude larger than the vacuum noise has been observed, suggesting this scheme can tolerate much higher detector noise in comparison with QRNGs based on measuring the vacuum noise. The high quality of this entropy source is evidenced by the small correlation coefficients of the acquired data. A Toeplitz-hashing extractor is applied to generate unbiased random bits from the Gaussian distributed raw data, achieving an efficiency of 5.12 bits per sample. The output of the Toeplitz extractor successfully passes all the NIST statistical tests for random numbers.
Statistical process control in nursing research.
Polit, Denise F; Chaboyer, Wendy
2012-02-01
In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.
Golden Ratio Versus Pi as Random Sequence Sources for Monte Carlo Integration
NASA Technical Reports Server (NTRS)
Sen, S. K.; Agarwal, Ravi P.; Shaykhian, Gholam Ali
2007-01-01
We discuss here the relative merits of these numbers as possible random sequence sources. The quality of these sequences is not judged directly based on the outcome of all known tests for the randomness of a sequence. Instead, it is determined implicitly by the accuracy of the Monte Carlo integration in a statistical sense. Since our main motive of using a random sequence is to solve real world problems, it is more desirable if we compare the quality of the sequences based on their performances for these problems in terms of quality/accuracy of the output. We also compare these sources against those generated by a popular pseudo-random generator, viz., the Matlab rand and the quasi-random generator ha/ton both in terms of error and time complexity. Our study demonstrates that consecutive blocks of digits of each of these numbers produce a good random sequence source. It is observed that randomly chosen blocks of digits do not have any remarkable advantage over consecutive blocks for the accuracy of the Monte Carlo integration. Also, it reveals that pi is a better source of a random sequence than theta when the accuracy of the integration is concerned.
Analysis of entropy extraction efficiencies in random number generation systems
NASA Astrophysics Data System (ADS)
Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu
2016-05-01
Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.
Quantum random bit generation using energy fluctuations in stimulated Raman scattering.
Bustard, Philip J; England, Duncan G; Nunn, Josh; Moffatt, Doug; Spanner, Michael; Lausten, Rune; Sussman, Benjamin J
2013-12-02
Random number sequences are a critical resource in modern information processing systems, with applications in cryptography, numerical simulation, and data sampling. We introduce a quantum random number generator based on the measurement of pulse energy quantum fluctuations in Stokes light generated by spontaneously-initiated stimulated Raman scattering. Bright Stokes pulse energy fluctuations up to five times the mean energy are measured with fast photodiodes and converted to unbiased random binary strings. Since the pulse energy is a continuous variable, multiple bits can be extracted from a single measurement. Our approach can be generalized to a wide range of Raman active materials; here we demonstrate a prototype using the optical phonon line in bulk diamond.
Experimental study of a quantum random-number generator based on two independent lasers
NASA Astrophysics Data System (ADS)
Sun, Shi-Hai; Xu, Feihu
2017-12-01
A quantum random-number generator (QRNG) can produce true randomness by utilizing the inherent probabilistic nature of quantum mechanics. Recently, the spontaneous-emission quantum phase noise of the laser has been widely deployed for quantum random-number generation, due to its high rate, its low cost, and the feasibility of chip-scale integration. Here, we perform a comprehensive experimental study of a phase-noise-based QRNG with two independent lasers, each of which operates in either continuous-wave (CW) or pulsed mode. We implement the QRNG by operating the two lasers in three configurations, namely, CW + CW, CW + pulsed, and pulsed + pulsed, and demonstrate their trade-offs, strengths, and weaknesses.
Minimal-post-processing 320-Gbps true random bit generation using physical white chaos.
Wang, Anbang; Wang, Longsheng; Li, Pu; Wang, Yuncai
2017-02-20
Chaotic external-cavity semiconductor laser (ECL) is a promising entropy source for generation of high-speed physical random bits or digital keys. The rate and randomness is unfortunately limited by laser relaxation oscillation and external-cavity resonance, and is usually improved by complicated post processing. Here, we propose using a physical broadband white chaos generated by optical heterodyning of two ECLs as entropy source to construct high-speed random bit generation (RBG) with minimal post processing. The optical heterodyne chaos not only has a white spectrum without signature of relaxation oscillation and external-cavity resonance but also has a symmetric amplitude distribution. Thus, after quantization with a multi-bit analog-digital-convertor (ADC), random bits can be obtained by extracting several least significant bits (LSBs) without any other processing. In experiments, a white chaos with a 3-dB bandwidth of 16.7 GHz is generated. Its entropy rate is estimated as 16 Gbps by single-bit quantization which means a spectrum efficiency of 96%. With quantization using an 8-bit ADC, 320-Gbps physical RBG is achieved by directly extracting 4 LSBs at 80-GHz sampling rate.
NASA Astrophysics Data System (ADS)
Nan, Hanqing; Liang, Long; Chen, Guo; Liu, Liyu; Liu, Ruchuan; Jiao, Yang
2018-03-01
Three-dimensional (3D) collective cell migration in a collagen-based extracellular matrix (ECM) is among one of the most significant topics in developmental biology, cancer progression, tissue regeneration, and immune response. Recent studies have suggested that collagen-fiber mediated force transmission in cellularized ECM plays an important role in stress homeostasis and regulation of collective cellular behaviors. Motivated by the recent in vitro observation that oriented collagen can significantly enhance the penetration of migrating breast cancer cells into dense Matrigel which mimics the intravasation process in vivo [Han et al. Proc. Natl. Acad. Sci. USA 113, 11208 (2016), 10.1073/pnas.1610347113], we devise a procedure for generating realizations of highly heterogeneous 3D collagen networks with prescribed microstructural statistics via stochastic optimization. Specifically, a collagen network is represented via the graph (node-bond) model and the microstructural statistics considered include the cross-link (node) density, valence distribution, fiber (bond) length distribution, as well as fiber orientation distribution. An optimization problem is formulated in which the objective function is defined as the squared difference between a set of target microstructural statistics and the corresponding statistics for the simulated network. Simulated annealing is employed to solve the optimization problem by evolving an initial network via random perturbations to generate realizations of homogeneous networks with randomly oriented fibers, homogeneous networks with aligned fibers, heterogeneous networks with a continuous variation of fiber orientation along a prescribed direction, as well as a binary system containing a collagen region with aligned fibers and a dense Matrigel region with randomly oriented fibers. The generation and propagation of active forces in the simulated networks due to polarized contraction of an embedded ellipsoidal cell and a small group of cells are analyzed by considering a nonlinear fiber model incorporating strain hardening upon large stretching and buckling upon compression. Our analysis shows that oriented fibers can significantly enhance long-range force transmission in the network. Moreover, in the oriented-collagen-Matrigel system, the forces generated by a polarized cell in collagen can penetrate deeply into the Matrigel region. The stressed Matrigel fibers could provide contact guidance for the migrating cell cells, and thus enhance their penetration into Matrigel. This suggests a possible mechanism for the observed enhanced intravasation by oriented collagen.
A bioavailable strontium isoscape for Western Europe: A machine learning approach
von Holstein, Isabella C. C.; Laffoon, Jason E.; Willmes, Malte; Liu, Xiao-Ming; Davies, Gareth R.
2018-01-01
Strontium isotope ratios (87Sr/86Sr) are gaining considerable interest as a geolocation tool and are now widely applied in archaeology, ecology, and forensic research. However, their application for provenance requires the development of baseline models predicting surficial 87Sr/86Sr variations (“isoscapes”). A variety of empirically-based and process-based models have been proposed to build terrestrial 87Sr/86Sr isoscapes but, in their current forms, those models are not mature enough to be integrated with continuous-probability surface models used in geographic assignment. In this study, we aim to overcome those limitations and to predict 87Sr/86Sr variations across Western Europe by combining process-based models and a series of remote-sensing geospatial products into a regression framework. We find that random forest regression significantly outperforms other commonly used regression and interpolation methods, and efficiently predicts the multi-scale patterning of 87Sr/86Sr variations by accounting for geological, geomorphological and atmospheric controls. Random forest regression also provides an easily interpretable and flexible framework to integrate different types of environmental auxiliary variables required to model the multi-scale patterning of 87Sr/86Sr variability. The method is transferable to different scales and resolutions and can be applied to the large collection of geospatial data available at local and global levels. The isoscape generated in this study provides the most accurate 87Sr/86Sr predictions in bioavailable strontium for Western Europe (R2 = 0.58 and RMSE = 0.0023) to date, as well as a conservative estimate of spatial uncertainty by applying quantile regression forest. We anticipate that the method presented in this study combined with the growing numbers of bioavailable 87Sr/86Sr data and satellite geospatial products will extend the applicability of the 87Sr/86Sr geo-profiling tool in provenance applications. PMID:29847595
De Vries, A
1984-01-01
Darwin's theory of evolution by natural selection on the basis of inherited random individual variation and excessive offspring remains controversial. Arguments derive from religion (creationism)--disagreement with the Darwinian reduction of teleology to physical causation, from science--nonselective mechanism, and from logic--negation of the possibility of proof of any scientific theory allowing only for validity thusfar. Experimentation on Darwinism is beset with a practical obstacle--unattainable duration, and a logical one--nonpredictability inherent in the randomness of variation. Evolution and creationism need not be contradictive if viewed in their separate domains--rational versus miraculous.
Empirical Analysis and Refinement of Expert System Knowledge Bases
1988-08-31
refinement. Both a simulated case generation program, and a random rule basher were developed to enhance rule refinement experimentation. *Substantial...the second fiscal year 88 objective was fully met. Rule Refinement System Simulated Rule Basher Case Generator Stored Cases Expert System Knowledge...generated until the rule is satisfied. Cases may be randomly generated for a given rule or hypothesis. Rule Basher Given that one has a correct
1989-08-01
Random variables for the conditional exponential distribution are generated using the inverse transform method. C1) Generate U - UCO,i) (2) Set s - A ln...e - [(x+s - 7)/ n] 0 + [Cx-T)/n]0 c. Random variables from the conditional weibull distribution are generated using the inverse transform method. C1...using a standard normal transformation and the inverse transform method. B - 3 APPENDIX 3 DISTRIBUTIONS SUPPORTED BY THE MODEL (1) Generate Y - PCX S
Practical quantum random number generator based on measuring the shot noise of vacuum states
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen Yong; Zou Hongxin; Tian Liang
2010-06-15
The shot noise of vacuum states is a kind of quantum noise and is totally random. In this paper a nondeterministic random number generation scheme based on measuring the shot noise of vacuum states is presented and experimentally demonstrated. We use a homodyne detector to measure the shot noise of vacuum states. Considering that the frequency bandwidth of our detector is limited, we derive the optimal sampling rate so that sampling points have the least correlation with each other. We also choose a method to extract random numbers from sampling values, and prove that the influence of classical noise canmore » be avoided with this method so that the detector does not have to be shot-noise limited. The random numbers generated with this scheme have passed ent and diehard tests.« less
Compact quantum random number generator based on superluminescent light-emitting diodes
NASA Astrophysics Data System (ADS)
Wei, Shihai; Yang, Jie; Fan, Fan; Huang, Wei; Li, Dashuang; Xu, Bingjie
2017-12-01
By measuring the amplified spontaneous emission (ASE) noise of the superluminescent light emitting diodes, we propose and realize a quantum random number generator (QRNG) featured with practicability. In the QRNG, after the detection and amplification of the ASE noise, the data acquisition and randomness extraction which is integrated in a field programmable gate array (FPGA) are both implemented in real-time, and the final random bit sequences are delivered to a host computer with a real-time generation rate of 1.2 Gbps. Further, to achieve compactness, all the components of the QRNG are integrated on three independent printed circuit boards with a compact design, and the QRNG is packed in a small enclosure sized 140 mm × 120 mm × 25 mm. The final random bit sequences can pass all the NIST-STS and DIEHARD tests.
Variations on a theme of Lander and Waterman
DOE Office of Scientific and Technical Information (OSTI.GOV)
Speed, T.
1997-12-01
The original Lander and Waterman mathematical analysis was for fingerprinting random clones. Since that time, a number of variants of their theory have appeared, including ones which apply to mapping by anchoring random clones, and to non-random or directed clone mapping. The same theory is now widely used to devise random sequencing strategies. In this talk I will review these developments, and go on the discuss the theory required for directed sequencing strategies.
Pseudo-random properties of a linear congruential generator investigated by b-adic diaphony
NASA Astrophysics Data System (ADS)
Stoev, Peter; Stoilova, Stanislava
2017-12-01
In the proposed paper we continue the study of the diaphony, defined in b-adic number system, and we extend it in different directions. We investigate this diaphony as a tool for estimation of the pseudorandom properties of some of the most used random number generators. This is done by evaluating the distribution of specially constructed two-dimensional nets on the base of the obtained random numbers. The aim is to see how the generated numbers are suitable for calculations in some numerical methods (Monte Carlo etc.).
Generating constrained randomized sequences: item frequency matters.
French, Robert M; Perruchet, Pierre
2009-11-01
All experimental psychologists understand the importance of randomizing lists of items. However, randomization is generally constrained, and these constraints-in particular, not allowing immediately repeated items-which are designed to eliminate particular biases, frequently engender others. We describe a simple Monte Carlo randomization technique that solves a number of these problems. However, in many experimental settings, we are concerned not only with the number and distribution of items but also with the number and distribution of transitions between items. The algorithm mentioned above provides no control over this. We therefore introduce a simple technique that uses transition tables for generating correctly randomized sequences. We present an analytic method of producing item-pair frequency tables and item-pair transitional probability tables when immediate repetitions are not allowed. We illustrate these difficulties and how to overcome them, with reference to a classic article on word segmentation in infants. Finally, we provide free access to an Excel file that allows users to generate transition tables with up to 10 different item types, as well as to generate appropriately distributed randomized sequences of any length without immediately repeated elements. This file is freely available from http://leadserv.u-bourgogne.fr/IMG/xls/TransitionMatrix.xls.
A random spatial network model based on elementary postulates
Karlinger, Michael R.; Troutman, Brent M.
1989-01-01
A model for generating random spatial networks that is based on elementary postulates comparable to those of the random topology model is proposed. In contrast to the random topology model, this model ascribes a unique spatial specification to generated drainage networks, a distinguishing property of some network growth models. The simplicity of the postulates creates an opportunity for potential analytic investigations of the probabilistic structure of the drainage networks, while the spatial specification enables analyses of spatially dependent network properties. In the random topology model all drainage networks, conditioned on magnitude (number of first-order streams), are equally likely, whereas in this model all spanning trees of a grid, conditioned on area and drainage density, are equally likely. As a result, link lengths in the generated networks are not independent, as usually assumed in the random topology model. For a preliminary model evaluation, scale-dependent network characteristics, such as geometric diameter and link length properties, and topologic characteristics, such as bifurcation ratio, are computed for sets of drainage networks generated on square and rectangular grids. Statistics of the bifurcation and length ratios fall within the range of values reported for natural drainage networks, but geometric diameters tend to be relatively longer than those for natural networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Longiaru, S.; Bhattacharyya, T.
1985-01-01
Inherent in Fry's (1979) all-object separation method of strain analysis are the subtle conditions that 1) the grains or phenocrysts being counted are of equal diameter and 2) that the true centers of such grains lie within the plane of measurement. When such conditions are met, the technique yields accurate, easily interpreted voids within all-object separation (AOS) plots for both deformed and non-deformed populations. Natural grain or phenocryst populations generally do not conform to these limitation and practical application of the technique from either a cut rock surface or thin section often yields diffuse patterns that are not easily interpreted.more » The authors examine the effect of grain size variation and grain/matrix ratio on AOS diagrams developed from computer generated spherical grain populations constructed in both two and three dimensions. They employ a random number generator and simple fitting algorithm to develop grain populations with known statistical parameters. Such control allows for the modeling of many types of natural grain size populations such as fluvial sandstones, porphyritic ash flow tuffs, augen gneisses, etc. They show that significant grain size variation in a two dimensional population contributes substantial noise in to the AOS diagram and that an additional level of noise is encountered when dealing with slices through populations modeled in three dimensions. Some of this noise can be eliminated by rigorous sampling of only subsets of the total grain population.« less
Method and apparatus for determining position using global positioning satellites
NASA Technical Reports Server (NTRS)
Ward, John (Inventor); Ward, William S. (Inventor)
1998-01-01
A global positioning satellite receiver having an antenna for receiving a L1 signal from a satellite. The L1 signal is processed by a preamplifier stage including a band pass filter and a low noise amplifier and output as a radio frequency (RF) signal. A mixer receives and de-spreads the RF signal in response to a pseudo-random noise code, i.e., Gold code, generated by an internal pseudo-random noise code generator. A microprocessor enters a code tracking loop, such that during the code tracking loop, it addresses the pseudo-random code generator to cause the pseudo-random code generator to sequentially output pseudo-random codes corresponding to satellite codes used to spread the L1 signal, until correlation occurs. When an output of the mixer is indicative of the occurrence of correlation between the RF signal and the generated pseudo-random codes, the microprocessor enters an operational state which slows the receiver code sequence to stay locked with the satellite code sequence. The output of the mixer is provided to a detector which, in turn, controls certain routines of the microprocessor. The microprocessor will output pseudo range information according to an interrupt routine in response detection of correlation. The pseudo range information is to be telemetered to a ground station which determines the position of the global positioning satellite receiver.
Narrow-band generation in random distributed feedback fiber laser.
Sugavanam, Srikanth; Tarasov, Nikita; Shu, Xuewen; Churkin, Dmitry V
2013-07-15
Narrow-band emission of spectral width down to ~0.05 nm line-width is achieved in the random distributed feedback fiber laser employing narrow-band fiber Bragg grating or fiber Fabry-Perot interferometer filters. The observed line-width is ~10 times less than line-width of other demonstrated up to date random distributed feedback fiber lasers. The random DFB laser with Fabry-Perot interferometer filter provides simultaneously multi-wavelength and narrow-band (within each line) generation with possibility of further wavelength tuning.
[Research progress of molecular genetic analysis in Schistosoma variation].
Zheng, Su-Yue; Li, Fei
2014-02-01
The development of molecular biology techniques makes important contributions to the researches of heritable variation of Schistosoma. In recent years, the molecular genetic analysis in the Schistosoma variation researches mainly includes the restriction fragment length polymorphism (RFLP), random amplified polymorphism technology (RAPD), microsatellite anchored PCR (SSR-PCR), and polymerase reaction single-strand conformation polymorphism (PCR-SSCP). This article reviews the research progress of molecular genetic analysis in Schistosoma variation in recent years.
Yang, Jinjian; Wu, Qijia; Xiao, Rong; Zhao, Jupeng; Chen, Jian; Jiao, Xiaoguo
2018-04-01
Variations in species morphology and life-history traits strongly correlate with geographic and climatic characteristics. Most studies on morphological variations in animals focus on ectotherms distributed on a large geographic scale across latitudinal and/or altitudinal gradient. However, the morphological variations of spiders living in the same habitats across different seasons have not been reported. In this study, we used the wolf spider, Pardosa astrigera , as a model to determine seasonal differences in adult body size, melanism, fecundity, and egg diameter both in the overwintering and the first generation for 2010 and 2016. The results showed that in 2010, both females and males of the overwintering generation were significantly darker than the first generation. Moreover, the overwintering females were markedly larger and produced more and bigger eggs than the first generation in both 2010 and 2016. Considering the overwintering P. astrigera experiencing low temperature and/or desiccation stress, these results suggest that substantially darker and larger body of the overwintering generation is adaptive to adverse conditions.
Variations in the perceptions of peer and coach motivational climate.
Vazou, Spiridoula
2010-06-01
This study examined (a) variations in the perceptions of peer- and coach-generated motivational climate within and between teams and (b) individual- and group-level factors that can account for these variations. Participants were 483 athletes between 12 and 16 years old. The results showed that perceptions of both peer- and coach-generated climate varied as a function of group-level variables, namely team success, coach's gender (except for peer ego-involving climate), and team type (only for coach ego-involving climate). Perceptions of peer- and coach-generated climate also varied as a function of individual-level variables, namely athletes' task and ego orientations, gender, and age (only for coach task-involving and peer ego-involving climate). Moreover, within-team variations in perceptions of peer- and coach-generated climate as a function of task and ego orientation levels were identified. Identifying and controlling the factors that influence perceptions of peer- and coach-generated climate may be important in strengthening task-involving motivational cues.
Programmable quantum random number generator without postprocessing.
Nguyen, Lac; Rehain, Patrick; Sua, Yong Meng; Huang, Yu-Ping
2018-02-15
We demonstrate a viable source of unbiased quantum random numbers whose statistical properties can be arbitrarily programmed without the need for any postprocessing such as randomness distillation or distribution transformation. It is based on measuring the arrival time of single photons in shaped temporal modes that are tailored with an electro-optical modulator. We show that quantum random numbers can be created directly in customized probability distributions and pass all randomness tests of the NIST and Dieharder test suites without any randomness extraction. The min-entropies of such generated random numbers are measured close to the theoretical limits, indicating their near-ideal statistics and ultrahigh purity. Easy to implement and arbitrarily programmable, this technique can find versatile uses in a multitude of data analysis areas.
The role of ferroelectric domain structure in second harmonic generation in random quadratic media.
Roppo, Vito; Wang, W; Kalinowski, K; Kong, Y; Cojocaru, C; Trull, J; Vilaseca, R; Scalora, M; Krolikowski, W; Kivshar, Yu
2010-03-01
We study theoretically and numerically the second harmonic generation in a nonlinear crystal with random distribution of ferroelectric domains. We show that the specific features of disordered domain structure greatly affect the emission pattern of the generated harmonics. This phenomena can be used to characterize the degree of disorder in nonlinear photonic structures.
2016-01-01
Context An increasing concern in the face of human expansion throughout natural habitats is whether animal populations can respond adaptively when confronted with challenges like environmental change and novelty. Behavioural flexibility is an important factor in estimating the adaptive potential of both individuals and populations, and predicting the degree to which they can cope with change. Study Design This study on the three-spined stickleback (Gasterosteus aculeatus) is an empiric illustration of the degree of behavioural variation that can emerge between semi-natural systems within only a single generation. Wild-caught adult sticklebacks (P, N = 400) were randomly distributed in equal densities over 20 standardized semi-natural environments (ponds), and one year later offspring (F1, N = 652) were presented with repeated behavioural assays. Individuals were challenged to reach a food source through a novel transparent obstacle, during which exploration, activity, foraging, sociability and wall-biting behaviours were recorded through video observation. We found that coping responses of individuals from the first generation to this unfamiliar foraging challenge were related to even relatively small, naturally diversified variation in developmental environment. All measured behaviours were correlated with each other. Especially exploration, sociability and wall-biting were found to differ significantly between ponds. These differences could not be explained by stickleback density or the turbidity of the water. Findings Our findings show that a) differences in early-life environment appear to affect stickleback feeding behaviour later in life; b) this is the case even when the environmental differences are only small, within natural parameters and diversified gradually; and c) effects are present despite semi-natural conditions that fluctuate during the year. Therefore, in behaviourally plastic animals like the stickleback, the adaptive response to human-induced habitat disturbance may occur rapidly (within one generation) and vary strongly based on the system’s (starting) conditions. This has important implications for the variability in animal behaviour, which may be much larger than expected from studying laboratory systems, as well as for the validity of predictions of population responses to change. PMID:26862908
Langenhof, M Rohaa; Apperloo, Rienk; Komdeur, Jan
2016-01-01
An increasing concern in the face of human expansion throughout natural habitats is whether animal populations can respond adaptively when confronted with challenges like environmental change and novelty. Behavioural flexibility is an important factor in estimating the adaptive potential of both individuals and populations, and predicting the degree to which they can cope with change. This study on the three-spined stickleback (Gasterosteus aculeatus) is an empiric illustration of the degree of behavioural variation that can emerge between semi-natural systems within only a single generation. Wild-caught adult sticklebacks (P, N = 400) were randomly distributed in equal densities over 20 standardized semi-natural environments (ponds), and one year later offspring (F1, N = 652) were presented with repeated behavioural assays. Individuals were challenged to reach a food source through a novel transparent obstacle, during which exploration, activity, foraging, sociability and wall-biting behaviours were recorded through video observation. We found that coping responses of individuals from the first generation to this unfamiliar foraging challenge were related to even relatively small, naturally diversified variation in developmental environment. All measured behaviours were correlated with each other. Especially exploration, sociability and wall-biting were found to differ significantly between ponds. These differences could not be explained by stickleback density or the turbidity of the water. Our findings show that a) differences in early-life environment appear to affect stickleback feeding behaviour later in life; b) this is the case even when the environmental differences are only small, within natural parameters and diversified gradually; and c) effects are present despite semi-natural conditions that fluctuate during the year. Therefore, in behaviourally plastic animals like the stickleback, the adaptive response to human-induced habitat disturbance may occur rapidly (within one generation) and vary strongly based on the system's (starting) conditions. This has important implications for the variability in animal behaviour, which may be much larger than expected from studying laboratory systems, as well as for the validity of predictions of population responses to change.
Wang, Chunhao; Yin, Fang-Fang; Kirkpatrick, John P; Chang, Zheng
2017-08-01
To investigate the feasibility of using undersampled k-space data and an iterative image reconstruction method with total generalized variation penalty in the quantitative pharmacokinetic analysis for clinical brain dynamic contrast-enhanced magnetic resonance imaging. Eight brain dynamic contrast-enhanced magnetic resonance imaging scans were retrospectively studied. Two k-space sparse sampling strategies were designed to achieve a simulated image acquisition acceleration factor of 4. They are (1) a golden ratio-optimized 32-ray radial sampling profile and (2) a Cartesian-based random sampling profile with spatiotemporal-regularized sampling density constraints. The undersampled data were reconstructed to yield images using the investigated reconstruction technique. In quantitative pharmacokinetic analysis on a voxel-by-voxel basis, the rate constant K trans in the extended Tofts model and blood flow F B and blood volume V B from the 2-compartment exchange model were analyzed. Finally, the quantitative pharmacokinetic parameters calculated from the undersampled data were compared with the corresponding calculated values from the fully sampled data. To quantify each parameter's accuracy calculated using the undersampled data, error in volume mean, total relative error, and cross-correlation were calculated. The pharmacokinetic parameter maps generated from the undersampled data appeared comparable to the ones generated from the original full sampling data. Within the region of interest, most derived error in volume mean values in the region of interest was about 5% or lower, and the average error in volume mean of all parameter maps generated through either sampling strategy was about 3.54%. The average total relative error value of all parameter maps in region of interest was about 0.115, and the average cross-correlation of all parameter maps in region of interest was about 0.962. All investigated pharmacokinetic parameters had no significant differences between the result from original data and the reduced sampling data. With sparsely sampled k-space data in simulation of accelerated acquisition by a factor of 4, the investigated dynamic contrast-enhanced magnetic resonance imaging pharmacokinetic parameters can accurately estimate the total generalized variation-based iterative image reconstruction method for reliable clinical application.
Wallace, Douglas C
2013-07-19
Two major inconsistencies exist in the current neo-Darwinian evolutionary theory that random chromosomal mutations acted on by natural selection generate new species. First, natural selection does not require the evolution of ever increasing complexity, yet this is the hallmark of biology. Second, human chromosomal DNA sequence variation is predominantly either neutral or deleterious and is insufficient to provide the variation required for speciation or for predilection to common diseases. Complexity is explained by the continuous flow of energy through the biosphere that drives the accumulation of nucleic acids and information. Information then encodes complex forms. In animals, energy flow is primarily mediated by mitochondria whose maternally inherited mitochondrial DNA (mtDNA) codes for key genes for energy metabolism. In mammals, the mtDNA has a very high mutation rate, but the deleterious mutations are removed by an ovarian selection system. Hence, new mutations that subtly alter energy metabolism are continuously introduced into the species, permitting adaptation to regional differences in energy environments. Therefore, the most phenotypically significant gene variants arise in the mtDNA, are regional, and permit animals to occupy peripheral energy environments where rarer nuclear DNA (nDNA) variants can accumulate, leading to speciation. The neutralist-selectionist debate is then a consequence of mammals having two different evolutionary strategies: a fast mtDNA strategy for intra-specific radiation and a slow nDNA strategy for speciation. Furthermore, the missing genetic variation for common human diseases is primarily mtDNA variation plus regional nDNA variants, both of which have been missed by large, inter-population association studies.
A Micro-Computer Model for Army Air Defense Training.
1985-03-01
generator. The period is 32763 numbers generated before a repetitive sequence is encountered on the development system. Chi-Squared tests for frequency...C’ Tests CPeriodicity. The period is 32763 numbers generated C’before a repetitive sequence is encountered on the development system. This was...positions in the test array. This was done with several different random number seeds. In each case 32763 p random numbers were generated before a
Díaz-Martínez, Miriam; Nava-Cedillo, Alejandro; Guzmán-López, José Alfredo; Escobar-Guzmán, Rocío; Simpson, June
2012-04-01
Genetic variation in three forms of asexually propagated Agave tequilana Weber var. 'Azul' plants namely offsets, bulbils and in vitro cultured individuals was studied by AFLP analysis. Low levels of variation were observed between mother plants and offsets and a higher level between mother plant and bulbils. Families obtained from commercial plantations showed lower levels of variation in comparison to families grown as ornamentals. No variation was observed between the original explant and four generations of in vitro cultured plants. Epigenetic variation was also studied by analyzing changes in methylation patterns between mother plants and offspring in each form of asexual reproduction. Offsets and bulbils showed an overall decrease in methylation whereas in vitro cultured plants showed patterns specific to each generation: Generations 1 and 4 showed overall demethylation whereas Generations 2 and 3 showed increased methylation. Analysis of ESTs associated with transposable elements revealed higher proportions of ESTs from Ty1-copia-like, Gypsy and CACTA transposable elements in cDNA libraries obtained from pluripotent tissue suggesting a possible correlation between methylation patterns, expression of transposable element associated genes and somaclonal variation. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Experimentally Generated Random Numbers Certified by the Impossibility of Superluminal Signaling
NASA Astrophysics Data System (ADS)
Bierhorst, Peter; Shalm, Lynden K.; Mink, Alan; Jordan, Stephen; Liu, Yi-Kai; Rommal, Andrea; Glancy, Scott; Christensen, Bradley; Nam, Sae Woo; Knill, Emanuel
Random numbers are an important resource for applications such as numerical simulation and secure communication. However, it is difficult to certify whether a physical random number generator is truly unpredictable. Here, we exploit the phenomenon of quantum nonlocality in a loophole-free photonic Bell test experiment to obtain data containing randomness that cannot be predicted by any theory that does not also allow the sending of signals faster than the speed of light. To certify and quantify the randomness, we develop a new protocol that performs well in an experimental regime characterized by low violation of Bell inequalities. Applying an extractor function to our data, we obtain 256 new random bits, uniform to within 10- 3 .
NASA Technical Reports Server (NTRS)
Lindsey, R. S., Jr. (Inventor)
1975-01-01
An exemplary embodiment of the present invention provides a source of random width and random spaced rectangular voltage pulses whose mean or average frequency of operation is controllable within prescribed limits of about 10 hertz to 1 megahertz. A pair of thin-film metal resistors are used to provide a differential white noise voltage pulse source. Pulse shaping and amplification circuitry provide relatively short duration pulses of constant amplitude which are applied to anti-bounce logic circuitry to prevent ringing effects. The pulse outputs from the anti-bounce circuits are then used to control two one-shot multivibrators whose output comprises the random length and random spaced rectangular pulses. Means are provided for monitoring, calibrating and evaluating the relative randomness of the generator.
Variation: Use It or Misuse It--Replication and Its Variants
ERIC Educational Resources Information Center
Drummond, Gordon B.; Vowler, Sarah L.
2012-01-01
In this article, the authors talk about variation and how variation between measurements may be reduced if sampling is not random. They also talk about replication and its variants. A replicate is a repeated measurement from the same experimental unit. An experimental unit is the smallest part of an experiment or a study that can be subject to a…
ERIC Educational Resources Information Center
Hemayattalab, Rasool; Movahedi, Ahmadreza
2010-01-01
The purpose of this study was to investigate the effect of five variations of imagery and physical practice on learning of Basketball free throws in adolescents with mental retardation (AWMR). Forty AWMR were randomly assigned to five groups and performed a variation of practice: physical practice, mental practice, physical practice followed by…
A Finite Mixture Method for Outlier Detection and Robustness in Meta-Analysis
ERIC Educational Resources Information Center
Beath, Ken J.
2014-01-01
When performing a meta-analysis unexplained variation above that predicted by within study variation is usually modeled by a random effect. However, in some cases, this is not sufficient to explain all the variation because of outlier or unusual studies. A previously described method is to define an outlier as a study requiring a higher random…
NASA Astrophysics Data System (ADS)
Jia, Chun-Xiao; Liu, Run-Ran; Rong, Zhihai
2017-03-01
Either in societies or economic cycles, the benefits of a group can be affected by various unpredictable factors. We study effects of additive spatiotemporal random variations on the evolution of cooperation by introducing them to the enhancement level of the spatial public goods game. Players are located on the sites of a two-dimensional lattice and gain their payoffs from games with their neighbors by choosing cooperation or defection. We observe that a moderate intensity of variations can best favor cooperation at low enhancement levels, which resembles classical coherence resonance. Whereas for high enhancement levels, we find that the random variations cannot increase the cooperation level, but hamper cooperation instead. This discrepancy is attributed to the different roles the additive variations played in the early and late stages of evolution. In the early stage of evolution, the additive variations increase the survival probability of the players with lower average payoffs. However, in the late stage of evolution, the additive variations can promote defectors to destroy the cooperative clusters that have been formed. Our results indicate that additive spatiotemporal noise may not be as universally beneficial for cooperation as the spatial prisoner's dilemma game.
True Randomness from Big Data.
Papakonstantinou, Periklis A; Woodruff, David P; Yang, Guang
2016-09-26
Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.
NASA Astrophysics Data System (ADS)
Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang
2016-09-01
Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.
Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang
2016-01-01
Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514
Source-Device-Independent Ultrafast Quantum Random Number Generation.
Marangon, Davide G; Vallone, Giuseppe; Villoresi, Paolo
2017-02-10
Secure random numbers are a fundamental element of many applications in science, statistics, cryptography and more in general in security protocols. We present a method that enables the generation of high-speed unpredictable random numbers from the quadratures of an electromagnetic field without any assumption on the input state. The method allows us to eliminate the numbers that can be predicted due to the presence of classical and quantum side information. In particular, we introduce a procedure to estimate a bound on the conditional min-entropy based on the entropic uncertainty principle for position and momentum observables of infinite dimensional quantum systems. By the above method, we experimentally demonstrated the generation of secure true random bits at a rate greater than 1.7 Gbit/s.
Annoni, J.; Pegna, A.
1997-01-01
OBJECTIVE—To test the hypothesis that, during random motor generation, the spatial contingencies inherent to the task would induce additional preferences in normal subjects, shifting their performances farther from randomness. By contrast, perceptual or executive dysfunction could alter these task related biases in patients with brain damage. METHODS—Two groups of patients, with right and left focal brain lesions, as well as 25 right handed subjects matched for age and handedness were asked to execute a random choice motor task—namely, to generate a random series of 180 button presses from a set of 10 keys placed vertically in front of them. RESULTS—In the control group, as in the left brain lesion group, motor generation was subject to deviations from theoretical expected randomness, similar to those when numbers are generated mentally, as immediate repetitions (successive presses on the same key) are avoided. However, the distribution of button presses was also contingent on the topographic disposition of the keys: the central keys were chosen more often than those placed at extreme positions. Small distances were favoured, particularly with the left hand. These patterns were influenced by implicit strategies and task related contingencies. By contrast, right brain lesion patients with frontal involvement tended to show a more square distribution of key presses—that is, the number of key presses tended to be more equally distributed. The strategies were also altered by brain lesions: the number of immediate repetitions was more frequent when the lesion involved the right frontal areas yielding a random generation nearer to expected theoretical randomness. The frequency of adjacent key presses was increased by right anterior and left posterior cortical as well as by right subcortical lesions, but decreased by left subcortical lesions. CONCLUSIONS—Depending on the side of the lesion and the degree of cortical-subcortical involvement, the deficits take on a different aspect and direct repetions and adjacent key presses have different patterns of alterations. Motor random generation is therefore a complex task which seems to necessitate the participation of numerous cerebral structures, among which those situated in the right frontal, left posterior, and subcortical regions have a predominant role. PMID:9408109
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Dlugach, Janna M.; Zakharova, Nadezhda T.
2016-01-01
The numerically exact superposition T-matrix method is used to model far-field electromagnetic scattering by two types of particulate object. Object 1 is a fixed configuration which consists of N identical spherical particles (with N 200 or 400) quasi-randomly populating a spherical volume V having a median size parameter of 50. Object 2 is a true discrete random medium (DRM) comprising the same number N of particles randomly moving throughout V. The median particle size parameter is fixed at 4. We show that if Object 1 is illuminated by a quasi-monochromatic parallel beam then it generates a typical speckle pattern having no resemblance to the scattering pattern generated by Object 2. However, if Object 1 is illuminated by a parallel polychromatic beam with a 10 bandwidth then it generates a scattering pattern that is largely devoid of speckles and closely reproduces the quasi-monochromatic pattern generated by Object 2. This result serves to illustrate the capacity of the concept of electromagnetic scattering by a DRM to encompass fixed quasi-random particulate samples provided that they are illuminated by polychromatic light.
Local Energies and Energy Fluctuations — Applied to the High Entropy Alloy CrFeCoNi
NASA Astrophysics Data System (ADS)
Fukushima, Tetsuya; Katayama-Yoshida, Hiroshi; Sato, Kazunori; Ogura, Masako; Zeller, Rudolf; Dederichs, Peter H.
2017-11-01
High entropy alloys show a variety of fascinating properties like high hardness, wear resistance, corrosion resistance, etc. They are random solid solutions of many components with rather high concentrations. We perform ab-initio calculations for the high entropy alloy CrFeCoNi, which equal concentration of 25% for each element. By the KKRnano program package, which is based on an order-N screened Korringa-Kohn-Rostoker Green's function method, we consider a face-centered cubic (FCC) supercell with 1372 randomly distributed elements, and in addition also smaller supercells with 500 and 256 atoms. It is found from our calculations that the local moments of the Cr atoms show a large environmental variation, ranging from -1.70 μB to +1.01 μB with an average of about -0.51 μB. We present a new method to calculate "local energies" of all atoms. This is based on the partitioning of the whole space into Voronoi cells and allows to calculate the energetic contribution of each atomic cell to the total energy of the supercell. The supercell calculations show very large variations of the local energies, analogous to the variations of the local moments. This shows that the random solid solution is not stable and has a tendency to form an L12-structure with the Cr-atoms ordered at the corner of the cube and the elements Fe, Co, and Ni randomly distributed on the three other FCC sublattices. For this structure the variation of the local moments are much smaller.
N-state random switching based on quantum tunnelling
NASA Astrophysics Data System (ADS)
Bernardo Gavito, Ramón; Jiménez Urbanos, Fernando; Roberts, Jonathan; Sexton, James; Astbury, Benjamin; Shokeir, Hamzah; McGrath, Thomas; Noori, Yasir J.; Woodhead, Christopher S.; Missous, Mohamed; Roedig, Utz; Young, Robert J.
2017-08-01
In this work, we show how the hysteretic behaviour of resonant tunnelling diodes (RTDs) can be exploited for new functionalities. In particular, the RTDs exhibit a stochastic 2-state switching mechanism that could be useful for random number generation and cryptographic applications. This behaviour can be scaled to N-bit switching, by connecting various RTDs in series. The InGaAs/AlAs RTDs used in our experiments display very sharp negative differential resistance (NDR) peaks at room temperature which show hysteresis cycles that, rather than having a fixed switching threshold, show a probability distribution about a central value. We propose to use this intrinsic uncertainty emerging from the quantum nature of the RTDs as a source of randomness. We show that a combination of two RTDs in series results in devices with three-state outputs and discuss the possibility of scaling to N-state devices by subsequent series connections of RTDs, which we demonstrate for the up to the 4-state case. In this work, we suggest using that the intrinsic uncertainty in the conduction paths of resonant tunnelling diodes can behave as a source of randomness that can be integrated into current electronics to produce on-chip true random number generators. The N-shaped I-V characteristic of RTDs results in a two-level random voltage output when driven with current pulse trains. Electrical characterisation and randomness testing of the devices was conducted in order to determine the validity of the true randomness assumption. Based on the results obtained for the single RTD case, we suggest the possibility of using multi-well devices to generate N-state random switching devices for their use in random number generation or multi-valued logic devices.
Philip, Femi; Stewart, Susan; Southard, Jeffrey A
2016-07-01
The relative safety of drug-eluting stents (DES) and bare-metal stents (BMS) in primary percutaneous coronary intervention (PPCI) in ST elevation myocardial infarction (STEMI) continues to be debated. The long-term clinical outcomes between second generation DES and BMS for primary percutaneous coronary intervention (PCI) using network meta-analysis were compared. Randomized controlled trials comparing stent types (first generation DES, second generation DES, or BMS) were considered for inclusion. A search strategy used Medline, Embase, Cochrane databases, and proceedings of international meetings. Information about study design, inclusion criteria, and sample characteristics were extracted. Network meta-analysis was used to pool direct (comparison of second generation DES to BMS) and indirect evidence (first generation DES with BMS and second generation DES) from the randomized trials. Twelve trials comparing all stents types including 9,673 patients randomly assigned to treatment groups were analyzed. Second generation DES was associated with significantly lower incidence of definite or probable ST (OR 0.59, 95% CI 0.39-0.89), MI (OR 0.59, 95% CI 0.39-0.89), and TVR at 3 years (OR 0.50: 95% CI 0.31-0.81) compared with BMS. In addition, there was a significantly lower incidence of MACE with second generation DES versus BMS (OR 0.54, 95% CI 0.34-0.74) at 3 years. These were driven by a higher rate of TVR, MI and stent thrombosis in the BMS group at 3 years. There was a non-significant reduction in the overall and cardiac mortality [OR 0.83, 95% CI (0.60-1.14), OR 0.88, 95% CI (0.6-1.28)] with the use of second generation DES versus BMS at 3 years. Network meta-analysis of randomized trials of primary PCI demonstrated lower incidence of MACE, MI, TVR, and stent thrombosis with second generation DES compared with BMS. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing
NASA Technical Reports Server (NTRS)
Takaki, Mitsuo; Cavalcanti, Diego; Gheyi, Rohit; Iyoda, Juliano; dAmorim, Marcelo; Prudencio, Ricardo
2009-01-01
The complexity of constraints is a major obstacle for constraint-based software verification. Automatic constraint solvers are fundamentally incomplete: input constraints often build on some undecidable theory or some theory the solver does not support. This paper proposes and evaluates several randomized solvers to address this issue. We compare the effectiveness of a symbolic solver (CVC3), a random solver, three hybrid solvers (i.e., mix of random and symbolic), and two heuristic search solvers. We evaluate the solvers on two benchmarks: one consisting of manually generated constraints and another generated with a concolic execution of 8 subjects. In addition to fully decidable constraints, the benchmarks include constraints with non-linear integer arithmetic, integer modulo and division, bitwise arithmetic, and floating-point arithmetic. As expected symbolic solving (in particular, CVC3) subsumes the other solvers for the concolic execution of subjects that only generate decidable constraints. For the remaining subjects the solvers are complementary.
NASA Astrophysics Data System (ADS)
Brask, Jonatan Bohr; Martin, Anthony; Esposito, William; Houlmann, Raphael; Bowles, Joseph; Zbinden, Hugo; Brunner, Nicolas
2017-05-01
An approach to quantum random number generation based on unambiguous quantum state discrimination is developed. We consider a prepare-and-measure protocol, where two nonorthogonal quantum states can be prepared, and a measurement device aims at unambiguously discriminating between them. Because the states are nonorthogonal, this necessarily leads to a minimal rate of inconclusive events whose occurrence must be genuinely random and which provide the randomness source that we exploit. Our protocol is semi-device-independent in the sense that the output entropy can be lower bounded based on experimental data and a few general assumptions about the setup alone. It is also practically relevant, which we demonstrate by realizing a simple optical implementation, achieving rates of 16.5 Mbits /s . Combining ease of implementation, a high rate, and a real-time entropy estimation, our protocol represents a promising approach intermediate between fully device-independent protocols and commercial quantum random number generators.
Secure self-calibrating quantum random-bit generator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fiorentino, M.; Santori, C.; Spillane, S. M.
2007-03-15
Random-bit generators (RBGs) are key components of a variety of information processing applications ranging from simulations to cryptography. In particular, cryptographic systems require 'strong' RBGs that produce high-entropy bit sequences, but traditional software pseudo-RBGs have very low entropy content and therefore are relatively weak for cryptography. Hardware RBGs yield entropy from chaotic or quantum physical systems and therefore are expected to exhibit high entropy, but in current implementations their exact entropy content is unknown. Here we report a quantum random-bit generator (QRBG) that harvests entropy by measuring single-photon and entangled two-photon polarization states. We introduce and implement a quantum tomographicmore » method to measure a lower bound on the 'min-entropy' of the system, and we employ this value to distill a truly random-bit sequence. This approach is secure: even if an attacker takes control of the source of optical states, a secure random sequence can be distilled.« less
Yu, Haitao; Dhingra, Rishi R; Dick, Thomas E; Galán, Roberto F
2017-01-01
Neural activity generally displays irregular firing patterns even in circuits with apparently regular outputs, such as motor pattern generators, in which the output frequency fluctuates randomly around a mean value. This "circuit noise" is inherited from the random firing of single neurons, which emerges from stochastic ion channel gating (channel noise), spontaneous neurotransmitter release, and its diffusion and binding to synaptic receptors. Here we demonstrate how to expand conductance-based network models that are originally deterministic to include realistic, physiological noise, focusing on stochastic ion channel gating. We illustrate this procedure with a well-established conductance-based model of the respiratory pattern generator, which allows us to investigate how channel noise affects neural dynamics at the circuit level and, in particular, to understand the relationship between the respiratory pattern and its breath-to-breath variability. We show that as the channel number increases, the duration of inspiration and expiration varies, and so does the coefficient of variation of the breath-to-breath interval, which attains a minimum when the mean duration of expiration slightly exceeds that of inspiration. For small channel numbers, the variability of the expiratory phase dominates over that of the inspiratory phase, and vice versa for large channel numbers. Among the four different cell types in the respiratory pattern generator, pacemaker cells exhibit the highest sensitivity to channel noise. The model shows that suppressing input from the pons leads to longer inspiratory phases, a reduction in breathing frequency, and larger breath-to-breath variability, whereas enhanced input from the raphe nucleus increases breathing frequency without changing its pattern. A major source of noise in neuronal circuits is the "flickering" of ion currents passing through the neurons' membranes (channel noise), which cannot be suppressed experimentally. Computational simulations are therefore the best way to investigate the effects of this physiological noise by manipulating its level at will. We investigate the role of noise in the respiratory pattern generator and show that endogenous, breath-to-breath variability is tightly linked to the respiratory pattern. Copyright © 2017 the American Physiological Society.
Molenaar, Heike; Boehm, Robert; Piepho, Hans-Peter
2017-01-01
Robust phenotypic data allow adequate statistical analysis and are crucial for any breeding purpose. Such data is obtained from experiments laid out to best control local variation. Additionally, experiments frequently involve two phases, each contributing environmental sources of variation. For example, in a former experiment we conducted to evaluate production related traits in Pelargonium zonale , there were two consecutive phases, each performed in a different greenhouse. Phase one involved the propagation of the breeding strains to obtain the stem cutting count, and phase two involved the assessment of root formation. The evaluation of the former study raised questions regarding options for improving the experimental layout: (i) Is there a disadvantage to using exactly the same design in both phases? (ii) Instead of generating a separate layout for each phase, can the design be optimized across both phases, such that the mean variance of a pair-wise treatment difference (MVD) can be decreased? To answer these questions, alternative approaches were explored to generate two-phase designs either in phase-wise order (Option 1) or across phases (Option 2). In Option 1 we considered the scenarios (i) using in both phases the same experimental design and (ii) randomizing each phase separately. In Option 2, we considered the scenarios (iii) generating a single design with eight replicates and splitting these among the two phases, (iv) separating the block structure across phases by dummy coding, and (v) design generation with optimal alignment of block units in the two phases. In both options, we considered the same or different block structures in each phase. The designs were evaluated by the MVD obtained by the intra-block analysis and the joint inter-block-intra-block analysis. The smallest MVD was most frequently obtained for designs generated across phases rather than for each phase separately, in particular when both phases of the design were separated with a single pseudo-level. The joint optimization ensured that treatment concurrences were equally balanced across pairs, one of the prerequisites for an efficient design. The proposed alternative approaches can be implemented with any model-based design packages with facilities to formulate linear models for treatment and block structures.
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
Code of Federal Regulations, 2013 CFR
2013-07-01
... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
Code of Federal Regulations, 2011 CFR
2011-07-01
... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
Code of Federal Regulations, 2010 CFR
2010-07-01
... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
Code of Federal Regulations, 2014 CFR
2014-07-01
... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...
40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.
Code of Federal Regulations, 2012 CFR
2012-07-01
... generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square grid. (a) Divide the surface area of the non-porous surface into rectangular or square areas having a...
Selection response and genetic parameters for residual feed intake in Yorkshire swine.
Cai, W; Casey, D S; Dekkers, J C M
2008-02-01
Residual feed intake (RFI) is a measure of feed efficiency defined as the difference between the observed feed intake and that predicted from the average requirements for growth and maintenance. The objective of this study was to evaluate the response in a selection experiment consisting of a line selected for low RFI and a random control line and to estimate the genetic parameters for RFI and related production and carcass traits. Beginning with random allocation of purebred Yorkshire littermates, in each generation, electronically measured ADFI, ADG, and ultrasound backfat (BF) were evaluated during a approximately 40- to approximately 115-kg of BW test period on approximately 90 boars from first parity and approximately 90 gilts from second parity sows of the low RFI line. After evaluation of first parity boars, approximately 12 boars and approximately 70 gilts from the low RFI line were selected to produce approximately 50 litters for the next generation. Approximately 30 control line litters were produced by random selection and mating. Selection was on EBV for RFI from an animal model analysis of ADFI, with on-test group and sex (fixed), pen within group and litter (random), and covariates for interactions of on- and off-test BW, on-test age, ADG, and BF with generations. The RFI explained 34% of phenotypic variation in ADFI. After 4 generations of selection, estimates of heritability for RFI, ADFI, ADG, feed efficiency (FE, which is the reciprocal of the feed conversion ratio and equals ADG/ ADFI), and ultrasound-predicted BF, LM area (LMA), and intramuscular fat (IMF) were 0.29, 0.51, 0.42, 0.17, 0.68, 0.57, and 0.28, respectively; predicted responses based on average EBV in the low RFI line were -114, -202, and -39 g/d for RFI (= 0.9 phenotypic SD), ADFI (0.9 SD), and ADG (0.4 SD), respectively, and 1.56% for FE (0.5 SD), -0.37 mm for BF (0.1 SD), 0.35 cm(2) for LMA (0.1 SD), and -0.10% for IMF (0.3 SD). Direct phenotypic comparison of the low RFI and control lines based on 92 low RFI and 76 control gilts from the second parity of generation 4 showed that selection had significantly decreased RFI by 96 g/d (P = 0.002) and ADFI by 165 g/d (P < 0.0001). The low RFI line also had 33 g/d lower ADG (P = 0.022), 1.36% greater FE (P = 0.09), and 1.99 mm less BF (P = 0.013). There was not a significant difference in LMA and other carcass traits, including subjective marbling score, despite a large observed difference in ultrasound-predicted IMF (-1.05% with P < 0.0001). In conclusion, RFI is a heritable trait, and selection for low RFI has significantly decreased the feed required for a given rate of growth and backfat.
General technique for measurement of refractive index variations
NASA Technical Reports Server (NTRS)
1970-01-01
Time variation of amplitude and phase fluctuations, impressed on waves propagating through a medium which has a randomly varying refractive index, is used to examine isolated regions along the line of propagation. Technique employs only one antenna at each end of the line.
Roth, E. J.; Frazier, B. L.; Apuya, N. R.; Lark, K. G.
1989-01-01
Although soybean [Glycine max (L.) Merrill] grows as an inbreeding, generally homozygous, plant, the germplasm of the species contains large amounts of genetic variation. Analysis of soybean DNA has indicated that variation of RFLP (restriction fragment length polymorphism) markers within the species usually entails only two alleles at any one locus and that mixtures of such dimorphic loci account for virtually all of the restriction fragment variation seen in soybean (G. max), and in its ancestors, G. soja and G. gracilis. We report here that tissue cultures prepared from root tissue of individual soybean plants develop RFLP allelic differences at various loci. However, these newly generated alleles are almost always the same as ones previously found and characterized in other varieties of cultivated soybean (cultivars). This repeated generation of particular alleles suggests that much of the genetic variation seen in soybean could be the consequence of specific, relatively frequently employed, recombinational events. Such a mechanism would allow inbred cultivars to generate genetic variation (in the form of alternative alleles) in a controlled manner, perhaps in response to stress. PMID:2567263
Accelerating Pseudo-Random Number Generator for MCNP on GPU
NASA Astrophysics Data System (ADS)
Gong, Chunye; Liu, Jie; Chi, Lihua; Hu, Qingfeng; Deng, Li; Gong, Zhenghu
2010-09-01
Pseudo-random number generators (PRNG) are intensively used in many stochastic algorithms in particle simulations, artificial neural networks and other scientific computation. The PRNG in Monte Carlo N-Particle Transport Code (MCNP) requires long period, high quality, flexible jump and fast enough. In this paper, we implement such a PRNG for MCNP on NVIDIA's GTX200 Graphics Processor Units (GPU) using CUDA programming model. Results shows that 3.80 to 8.10 times speedup are achieved compared with 4 to 6 cores CPUs and more than 679.18 million double precision random numbers can be generated per second on GPU.
Digital-Analog Hybrid Scheme and Its Application to Chaotic Random Number Generators
NASA Astrophysics Data System (ADS)
Yuan, Zeshi; Li, Hongtao; Miao, Yunchi; Hu, Wen; Zhu, Xiaohua
2017-12-01
Practical random number generation (RNG) circuits are typically achieved with analog devices or digital approaches. Digital-based techniques, which use field programmable gate array (FPGA) and graphics processing units (GPU) etc. usually have better performances than analog methods as they are programmable, efficient and robust. However, digital realizations suffer from the effect of finite precision. Accordingly, the generated random numbers (RNs) are actually periodic instead of being real random. To tackle this limitation, in this paper we propose a novel digital-analog hybrid scheme that employs the digital unit as the main body, and minimum analog devices to generate physical RNs. Moreover, the possibility of realizing the proposed scheme with only one memory element is discussed. Without loss of generality, we use the capacitor and the memristor along with FPGA to construct the proposed hybrid system, and a chaotic true random number generator (TRNG) circuit is realized, producing physical RNs at a throughput of Gbit/s scale. These RNs successfully pass all the tests in the NIST SP800-22 package, confirming the significance of the scheme in practical applications. In addition, the use of this new scheme is not restricted to RNGs, and it also provides a strategy to solve the effect of finite precision in other digital systems.
Influence of Choice of Null Network on Small-World Parameters of Structural Correlation Networks
Hosseini, S. M. Hadi; Kesler, Shelli R.
2013-01-01
In recent years, coordinated variations in brain morphology (e.g., volume, thickness) have been employed as a measure of structural association between brain regions to infer large-scale structural correlation networks. Recent evidence suggests that brain networks constructed in this manner are inherently more clustered than random networks of the same size and degree. Thus, null networks constructed by randomizing topology are not a good choice for benchmarking small-world parameters of these networks. In the present report, we investigated the influence of choice of null networks on small-world parameters of gray matter correlation networks in healthy individuals and survivors of acute lymphoblastic leukemia. Three types of null networks were studied: 1) networks constructed by topology randomization (TOP), 2) networks matched to the distributional properties of the observed covariance matrix (HQS), and 3) networks generated from correlation of randomized input data (COR). The results revealed that the choice of null network not only influences the estimated small-world parameters, it also influences the results of between-group differences in small-world parameters. In addition, at higher network densities, the choice of null network influences the direction of group differences in network measures. Our data suggest that the choice of null network is quite crucial for interpretation of group differences in small-world parameters of structural correlation networks. We argue that none of the available null models is perfect for estimation of small-world parameters for correlation networks and the relative strengths and weaknesses of the selected model should be carefully considered with respect to obtained network measures. PMID:23840672
Anonymous authenticated communications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beaver, Cheryl L; Schroeppel, Richard C; Snyder, Lillian A
2007-06-19
A method of performing electronic communications between members of a group wherein the communications are authenticated as being from a member of the group and have not been altered, comprising: generating a plurality of random numbers; distributing in a digital medium the plurality of random numbers to the members of the group; publishing a hash value of contents of the digital medium; distributing to the members of the group public-key-encrypted messages each containing a same token comprising a random number; and encrypting a message with a key generated from the token and the plurality of random numbers.
NullSeq: A Tool for Generating Random Coding Sequences with Desired Amino Acid and GC Contents.
Liu, Sophia S; Hockenberry, Adam J; Lancichinetti, Andrea; Jewett, Michael C; Amaral, Luís A N
2016-11-01
The existence of over- and under-represented sequence motifs in genomes provides evidence of selective evolutionary pressures on biological mechanisms such as transcription, translation, ligand-substrate binding, and host immunity. In order to accurately identify motifs and other genome-scale patterns of interest, it is essential to be able to generate accurate null models that are appropriate for the sequences under study. While many tools have been developed to create random nucleotide sequences, protein coding sequences are subject to a unique set of constraints that complicates the process of generating appropriate null models. There are currently no tools available that allow users to create random coding sequences with specified amino acid composition and GC content for the purpose of hypothesis testing. Using the principle of maximum entropy, we developed a method that generates unbiased random sequences with pre-specified amino acid and GC content, which we have developed into a python package. Our method is the simplest way to obtain maximally unbiased random sequences that are subject to GC usage and primary amino acid sequence constraints. Furthermore, this approach can easily be expanded to create unbiased random sequences that incorporate more complicated constraints such as individual nucleotide usage or even di-nucleotide frequencies. The ability to generate correctly specified null models will allow researchers to accurately identify sequence motifs which will lead to a better understanding of biological processes as well as more effective engineering of biological systems.
NASA Astrophysics Data System (ADS)
Tsao, Shih-Ming; Lai, Ji-Ching; Horng, Horng-Er; Liu, Tu-Chen; Hong, Chin-Yih
2017-04-01
Aptamers are oligonucleotides that can bind to specific target molecules. Most aptamers are generated using random libraries in the standard systematic evolution of ligands by exponential enrichment (SELEX). Each random library contains oligonucleotides with a randomized central region and two fixed primer regions at both ends. The fixed primer regions are necessary for amplifying target-bound sequences by PCR. However, these extra-sequences may cause non-specific bindings, which potentially interfere with good binding for random sequences. The Magnetic-Assisted Rapid Aptamer Selection (MARAS) is a newly developed protocol for generating single-strand DNA aptamers. No repeat selection cycle is required in the protocol. This study proposes and demonstrates a method to isolate aptamers for C-reactive proteins (CRP) from a randomized ssDNA library containing no fixed sequences at 5‧ and 3‧ termini using the MARAS platform. Furthermore, the isolated primer-free aptamer was sequenced and binding affinity for CRP was analyzed. The specificity of the obtained aptamer was validated using blind serum samples. The result was consistent with monoclonal antibody-based nephelometry analysis, which indicated that a primer-free aptamer has high specificity toward targets. MARAS is a feasible platform for efficiently generating primer-free aptamers for clinical diagnoses.
NASA Astrophysics Data System (ADS)
Paulsson, Johan; Ehrenberg, Måns
2000-06-01
Many intracellular components are present in low copy numbers per cell and subject to feedback control. We use chemical master equations to analyze a negative feedback system where species X and S regulate each other's synthesis with standard intracellular kinetics. For a given number of X-molecules, S-variation can be significant. We show that this signal noise does not necessarily increase X-variation as previously thought but, surprisingly, can be necessary to reduce it below a Poissonian limit. The principle resembles Stochastic Resonance in that signal noise improves signal detection.
Can Yeast (S. cerevisiae) Metabolic Volatiles Provide Polymorphic Signaling?
Arguello, J. Roman; Sellanes, Carolina; Lou, Yann Ru; Raguso, Robert A.
2013-01-01
Chemical signaling between organisms is a ubiquitous and evolutionarily dynamic process that helps to ensure mate recognition, location of nutrients, avoidance of toxins, and social cooperation. Evolutionary changes in chemical communication systems progress through natural variation within the organism generating the signal as well as the responding individuals. A promising yet poorly understood system with which to probe the importance of this variation exists between D. melanogaster and S. cerevisiae. D. melanogaster relies on yeast for nutrients, while also serving as a vector for yeast cell dispersal. Both are outstanding genetic and genomic models, with Drosophila also serving as a preeminent model for sensory neurobiology. To help develop these two genetic models as an ecological model, we have tested if - and to what extent - S. cerevisiae is capable of producing polymorphic signaling through variation in metabolic volatiles. We have carried out a chemical phenotyping experiment for 14 diverse accessions within a common garden random block design. Leveraging genomic sequences for 11 of the accessions, we ensured a genetically broad sample and tested for phylogenetic signal arising from phenotypic dataset. Our results demonstrate that significant quantitative differences for volatile blends do exist among S. cerevisiae accessions. Of particular ecological relevance, the compounds driving the blend differences (acetoin, 2-phenyl ethanol and 3-methyl-1-butanol) are known ligands for D. melanogasters chemosensory receptors, and are related to sensory behaviors. Though unable to correlate the genetic and volatile measurements, our data point clear ways forward for behavioral assays aimed at understanding the implications of this variation. PMID:23990899
A Comparative Study of Random Patterns for Digital Image Correlation
NASA Astrophysics Data System (ADS)
Stoilov, G.; Kavardzhikov, V.; Pashkouleva, D.
2012-06-01
Digital Image Correlation (DIC) is a computer based image analysis technique utilizing random patterns, which finds applications in experimental mechanics of solids and structures. In this paper a comparative study of three simulated random patterns is done. One of them is generated according to a new algorithm, introduced by the authors. A criterion for quantitative evaluation of random patterns after the calculation of their autocorrelation functions is introduced. The patterns' deformations are simulated numerically and realized experimentally. The displacements are measured by using the DIC method. Tensile tests are performed after printing the generated random patterns on surfaces of standard iron sheet specimens. It is found that the new designed random pattern keeps relatively good quality until reaching 20% deformation.
Effects of drain bias on the statistical variation of double-gate tunnel field-effect transistors
NASA Astrophysics Data System (ADS)
Choi, Woo Young
2017-04-01
The effects of drain bias on the statistical variation of double-gate (DG) tunnel field-effect transistors (TFETs) are discussed in comparison with DG metal-oxide-semiconductor FETs (MOSFETs). Statistical variation corresponds to the variation of threshold voltage (V th), subthreshold swing (SS), and drain-induced barrier thinning (DIBT). The unique statistical variation characteristics of DG TFETs and DG MOSFETs with the variation of drain bias are analyzed by using full three-dimensional technology computer-aided design (TCAD) simulation in terms of the three dominant variation sources: line-edge roughness (LER), random dopant fluctuation (RDF) and workfunction variation (WFV). It is observed than DG TFETs suffer from less severe statistical variation as drain voltage increases unlike DG MOSFETs.
Variation in Physician Practice Styles within and across Emergency Departments
Van Parys, Jessica
2016-01-01
Despite the significant responsibility that physicians have in healthcare delivery, we know surprisingly little about why physician practice styles vary within or across institutions. Estimating variation in physician practice styles is complicated by the fact that patients are rarely randomly assigned to physicians. This paper uses the quasi-random assignment of patients to physicians in emergency departments (EDs) to show how physicians vary in their treatment of patients with minor injuries. The results reveal a considerable degree of variation in practice styles within EDs; physicians at the 75th percentile of the spending distribution spend 20% more than physicians at the 25th percentile. Observable physician characteristics do not explain much of the variation across physicians, but there is a significant degree of sorting between physicians and EDs over time, with high-cost physicians sorting into high-cost EDs as they gain experience. The results may shed light on why some EDs remain persistently higher-cost than others. PMID:27517464
SNP selection and classification of genome-wide SNP data using stratified sampling random forests.
Wu, Qingyao; Ye, Yunming; Liu, Yang; Ng, Michael K
2012-09-01
For high dimensional genome-wide association (GWA) case-control data of complex disease, there are usually a large portion of single-nucleotide polymorphisms (SNPs) that are irrelevant with the disease. A simple random sampling method in random forest using default mtry parameter to choose feature subspace, will select too many subspaces without informative SNPs. Exhaustive searching an optimal mtry is often required in order to include useful and relevant SNPs and get rid of vast of non-informative SNPs. However, it is too time-consuming and not favorable in GWA for high-dimensional data. The main aim of this paper is to propose a stratified sampling method for feature subspace selection to generate decision trees in a random forest for GWA high-dimensional data. Our idea is to design an equal-width discretization scheme for informativeness to divide SNPs into multiple groups. In feature subspace selection, we randomly select the same number of SNPs from each group and combine them to form a subspace to generate a decision tree. The advantage of this stratified sampling procedure can make sure each subspace contains enough useful SNPs, but can avoid a very high computational cost of exhaustive search of an optimal mtry, and maintain the randomness of a random forest. We employ two genome-wide SNP data sets (Parkinson case-control data comprised of 408 803 SNPs and Alzheimer case-control data comprised of 380 157 SNPs) to demonstrate that the proposed stratified sampling method is effective, and it can generate better random forest with higher accuracy and lower error bound than those by Breiman's random forest generation method. For Parkinson data, we also show some interesting genes identified by the method, which may be associated with neurological disorders for further biological investigations.
FROG - Fingerprinting Genomic Variation Ontology
Bhardwaj, Anshu
2015-01-01
Genetic variations play a crucial role in differential phenotypic outcomes. Given the complexity in establishing this correlation and the enormous data available today, it is imperative to design machine-readable, efficient methods to store, label, search and analyze this data. A semantic approach, FROG: “FingeRprinting Ontology of Genomic variations” is implemented to label variation data, based on its location, function and interactions. FROG has six levels to describe the variation annotation, namely, chromosome, DNA, RNA, protein, variations and interactions. Each level is a conceptual aggregation of logically connected attributes each of which comprises of various properties for the variant. For example, in chromosome level, one of the attributes is location of variation and which has two properties, allosomes or autosomes. Another attribute is variation kind which has four properties, namely, indel, deletion, insertion, substitution. Likewise, there are 48 attributes and 278 properties to capture the variation annotation across six levels. Each property is then assigned a bit score which in turn leads to generation of a binary fingerprint based on the combination of these properties (mostly taken from existing variation ontologies). FROG is a novel and unique method designed for the purpose of labeling the entire variation data generated till date for efficient storage, search and analysis. A web-based platform is designed as a test case for users to navigate sample datasets and generate fingerprints. The platform is available at http://ab-openlab.csir.res.in/frog. PMID:26244889
SINEs as driving forces in genome evolution.
Schmitz, J
2012-01-01
SINEs are short interspersed elements derived from cellular RNAs that repetitively retropose via RNA intermediates and integrate more or less randomly back into the genome. SINEs propagate almost entirely vertically within their host cells and, once established in the germline, are passed on from generation to generation. As non-autonomous elements, their reverse transcription (from RNA to cDNA) and genomic integration depends on the activity of the enzymatic machinery of autonomous retrotransposons, such as long interspersed elements (LINEs). SINEs are widely distributed in eukaryotes, but are especially effectively propagated in mammalian species. For example, more than a million Alu-SINE copies populate the human genome (approximately 13% of genomic space), and few master copies of them are still active. In the organisms where they occur, SINEs are a challenge to genomic integrity, but in the long term also can serve as beneficial building blocks for evolution, contributing to phenotypic heterogeneity and modifying gene regulatory networks. They substantially expand the genomic space and introduce structural variation to the genome. SINEs have the potential to mutate genes, to alter gene expression, and to generate new parts of genes. A balanced distribution and controlled activity of such properties is crucial to maintaining the organism's dynamic and thriving evolution. Copyright © 2012 S. Karger AG, Basel.
Atypical transistor-based chaotic oscillators: Design, realization, and diversity
NASA Astrophysics Data System (ADS)
Minati, Ludovico; Frasca, Mattia; OświÈ©cimka, Paweł; Faes, Luca; DroŻdŻ, Stanisław
2017-07-01
In this paper, we show that novel autonomous chaotic oscillators based on one or two bipolar junction transistors and a limited number of passive components can be obtained via random search with suitable heuristics. Chaos is a pervasive occurrence in these circuits, particularly after manual adjustment of a variable resistor placed in series with the supply voltage source. Following this approach, 49 unique circuits generating chaotic signals when physically realized were designed, representing the largest collection of circuits of this kind to date. These circuits are atypical as they do not trivially map onto known topologies or variations thereof. They feature diverse spectra and predominantly anti-persistent monofractal dynamics. Notably, we recurrently found a circuit comprising one resistor, one transistor, two inductors, and one capacitor, which generates a range of attractors depending on the parameter values. We also found a circuit yielding an irregular quantized spike-train resembling some aspects of neural discharge and another one generating a double-scroll attractor, which represent the smallest known transistor-based embodiments of these behaviors. Through three representative examples, we additionally show that diffusive coupling of heterogeneous oscillators of this kind may give rise to complex entrainment, such as lag synchronization with directed information transfer and generalized synchronization. The replicability and reproducibility of the experimental findings are good.
Atypical transistor-based chaotic oscillators: Design, realization, and diversity.
Minati, Ludovico; Frasca, Mattia; Oświȩcimka, Paweł; Faes, Luca; Drożdż, Stanisław
2017-07-01
In this paper, we show that novel autonomous chaotic oscillators based on one or two bipolar junction transistors and a limited number of passive components can be obtained via random search with suitable heuristics. Chaos is a pervasive occurrence in these circuits, particularly after manual adjustment of a variable resistor placed in series with the supply voltage source. Following this approach, 49 unique circuits generating chaotic signals when physically realized were designed, representing the largest collection of circuits of this kind to date. These circuits are atypical as they do not trivially map onto known topologies or variations thereof. They feature diverse spectra and predominantly anti-persistent monofractal dynamics. Notably, we recurrently found a circuit comprising one resistor, one transistor, two inductors, and one capacitor, which generates a range of attractors depending on the parameter values. We also found a circuit yielding an irregular quantized spike-train resembling some aspects of neural discharge and another one generating a double-scroll attractor, which represent the smallest known transistor-based embodiments of these behaviors. Through three representative examples, we additionally show that diffusive coupling of heterogeneous oscillators of this kind may give rise to complex entrainment, such as lag synchronization with directed information transfer and generalized synchronization. The replicability and reproducibility of the experimental findings are good.
Durner, George M.; Amstrup, Steven C.; Nielson, Ryan M.; McDonald, Trent; Huzurbazar, Snehalata
2004-01-01
Polar bears (Ursus maritimus) depend on ice-covered seas to satisfy life history requirements. Modern threats to polar bears include oil spills in the marine environment and changes in ice composition resulting from climate change. Managers need practical models that explain the distribution of bears in order to assess the impacts of these threats. We explored the use of discrete choice models to describe habitat selection by female polar bears in the Beaufort Sea. Using stepwise procedures we generated resource selection models of habitat use. Sea ice characteristics and ocean depths at known polar bear locations were compared to the same features at randomly selected locations. Models generated for each of four seasons confirmed complexities of habitat use by polar bears and their response to numerous factors. Bears preferred shallow water areas where different ice types intersected. Variation among seasons was reflected mainly in differential selection of total ice concentration, ice stages, floe sizes, and their interactions. Distance to the nearest ice interface was a significant term in models for three seasons. Water depth was selected as a significant term in all seasons, possibly reflecting higher productivity in shallow water areas. Preliminary tests indicate seasonal models can predict polar bear distribution based on prior sea ice data.
Lague, Michael R; Collard, Nicole J; Richmond, Brian G; Wood, Bernard A
2008-01-01
Mandibular corpora are well represented in the hominin fossil record, yet few studies have rigorously assessed the utility of mandibular corpus morphology for species recognition, particularly with respect to the linear dimensions that are most commonly available. In this study, we explored the extent to which commonly preserved mandibular corpus morphology can be used to: (i) discriminate among extant hominid taxa and (ii) support species designations among fossil specimens assigned to the genus Homo. In the first part of the study, discriminant analysis was used to test for significant differences in mandibular corpus shape at different taxonomic levels (genus, species and subspecies) among extant hominid taxa (i.e. Homo, Pan, Gorilla, Pongo). In the second part of the study, we examined shape variation among fossil mandibles assigned to Homo(including H. habilis sensu stricto, H. rudolfensis, early African H. erectus/H. ergaster, late African H. erectus, Asian H. erectus, H. heidelbergensis, H. neanderthalensis and H. sapiens). A novel randomization procedure designed for small samples (and using group ‘distinctness values’) was used to determine whether shape variation among the fossils is consistent with conventional taxonomy (or alternatively, whether a priori taxonomic groupings are completely random with respect to mandibular morphology). The randomization of ‘distinctness values’ was also used on the extant samples to assess the ability of the test to recognize known taxa. The discriminant analysis results demonstrated that, even for a relatively modest set of traditional mandibular corpus measurements, we can detect significant differences among extant hominids at the genus and species levels, and, in some cases, also at the subspecies level. Although the randomization of ‘distinctness values’ test is more conservative than discriminant analysis (based on comparisons with extant specimens), we were able to detect at least four distinct groups among the fossil specimens (i.e. H. sapiens, H. heidelbergensis, Asian H. erectus and a combined ‘African Homo’ group consisting of H. habilis sensu stricto, H. rudolfensis, early African H. erectus/H. ergaster and late African H. erectus). These four groups appear to be distinct at a level similar to, or greater than, that of modern hominid species. In addition, the mandibular corpora of H. neanderthalensis could be distinguished from those of ‘African Homo’, although not from those of H. sapiens, H. heidelbergensis, or the Asian H. erectus group. The results suggest that the features most commonly preserved on the hominin mandibular corpus have some taxonomic utility, although they are unlikely to be useful in generating a reliable alpha taxonomy for early African members of the genus Homo. PMID:19094183
Single-electron random-number generator (RNG) for highly secure ubiquitous computing applications
NASA Astrophysics Data System (ADS)
Uchida, Ken; Tanamoto, Tetsufumi; Fujita, Shinobu
2007-11-01
Since the security of all modern cryptographic techniques relies on unpredictable and irreproducible digital keys generated by random-number generators (RNGs), the realization of high-quality RNG is essential for secure communications. In this report, a new RNG, which utilizes single-electron phenomena, is proposed. A room-temperature operating silicon single-electron transistor (SET) having nearby an electron pocket is used as a high-quality, ultra-small RNG. In the proposed RNG, stochastic single-electron capture/emission processes to/from the electron pocket are detected with high sensitivity by the SET, and result in giant random telegraphic signals (GRTS) on the SET current. It is experimentally demonstrated that the single-electron RNG generates extremely high-quality random digital sequences at room temperature, in spite of its simple configuration. Because of its small-size and low-power properties, the single-electron RNG is promising as a key nanoelectronic device for future ubiquitous computing systems with highly secure mobile communication capabilities.
Dynamic responses of railroad car models to vertical and lateral rail inputs
NASA Technical Reports Server (NTRS)
Sewall, J. L.; Parrish, R. V.; Durling, B. J.
1971-01-01
Simplified dynamic models were applied in a study of vibration in a high-speed railroad car. The mathematical models used were a four-degree-of-freedom model for vertical responses to vertical rail inputs and a ten-degree-of-freedom model for lateral response to lateral or rolling (cross-level) inputs from the rails. Elastic properties of the passenger car body were represented by bending and torsion of a uniform beam. Rail-to-car (truck) suspensions were modeled as spring-mass-dashpot oscillators. Lateral spring nonlinearities approximating certain complicated truck mechanisms were introduced. The models were excited by displacement and, in some cases, velocity inputs from the rails by both deterministic (including sinusoidal) and random input functions. Results were obtained both in the frequency and time domains. Solutions in the time domain for the lateral model were obtained for a wide variety of transient and random inputs generated on-line by an analog computer. Variations in one of the damping properties of the lateral car suspension gave large fluctuations in response over a range of car speeds for a given input. This damping coefficient was significant in reducing lateral car responses that were higher for nonlinear springs for three different inputs.
Differential Susceptibility to Prevention: GABAergic, Dopaminergic, and Multilocus Effects
ERIC Educational Resources Information Center
Brody, Gene H.; Chen, Yi-fu; Beach, Steven R. H.
2013-01-01
Background: Randomized prevention trials provide a unique opportunity to test hypotheses about the interaction of genetic predispositions with contextual processes to create variations in phenotypes over time. Methods: Using two longitudinal, randomized prevention trials, molecular genetic and alcohol use outcome data were gathered from more than…
Grievink, Liat Shavit; Penny, David; Hendy, Mike D; Holland, Barbara R
2009-01-01
Correction to Shavit Grievink L, Penny D, Hendy MD, Holland BR: LineageSpecificSeqgen: generating sequence data with lineage-specific variation in the proportion of variable sites. BMC Evol Biol 2008, 8(1):317.
Srinivasa Rao, Mathukumalli; Swathi, Pettem; Rama Rao, Chitiprolu Anantha; Rao, K. V.; Raju, B. M. K.; Srinivas, Karlapudi; Manimanjari, Dammu; Maheswari, Mandapaka
2015-01-01
The present study features the estimation of number of generations of tobacco caterpillar, Spodoptera litura. Fab. on peanut crop at six locations in India using MarkSim, which provides General Circulation Model (GCM) of future data on daily maximum (T.max), minimum (T.min) air temperatures from six models viz., BCCR-BCM2.0, CNRM-CM3, CSIRO-Mk3.5, ECHams5, INCM-CM3.0 and MIROC3.2 along with an ensemble of the six from three emission scenarios (A2, A1B and B1). This data was used to predict the future pest scenarios following the growing degree days approach in four different climate periods viz., Baseline-1975, Near future (NF) -2020, Distant future (DF)-2050 and Very Distant future (VDF)—2080. It is predicted that more generations would occur during the three future climate periods with significant variation among scenarios and models. Among the seven models, 1–2 additional generations were predicted during DF and VDF due to higher future temperatures in CNRM-CM3, ECHams5 & CSIRO-Mk3.5 models. The temperature projections of these models indicated that the generation time would decrease by 18–22% over baseline. Analysis of variance (ANOVA) was used to partition the variation in the predicted number of generations and generation time of S. litura on peanut during crop season. Geographical location explained 34% of the total variation in number of generations, followed by time period (26%), model (1.74%) and scenario (0.74%). The remaining 14% of the variation was explained by interactions. Increased number of generations and reduction of generation time across the six peanut growing locations of India suggest that the incidence of S. litura may increase due to projected increase in temperatures in future climate change periods. PMID:25671564
Iteration of ultrasound aberration correction methods
NASA Astrophysics Data System (ADS)
Maasoey, Svein-Erik; Angelsen, Bjoern; Varslot, Trond
2004-05-01
Aberration in ultrasound medical imaging is usually modeled by time-delay and amplitude variations concentrated on the transmitting/receiving array. This filter process is here denoted a TDA filter. The TDA filter is an approximation to the physical aberration process, which occurs over an extended part of the human body wall. Estimation of the TDA filter, and performing correction on transmit and receive, has proven difficult. It has yet to be shown that this method works adequately for severe aberration. Estimation of the TDA filter can be iterated by retransmitting a corrected signal and re-estimate until a convergence criterion is fulfilled (adaptive imaging). Two methods for estimating time-delay and amplitude variations in receive signals from random scatterers have been developed. One method correlates each element signal with a reference signal. The other method use eigenvalue decomposition of the receive cross-spectrum matrix, based upon a receive energy-maximizing criterion. Simulations of iterating aberration correction with a TDA filter have been investigated to study its convergence properties. A weak and strong human-body wall model generated aberration. Both emulated the human abdominal wall. Results after iteration improve aberration correction substantially, and both estimation methods converge, even for the case of strong aberration.
Xu, Shixia; Ju, Jianfeng; Zhou, Xuming; Wang, Lian; Zhou, Kaiya; Yang, Guang
2012-01-01
To further extend our understanding of the mechanism causing the current nearly extinct status of the baiji (Lipotes vexillifer), one of the most critically endangered species in the world, genetic diversity at the major histocompatibility complex (MHC) class II DRB locus was investigated in the baiji. Nine highly divergent DRB alleles were identified in 17 samples, with an average of 28.4 (13.2%) nucleotide difference and 16.7 (23.5%) amino acid difference between alleles. The unexpectedly high levels of DRB allelic diversity in the baiji may partly be attributable to its evolutionary adaptations to the freshwater environment which is regarded to have a higher parasite diversity compared to the marine environment. In addition, balancing selection was found to be the main mechanisms in generating sequence diversity at baiji DRB gene. Considerable sequence variation at the adaptive MHC genes despite of significant loss of neutral genetic variation in baiji genome might suggest that intense selection has overpowered random genetic drift as the main evolutionary forces, which further suggested that the critically endangered or nearly extinct status of the baiji is not an outcome of genetic collapse. PMID:22272349
Strenge, Hans; Lesmana, Cokorda Bagus Jaya; Suryani, Luh Ketut
2009-08-01
Verbal random number generation is a procedurally simple task to assess executive function and appears ideally suited for the use under diverse settings in cross-cultural research. The objective of this study was to examine ethnic group differences between young adults in Bali (Indonesia) and Kiel (Germany): 50 bilingual healthy students, 30 Balinese and 20 Germans, attempted to generate a random sequence of the digits 1 to 9. In Balinese participants, randomization was done in Balinese (native language L1) and Indonesian (first foreign language L2), in German subjects in the German (L1) and English (L2) languages. 10 of 30 Balinese (33%), but no Germans, were unable to inhibit habitual counting in more than half of the responses. The Balinese produced significantly more nonrandom responses than the Germans with higher rates of counting and significantly less occurrence of the digits 2 and 3 in L1 compared with L2. Repetition and cycling behavior did not differ between the four languages. The findings highlight the importance of taking into account culture-bound psychosocial factors for Balinese individuals when administering and interpreting a random number generation test.
An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution
NASA Technical Reports Server (NTRS)
Campbell, C. W.
1983-01-01
An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.
The use of crop rotation for mapping soil organic content in farmland
NASA Astrophysics Data System (ADS)
Yang, Lin; Song, Min; Zhu, A.-Xing; Qin, Chengzhi
2017-04-01
Most of the current digital soil mapping uses natural environmental covariates. However, human activities have significantly impacted the development of soil properties since half a century, and therefore become an important factor affecting soil spatial variability. Many researches have done field experiments to show how soil properties are impacted and changed by human activities, however, spatial variation data of human activities as environmental covariates have been rarely used in digital soil mapping. In this paper, we took crop rotation as an example of agricultural activities, and explored its effectiveness in characterizing and mapping the spatial variability of soil. The cultivated area of Xuanzhou city and Langxi County in Anhui Province was chosen as the study area. Three main crop rotations,including double-rice, wheat-rice,and oilseed rape-cotton were observed through field investigation in 2010. The spatial distribution of the three crop rotations in the study area was obtained by multi-phase remote sensing image interpretation using a supervised classification method. One-way analysis of variance (ANOVA) for topsoil organic content in the three crop rotation groups was performed. Factor importance of seven natural environmental covariates, crop rotation, Land use and NDVI were generated by variable importance criterion of Random Forest. Different combinations of environmental covariates were selected according to the importance rankings of environmental covariates for predicting SOC using Random Forest and Soil Landscape Inference Model (SOLIM). A cross validation was generated to evaluated the mapping accuracies. The results showed that there were siginificant differences of topsoil organic content among the three crop rotation groups. The crop rotation is more important than parent material, land use or NDVI according to the importance ranking calculated by Random Forest. In addition, crop rotation improved the mapping accuracy, especially for the flat clutivated area. This study demonstrates the usefulness of human activities in digital soil mapping and thus indicates the necessity for human activity factors in digital soil mapping studies.
The evolutionary implications of epigenetic inheritance.
Jablonka, Eva
2017-10-06
The Modern Evolutionary Synthesis (MS) forged in the mid-twentieth century was built on a notion of heredity that excluded soft inheritance, the inheritance of the effects of developmental modifications. However, the discovery of molecular mechanisms that generate random and developmentally induced epigenetic variations is leading to a broadening of the notion of biological heredity that has consequences for ideas about evolution. After presenting some old challenges to the MS that were raised, among others, by Karl Popper, I discuss recent research on epigenetic inheritance, which provides experimental and theoretical support for these challenges. There is now good evidence that epigenetic inheritance is ubiquitous and is involved in adaptive evolution and macroevolution. I argue that the many evolutionary consequences of epigenetic inheritance open up new research areas and require the extension of the evolutionary synthesis beyond the current neo-Darwinian model.
Transposable element islands facilitate adaptation to novel environments in an invasive species
Schrader, Lukas; Kim, Jay W.; Ence, Daniel; Zimin, Aleksey; Klein, Antonia; Wyschetzki, Katharina; Weichselgartner, Tobias; Kemena, Carsten; Stökl, Johannes; Schultner, Eva; Wurm, Yannick; Smith, Christopher D.; Yandell, Mark; Heinze, Jürgen; Gadau, Jürgen; Oettler, Jan
2014-01-01
Adaptation requires genetic variation, but founder populations are generally genetically depleted. Here we sequence two populations of an inbred ant that diverge in phenotype to determine how variability is generated. Cardiocondyla obscurior has the smallest of the sequenced ant genomes and its structure suggests a fundamental role of transposable elements (TEs) in adaptive evolution. Accumulations of TEs (TE islands) comprising 7.18% of the genome evolve faster than other regions with regard to single-nucleotide variants, gene/exon duplications and deletions and gene homology. A non-random distribution of gene families, larvae/adult specific gene expression and signs of differential methylation in TE islands indicate intragenomic differences in regulation, evolutionary rates and coalescent effective population size. Our study reveals a tripartite interplay between TEs, life history and adaptation in an invasive species. PMID:25510865
Fidelity under isospectral perturbations: a random matrix study
NASA Astrophysics Data System (ADS)
Leyvraz, F.; García, A.; Kohler, H.; Seligman, T. H.
2013-07-01
The set of Hamiltonians generated by all unitary transformations from a single Hamiltonian is the largest set of isospectral Hamiltonians we can form. Taking advantage of the fact that the unitary group can be generated from Hermitian matrices we can take the ones generated by the Gaussian unitary ensemble with a small parameter as small perturbations. Similarly, the transformations generated by Hermitian antisymmetric matrices from orthogonal matrices form isospectral transformations among symmetric matrices. Based on this concept we can obtain the fidelity decay of a system that decays under a random isospectral perturbation with well-defined properties regarding time-reversal invariance. If we choose the Hamiltonian itself also from a classical random matrix ensemble, then we obtain solutions in terms of form factors in the limit of large matrices.
Investigation of estimators of probability density functions
NASA Technical Reports Server (NTRS)
Speed, F. M.
1972-01-01
Four research projects are summarized which include: (1) the generation of random numbers on the IBM 360/44, (2) statistical tests used to check out random number generators, (3) Specht density estimators, and (4) use of estimators of probability density functions in analyzing large amounts of data.
Mining and Querying Multimedia Data
2011-09-29
able to capture more subtle spatial variations such as repetitiveness. Local feature descriptors such as SIFT [74] and SURF [12] have also been widely...empirically set to s = 90%, r = 50%, K = 20, where small variations lead to little perturbation of the output. The pseudo-code of the algorithm is...by constructing a three-layer graph based on clustering outputs, and executing a slight variation of random walk with restart algorithm. It provided
DNA based random key generation and management for OTP encryption.
Zhang, Yunpeng; Liu, Xin; Sun, Manhui
2017-09-01
One-time pad (OTP) is a principle of key generation applied to the stream ciphering method which offers total privacy. The OTP encryption scheme has proved to be unbreakable in theory, but difficult to realize in practical applications. Because OTP encryption specially requires the absolute randomness of the key, its development has suffered from dense constraints. DNA cryptography is a new and promising technology in the field of information security. DNA chromosomes storing capabilities can be used as one-time pad structures with pseudo-random number generation and indexing in order to encrypt the plaintext messages. In this paper, we present a feasible solution to the OTP symmetric key generation and transmission problem with DNA at the molecular level. Through recombinant DNA technology, by using only sender-receiver known restriction enzymes to combine the secure key represented by DNA sequence and the T vector, we generate the DNA bio-hiding secure key and then place the recombinant plasmid in implanted bacteria for secure key transmission. The designed bio experiments and simulation results show that the security of the transmission of the key is further improved and the environmental requirements of key transmission are reduced. Analysis has demonstrated that the proposed DNA-based random key generation and management solutions are marked by high security and usability. Published by Elsevier B.V.
Spectral turning bands for efficient Gaussian random fields generation on GPUs and accelerators
NASA Astrophysics Data System (ADS)
Hunger, L.; Cosenza, B.; Kimeswenger, S.; Fahringer, T.
2015-11-01
A random field (RF) is a set of correlated random variables associated with different spatial locations. RF generation algorithms are of crucial importance for many scientific areas, such as astrophysics, geostatistics, computer graphics, and many others. Current approaches commonly make use of 3D fast Fourier transform (FFT), which does not scale well for RF bigger than the available memory; they are also limited to regular rectilinear meshes. We introduce random field generation with the turning band method (RAFT), an RF generation algorithm based on the turning band method that is optimized for massively parallel hardware such as GPUs and accelerators. Our algorithm replaces the 3D FFT with a lower-order, one-dimensional FFT followed by a projection step and is further optimized with loop unrolling and blocking. RAFT can easily generate RF on non-regular (non-uniform) meshes and efficiently produce fields with mesh sizes bigger than the available device memory by using a streaming, out-of-core approach. Our algorithm generates RF with the correct statistical behavior and is tested on a variety of modern hardware, such as NVIDIA Tesla, AMD FirePro and Intel Phi. RAFT is faster than the traditional methods on regular meshes and has been successfully applied to two real case scenarios: planetary nebulae and cosmological simulations.
A high-speed on-chip pseudo-random binary sequence generator for multi-tone phase calibration
NASA Astrophysics Data System (ADS)
Gommé, Liesbeth; Vandersteen, Gerd; Rolain, Yves
2011-07-01
An on-chip reference generator is conceived by adopting the technique of decimating a pseudo-random binary sequence (PRBS) signal in parallel sequences. This is of great benefit when high-speed generation of PRBS and PRBS-derived signals is the objective. The design implemented standard CMOS logic is available in commercial libraries to provide the logic functions for the generator. The design allows the user to select the periodicity of the PRBS and the PRBS-derived signals. The characterization of the on-chip generator marks its performance and reveals promising specifications.
Scope of Various Random Number Generators in ant System Approach for TSP
NASA Technical Reports Server (NTRS)
Sen, S. K.; Shaykhian, Gholam Ali
2007-01-01
Experimented on heuristic, based on an ant system approach for traveling salesman problem, are several quasi- and pseudo-random number generators. This experiment is to explore if any particular generator is most desirable. Such an experiment on large samples has the potential to rank the performance of the generators for the foregoing heuristic. This is mainly to seek an answer to the controversial issue "which generator is the best in terms of quality of the result (accuracy) as well as cost of producing the result (time/computational complexity) in a probabilistic/statistical sense."
Kouritzin, Michael A; Newton, Fraser; Wu, Biao
2013-04-01
Herein, we propose generating CAPTCHAs through random field simulation and give a novel, effective and efficient algorithm to do so. Indeed, we demonstrate that sufficient information about word tests for easy human recognition is contained in the site marginal probabilities and the site-to-nearby-site covariances and that these quantities can be embedded directly into certain conditional probabilities, designed for effective simulation. The CAPTCHAs are then partial random realizations of the random CAPTCHA word. We start with an initial random field (e.g., randomly scattered letter pieces) and use Gibbs resampling to re-simulate portions of the field repeatedly using these conditional probabilities until the word becomes human-readable. The residual randomness from the initial random field together with the random implementation of the CAPTCHA word provide significant resistance to attack. This results in a CAPTCHA, which is unrecognizable to modern optical character recognition but is recognized about 95% of the time in a human readability study.
A Statistical Method to Distinguish Functional Brain Networks
Fujita, André; Vidal, Maciel C.; Takahashi, Daniel Y.
2017-01-01
One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism (p < 0.001). PMID:28261045
A Statistical Method to Distinguish Functional Brain Networks.
Fujita, André; Vidal, Maciel C; Takahashi, Daniel Y
2017-01-01
One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism ( p < 0.001).
Hu, Kun; Lu, Houbing; Wang, Xu; Li, Feng; Liang, Futian; Jin, Ge
2015-01-01
The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.
Note: The design of thin gap chamber simulation signal source based on field programmable gate array
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Kun; Wang, Xu; Li, Feng
The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.
Housworth, E A; Martins, E P
2001-01-01
Statistical randomization tests in evolutionary biology often require a set of random, computer-generated trees. For example, earlier studies have shown how large numbers of computer-generated trees can be used to conduct phylogenetic comparative analyses even when the phylogeny is uncertain or unknown. These methods were limited, however, in that (in the absence of molecular sequence or other data) they allowed users to assume that no phylogenetic information was available or that all possible trees were known. Intermediate situations where only a taxonomy or other limited phylogenetic information (e.g., polytomies) are available are technically more difficult. The current study describes a procedure for generating random samples of phylogenies while incorporating limited phylogenetic information (e.g., four taxa belong together in a subclade). The procedure can be used to conduct comparative analyses when the phylogeny is only partially resolved or can be used in other randomization tests in which large numbers of possible phylogenies are needed.
Physical layer one-time-pad data encryption through synchronized semiconductor laser networks
NASA Astrophysics Data System (ADS)
Argyris, Apostolos; Pikasis, Evangelos; Syvridis, Dimitris
2016-02-01
Semiconductor lasers (SL) have been proven to be a key device in the generation of ultrafast true random bit streams. Their potential to emit chaotic signals under conditions with desirable statistics, establish them as a low cost solution to cover various needs, from large volume key generation to real-time encrypted communications. Usually, only undemanding post-processing is needed to convert the acquired analog timeseries to digital sequences that pass all established tests of randomness. A novel architecture that can generate and exploit these true random sequences is through a fiber network in which the nodes are semiconductor lasers that are coupled and synchronized to central hub laser. In this work we show experimentally that laser nodes in such a star network topology can synchronize with each other through complex broadband signals that are the seed to true random bit sequences (TRBS) generated at several Gb/s. The potential for each node to access real-time generated and synchronized with the rest of the nodes random bit streams, through the fiber optic network, allows to implement an one-time-pad encryption protocol that mixes the synchronized true random bit sequence with real data at Gb/s rates. Forward-error correction methods are used to reduce the errors in the TRBS and the final error rate at the data decoding level. An appropriate selection in the sampling methodology and properties, as well as in the physical properties of the chaotic seed signal through which network locks in synchronization, allows an error free performance.
Johanson, Edward W.; Simms, Richard
1981-01-01
A scram signal generating circuit for nuclear reactor installations monitors a flow signal representing the flow rate of the liquid sodium coolant which is circulated through the reactor, and initiates reactor shutdown for a rapid variation in the flow signal, indicative of fuel motion. The scram signal generating circuit includes a long-term drift compensation circuit which processes the flow signal and generates an output signal representing the flow rate of the coolant. The output signal remains substantially unchanged for small variations in the flow signal, attributable to long term drift in the flow rate, but a rapid change in the flow signal, indicative of a fast flow variation, causes a corresponding change in the output signal. A comparator circuit compares the output signal with a reference signal, representing a given percentage of the steady state flow rate of the coolant, and generates a scram signal to initiate reactor shutdown when the output signal equals the reference signal.
Johanson, E.W.; Simms, R.
A scram signal generating circuit for nuclear reactor installations monitors a flow signal representing the flow rate of the liquid sodium coolant which is circulated through the reactor, and initiates reactor shutdown for a rapid variation in the flow signal, indicative of fuel motion. The scram signal generating circuit includes a long-term drift compensation circuit which processes the flow signal and generates an output signal representing the flow rate of the coolant. The output signal remains substantially unchanged for small variations in the flow signal, attributable to long term drift in the flow rate, but a rapid change in the flow signal, indicative of a fast flow variation, causes a corresponding change in the output signal. A comparator circuit compares the output signal with a reference signal, representing a given percentage of the steady state flow rate of the coolant, and generates a scram signal to initiate reactor shutdown when the output signal equals the reference signal.
Individual mineral supplement intake by ewes swath grazing or confinement fed pea-barley forage
USDA-ARS?s Scientific Manuscript database
Previous research has reported high variation in intake of self-fed protein and/or energy supplements by individual animals, however little is known about variation in consumption of mineral supplements. Sixty mature range ewes (non-pregnant, non-lactating) were used in a completely randomized desig...
Montgomery, John H; Byerly, Matthew; Carmody, Thomas; Li, Baitao; Miller, Daniel R; Varghese, Femina; Holland, Rhiannon
2004-12-01
The effect of funding source on the outcome of randomized controlled trials has been investigated in several medical disciplines; however, psychiatry has been largely excluded from such analyses. In this article, randomized controlled trials of second generation antipsychotics in schizophrenia are reviewed and analyzed with respect to funding source (industry vs. non-industry funding). A literature search was conducted for randomized, double-blind trials in which at least one of the tested treatments was a second generation antipsychotic. In each study, design quality and study outcome were assessed quantitatively according to rating scales. Mean quality and outcome scores were compared in the industry-funded studies and non-industry-funded studies. An analysis of the primary author's affiliation with industry was similarly performed. Results of industry-funded studies significantly favored second generation over first generation antipsychotics when compared to non-industry-funded studies. Non-industry-funded studies showed a trend toward higher quality than industry-funded studies; however, the difference between the two was not significant. Also, within the industry-funded studies, outcomes of trials involving first authors employed by industry sponsors demonstrated a trend toward second generation over first generation antipsychotics to a greater degree than did trials involving first authors employed outside the industry (p=0.05). While the retrospective design of the study limits the strength of the findings, the data suggest that industry bias may occur in randomized controlled trials in schizophrenia. There appears to be several sources by which bias may enter clinical research, including trial design, control of data analysis and multiplicity/redundancy of trials.
An investigation of the uniform random number generator
NASA Technical Reports Server (NTRS)
Temple, E. C.
1982-01-01
Most random number generators that are in use today are of the congruential form X(i+1) + AX(i) + C mod M where A, C, and M are nonnegative integers. If C=O, the generator is called the multiplicative type and those for which C/O are called mixed congruential generators. It is easy to see that congruential generators will repeat a sequence of numbers after a maximum of M values have been generated. The number of numbers that a procedure generates before restarting the sequence is called the length or the period of the generator. Generally, it is desirable to make the period as long as possible. A detailed discussion of congruential generators is given. Also, several promising procedures that differ from the multiplicative and mixed procedure are discussed.
Toward DNA-based Security Circuitry: First Step - Random Number Generation.
Bogard, Christy M; Arazi, Benjamin; Rouchka, Eric C
2008-08-10
DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. Our team investigates the implications of DNA-based circuit design in serving security applications. As an initial step we develop a random number generation circuitry. A novel prototype schema employs solid-phase synthesis of oligonucleotides for random construction of DNA sequences. Temporary storage and retrieval is achieved through plasmid vectors.
Individual variation behind the evolution of cooperation.
Barta, Zoltán
2016-02-05
Life on Earth has two remarkable properties. The first is variation: even apart from the vast number of extant species, there are considerable differences between individuals within a single species. The second property is cooperation. It is surprising that until recently the interactions between these two properties have rarely been addressed from an evolutionary point of view. Here, I concentrate on how inter-individual differences influence the evolution of cooperation. First, I deal with cases where individuality is maintained by random processes like mutation or phenotypic noise. Second, I examine when differences in state cause differences in behaviour. Finally, I investigate the effects of individual role specialization. Variation can be important in several ways. Increased random variation can change the expectation about cooperativeness of future partners, altering behaviour in a current relationship. Differences in state may serve as a book-keeping mechanism that is necessary for the evolution of reciprocity. If the cost of cooperation can depend on state then strategic regulation of state makes it possible to coerce partners to cooperate. If conditions force individuals to specialize, cooperation becomes more valuable. My review of theoretical models suggests that variation plays an important role in the evolution of cooperation. © 2016 The Author(s).
Simulations Using Random-Generated DNA and RNA Sequences
ERIC Educational Resources Information Center
Bryce, C. F. A.
1977-01-01
Using a very simple computer program written in BASIC, a very large number of random-generated DNA or RNA sequences are obtained. Students use these sequences to predict complementary sequences and translational products, evaluate base compositions, determine frequencies of particular triplet codons, and suggest possible secondary structures.…
ERIC Educational Resources Information Center
Griffiths, Martin
2011-01-01
One of the author's undergraduate students recently asked him whether it was possible to generate a random positive integer. After some thought, the author realised that there were plenty of interesting mathematical ideas inherent in her question. So much so in fact, that the author decided to organise a workshop, open both to undergraduates and…
Minority Game of price promotions in fast moving consumer goods markets
NASA Astrophysics Data System (ADS)
Groot, Robert D.; Musters, Pieter A. D.
2005-05-01
A variation of the Minority Game has been applied to study the timing of promotional actions at retailers in the fast moving consumer goods market. The underlying hypotheses for this work are that price promotions are more effective when fewer than average competitors do a promotion, and that a promotion strategy can be based on past sales data. The first assumption has been checked by analysing 1467 promotional actions for three products on the Dutch market (ketchup, mayonnaise and curry sauce) over a 120-week period, both on an aggregated level and on retailer chain level. The second assumption was tested by analysing past sales data with the Minority Game. This revealed that high or low competitor promotional pressure for actual ketchup, mayonnaise, curry sauce and barbecue sauce markets is to some extent predictable up to a forecast of some 10 weeks. Whereas a random guess would be right 50% of the time, a single-agent game can predict the market with a success rate of 56% for a 6-9 week forecast. This number is the same for all four mentioned fast moving consumer markets. For a multi-agent game a larger variability in the success rate is obtained, but predictability can be as high as 65%. Contrary to expectation, the actual market does the opposite of what game theory would predict. This points at a systematic oscillation in the market. Even though this result is not fully understood, merely observing that this trend is present in the data could lead to exploitable trading benefits. As a check, random history strings were generated from which the statistical variation in the game prediction was studied. This shows that the odds are 1:1,000,000 that the observed pattern in the market is based on coincidence.
Garamszegi, László Zsolt
2011-02-01
Plasmodium parasites, the causative agents of malaria, are generally considered as harmful parasites, but many of them cause mild symptoms. Little is known about the evolutionary history and phylogenetic constraints that generate this interspecific variation in virulence due to uncertainties about the phylogenetic associations of parasites. Here, to account for such phylogenetic uncertainty, phylogenetic methods based on Bayesian statistics were followed in combination with sequence data from five genes to estimate the ancestral state of virulence in primate Plasmodium parasites. When recent parasites were categorised according to the damage caused to the host, Bayesian estimates of ancestral states indicated that the acquisition of a harmful host exploitation strategy is more likely to be a recent evolutionary event than a result of an ancient change in a character state altering virulence. On the contrary, there was more evidence for moderate host exploitation having a deep origin along the phylogenetic tree. Moreover, the evolution of host severity is determined by the phylogenetic relationships of parasites, as severity gains did not appear randomly on the evolutionary tree. Such phylogenetic constraints can be mediated by the acquisition of virulence genes. As the impact of a parasite on a host is the result of both the parasite's investment in reproduction and host sensitivity, virulence was also estimated by calculating peak parasitemia after eliminating host effects. A directional random-walk evolutionary model showed that the ancestral primate malarias reproduced at very low parasitemia in their hosts. Consequently, the extreme variation in the outcome of malaria infection in different host species can be better understood in light of the phylogeny of parasites. Copyright © 2010 Australian Society for Parasitology Inc. Published by Elsevier Ltd. All rights reserved.
MOST detects corotating bright spots on the mid-O-type giant ξ Persei
NASA Astrophysics Data System (ADS)
Ramiaramanantsoa, Tahina; Moffat, Anthony F. J.; Chené, André-Nicolas; Richardson, Noel D.; Henrichs, Huib F.; Desforges, Sébastien; Antoci, Victoria; Rowe, Jason F.; Matthews, Jaymie M.; Kuschnig, Rainer; Weiss, Werner W.; Sasselov, Dimitar; Rucinski, Slavek M.; Guenther, David B.
2014-06-01
We have used the MOST (Microvariability and Oscillations of STars) microsatellite to obtain four weeks of contiguous high-precision broad-band visual photometry of the O7.5III(n)((f)) star ξ Persei in 2011 November. This star is well known from previous work to show prominent DACs (discrete absorption components) on time-scales of about 2 d from UV spectroscopy and non-radial pulsation with one (l = 3) p-mode oscillation with a period of 3.5 h from optical spectroscopy. Our MOST-orbit (101.4 min) binned photometry fails to reveal any periodic light variations above the 0.1 mmag 3σ noise level for periods of a few hours, while several prominent Fourier peaks emerge at the 1 mmag level in the two-day period range. These longer period variations are unlikely due to pulsations, including gravity modes. From our simulations based upon a simple spot model, we deduce that we are seeing the photometric modulation of several corotating bright spots on the stellar surface. In our model, the starting times (random) and lifetimes (up to several rotations) vary from one spot to another yet all spots rotate at the same period of 4.18 d, the best-estimated rotation period of the star. This is the first convincing reported case of corotating bright spots on an O star, with important implications for drivers of the DACs (resulting from corotating interaction regions) with possible bright-spot generation via a breakout at the surface of a global magnetic field generated by a subsurface convection zone.
Discovery and refinement of muscle weight QTLs in B6 × D2 advanced intercross mice
Carbonetto, P.; Cheng, R.; Gyekis, J. P.; Parker, C. C.; Blizard, D. A.; Palmer, A. A.
2014-01-01
The genes underlying variation in skeletal muscle mass are poorly understood. Although many quantitative trait loci (QTLs) have been mapped in crosses of mouse strains, the limited resolution inherent in these conventional studies has made it difficult to reliably pinpoint the causal genetic variants. The accumulated recombination events in an advanced intercross line (AIL), in which mice from two inbred strains are mated at random for several generations, can improve mapping resolution. We demonstrate these advancements in mapping QTLs for hindlimb muscle weights in an AIL (n = 832) of the C57BL/6J (B6) and DBA/2J (D2) strains, generations F8–F13. We mapped muscle weight QTLs using the high-density MegaMUGA SNP panel. The QTLs highlight the shared genetic architecture of four hindlimb muscles and suggest that the genetic contributions to muscle variation are substantially different in males and females, at least in the B6D2 lineage. Out of the 15 muscle weight QTLs identified in the AIL, nine overlapped the genomic regions discovered in an earlier B6D2 F2 intercross. Mapping resolution, however, was substantially improved in our study to a median QTL interval of 12.5 Mb. Subsequent sequence analysis of the QTL regions revealed 20 genes with nonsense or potentially damaging missense mutations. Further refinement of the muscle weight QTLs using additional functional information, such as gene expression differences between alleles, will be important for discerning the causal genes. PMID:24963006
Discovery and refinement of muscle weight QTLs in B6 × D2 advanced intercross mice.
Carbonetto, P; Cheng, R; Gyekis, J P; Parker, C C; Blizard, D A; Palmer, A A; Lionikas, A
2014-08-15
The genes underlying variation in skeletal muscle mass are poorly understood. Although many quantitative trait loci (QTLs) have been mapped in crosses of mouse strains, the limited resolution inherent in these conventional studies has made it difficult to reliably pinpoint the causal genetic variants. The accumulated recombination events in an advanced intercross line (AIL), in which mice from two inbred strains are mated at random for several generations, can improve mapping resolution. We demonstrate these advancements in mapping QTLs for hindlimb muscle weights in an AIL (n = 832) of the C57BL/6J (B6) and DBA/2J (D2) strains, generations F8-F13. We mapped muscle weight QTLs using the high-density MegaMUGA SNP panel. The QTLs highlight the shared genetic architecture of four hindlimb muscles and suggest that the genetic contributions to muscle variation are substantially different in males and females, at least in the B6D2 lineage. Out of the 15 muscle weight QTLs identified in the AIL, nine overlapped the genomic regions discovered in an earlier B6D2 F2 intercross. Mapping resolution, however, was substantially improved in our study to a median QTL interval of 12.5 Mb. Subsequent sequence analysis of the QTL regions revealed 20 genes with nonsense or potentially damaging missense mutations. Further refinement of the muscle weight QTLs using additional functional information, such as gene expression differences between alleles, will be important for discerning the causal genes. Copyright © 2014 the American Physiological Society.
Jacq, Olivier; Arnulf, Isabelle; Similowski, Thomas; Attali, Valérie
2017-12-20
Osteopathic manipulative treatment (OMT) of the sphenopalatine ganglion (SPG) is used empirically for the treatment of rhinitis and snoring and is thought to increase pharyngeal stability. This trial was designed to study the effects of this treatment on pharyngeal stability evaluated by critical closing pressure in obstructive sleep apnoea syndrome. This single-centre, randomized, crossover, double-blind study compared active manipulation and sham manipulation of the SPG. Randomization was computer-generated. Patients each received one active manipulation and one sham manipulation at an interval of 21 days and were evaluated 30 min and 48 h after each session administered by a qualified osteopath. Neither the patients, nor the investigator performing the evaluations were informed about the order of the two techniques (double-blind). The primary endpoint was the percentage of responding patients presenting increased pharyngeal stability defined by a variation of critical closing pressure (Pcrit) of at least -4 cmH 2 O at 30 min. Secondary endpoints were the variation of Pcrit in absolute values, sleepiness and snoring. Others endpoints were lacrimation (Schirmer's test), induced pain, sensations experienced during OMT. Ten patients were included and nine (57 [50; 58] years, comprising 7 men, with an apnoea-hypopnoea index of 31.0 [25.5; 33.2]/h; (values are median [quartiles])) were analysed. Seven patients were analysed for the primary endpoint and nine patients were analysed for secondary endpoints. Five patients responded after active manipulation versus no patients after sham manipulation (p = 0.0209). Active manipulation induced more intense pain (p = 0.0089), increased lacrimation (ns) and more tactile, nociceptive and gustatory sensations (13 versus 1) compared to sham manipulation. No significant difference was observed for the other endpoints. Osteopathic manipulative treatment of the SPG may improve pharyngeal stability in obstructive sleep apnoea syndrome. This trial validates the feasibility of the randomized, controlled, double-blind methodology for evaluation of this osteopathic treatment. Studies on a larger sample size must specify the efficacy on the apnoea-hypopnoea index. The study was retrospectively registered in the clinicaltrial.gov registry under reference NCT01193738 on 1st September 2010 (first inclusion May 19, 2010).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zou Pu; Zeng Zhinan; Zheng Yinghui
2010-11-15
We propose a scheme for generating isolated attosecond pulse (IAP) via high-order harmonic generation in gases using a chirped two-color laser field of multicycle duration. In contrast to previous techniques where the stable carrier-envelope phase (CEP) of the driving laser pulses is a prerequisite for IAP generation, the proposed scheme is robust against the large variation of CEP. We show the generation of IAP with an intensity fluctuation less than 50% and an intensity contrast ratio higher than 5:1 when the CEP shift is as large as 1.35{pi}.
NASA Astrophysics Data System (ADS)
Bousserez, Nicolas; Henze, Daven; Bowman, Kevin; Liu, Junjie; Jones, Dylan; Keller, Martin; Deng, Feng
2013-04-01
This work presents improved analysis error estimates for 4D-Var systems. From operational NWP models to top-down constraints on trace gas emissions, many of today's data assimilation and inversion systems in atmospheric science rely on variational approaches. This success is due to both the mathematical clarity of these formulations and the availability of computationally efficient minimization algorithms. However, unlike Kalman Filter-based algorithms, these methods do not provide an estimate of the analysis or forecast error covariance matrices, these error statistics being propagated only implicitly by the system. From both a practical (cycling assimilation) and scientific perspective, assessing uncertainties in the solution of the variational problem is critical. For large-scale linear systems, deterministic or randomization approaches can be considered based on the equivalence between the inverse Hessian of the cost function and the covariance matrix of analysis error. For perfectly quadratic systems, like incremental 4D-Var, Lanczos/Conjugate-Gradient algorithms have proven to be most efficient in generating low-rank approximations of the Hessian matrix during the minimization. For weakly non-linear systems though, the Limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS), a quasi-Newton descent algorithm, is usually considered the best method for the minimization. Suitable for large-scale optimization, this method allows one to generate an approximation to the inverse Hessian using the latest m vector/gradient pairs generated during the minimization, m depending upon the available core memory. At each iteration, an initial low-rank approximation to the inverse Hessian has to be provided, which is called preconditioning. The ability of the preconditioner to retain useful information from previous iterations largely determines the efficiency of the algorithm. Here we assess the performance of different preconditioners to estimate the inverse Hessian of a large-scale 4D-Var system. The impact of using the diagonal preconditioners proposed by Gilbert and Le Maréchal (1989) instead of the usual Oren-Spedicato scalar will be first presented. We will also introduce new hybrid methods that combine randomization estimates of the analysis error variance with L-BFGS diagonal updates to improve the inverse Hessian approximation. Results from these new algorithms will be evaluated against standard large ensemble Monte-Carlo simulations. The methods explored here are applied to the problem of inferring global atmospheric CO2 fluxes using remote sensing observations, and are intended to be integrated with the future NASA Carbon Monitoring System.
NASA Astrophysics Data System (ADS)
Keiser, Gerd; Liu, Hao-Yu; Lu, Shao-Hsi; Devi Pukhrambam, Puspa
2012-07-01
Low-cost multimode glass and plastic optical fibers are attractive for high-capacity indoor telecom networks. Many existing buildings already have glass multimode fibers installed for local area network applications. Future indoor applications will use combinations of glass multimode fibers with plastic optical fibers that have low losses in the 850-nm-1,310-nm range. This article examines real-world link losses when randomly interconnecting glass and plastic fiber segments having factory-installed connectors. Potential interconnection issues include large variations in connector losses among randomly selected fiber segments, asymmetric link losses in bidirectional links, and variations in bandwidths among different types of fibers.
Variations in Carboxyhaemoglobin Levels in Smokers
Castleden, C. M.; Cole, P. V.
1974-01-01
Three experiments on smokers have been performed to determine variations in blood levels of carboxyhaemoglobin (COHb) throughout the day and night and whether a random measurement of COHb gives a true estimation of a smoker's mean COHb level. In the individual smoker the COHb level does not increase gradually during the day but is kept within relatively narrow limits. Moderately heavy smokers rise in the morning with a substantially raised COHb level because the half life of COHb is significantly longer during sleep than during the day. Women excrete their carbon monoxide faster than men. A random COHb estimation gives a good indication of the mean COHb level of an individual. PMID:4441877
Sample size calculations for the design of cluster randomized trials: A summary of methodology.
Gao, Fei; Earnest, Arul; Matchar, David B; Campbell, Michael J; Machin, David
2015-05-01
Cluster randomized trial designs are growing in popularity in, for example, cardiovascular medicine research and other clinical areas and parallel statistical developments concerned with the design and analysis of these trials have been stimulated. Nevertheless, reviews suggest that design issues associated with cluster randomized trials are often poorly appreciated and there remain inadequacies in, for example, describing how the trial size is determined and the associated results are presented. In this paper, our aim is to provide pragmatic guidance for researchers on the methods of calculating sample sizes. We focus attention on designs with the primary purpose of comparing two interventions with respect to continuous, binary, ordered categorical, incidence rate and time-to-event outcome variables. Issues of aggregate and non-aggregate cluster trials, adjustment for variation in cluster size and the effect size are detailed. The problem of establishing the anticipated magnitude of between- and within-cluster variation to enable planning values of the intra-cluster correlation coefficient and the coefficient of variation are also described. Illustrative examples of calculations of trial sizes for each endpoint type are included. Copyright © 2015 Elsevier Inc. All rights reserved.
GASPRNG: GPU accelerated scalable parallel random number generator library
NASA Astrophysics Data System (ADS)
Gao, Shuang; Peterson, Gregory D.
2013-04-01
Graphics processors represent a promising technology for accelerating computational science applications. Many computational science applications require fast and scalable random number generation with good statistical properties, so they use the Scalable Parallel Random Number Generators library (SPRNG). We present the GPU Accelerated SPRNG library (GASPRNG) to accelerate SPRNG in GPU-based high performance computing systems. GASPRNG includes code for a host CPU and CUDA code for execution on NVIDIA graphics processing units (GPUs) along with a programming interface to support various usage models for pseudorandom numbers and computational science applications executing on the CPU, GPU, or both. This paper describes the implementation approach used to produce high performance and also describes how to use the programming interface. The programming interface allows a user to be able to use GASPRNG the same way as SPRNG on traditional serial or parallel computers as well as to develop tightly coupled programs executing primarily on the GPU. We also describe how to install GASPRNG and use it. To help illustrate linking with GASPRNG, various demonstration codes are included for the different usage models. GASPRNG on a single GPU shows up to 280x speedup over SPRNG on a single CPU core and is able to scale for larger systems in the same manner as SPRNG. Because GASPRNG generates identical streams of pseudorandom numbers as SPRNG, users can be confident about the quality of GASPRNG for scalable computational science applications. Catalogue identifier: AEOI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOI_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: UTK license. No. of lines in distributed program, including test data, etc.: 167900 No. of bytes in distributed program, including test data, etc.: 1422058 Distribution format: tar.gz Programming language: C and CUDA. Computer: Any PC or workstation with NVIDIA GPU (Tested on Fermi GTX480, Tesla C1060, Tesla M2070). Operating system: Linux with CUDA version 4.0 or later. Should also run on MacOS, Windows, or UNIX. Has the code been vectorized or parallelized?: Yes. Parallelized using MPI directives. RAM: 512 MB˜ 732 MB (main memory on host CPU, depending on the data type of random numbers.) / 512 MB (GPU global memory) Classification: 4.13, 6.5. Nature of problem: Many computational science applications are able to consume large numbers of random numbers. For example, Monte Carlo simulations are able to consume limitless random numbers for the computation as long as resources for the computing are supported. Moreover, parallel computational science applications require independent streams of random numbers to attain statistically significant results. The SPRNG library provides this capability, but at a significant computational cost. The GASPRNG library presented here accelerates the generators of independent streams of random numbers using graphical processing units (GPUs). Solution method: Multiple copies of random number generators in GPUs allow a computational science application to consume large numbers of random numbers from independent, parallel streams. GASPRNG is a random number generators library to allow a computational science application to employ multiple copies of random number generators to boost performance. Users can interface GASPRNG with software code executing on microprocessors and/or GPUs. Running time: The tests provided take a few minutes to run.
Sugavanam, S; Yan, Z; Kamynin, V; Kurkov, A S; Zhang, L; Churkin, D V
2014-02-10
Multiwavelength lasing in the random distributed feedback fiber laser is demonstrated by employing an all fiber Lyot filter. Stable multiwavelength generation is obtained, with each line exhibiting sub-nanometer line-widths. A flat power distribution over multiple lines is obtained, which indicates that the power between lines is redistributed in nonlinear mixing processes. The multiwavelength generation is observed both in first and second Stokes waves.
Subjective randomness as statistical inference.
Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B
2018-06-01
Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.
Sakai, Norihiro; Inoue, Takaya; Kunugiza, Yasuo; Tomita, Tetsuya; Mashimo, Takashi
2013-05-01
We conducted the prospective randomized controlled trial to test that continuous femoral nerve block (CFNB) improves attainment of 120° knee flexion compared to continuous epidural analgesia (CEA). Sixty-six patients scheduled for unilateral total knee arthroplasty were randomized into two groups; infusion of ropivacaine 0.15% into CEA or CFNB to third postoperative days. We studied the time required to attain 120° knee flexion, variations in thigh and calf circumferences around the treated knee, pain scores, rehabilitation milestones, the need for adjuvant analgesics, and side effects. CFNB patients attained earlier knee flexion to 120°, lower variations in thigh and calf circumferences, less pain during rehabilitation, and less need for adjuvant analgesics. CFNB is a better pain management strategy that accelerates knee flexion rehabilitation. Copyright © 2013 Elsevier Inc. All rights reserved.
Anhøj, Jacob; Olesen, Anne Vingaard
2014-01-01
A run chart is a line graph of a measure plotted over time with the median as a horizontal line. The main purpose of the run chart is to identify process improvement or degradation, which may be detected by statistical tests for non-random patterns in the data sequence. We studied the sensitivity to shifts and linear drifts in simulated processes using the shift, crossings and trend rules for detecting non-random variation in run charts. The shift and crossings rules are effective in detecting shifts and drifts in process centre over time while keeping the false signal rate constant around 5% and independent of the number of data points in the chart. The trend rule is virtually useless for detection of linear drift over time, the purpose it was intended for.
Copy number variation of individual cattle genomes using next-generation sequencing
USDA-ARS?s Scientific Manuscript database
Copy number variations (CNVs) affect a wide range of phenotypic traits; however, CNVs in or near segmental duplication regions are often intractable. Using a read depth approach based on next-generation sequencing, we examined genome-wide copy number differences among five taurine (three Angus, one ...
Copy number variation of individual cattle genomes using next-generation sequencing
USDA-ARS?s Scientific Manuscript database
Copy Number Variations (CNVs) affect a wide range of phenotypic traits; however, CNVs in or near segmental duplication regions are often difficult to track. Using a read depth approach based on next generation sequencing, we examined genome-wide copy number differences among five taurine (three Angu...
Single-shot stand-off chemical identification of powders using random Raman lasing
Hokr, Brett H.; Bixler, Joel N.; Noojin, Gary D.; Thomas, Robert J.; Rockwell, Benjamin A.; Yakovlev, Vladislav V.; Scully, Marlan O.
2014-01-01
The task of identifying explosives, hazardous chemicals, and biological materials from a safe distance is the subject we consider. Much of the prior work on stand-off spectroscopy using light has been devoted to generating a backward-propagating beam of light that can be used drive further spectroscopic processes. The discovery of random lasing and, more recently, random Raman lasing provide a mechanism for remotely generating copious amounts of chemically specific Raman scattered light. The bright nature of random Raman lasing renders directionality unnecessary, allowing for the detection and identification of chemicals from large distances in real time. In this article, the single-shot remote identification of chemicals at kilometer-scale distances is experimentally demonstrated using random Raman lasing. PMID:25114231
Optimized random phase only holograms.
Zea, Alejandro Velez; Barrera Ramirez, John Fredy; Torroba, Roberto
2018-02-15
We propose a simple and efficient technique capable of generating Fourier phase only holograms with a reconstruction quality similar to the results obtained with the Gerchberg-Saxton (G-S) algorithm. Our proposal is to use the traditional G-S algorithm to optimize a random phase pattern for the resolution, pixel size, and target size of the general optical system without any specific amplitude data. This produces an optimized random phase (ORAP), which is used for fast generation of phase only holograms of arbitrary amplitude targets. This ORAP needs to be generated only once for a given optical system, avoiding the need for costly iterative algorithms for each new target. We show numerical and experimental results confirming the validity of the proposal.
Ko, Heasin; Choi, Byung-Seok; Choe, Joong-Seon; Kim, Kap-Joong; Kim, Jong-Hoi; Youn, Chun Ju
2017-08-21
Most polarization-based BB84 quantum key distribution (QKD) systems utilize multiple lasers to generate one of four polarization quantum states randomly. However, random bit generation with multiple lasers can potentially open critical side channels that significantly endangers the security of QKD systems. In this paper, we show unnoticed side channels of temporal disparity and intensity fluctuation, which possibly exist in the operation of multiple semiconductor laser diodes. Experimental results show that the side channels can enormously degrade security performance of QKD systems. An important system issue for the improvement of quantum bit error rate (QBER) related with laser driving condition is further addressed with experimental results.
Long-term persistence of solar activity
NASA Technical Reports Server (NTRS)
Ruzmaikin, Alexander; Feynman, Joan; Robinson, Paul
1994-01-01
We examine the question of whether or not the non-periodic variations in solar activity are caused by a white-noise, random process. The Hurst exponent, which characterizes the persistence of a time series, is evaluated for the series of C-14 data for the time interval from about 6000 BC to 1950 AD. We find a constant Hurst exponent, suggesting that solar activity in the frequency range from 100 to 3000 years includes an important continuum component in addition to the well-known periodic variations. The value we calculate, H approximately 0.8, is significantly larger than the value of 0.5 that would correspond to variations produced by a white-noise process. This value is in good agreement with the results for the monthly sunspot data reported elsewhere, indicating that the physics that produces the continuum is a correlated random process and that it is the same type of process over a wide range of time interval lengths.
RandomSpot: A web-based tool for systematic random sampling of virtual slides.
Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E
2015-01-01
This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.
Direct generation of all-optical random numbers from optical pulse amplitude chaos.
Li, Pu; Wang, Yun-Cai; Wang, An-Bang; Yang, Ling-Zhen; Zhang, Ming-Jiang; Zhang, Jian-Zhong
2012-02-13
We propose and theoretically demonstrate an all-optical method for directly generating all-optical random numbers from pulse amplitude chaos produced by a mode-locked fiber ring laser. Under an appropriate pump intensity, the mode-locked laser can experience a quasi-periodic route to chaos. Such a chaos consists of a stream of pulses with a fixed repetition frequency but random intensities. In this method, we do not require sampling procedure and external triggered clocks but directly quantize the chaotic pulses stream into random number sequence via an all-optical flip-flop. Moreover, our simulation results show that the pulse amplitude chaos has no periodicity and possesses a highly symmetric distribution of amplitude. Thus, in theory, the obtained random number sequence without post-processing has a high-quality randomness verified by industry-standard statistical tests.
Noise-Induced Synchronization among Sub-RF CMOS Analog Oscillators for Skew-Free Clock Distribution
NASA Astrophysics Data System (ADS)
Utagawa, Akira; Asai, Tetsuya; Hirose, Tetsuya; Amemiya, Yoshihito
We present on-chip oscillator arrays synchronized by random noises, aiming at skew-free clock distribution on synchronous digital systems. Nakao et al. recently reported that independent neural oscillators can be synchronized by applying temporal random impulses to the oscillators [1], [2]. We regard neural oscillators as independent clock sources on LSIs; i. e., clock sources are distributed on LSIs, and they are forced to synchronize through the use of random noises. We designed neuron-based clock generators operating at sub-RF region (<1GHz) by modifying the original neuron model to a new model that is suitable for CMOS implementation with 0.25-μm CMOS parameters. Through circuit simulations, we demonstrate that i) the clock generators are certainly synchronized by pseudo-random noises and ii) clock generators exhibited phase-locked oscillations even if they had small device mismatches.
Regression Discontinuity for Causal Effect Estimation in Epidemiology.
Oldenburg, Catherine E; Moscoe, Ellen; Bärnighausen, Till
Regression discontinuity analyses can generate estimates of the causal effects of an exposure when a continuously measured variable is used to assign the exposure to individuals based on a threshold rule. Individuals just above the threshold are expected to be similar in their distribution of measured and unmeasured baseline covariates to individuals just below the threshold, resulting in exchangeability. At the threshold exchangeability is guaranteed if there is random variation in the continuous assignment variable, e.g., due to random measurement error. Under exchangeability, causal effects can be identified at the threshold. The regression discontinuity intention-to-treat (RD-ITT) effect on an outcome can be estimated as the difference in the outcome between individuals just above (or below) versus just below (or above) the threshold. This effect is analogous to the ITT effect in a randomized controlled trial. Instrumental variable methods can be used to estimate the effect of exposure itself utilizing the threshold as the instrument. We review the recent epidemiologic literature reporting regression discontinuity studies and find that while regression discontinuity designs are beginning to be utilized in a variety of applications in epidemiology, they are still relatively rare, and analytic and reporting practices vary. Regression discontinuity has the potential to greatly contribute to the evidence base in epidemiology, in particular on the real-life and long-term effects and side-effects of medical treatments that are provided based on threshold rules - such as treatments for low birth weight, hypertension or diabetes.
A two-stage model of fracture of rocks
Kuksenko, V.; Tomilin, N.; Damaskinskaya, E.; Lockner, D.
1996-01-01
In this paper we propose a two-stage model of rock fracture. In the first stage, cracks or local regions of failure are uncorrelated occur randomly throughout the rock in response to loading of pre-existing flaws. As damage accumulates in the rock, there is a gradual increase in the probability that large clusters of closely spaced cracks or local failure sites will develop. Based on statistical arguments, a critical density of damage will occur where clusters of flaws become large enough to lead to larger-scale failure of the rock (stage two). While crack interaction and cooperative failure is expected to occur within clusters of closely spaced cracks, the initial development of clusters is predicted based on the random variation in pre-existing Saw populations. Thus the onset of the unstable second stage in the model can be computed from the generation of random, uncorrelated damage. The proposed model incorporates notions of the kinetic (and therefore time-dependent) nature of the strength of solids as well as the discrete hierarchic structure of rocks and the flaw populations that lead to damage accumulation. The advantage offered by this model is that its salient features are valid for fracture processes occurring over a wide range of scales including earthquake processes. A notion of the rank of fracture (fracture size) is introduced, and criteria are presented for both fracture nucleation and the transition of the failure process from one scale to another.
Cascaded Raman lasing in a PM phosphosilicate fiber with random distributed feedback
NASA Astrophysics Data System (ADS)
Lobach, Ivan A.; Kablukov, Sergey I.; Babin, Sergey A.
2018-02-01
We report on the first demonstration of a linearly polarized cascaded Raman fiber laser based on a simple half-open cavity with a broadband composite reflector and random distributed feedback in a polarization maintaining phosphosilicate fiber operating beyond zero dispersion wavelength ( 1400 nm). With increasing pump power from a Yb-doped fiber laser at 1080 nm, the random laser generates subsequently 8 W at 1262 nm and 9 W at 1515 nm with polarization extinction ratio of 27 dB. The generation linewidths amount to about 1 nm and 3 nm, respectively, being almost independent of power, in correspondence with the theory of a cascaded random lasing.
Random phase encoding for optical security
NASA Astrophysics Data System (ADS)
Wang, RuiKang K.; Watson, Ian A.; Chatwin, Christopher R.
1996-09-01
A new optical encoding method for security applications is proposed. The encoded image (encrypted into the security products) is merely a random phase image statistically and randomly generated by a random number generator using a computer, which contains no information from the reference pattern (stored for verification) or the frequency plane filter (a phase-only function for decoding). The phase function in the frequency plane is obtained using a modified phase retrieval algorithm. The proposed method uses two phase-only functions (images) at both the input and frequency planes of the optical processor leading to maximum optical efficiency. Computer simulation shows that the proposed method is robust for optical security applications.
Multi-peak structure of generation spectrum of random distributed feedback fiber Raman lasers.
Vatnik, I D; Zlobina, E A; Kablukov, S I; Babin, S A
2017-02-06
We study spectral features of the generation of random distributed feedback fiber Raman laser arising from two-peak shape of the Raman gain spectral profile realized in the germanosilicate fibers. We demonstrate that number of peaks can be calculated using power balance model considering different subcomponents within each Stokes component.
Random benzotrithiophene-based donor-acceptor copolymers for efficient organic photovoltaic devices.
Nielsen, Christian B; Ashraf, Raja Shahid; Schroeder, Bob C; D'Angelo, Pasquale; Watkins, Scott E; Song, Kigook; Anthopoulos, Thomas D; McCulloch, Iain
2012-06-14
A series of benzotrithiophene-containing random terpolymers for polymer solar cells is reported. Through variations of the two other components in the terpolymers, the absorption profile and the frontier energy levels are optimized and maximum power conversion efficiencies are nearly doubled (5.14%) relative to the parent alternating copolymer.
Physical Principle for Generation of Randomness
NASA Technical Reports Server (NTRS)
Zak, Michail
2009-01-01
A physical principle (more precisely, a principle that incorporates mathematical models used in physics) has been conceived as the basis of a method of generating randomness in Monte Carlo simulations. The principle eliminates the need for conventional random-number generators. The Monte Carlo simulation method is among the most powerful computational methods for solving high-dimensional problems in physics, chemistry, economics, and information processing. The Monte Carlo simulation method is especially effective for solving problems in which computational complexity increases exponentially with dimensionality. The main advantage of the Monte Carlo simulation method over other methods is that the demand on computational resources becomes independent of dimensionality. As augmented by the present principle, the Monte Carlo simulation method becomes an even more powerful computational method that is especially useful for solving problems associated with dynamics of fluids, planning, scheduling, and combinatorial optimization. The present principle is based on coupling of dynamical equations with the corresponding Liouville equation. The randomness is generated by non-Lipschitz instability of dynamics triggered and controlled by feedback from the Liouville equation. (In non-Lipschitz dynamics, the derivatives of solutions of the dynamical equations are not required to be bounded.)
Resonance energy transfer process in nanogap-based dual-color random lasing
NASA Astrophysics Data System (ADS)
Shi, Xiaoyu; Tong, Junhua; Liu, Dahe; Wang, Zhaona
2017-04-01
The resonance energy transfer (RET) process between Rhodamine 6G and oxazine in the nanogap-based random systems is systematically studied by revealing the variations and fluctuations of RET coefficients with pump power density. Three working regions stable fluorescence, dynamic laser, and stable laser are thus demonstrated in the dual-color random systems. The stable RET coefficients in fluorescence and lasing regions are generally different and greatly dependent on the donor concentration and the donor-acceptor ratio. These results may provide a way to reveal the energy distribution regulars in the random system and to design the tunable multi-color coherent random lasers for colorful imaging.
Subtyping cognitive profiles in Autism Spectrum Disorder using a Functional Random Forest algorithm.
Feczko, E; Balba, N M; Miranda-Dominguez, O; Cordova, M; Karalunas, S L; Irwin, L; Demeter, D V; Hill, A P; Langhorst, B H; Grieser Painter, J; Van Santen, J; Fombonne, E J; Nigg, J T; Fair, D A
2018-05-15
DSM-5 Autism Spectrum Disorder (ASD) comprises a set of neurodevelopmental disorders characterized by deficits in social communication and interaction and repetitive behaviors or restricted interests, and may both affect and be affected by multiple cognitive mechanisms. This study attempts to identify and characterize cognitive subtypes within the ASD population using our Functional Random Forest (FRF) machine learning classification model. This model trained a traditional random forest model on measures from seven tasks that reflect multiple levels of information processing. 47 ASD diagnosed and 58 typically developing (TD) children between the ages of 9 and 13 participated in this study. Our RF model was 72.7% accurate, with 80.7% specificity and 63.1% sensitivity. Using the random forest model, the FRF then measures the proximity of each subject to every other subject, generating a distance matrix between participants. This matrix is then used in a community detection algorithm to identify subgroups within the ASD and TD groups, and revealed 3 ASD and 4 TD putative subgroups with unique behavioral profiles. We then examined differences in functional brain systems between diagnostic groups and putative subgroups using resting-state functional connectivity magnetic resonance imaging (rsfcMRI). Chi-square tests revealed a significantly greater number of between group differences (p < .05) within the cingulo-opercular, visual, and default systems as well as differences in inter-system connections in the somato-motor, dorsal attention, and subcortical systems. Many of these differences were primarily driven by specific subgroups suggesting that our method could potentially parse the variation in brain mechanisms affected by ASD. Copyright © 2017. Published by Elsevier Inc.
Conditional Random Fields for Fast, Large-Scale Genome-Wide Association Studies
Huang, Jim C.; Meek, Christopher; Kadie, Carl; Heckerman, David
2011-01-01
Understanding the role of genetic variation in human diseases remains an important problem to be solved in genomics. An important component of such variation consist of variations at single sites in DNA, or single nucleotide polymorphisms (SNPs). Typically, the problem of associating particular SNPs to phenotypes has been confounded by hidden factors such as the presence of population structure, family structure or cryptic relatedness in the sample of individuals being analyzed. Such confounding factors lead to a large number of spurious associations and missed associations. Various statistical methods have been proposed to account for such confounding factors such as linear mixed-effect models (LMMs) or methods that adjust data based on a principal components analysis (PCA), but these methods either suffer from low power or cease to be tractable for larger numbers of individuals in the sample. Here we present a statistical model for conducting genome-wide association studies (GWAS) that accounts for such confounding factors. Our method scales in runtime quadratic in the number of individuals being studied with only a modest loss in statistical power as compared to LMM-based and PCA-based methods when testing on synthetic data that was generated from a generalized LMM. Applying our method to both real and synthetic human genotype/phenotype data, we demonstrate the ability of our model to correct for confounding factors while requiring significantly less runtime relative to LMMs. We have implemented methods for fitting these models, which are available at http://www.microsoft.com/science. PMID:21765897
von Borries, Rainer; Johnson, Warren E.; Franklin, William L.; Marín, Juan C.
2014-01-01
Fifteen guanacos were introduced to Staats Island in the Falklands/Malvinas archipelago from Patagonia in the 1930s. Twenty five years later, the population was culled from 300 to 10–20 individuals, but quickly rebounded to a population of almost 400 animals that today retain the genetic signature of the founding event and later bottleneck. The goals of this study were to (i) make a genetic assessment of this island population through comparisons with mainland populations and simulations, and (ii) assess the likely source-population of the introduced guanacos. Genetic variation was estimated from 513 bp of mitochondrial DNA sequence and 15 microsatellite loci among 154 guanacos collected from eight localities, including the adjacent mainland and the islands of Tierra del Fuego and Staats Island. Of the 23 haplotypes observed among our samples, the Staats Island population only contained three haplotypes, all of which were shared with the coastal Monte Leon population in southern Patagonia. Mitochondrial DNA and microsatellite variations on Staats Island were comparable to most mainland populations and greater than those observed on Tierra del Fuego. Patterns of genetic structure suggest that the Staats Island guanaco population was founded with animals from southern Patagonia (as opposed to northern Patagonia or Tierra del Fuego), but that effective reductions in population size lasted only a few generations and that surviving animals were a random sample of the pre-bottleneck genetic variation. PMID:24663026
Nonconvex model predictive control for commercial refrigeration
NASA Astrophysics Data System (ADS)
Gybel Hovgaard, Tobias; Boyd, Stephen; Larsen, Lars F. S.; Bagterp Jørgensen, John
2013-08-01
We consider the control of a commercial multi-zone refrigeration system, consisting of several cooling units that share a common compressor, and is used to cool multiple areas or rooms. In each time period we choose cooling capacity to each unit and a common evaporation temperature. The goal is to minimise the total energy cost, using real-time electricity prices, while obeying temperature constraints on the zones. We propose a variation on model predictive control to achieve this goal. When the right variables are used, the dynamics of the system are linear, and the constraints are convex. The cost function, however, is nonconvex due to the temperature dependence of thermodynamic efficiency. To handle this nonconvexity we propose a sequential convex optimisation method, which typically converges in fewer than 5 or so iterations. We employ a fast convex quadratic programming solver to carry out the iterations, which is more than fast enough to run in real time. We demonstrate our method on a realistic model, with a full year simulation and 15-minute time periods, using historical electricity prices and weather data, as well as random variations in thermal load. These simulations show substantial cost savings, on the order of 30%, compared to a standard thermostat-based control system. Perhaps more important, we see that the method exhibits sophisticated response to real-time variations in electricity prices. This demand response is critical to help balance real-time uncertainties in generation capacity associated with large penetration of intermittent renewable energy sources in a future smart grid.
González, Benito A; Orozco-Terwengel, Pablo; von Borries, Rainer; Johnson, Warren E; Franklin, William L; Marín, Juan C
2014-01-01
Fifteen guanacos were introduced to Staats Island in the Falklands/Malvinas archipelago from Patagonia in the 1930s. Twenty five years later, the population was culled from 300 to 10-20 individuals, but quickly rebounded to a population of almost 400 animals that today retain the genetic signature of the founding event and later bottleneck. The goals of this study were to (i) make a genetic assessment of this island population through comparisons with mainland populations and simulations, and (ii) assess the likely source-population of the introduced guanacos. Genetic variation was estimated from 513 bp of mitochondrial DNA sequence and 15 microsatellite loci among 154 guanacos collected from eight localities, including the adjacent mainland and the islands of Tierra del Fuego and Staats Island. Of the 23 haplotypes observed among our samples, the Staats Island population only contained three haplotypes, all of which were shared with the coastal Monte Leon population in southern Patagonia. Mitochondrial DNA and microsatellite variations on Staats Island were comparable to most mainland populations and greater than those observed on Tierra del Fuego. Patterns of genetic structure suggest that the Staats Island guanaco population was founded with animals from southern Patagonia (as opposed to northern Patagonia or Tierra del Fuego), but that effective reductions in population size lasted only a few generations and that surviving animals were a random sample of the pre-bottleneck genetic variation.
ERIC Educational Resources Information Center
Raudenbush, Stephen W.; Bloom, Howard S.
2015-01-01
The present paper, which is intended for a diverse audience of evaluation researchers, applied social scientists, and research funders, provides a broad overview of the conceptual and statistical issues involved in using multisite randomized trials to learn "about" and "from" variation in program effects across…
Variation in Local-Scale Edge Effects: Mechanisms and landscape Context
Therese M. Donovan; Peter W. Jones; Elizabeth M. Annand; Frank R. Thompson III
1997-01-01
Ecological processes near habitat edges often differ from processes away from edges. Yet, the generality of "edge effects" has been hotly debated because results vary tremendously. To understand the factors responsible for this variation, we described nest predation and cowbird distribution patterns in forest edge and forest core habitats on 36 randomly...
First in the Class? Age and the Education Production Function
ERIC Educational Resources Information Center
Cascio, Elizabeth U.; Schanzenbach, Diane Whitmore
2016-01-01
We estimate the effects of relative age in kindergarten using data from an experiment where children of the same age were randomly assigned to different kindergarten classmates. We exploit the resulting experimental variation in relative age in conjunction with variation in expected kindergarten entry age based on birth date to account for…
A rapid random-sampling method was used to relate densities of juvenile winter flounder to multiple scales of habitat variation in Narragansett Bay and two nearby coastal lagoons in Rhode Island. We used a 1-m beam trawl with attached video camera, continuous GPS track overlay, ...
Crossing the Threshold: Bringing Biological Variation to the Foreground
ERIC Educational Resources Information Center
Batzli, Janet M.; Knight, Jennifer K.; Hartley, Laurel M.; Maskiewicz, April Cordero; Desy, Elizabeth A.
2016-01-01
Threshold concepts have been referred to as "jewels in the curriculum": concepts that are key to competency in a discipline but not taught explicitly. In biology, researchers have proposed the idea of threshold concepts that include such topics as variation, randomness, uncertainty, and scale. In this essay, we explore how the notion of…
Assessment of spatial variation of risks in small populations.
Riggan, W B; Manton, K G; Creason, J P; Woodbury, M A; Stallard, E
1991-01-01
Often environmental hazards are assessed by examining the spatial variation of disease-specific mortality or morbidity rates. These rates, when estimated for small local populations, can have a high degree of random variation or uncertainty associated with them. If those rate estimates are used to prioritize environmental clean-up actions or to allocate resources, then those decisions may be influenced by this high degree of uncertainty. Unfortunately, the effect of this uncertainty is not to add "random noise" into the decision-making process, but to systematically bias action toward the smallest populations where uncertainty is greatest and where extreme high and low rate deviations are most likely to be manifest by chance. We present a statistical procedure for adjusting rate estimates for differences in variability due to differentials in local area population sizes. Such adjustments produce rate estimates for areas that have better properties than the unadjusted rates for use in making statistically based decisions about the entire set of areas. Examples are provided for county variation in bladder, stomach, and lung cancer mortality rates for U.S. white males for the period 1970 to 1979. PMID:1820268
Usher, Kim; Park, Tanya; Foster, Kim; Buettner, Petra
2013-07-01
To test the effect of a nurse-led intervention on weight gain in people with serious mental illness prescribed and taking second generation antipsychotic medication. Weight gain and obesity has reached epidemic proportions in the general population with the prevalence of Metabolic Syndrome reaching 20-25% of the global population. People with serious mental illness are at even higher risk, particularly those taking second generation antipsychotic medication. An experimental randomized controlled trial was undertaken. The control group received a 12-week healthy lifestyle booklet. In addition to the booklet, the intervention group received weekly nutrition and exercise education, exercise sessions, and nurse support. Participants (n = 101) were assessed at baseline and 12 weeks. Data were collected between March 2008-December 2010. Seven outcome measures were used: body measurements included girth (cm), weight (kg), height (cm), and body mass index (kg/m(2) ); questionnaires included the medication compliance questionnaire, the Drug Attitude Inventory, the Liverpool University Neuroleptic Side Effect Rating Scale, and the Medical Outcomes Study Short Form 36. Differences in primary outcome measures between baseline and 12 weeks follow-up were compared between intervention and control groups using standard bi-variate statistical tests. The study was conducted between 2008-2010. The analysis of outcome measures for the control group (n = 50) and intervention group (n = 51) was not statistically significant. There was a mean weight change of -0·74 kg at 12 weeks for the intervention group (n = 51), while the control group (n = 50) had a mean weight change of -0·17 kg at 12 weeks. The results were not statistically significant. © 2012 Blackwell Publishing Ltd.
Identifying uniformly mutated segments within repeats.
Sahinalp, S Cenk; Eichler, Evan; Goldberg, Paul; Berenbrink, Petra; Friedetzky, Tom; Ergun, Funda
2004-12-01
Given a long string of characters from a constant size alphabet we present an algorithm to determine whether its characters have been generated by a single i.i.d. random source. More specifically, consider all possible n-coin models for generating a binary string S, where each bit of S is generated via an independent toss of one of the n coins in the model. The choice of which coin to toss is decided by a random walk on the set of coins where the probability of a coin change is much lower than the probability of using the same coin repeatedly. We present a procedure to evaluate the likelihood of a n-coin model for given S, subject a uniform prior distribution over the parameters of the model (that represent mutation rates and probabilities of copying events). In the absence of detailed prior knowledge of these parameters, the algorithm can be used to determine whether the a posteriori probability for n=1 is higher than for any other n>1. Our algorithm runs in time O(l4logl), where l is the length of S, through a dynamic programming approach which exploits the assumed convexity of the a posteriori probability for n. Our test can be used in the analysis of long alignments between pairs of genomic sequences in a number of ways. For example, functional regions in genome sequences exhibit much lower mutation rates than non-functional regions. Because our test provides means for determining variations in the mutation rate, it may be used to distinguish functional regions from non-functional ones. Another application is in determining whether two highly similar, thus evolutionarily related, genome segments are the result of a single copy event or of a complex series of copy events. This is particularly an issue in evolutionary studies of genome regions rich with repeat segments (especially tandemly repeated segments).
NASA Astrophysics Data System (ADS)
Liang, G.; Wilcox, K.; Rudgers, J.; Litvak, M. E.; Newsome, S. D.; Collins, S. L.; Pockman, W.; Luo, Y.
2017-12-01
Altered amounts and increased interannual variation of precipitation are likely to occur on a regional to global scale in the late 21st Century, yet understanding the interactive effects of these changes on ecosystem processes is limited. Here, we modeled the responses of the carbon cycle in a desert grassland at the Sevilleta National Wildlife Refuge (SEV) to changes in precipitation amount and interannual variation using the Terrestrial Ecosystem model. After model calibration, we generated 100-year hourly weather data by randomly repeated sampling of observed hourly weather data at SEV from 2000 to 2012. We then modified this 100-year time series to create six climate scenarios: (1) ambient (AMB); (2) increased air temperature by 4.3 °C in summer and 3.3 °C in other seasons for each year (IT); (3) 20% decreased precipitation amount for every event (DP); (4) combined IT and DP (DPT); (5) 100% increased precipitation interannual variance without changing the mean (IV); (6) combined IT and IV (IVT). Our results showed IV significantly increased the sensitivity of NPP to continuous extreme drought. In addition, the increased number of extreme drought years caused by IV exacerbated the negative influence of individual extreme drought events on soil organic carbon (SOC). The IV climate scenario showed the highest interannual variance of carbon fluxes and SOC, but increased temperature reduced this variance. DP and IV decreased NPP by 10.7% and 18.3% compared with AMB, respectively, and the negative impact of IV on NPP was more severe than that of DP. Our results indicate that the increased interannual variation in precipitation could have more severe impacts on terrestrial ecosystems that exceed the decrease in predicted average annual precipitation.
Autonomous Byte Stream Randomizer
NASA Technical Reports Server (NTRS)
Paloulian, George K.; Woo, Simon S.; Chow, Edward T.
2013-01-01
Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.
Vincenzi, Simone; Mangel, Marc; Crivelli, Alain J; Munch, Stephan; Skaug, Hans J
2014-09-01
The differences in demographic and life-history processes between organisms living in the same population have important consequences for ecological and evolutionary dynamics. Modern statistical and computational methods allow the investigation of individual and shared (among homogeneous groups) determinants of the observed variation in growth. We use an Empirical Bayes approach to estimate individual and shared variation in somatic growth using a von Bertalanffy growth model with random effects. To illustrate the power and generality of the method, we consider two populations of marble trout Salmo marmoratus living in Slovenian streams, where individually tagged fish have been sampled for more than 15 years. We use year-of-birth cohort, population density during the first year of life, and individual random effects as potential predictors of the von Bertalanffy growth function's parameters k (rate of growth) and L∞ (asymptotic size). Our results showed that size ranks were largely maintained throughout marble trout lifetime in both populations. According to the Akaike Information Criterion (AIC), the best models showed different growth patterns for year-of-birth cohorts as well as the existence of substantial individual variation in growth trajectories after accounting for the cohort effect. For both populations, models including density during the first year of life showed that growth tended to decrease with increasing population density early in life. Model validation showed that predictions of individual growth trajectories using the random-effects model were more accurate than predictions based on mean size-at-age of fish.
The potential of shifting recombination hotspots to increase genetic gain in livestock breeding.
Gonen, Serap; Battagin, Mara; Johnston, Susan E; Gorjanc, Gregor; Hickey, John M
2017-07-04
This study uses simulation to explore and quantify the potential effect of shifting recombination hotspots on genetic gain in livestock breeding programs. We simulated three scenarios that differed in the locations of quantitative trait nucleotides (QTN) and recombination hotspots in the genome. In scenario 1, QTN were randomly distributed along the chromosomes and recombination was restricted to occur within specific genomic regions (i.e. recombination hotspots). In the other two scenarios, both QTN and recombination hotspots were located in specific regions, but differed in whether the QTN occurred outside of (scenario 2) or inside (scenario 3) recombination hotspots. We split each chromosome into 250, 500 or 1000 regions per chromosome of which 10% were recombination hotspots and/or contained QTN. The breeding program was run for 21 generations of selection, after which recombination hotspot regions were kept the same or were shifted to adjacent regions for a further 80 generations of selection. We evaluated the effect of shifting recombination hotspots on genetic gain, genetic variance and genic variance. Our results show that shifting recombination hotspots reduced the decline of genetic and genic variance by releasing standing allelic variation in the form of new allele combinations. This in turn resulted in larger increases in genetic gain. However, the benefit of shifting recombination hotspots for increased genetic gain was only observed when QTN were initially outside recombination hotspots. If QTN were initially inside recombination hotspots then shifting them decreased genetic gain. Shifting recombination hotspots to regions of the genome where recombination had not occurred for 21 generations of selection (i.e. recombination deserts) released more of the standing allelic variation available in each generation and thus increased genetic gain. However, whether and how much increase in genetic gain was achieved by shifting recombination hotspots depended on the distribution of QTN in the genome, the number of recombination hotspots and whether QTN were initially inside or outside recombination hotspots. Our findings show future scope for targeted modification of recombination hotspots e.g. through changes in zinc-finger motifs of the PRDM9 protein to increase genetic gain in production species.
Areas V1 and V2 show microsaccade-related 3-4-Hz covariation in gamma power and frequency.
Lowet, E; Roberts, M J; Bosman, C A; Fries, P; De Weerd, P
2016-05-01
Neuronal gamma-band synchronization (25-80 Hz) in visual cortex appears sustained and stable during prolonged visual stimulation when investigated with conventional averages across trials. However, recent studies in macaque visual cortex have used single-trial analyses to show that both power and frequency of gamma oscillations exhibit substantial moment-by-moment variation. This has raised the question of whether these apparently random variations might limit the functional role of gamma-band synchronization for neural processing. Here, we studied the moment-by-moment variation in gamma oscillation power and frequency, as well as inter-areal gamma synchronization, by simultaneously recording local field potentials in V1 and V2 of two macaque monkeys. We additionally analyzed electrocorticographic V1 data from a third monkey. Our analyses confirm that gamma-band synchronization is not stationary and sustained but undergoes moment-by-moment variations in power and frequency. However, those variations are neither random and nor a possible obstacle to neural communication. Instead, the gamma power and frequency variations are highly structured, shared between areas and shaped by a microsaccade-related 3-4-Hz theta rhythm. Our findings provide experimental support for the suggestion that cross-frequency coupling might structure and facilitate the information flow between brain regions. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Educational Engagement and Early Family Formation: Differences by Ethnicity and Generation
ERIC Educational Resources Information Center
Glick, Jennifer E.; Ruf, Stacey D.; White, Michael J.; Goldscheider, Frances K.
2006-01-01
This paper examines how school engagement influences the timing of family formation for youth. We pay particular attention to variation across four racial/ethnic groups and by generation status, variation that reflects the diversification of U.S. society through immigration. Using data from the National Education Longitudinal Study (NELS), we…
Introductory Biology Students' Conceptual Models and Explanations of the Origin of Variation
ERIC Educational Resources Information Center
Bray Speth, Elena; Shaw, Neil; Momsen, Jennifer; Reinagel, Adam; Le, Paul; Taqieddin, Ranya; Long, Tammy
2014-01-01
Mutation is the key molecular mechanism generating phenotypic variation, which is the basis for evolution. In an introductory biology course, we used a model-based pedagogy that enabled students to integrate their understanding of genetics and evolution within multiple case studies. We used student-generated conceptual models to assess…