Sample records for completely random distribution

  1. fixedTimeEvents: An R package for the distribution of distances between discrete events in fixed time

    NASA Astrophysics Data System (ADS)

    Liland, Kristian Hovde; Snipen, Lars

    When a series of Bernoulli trials occur within a fixed time frame or limited space, it is often interesting to assess if the successful outcomes have occurred completely at random, or if they tend to group together. One example, in genetics, is detecting grouping of genes within a genome. Approximations of the distribution of successes are possible, but they become inaccurate for small sample sizes. In this article, we describe the exact distribution of time between random, non-overlapping successes in discrete time of fixed length. A complete description of the probability mass function, the cumulative distribution function, mean, variance and recurrence relation is included. We propose an associated test for the over-representation of short distances and illustrate the methodology through relevant examples. The theory is implemented in an R package including probability mass, cumulative distribution, quantile function, random number generator, simulation functions, and functions for testing.

  2. Human X-chromosome inactivation pattern distributions fit a model of genetically influenced choice better than models of completely random choice

    PubMed Central

    Renault, Nisa K E; Pritchett, Sonja M; Howell, Robin E; Greer, Wenda L; Sapienza, Carmen; Ørstavik, Karen Helene; Hamilton, David C

    2013-01-01

    In eutherian mammals, one X-chromosome in every XX somatic cell is transcriptionally silenced through the process of X-chromosome inactivation (XCI). Females are thus functional mosaics, where some cells express genes from the paternal X, and the others from the maternal X. The relative abundance of the two cell populations (X-inactivation pattern, XIP) can have significant medical implications for some females. In mice, the ‘choice' of which X to inactivate, maternal or paternal, in each cell of the early embryo is genetically influenced. In humans, the timing of XCI choice and whether choice occurs completely randomly or under a genetic influence is debated. Here, we explore these questions by analysing the distribution of XIPs in large populations of normal females. Models were generated to predict XIP distributions resulting from completely random or genetically influenced choice. Each model describes the discrete primary distribution at the onset of XCI, and the continuous secondary distribution accounting for changes to the XIP as a result of development and ageing. Statistical methods are used to compare models with empirical data from Danish and Utah populations. A rigorous data treatment strategy maximises information content and allows for unbiased use of unphased XIP data. The Anderson–Darling goodness-of-fit statistics and likelihood ratio tests indicate that a model of genetically influenced XCI choice better fits the empirical data than models of completely random choice. PMID:23652377

  3. Super-resolving random-Gaussian apodized photon sieve.

    PubMed

    Sabatyan, Arash; Roshaninejad, Parisa

    2012-09-10

    A novel apodized photon sieve is presented in which random dense Gaussian distribution is implemented to modulate the pinhole density in each zone. The random distribution in dense Gaussian distribution causes intrazone discontinuities. Also, the dense Gaussian distribution generates a substantial number of pinholes in order to form a large degree of overlap between the holes in a few innermost zones of the photon sieve; thereby, clear zones are formed. The role of the discontinuities on the focusing properties of the photon sieve is examined as well. Analysis shows that secondary maxima have evidently been suppressed, transmission has increased enormously, and the central maxima width is approximately unchanged in comparison to the dense Gaussian distribution. Theoretical results have been completely verified by experiment.

  4. Maronutrient distribution in 'Tifblue' rabbiteye blueberry

    USDA-ARS?s Scientific Manuscript database

    This study was developed and initiated to determine the nutrient distribution within a ‘Tifblue’ rabbiteye blueberry. Rooted cuttings were potted into 3.8 liter containers and placed into a completely randomized design on a covered bench. Plants were divided evenly into 3 groups for low, high a...

  5. Distributive and Procedural Justice as Related to Satisfaction and Commitment.

    ERIC Educational Resources Information Center

    Tang, Thomas Li-Ping; Sarsfield-Baldwin, Linda J.

    Randomly selected employees from a Veterans Administration Medical Center (n=200) were asked to complete measures on distributive justice and procedural justice 4 weeks before their performance appraisal; and on job satisfaction, commitment, involvement, and self-reported performance feedback 4 weeks after their performance appraisals.…

  6. Formation of randomly distributed nano-tubes, -rods and -plates of n-type and p-type bismuth telluride via molecular legation

    NASA Astrophysics Data System (ADS)

    Ram, Jasa; Ghosal, Partha

    2015-08-01

    Randomly distributed nanotubes, nanorods and nanoplates of Bi0.5Sb1.5Te3 and Bi2Te2.7Se0.3 ternary compounds have been synthesized via a high yield solvo-thermal process. Prior to solvo-thermal heating at 230 °C for crystallization, we ensured molecular legation in room temperature reaction by complete reduction of precursor materials, dissolved in ethylene glycol and confirmed it by replicating Raman spectra of amorphous and crystalline materials. These nanomaterials have also been characterized using XRD, FE-SEM, EDS and TEM. Possible formation mechanism is also discussed. This single process will enable development of thermoelectric modules and random distribution of diverse morphology will be beneficial in retaining nano-crystallite sizes.

  7. Takeover times for a simple model of network infection.

    PubMed

    Ottino-Löffler, Bertrand; Scott, Jacob G; Strogatz, Steven H

    2017-07-01

    We study a stochastic model of infection spreading on a network. At each time step a node is chosen at random, along with one of its neighbors. If the node is infected and the neighbor is susceptible, the neighbor becomes infected. How many time steps T does it take to completely infect a network of N nodes, starting from a single infected node? An analogy to the classic "coupon collector" problem of probability theory reveals that the takeover time T is dominated by extremal behavior, either when there are only a few infected nodes near the start of the process or a few susceptible nodes near the end. We show that for N≫1, the takeover time T is distributed as a Gumbel distribution for the star graph, as the convolution of two Gumbel distributions for a complete graph and an Erdős-Rényi random graph, as a normal for a one-dimensional ring and a two-dimensional lattice, and as a family of intermediate skewed distributions for d-dimensional lattices with d≥3 (these distributions approach the convolution of two Gumbel distributions as d approaches infinity). Connections to evolutionary dynamics, cancer, incubation periods of infectious diseases, first-passage percolation, and other spreading phenomena in biology and physics are discussed.

  8. Takeover times for a simple model of network infection

    NASA Astrophysics Data System (ADS)

    Ottino-Löffler, Bertrand; Scott, Jacob G.; Strogatz, Steven H.

    2017-07-01

    We study a stochastic model of infection spreading on a network. At each time step a node is chosen at random, along with one of its neighbors. If the node is infected and the neighbor is susceptible, the neighbor becomes infected. How many time steps T does it take to completely infect a network of N nodes, starting from a single infected node? An analogy to the classic "coupon collector" problem of probability theory reveals that the takeover time T is dominated by extremal behavior, either when there are only a few infected nodes near the start of the process or a few susceptible nodes near the end. We show that for N ≫1 , the takeover time T is distributed as a Gumbel distribution for the star graph, as the convolution of two Gumbel distributions for a complete graph and an Erdős-Rényi random graph, as a normal for a one-dimensional ring and a two-dimensional lattice, and as a family of intermediate skewed distributions for d -dimensional lattices with d ≥3 (these distributions approach the convolution of two Gumbel distributions as d approaches infinity). Connections to evolutionary dynamics, cancer, incubation periods of infectious diseases, first-passage percolation, and other spreading phenomena in biology and physics are discussed.

  9. Analytical connection between thresholds and immunization strategies of SIS model in random networks

    NASA Astrophysics Data System (ADS)

    Zhou, Ming-Yang; Xiong, Wen-Man; Liao, Hao; Wang, Tong; Wei, Zong-Wen; Fu, Zhong-Qian

    2018-05-01

    Devising effective strategies for hindering the propagation of viruses and protecting the population against epidemics is critical for public security and health. Despite a number of studies based on the susceptible-infected-susceptible (SIS) model devoted to this topic, we still lack a general framework to compare different immunization strategies in completely random networks. Here, we address this problem by suggesting a novel method based on heterogeneous mean-field theory for the SIS model. Our method builds the relationship between the thresholds and different immunization strategies in completely random networks. Besides, we provide an analytical argument that the targeted large-degree strategy achieves the best performance in random networks with arbitrary degree distribution. Moreover, the experimental results demonstrate the effectiveness of the proposed method in both artificial and real-world networks.

  10. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    NASA Astrophysics Data System (ADS)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  11. Research on Some Bus Transport Networks with Random Overlapping Clique Structure

    NASA Astrophysics Data System (ADS)

    Yang, Xu-Hua; Wang, Bo; Wang, Wan-Liang; Sun, You-Xian

    2008-11-01

    On the basis of investigating the statistical data of bus transport networks of three big cities in China, we propose that each bus route is a clique (maximal complete subgraph) and a bus transport network (BTN) consists of a lot of cliques, which intensively connect and overlap with each other. We study the network properties, which include the degree distribution, multiple edges' overlapping time distribution, distribution of the overlap size between any two overlapping cliques, distribution of the number of cliques that a node belongs to. Naturally, the cliques also constitute a network, with the overlapping nodes being their multiple links. We also research its network properties such as degree distribution, clustering, average path length, and so on. We propose that a BTN has the properties of random clique increment and random overlapping clique, at the same time, a BTN is a small-world network with highly clique-clustered and highly clique-overlapped. Finally, we introduce a BTN evolution model, whose simulation results agree well with the statistical laws that emerge in real BTNs.

  12. A random walk rule for phase I clinical trials.

    PubMed

    Durham, S D; Flournoy, N; Rosenberger, W F

    1997-06-01

    We describe a family of random walk rules for the sequential allocation of dose levels to patients in a dose-response study, or phase I clinical trial. Patients are sequentially assigned the next higher, same, or next lower dose level according to some probability distribution, which may be determined by ethical considerations as well as the patient's response. It is shown that one can choose these probabilities in order to center dose level assignments unimodally around any target quantile of interest. Estimation of the quantile is discussed; the maximum likelihood estimator and its variance are derived under a two-parameter logistic distribution, and the maximum likelihood estimator is compared with other nonparametric estimators. Random walk rules have clear advantages: they are simple to implement, and finite and asymptotic distribution theory is completely worked out. For a specific random walk rule, we compute finite and asymptotic properties and give examples of its use in planning studies. Having the finite distribution theory available and tractable obviates the need for elaborate simulation studies to analyze the properties of the design. The small sample properties of our rule, as determined by exact theory, compare favorably to those of the continual reassessment method, determined by simulation.

  13. Coherent wave transmission in quasi-one-dimensional systems with Lévy disorder

    NASA Astrophysics Data System (ADS)

    Amanatidis, Ilias; Kleftogiannis, Ioannis; Falceto, Fernando; Gopar, Víctor A.

    2017-12-01

    We study the random fluctuations of the transmission in disordered quasi-one-dimensional systems such as disordered waveguides and/or quantum wires whose random configurations of disorder are characterized by density distributions with a long tail known as Lévy distributions. The presence of Lévy disorder leads to large fluctuations of the transmission and anomalous localization, in relation to the standard exponential localization (Anderson localization). We calculate the complete distribution of the transmission fluctuations for a different number of transmission channels in the presence and absence of time-reversal symmetry. Significant differences in the transmission statistics between disordered systems with Anderson and anomalous localizations are revealed. The theoretical predictions are independently confirmed by tight-binding numerical simulations.

  14. Node degree distribution in spanning trees

    NASA Astrophysics Data System (ADS)

    Pozrikidis, C.

    2016-03-01

    A method is presented for computing the number of spanning trees involving one link or a specified group of links, and excluding another link or a specified group of links, in a network described by a simple graph in terms of derivatives of the spanning-tree generating function defined with respect to the eigenvalues of the Kirchhoff (weighted Laplacian) matrix. The method is applied to deduce the node degree distribution in a complete or randomized set of spanning trees of an arbitrary network. An important feature of the proposed method is that the explicit construction of spanning trees is not required. It is shown that the node degree distribution in the spanning trees of the complete network is described by the binomial distribution. Numerical results are presented for the node degree distribution in square, triangular, and honeycomb lattices.

  15. Continuous-Time Classical and Quantum Random Walk on Direct Product of Cayley Graphs

    NASA Astrophysics Data System (ADS)

    Salimi, S.; Jafarizadeh, M. A.

    2009-06-01

    In this paper we define direct product of graphs and give a recipe for obtaining probability of observing particle on vertices in the continuous-time classical and quantum random walk. In the recipe, the probability of observing particle on direct product of graph is obtained by multiplication of probability on the corresponding to sub-graphs, where this method is useful to determining probability of walk on complicated graphs. Using this method, we calculate the probability of continuous-time classical and quantum random walks on many of finite direct product Cayley graphs (complete cycle, complete Kn, charter and n-cube). Also, we inquire that the classical state the stationary uniform distribution is reached as t → ∞ but for quantum state is not always satisfied.

  16. Monte Carlo based toy model for fission process

    NASA Astrophysics Data System (ADS)

    Kurniadi, R.; Waris, A.; Viridi, S.

    2014-09-01

    There are many models and calculation techniques to obtain visible image of fission yield process. In particular, fission yield can be calculated by using two calculations approach, namely macroscopic approach and microscopic approach. This work proposes another calculation approach in which the nucleus is treated as a toy model. Hence, the fission process does not represent real fission process in nature completely. The toy model is formed by Gaussian distribution of random number that randomizes distance likesthe distance between particle and central point. The scission process is started by smashing compound nucleus central point into two parts that are left central and right central points. These three points have different Gaussian distribution parameters such as mean (μCN, μL, μR), and standard deviation (σCN, σL, σR). By overlaying of three distributions, the number of particles (NL, NR) that are trapped by central points can be obtained. This process is iterated until (NL, NR) become constant numbers. Smashing process is repeated by changing σL and σR, randomly.

  17. 3D vector distribution of the electro-magnetic fields on a random gold film

    NASA Astrophysics Data System (ADS)

    Canneson, Damien; Berini, Bruno; Buil, Stéphanie; Hermier, Jean-Pierre; Quélin, Xavier

    2018-05-01

    The 3D vector distribution of the electro-magnetic fields at the very close vicinity of the surface of a random gold film is studied. Such films are well known for their properties of light confinement and large fluctuations of local density of optical states. Using Finite-Difference Time-Domain simulations, we show that it is possible to determine the local orientation of the electro-magnetic fields. This allows us to obtain a complete characterization of the fields. Large fluctuations of their amplitude are observed as previously shown. Here, we demonstrate large variations of their direction depending both on the position on the random gold film, and on the distance to it. Such characterization could be useful for a better understanding of applications like the coupling of point-like dipoles to such films.

  18. Variances and uncertainties of the sample laboratory-to-laboratory variance (S(L)2) and standard deviation (S(L)) associated with an interlaboratory study.

    PubMed

    McClure, Foster D; Lee, Jung K

    2012-01-01

    The validation process for an analytical method usually employs an interlaboratory study conducted as a balanced completely randomized model involving a specified number of randomly chosen laboratories, each analyzing a specified number of randomly allocated replicates. For such studies, formulas to obtain approximate unbiased estimates of the variance and uncertainty of the sample laboratory-to-laboratory (lab-to-lab) STD (S(L)) have been developed primarily to account for the uncertainty of S(L) when there is a need to develop an uncertainty budget that includes the uncertainty of S(L). For the sake of completeness on this topic, formulas to estimate the variance and uncertainty of the sample lab-to-lab variance (S(L)2) were also developed. In some cases, it was necessary to derive the formulas based on an approximate distribution for S(L)2.

  19. Multiple Imputation of a Randomly Censored Covariate Improves Logistic Regression Analysis.

    PubMed

    Atem, Folefac D; Qian, Jing; Maye, Jacqueline E; Johnson, Keith A; Betensky, Rebecca A

    2016-01-01

    Randomly censored covariates arise frequently in epidemiologic studies. The most commonly used methods, including complete case and single imputation or substitution, suffer from inefficiency and bias. They make strong parametric assumptions or they consider limit of detection censoring only. We employ multiple imputation, in conjunction with semi-parametric modeling of the censored covariate, to overcome these shortcomings and to facilitate robust estimation. We develop a multiple imputation approach for randomly censored covariates within the framework of a logistic regression model. We use the non-parametric estimate of the covariate distribution or the semiparametric Cox model estimate in the presence of additional covariates in the model. We evaluate this procedure in simulations, and compare its operating characteristics to those from the complete case analysis and a survival regression approach. We apply the procedures to an Alzheimer's study of the association between amyloid positivity and maternal age of onset of dementia. Multiple imputation achieves lower standard errors and higher power than the complete case approach under heavy and moderate censoring and is comparable under light censoring. The survival regression approach achieves the highest power among all procedures, but does not produce interpretable estimates of association. Multiple imputation offers a favorable alternative to complete case analysis and ad hoc substitution methods in the presence of randomly censored covariates within the framework of logistic regression.

  20. Phase diagram for the Kuramoto model with van Hemmen interactions.

    PubMed

    Kloumann, Isabel M; Lizarraga, Ian M; Strogatz, Steven H

    2014-01-01

    We consider a Kuramoto model of coupled oscillators that includes quenched random interactions of the type used by van Hemmen in his model of spin glasses. The phase diagram is obtained analytically for the case of zero noise and a Lorentzian distribution of the oscillators' natural frequencies. Depending on the size of the attractive and random coupling terms, the system displays four states: complete incoherence, partial synchronization, partial antiphase synchronization, and a mix of antiphase and ordinary synchronization.

  1. Some limit theorems for ratios of order statistics from uniform random variables.

    PubMed

    Xu, Shou-Fang; Miao, Yu

    2017-01-01

    In this paper, we study the ratios of order statistics based on samples drawn from uniform distribution and establish some limit properties such as the almost sure central limit theorem, the large deviation principle, the Marcinkiewicz-Zygmund law of large numbers and complete convergence.

  2. Exact Solution of the Markov Propagator for the Voter Model on the Complete Graph

    DTIC Science & Technology

    2014-07-01

    distribution of the random walk. This process can also be applied to other models, incomplete graphs, or to multiple dimensions. An advantage of this...since any multiple of an eigenvector remains an eigenvector. Without any loss, let bk = 1. Now we can ascertain the explicit solution for bj when k < j...this bound is valid for all initial probability distributions. However, without detailed information about the eigenvectors, we cannot extract more

  3. A pilot randomized controlled trial of D-cycloserine and distributed practice as adjuvants to constraint-induced movement therapy after stroke.

    PubMed

    Nadeau, Stephen E; Davis, Sandra E; Wu, Samuel S; Dai, Yunfeng; Richards, Lorie G

    2014-01-01

    Background. Phase III trials of rehabilitation of paresis after stroke have proven the effectiveness of intensive and extended task practice, but they have also shown that many patients do not qualify, because of severity of impairment, and that many of those who are treated are left with clinically significant deficits. Objective. To test the value of 2 potential adjuvants to normal learning processes engaged in constraint-induced movement therapy (CIMT): greater distribution of treatment over time and the coadministration of d-cycloserine, a competitive agonist at the glycine site of the N-methyl-D-aspartate glutamate receptor. Methods. A prospective randomized single-blind parallel-group trial of more versus less condensed therapy (2 vs 10 weeks) and d-cycloserine (50 mg) each treatment day versus placebo (in a 2 × 2 design), as potential adjuvants to 60 hours of CIMT. Results. Twenty-four participants entered the study, and 22 completed it and were assessed at the completion of treatment and 3 months later. Neither greater distribution of treatment nor treatment with d-cycloserine significantly augmented retention of gains achieved with CIMT. Conclusions. Greater distribution of practice and treatment with d-cycloserine do not appear to augment retention of gains achieved with CIMT. However, concentration of CIMT over 2 weeks ("massed practice") appears to confer no advantage either. © The Author(s) 2014.

  4. Bridges in complex networks

    NASA Astrophysics Data System (ADS)

    Wu, Ang-Kun; Tian, Liang; Liu, Yang-Yu

    2018-01-01

    A bridge in a graph is an edge whose removal disconnects the graph and increases the number of connected components. We calculate the fraction of bridges in a wide range of real-world networks and their randomized counterparts. We find that real networks typically have more bridges than their completely randomized counterparts, but they have a fraction of bridges that is very similar to their degree-preserving randomizations. We define an edge centrality measure, called bridgeness, to quantify the importance of a bridge in damaging a network. We find that certain real networks have a very large average and variance of bridgeness compared to their degree-preserving randomizations and other real networks. Finally, we offer an analytical framework to calculate the bridge fraction and the average and variance of bridgeness for uncorrelated random networks with arbitrary degree distributions.

  5. Observation of arrival times of EAS with energies or = 6 x 10 (14) eV

    NASA Technical Reports Server (NTRS)

    Sun, L.

    1985-01-01

    The Earth's atmosphere is continually being bombarded by primary cosmic ray particles which are generally believed to be high-energy nuclei. The fact that the majority of cosmic ray primaries are charged particles and that space is permeated with random magnetic fields, means that the particles do not travel in straight lines. The arrival time distribution of EAS may also transfer some information about the primary particles. Actually, if the particles come to our Earth in a completely random process, the arrival time distribution of pairs of successive particles should fit an exponential law. The work reported here was arried out at Sydney University from May 1982 to January 1983. All the data are used to plot the arrival-time distribution of the events, that is, the distribution of time-separation between consecutive events on a 1 minute bin size. During this period more than 2300 showers were recorded. The results are discussed and compared with that of some other experiments.

  6. Evaluation of distributed practice schedules on retention of a newly acquired surgical skill: a randomized trial.

    PubMed

    Mitchell, Erica L; Lee, Dae Y; Sevdalis, Nick; Partsafas, Aaron W; Landry, Gregory J; Liem, Timothy K; Moneta, Gregory L

    2011-01-01

    practice influences new skill acquisition. The aim of this study was to prospectively investigate the impact of practice distribution (weekly vs monthly) on complex motor skill (end-side vascular anastomosis) acquisition and 4-month retention. twenty-four surgical interns were randomly assigned to weekly training for 4 weeks or monthly training for 4 months, with equal total training times. Performance was assessed before training, immediately after training, after the completion of distributed training, and 4 months later. there was no statistical difference in surgical skill acquisition and retention between the weekly and monthly scheduled groups, as measured by procedural checklist scores, global rating scores of operative performance, final product analysis, and overall performance or assessment of operative "competence." distributed practice results in improvement and retention of a newly acquired surgical skill independent of weekly or monthly practice schedules. Flexibility in a surgical skills laboratory curriculum is possible without adversely affecting training. 2011 Elsevier Inc. All rights reserved.

  7. Online Distributed Learning Over Networks in RKH Spaces Using Random Fourier Features

    NASA Astrophysics Data System (ADS)

    Bouboulis, Pantelis; Chouvardas, Symeon; Theodoridis, Sergios

    2018-04-01

    We present a novel diffusion scheme for online kernel-based learning over networks. So far, a major drawback of any online learning algorithm, operating in a reproducing kernel Hilbert space (RKHS), is the need for updating a growing number of parameters as time iterations evolve. Besides complexity, this leads to an increased need of communication resources, in a distributed setting. In contrast, the proposed method approximates the solution as a fixed-size vector (of larger dimension than the input space) using Random Fourier Features. This paves the way to use standard linear combine-then-adapt techniques. To the best of our knowledge, this is the first time that a complete protocol for distributed online learning in RKHS is presented. Conditions for asymptotic convergence and boundness of the networkwise regret are also provided. The simulated tests illustrate the performance of the proposed scheme.

  8. Missing Data and Multiple Imputation: An Unbiased Approach

    NASA Technical Reports Server (NTRS)

    Foy, M.; VanBaalen, M.; Wear, M.; Mendez, C.; Mason, S.; Meyers, V.; Alexander, D.; Law, J.

    2014-01-01

    The default method of dealing with missing data in statistical analyses is to only use the complete observations (complete case analysis), which can lead to unexpected bias when data do not meet the assumption of missing completely at random (MCAR). For the assumption of MCAR to be met, missingness cannot be related to either the observed or unobserved variables. A less stringent assumption, missing at random (MAR), requires that missingness not be associated with the value of the missing variable itself, but can be associated with the other observed variables. When data are truly MAR as opposed to MCAR, the default complete case analysis method can lead to biased results. There are statistical options available to adjust for data that are MAR, including multiple imputation (MI) which is consistent and efficient at estimating effects. Multiple imputation uses informing variables to determine statistical distributions for each piece of missing data. Then multiple datasets are created by randomly drawing on the distributions for each piece of missing data. Since MI is efficient, only a limited number, usually less than 20, of imputed datasets are required to get stable estimates. Each imputed dataset is analyzed using standard statistical techniques, and then results are combined to get overall estimates of effect. A simulation study will be demonstrated to show the results of using the default complete case analysis, and MI in a linear regression of MCAR and MAR simulated data. Further, MI was successfully applied to the association study of CO2 levels and headaches when initial analysis showed there may be an underlying association between missing CO2 levels and reported headaches. Through MI, we were able to show that there is a strong association between average CO2 levels and the risk of headaches. Each unit increase in CO2 (mmHg) resulted in a doubling in the odds of reported headaches.

  9. Chromosome Model reveals Dynamic Redistribution of DNA Damage into Nuclear Sub-domains

    NASA Technical Reports Server (NTRS)

    Costes, Sylvain V.; Ponomarev, Artem; Chen, James L.; Cucinotta, Francis A.; Barcellos-Hoff, Helen

    2007-01-01

    Several proteins involved in the response to DNA double strand breaks (DSB) form microscopically visible nuclear domains, or foci, after exposure to ionizing radiation. Radiation-induced foci (RIF) are believed to be located where DNA damage is induced. To test this assumption, we analyzed the spatial distribution of 53BP1, phosphorylated ATM and gammaH2AX RIF in cells irradiated with high linear energy transfer (LET) radiation. Since energy is randomly deposited along high-LET particle paths, RIF along these paths should also be randomly distributed. The probability to induce DSB can be derived from DNA fragment data measured experimentally by pulsed-field gel electrophoresis. We used this probability in Monte Carlo simulations to predict DSB locations in synthetic nuclei geometrically described by a complete set of human chromosomes, taking into account microscope optics from real experiments. As expected, simulations produced DNA-weighted random (Poisson) distributions. In contrast, the distributions of RIF obtained as early as 5 min after exposure to high LET (1 GeV/amu Fe) were non-random. This deviation from the expected DNA-weighted random pattern can be further characterized by relative DNA image measurements. This novel imaging approach shows that RIF were located preferentially at the interface between high and low DNA density regions, and were more frequent in regions with lower density DNA than predicted. This deviation from random behavior was more pronounced within the first 5 min following irradiation for phosphorylated ATM RIF, while gammaH2AX and 53BP1 RIF showed very pronounced deviation up to 30 min after exposure. These data suggest the existence of repair centers in mammalian epithelial cells. These centers would be nuclear sub-domains where DNA lesions would be collected for more efficient repair.

  10. Natural Time and Nowcasting Earthquakes: Are Large Global Earthquakes Temporally Clustered?

    NASA Astrophysics Data System (ADS)

    Luginbuhl, Molly; Rundle, John B.; Turcotte, Donald L.

    2018-02-01

    The objective of this paper is to analyze the temporal clustering of large global earthquakes with respect to natural time, or interevent count, as opposed to regular clock time. To do this, we use two techniques: (1) nowcasting, a new method of statistically classifying seismicity and seismic risk, and (2) time series analysis of interevent counts. We chose the sequences of M_{λ } ≥ 7.0 and M_{λ } ≥ 8.0 earthquakes from the global centroid moment tensor (CMT) catalog from 2004 to 2016 for analysis. A significant number of these earthquakes will be aftershocks of the largest events, but no satisfactory method of declustering the aftershocks in clock time is available. A major advantage of using natural time is that it eliminates the need for declustering aftershocks. The event count we utilize is the number of small earthquakes that occur between large earthquakes. The small earthquake magnitude is chosen to be as small as possible, such that the catalog is still complete based on the Gutenberg-Richter statistics. For the CMT catalog, starting in 2004, we found the completeness magnitude to be M_{σ } ≥ 5.1. For the nowcasting method, the cumulative probability distribution of these interevent counts is obtained. We quantify the distribution using the exponent, β, of the best fitting Weibull distribution; β = 1 for a random (exponential) distribution. We considered 197 earthquakes with M_{λ } ≥ 7.0 and found β = 0.83 ± 0.08. We considered 15 earthquakes with M_{λ } ≥ 8.0, but this number was considered too small to generate a meaningful distribution. For comparison, we generated synthetic catalogs of earthquakes that occur randomly with the Gutenberg-Richter frequency-magnitude statistics. We considered a synthetic catalog of 1.97 × 10^5 M_{λ } ≥ 7.0 earthquakes and found β = 0.99 ± 0.01. The random catalog converted to natural time was also random. We then generated 1.5 × 10^4 synthetic catalogs with 197 M_{λ } ≥ 7.0 in each catalog and found the statistical range of β values. The observed value of β = 0.83 for the CMT catalog corresponds to a p value of p=0.004 leading us to conclude that the interevent natural times in the CMT catalog are not random. For the time series analysis, we calculated the autocorrelation function for the sequence of natural time intervals between large global earthquakes and again compared with data from 1.5 × 10^4 synthetic catalogs of random data. In this case, the spread of autocorrelation values was much larger, so we concluded that this approach is insensitive to deviations from random behavior.

  11. Challenges to recruitment and retention of African Americans in the gene-environment trial of response to dietary interventions (GET READI) for heart health

    PubMed Central

    Kennedy, Betty M.; Harsha, David W.; Bookman, Ebony B.; Hill, Yolanda R.; Rankinen, Tuomo; Rodarte, Ruben Q.; Murla, Connie D.

    2011-01-01

    In this paper, challenges to recruiting African Americans specifically for a dietary feeding trial are examined, learning experiences gained and suggestions to overcome these challenges in future trials are discussed. A total of 333 individuals were randomized in the trial and 234 (167 sibling pairs and 67 parents/siblings) completed the dietary intervention and required DNA blood sampling for genetic analysis. The trial used multiple strategies for recruitment. Hand distributed letters and flyers through mass distribution at various churches resulted in the largest number (n = 153, 46%) of African Americans in the trial. Word of mouth accounted for the second largest number (n = 120, 36%) and included prior study participants. These two recruitment sources represented 82% (n = 273) of the total number of individuals randomized in GET READI. The remaining 18% (n = 60) consisted of a combination of sources including printed message on check stubs, newspaper articles, radio and TV appearances, screening events and presentations. Though challenging, the recruitment efforts for GET READI produced a significant number of African American participants despite the inability to complete the trial as planned because of low recruitment yields. Nevertheless, the recruitment process produced substantial numbers that successfully completed all study requirements. PMID:21865154

  12. On Testability of Missing Data Mechanisms in Incomplete Data Sets

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2011-01-01

    This article is concerned with the question of whether the missing data mechanism routinely referred to as missing completely at random (MCAR) is statistically examinable via a test for lack of distributional differences between groups with observed and missing data, and related consequences. A discussion is initially provided, from a formal logic…

  13. Closed-form solution for the Wigner phase-space distribution function for diffuse reflection and small-angle scattering in a random medium.

    PubMed

    Yura, H T; Thrane, L; Andersen, P E

    2000-12-01

    Within the paraxial approximation, a closed-form solution for the Wigner phase-space distribution function is derived for diffuse reflection and small-angle scattering in a random medium. This solution is based on the extended Huygens-Fresnel principle for the optical field, which is widely used in studies of wave propagation through random media. The results are general in that they apply to both an arbitrary small-angle volume scattering function, and arbitrary (real) ABCD optical systems. Furthermore, they are valid in both the single- and multiple-scattering regimes. Some general features of the Wigner phase-space distribution function are discussed, and analytic results are obtained for various types of scattering functions in the asymptotic limit s > 1, where s is the optical depth. In particular, explicit results are presented for optical coherence tomography (OCT) systems. On this basis, a novel way of creating OCT images based on measurements of the momentum width of the Wigner phase-space distribution is suggested, and the advantage over conventional OCT images is discussed. Because all previous published studies regarding the Wigner function are carried out in the transmission geometry, it is important to note that the extended Huygens-Fresnel principle and the ABCD matrix formalism may be used successfully to describe this geometry (within the paraxial approximation). Therefore for completeness we present in an appendix the general closed-form solution for the Wigner phase-space distribution function in ABCD paraxial optical systems for direct propagation through random media, and in a second appendix absorption effects are included.

  14. Clustering, randomness, and regularity in cloud fields. 4. Stratocumulus cloud fields

    NASA Astrophysics Data System (ADS)

    Lee, J.; Chou, J.; Weger, R. C.; Welch, R. M.

    1994-07-01

    To complete the analysis of the spatial distribution of boundary layer cloudiness, the present study focuses on nine stratocumulus Landsat scenes. The results indicate many similarities between stratocumulus and cumulus spatial distributions. Most notably, at full spatial resolution all scenes exhibit a decidedly clustered distribution. The strength of the clustering signal decreases with increasing cloud size; the clusters themselves consist of a few clouds (less than 10), occupy a small percentage of the cloud field area (less than 5%), contain between 20% and 60% of the cloud field population, and are randomly located within the scene. In contrast, stratocumulus in almost every respect are more strongly clustered than are cumulus cloud fields. For instance, stratocumulus clusters contain more clouds per cluster, occupy a larger percentage of the total area, and have a larger percentage of clouds participating in clusters than the corresponding cumulus examples. To investigate clustering at intermediate spatial scales, the local dimensionality statistic is introduced. Results obtained from this statistic provide the first direct evidence for regularity among large (>900 m in diameter) clouds in stratocumulus and cumulus cloud fields, in support of the inhibition hypothesis of Ramirez and Bras (1990). Also, the size compensated point-to-cloud cumulative distribution function statistic is found to be necessary to obtain a consistent description of stratocumulus cloud distributions. A hypothesis regarding the underlying physical mechanisms responsible for cloud clustering is presented. It is suggested that cloud clusters often arise from 4 to 10 triggering events localized within regions less than 2 km in diameter and randomly distributed within the cloud field. As the size of the cloud surpasses the scale of the triggering region, the clustering signal weakens and the larger cloud locations become more random.

  15. Clustering, randomness, and regularity in cloud fields. 4: Stratocumulus cloud fields

    NASA Technical Reports Server (NTRS)

    Lee, J.; Chou, J.; Weger, R. C.; Welch, R. M.

    1994-01-01

    To complete the analysis of the spatial distribution of boundary layer cloudiness, the present study focuses on nine stratocumulus Landsat scenes. The results indicate many similarities between stratocumulus and cumulus spatial distributions. Most notably, at full spatial resolution all scenes exhibit a decidedly clustered distribution. The strength of the clustering signal decreases with increasing cloud size; the clusters themselves consist of a few clouds (less than 10), occupy a small percentage of the cloud field area (less than 5%), contain between 20% and 60% of the cloud field population, and are randomly located within the scene. In contrast, stratocumulus in almost every respect are more strongly clustered than are cumulus cloud fields. For instance, stratocumulus clusters contain more clouds per cluster, occupy a larger percentage of the total area, and have a larger percentage of clouds participating in clusters than the corresponding cumulus examples. To investigate clustering at intermediate spatial scales, the local dimensionality statistic is introduced. Results obtained from this statistic provide the first direct evidence for regularity among large (more than 900 m in diameter) clouds in stratocumulus and cumulus cloud fields, in support of the inhibition hypothesis of Ramirez and Bras (1990). Also, the size compensated point-to-cloud cumulative distribution function statistic is found to be necessary to obtain a consistent description of stratocumulus cloud distributions. A hypothesis regarding the underlying physical mechanisms responsible for cloud clustering is presented. It is suggested that cloud clusters often arise from 4 to 10 triggering events localized within regions less than 2 km in diameter and randomly distributed within the cloud field. As the size of the cloud surpasses the scale of the triggering region, the clustering signal weakens and the larger cloud locations become more random.

  16. Learning stochastic reward distributions in a speeded pointing task.

    PubMed

    Seydell, Anna; McCann, Brian C; Trommershäuser, Julia; Knill, David C

    2008-04-23

    Recent studies have shown that humans effectively take into account task variance caused by intrinsic motor noise when planning fast hand movements. However, previous evidence suggests that humans have greater difficulty accounting for arbitrary forms of stochasticity in their environment, both in economic decision making and sensorimotor tasks. We hypothesized that humans can learn to optimize movement strategies when environmental randomness can be experienced and thus implicitly learned over several trials, especially if it mimics the kinds of randomness for which subjects might have generative models. We tested the hypothesis using a task in which subjects had to rapidly point at a target region partly covered by three stochastic penalty regions introduced as "defenders." At movement completion, each defender jumped to a new position drawn randomly from fixed probability distributions. Subjects earned points when they hit the target, unblocked by a defender, and lost points otherwise. Results indicate that after approximately 600 trials, subjects approached optimal behavior. We further tested whether subjects simply learned a set of stimulus-contingent motor plans or the statistics of defenders' movements by training subjects with one penalty distribution and then testing them on a new penalty distribution. Subjects immediately changed their strategy to achieve the same average reward as subjects who had trained with the second penalty distribution. These results indicate that subjects learned the parameters of the defenders' jump distributions and used this knowledge to optimally plan their hand movements under conditions involving stochastic rewards and penalties.

  17. Volcanoes Distribution in Linear Segmentation of Mariana Arc

    NASA Astrophysics Data System (ADS)

    Andikagumi, H.; Macpherson, C.; McCaffrey, K. J. W.

    2016-12-01

    A new method has been developed to describe better volcanoes distribution pattern within Mariana Arc. A previous study assumed the distribution of volcanoes in the Mariana Arc is described by a small circle distribution which reflects the melting processes in a curved subduction zone. The small circle fit to this dataset used in the study, comprised 12 -mainly subaerial- volcanoes from Smithsonian Institute Global Volcanism Program, was reassessed by us to have a root-mean-square misfit of 2.5 km. The same method applied to a more complete dataset from Baker et al. (2008), consisting 37 subaerial and submarine volcanoes, resulted in an 8.4 km misfit. However, using the Hough Transform method on the larger dataset, lower misfits of great circle segments were achieved (3.1 and 3.0 km) for two possible segments combination. The results indicate that the distribution of volcanoes in the Mariana Arc is better described by a great circle pattern, instead of small circle. Variogram and cross-variogram analysis on volcano spacing and volume shows that there is spatial correlation between volcanoes between 420 and 500 km which corresponds to the maximum segmentation lengths from Hough Transform (320 km). Further analysis of volcano spacing by the coefficient of variation (Cv), shows a tendency toward not-random distribution as the Cv values are closer to zero than one. These distributions are inferred to be associated with the development of normal faults at the back arc as their Cv values also tend towards zero. To analyse whether volcano spacing is random or not, Cv values were simulated using a Monte Carlo method with random input. Only the southernmost segment has allowed us to reject the null hypothesis that volcanoes are randomly spaced at 95% confidence level by 0.007 estimated probability. This result shows infrequent regularity in volcano spacing by chance so that controlling factor in lithospheric scale should be analysed with different approach (not from random number generator). Sunda Arc which has been studied to have en enchelon segmentation and larger number of volcanoes will be further studied to understand particular upper plate influence in volcanoes distribution.

  18. Long-Tailed Distributions in Biological Systems:. Revisit to Lognormals

    NASA Astrophysics Data System (ADS)

    Kobayashi, N.; Kohyama, K.; Moriyama, O.; Sasaki, Y.; Matsushita, M.; Matsushita, S.

    2007-07-01

    Long-tailed distributions in biological systems have been studied. First, we found that lognormal distributions show excellent fit with various data for the duration distribution of disability in aged people, irrespective of their severity and gender. The robust lognormal distribution of disability implies that the incidence of diseases can be completed by many independent subprocesses in succession. Next, we studied food fragmentation by human mastication. A lognormal distribution well fits to the entire region for masticated food fragments for a small number of chewing strokes. Furthermore, the tail of the fragment-size distribution changes from the lognormal distribution to a power-law one as the chewing stroke number increases. The good data fitting by the lognormal and power-law distribution implies that two functions of mastication, a sequential fragmentation with cascade and randomness and a lower threshold for fragment size, may affect the size distribution of masticated food fragments.

  19. Quantifying randomness in real networks

    NASA Astrophysics Data System (ADS)

    Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-10-01

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs.

  20. Spatial distribution of impact craters on Deimos

    NASA Astrophysics Data System (ADS)

    Hirata, Naoyuki

    2017-05-01

    Deimos, one of the Martian moons, has numerous impact craters. However, it is unclear whether crater saturation has been reached on this satellite. To address this issue, we apply a statistical test known as nearest-neighbor analysis to analyze the crater distribution of Deimos. When a planetary surface such as the Moon is saturated with impact craters, the spatial distribution of craters is generally changed from random to more ordered. We measured impact craters on Deimos from Viking and HiRISE images and found (1) that the power law of the size-frequency distribution of the craters is approximately -1.7, which is significantly shallower than those of potential impactors, and (2) that the spatial distribution of craters over 30 m in diameter cannot be statistically distinguished from completely random distribution, which indicates that the surface of Deimos is inconsistent with a surface saturated with impact craters. Although a crater size-frequency distribution curve with a slope of -2 is generally interpreted as indicating saturation equilibrium, it is here proposed that two competing mechanisms, seismic shaking and ejecta emplacement, have played a major role in erasing craters on Deimos and are therefore responsible for the shallow slope of this curve. The observed crater density may have reached steady state owing to the obliterations induced by the two competing mechanisms. Such an occurrence indicates that the surface is saturated with impact craters despite the random distribution of craters on Deimos. Therefore, this work proposes that the age determined by the current craters on Deimos reflects neither the age of Deimos itself nor that of the formation of the large concavity centered at its south pole because craters should be removed by later impacts. However, a few of the largest craters on Deimos may be indicative of the age of the south pole event.

  1. The coalescent process in models with selection and recombination.

    PubMed

    Hudson, R R; Kaplan, N L

    1988-11-01

    The statistical properties of the process describing the genealogical history of a random sample of genes at a selectively neutral locus which is linked to a locus at which natural selection operates are investigated. It is found that the equations describing this process are simple modifications of the equations describing the process assuming that the two loci are completely linked. Thus, the statistical properties of the genealogical process for a random sample at a neutral locus linked to a locus with selection follow from the results obtained for the selected locus. Sequence data from the alcohol dehydrogenase (Adh) region of Drosophila melanogaster are examined and compared to predictions based on the theory. It is found that the spatial distribution of nucleotide differences between Fast and Slow alleles of Adh is very similar to the spatial distribution predicted if balancing selection operates to maintain the allozyme variation at the Adh locus. The spatial distribution of nucleotide differences between different Slow alleles of Adh do not match the predictions of this simple model very well.

  2. Distribution of diameters for Erdős-Rényi random graphs.

    PubMed

    Hartmann, A K; Mézard, M

    2018-03-01

    We study the distribution of diameters d of Erdős-Rényi random graphs with average connectivity c. The diameter d is the maximum among all the shortest distances between pairs of nodes in a graph and an important quantity for all dynamic processes taking place on graphs. Here we study the distribution P(d) numerically for various values of c, in the nonpercolating and percolating regimes. Using large-deviation techniques, we are able to reach small probabilities like 10^{-100} which allow us to obtain the distribution over basically the full range of the support, for graphs up to N=1000 nodes. For values c<1, our results are in good agreement with analytical results, proving the reliability of our numerical approach. For c>1 the distribution is more complex and no complete analytical results are available. For this parameter range, P(d) exhibits an inflection point, which we found to be related to a structural change of the graphs. For all values of c, we determined the finite-size rate function Φ(d/N) and were able to extrapolate numerically to N→∞, indicating that the large-deviation principle holds.

  3. Distribution of diameters for Erdős-Rényi random graphs

    NASA Astrophysics Data System (ADS)

    Hartmann, A. K.; Mézard, M.

    2018-03-01

    We study the distribution of diameters d of Erdős-Rényi random graphs with average connectivity c . The diameter d is the maximum among all the shortest distances between pairs of nodes in a graph and an important quantity for all dynamic processes taking place on graphs. Here we study the distribution P (d ) numerically for various values of c , in the nonpercolating and percolating regimes. Using large-deviation techniques, we are able to reach small probabilities like 10-100 which allow us to obtain the distribution over basically the full range of the support, for graphs up to N =1000 nodes. For values c <1 , our results are in good agreement with analytical results, proving the reliability of our numerical approach. For c >1 the distribution is more complex and no complete analytical results are available. For this parameter range, P (d ) exhibits an inflection point, which we found to be related to a structural change of the graphs. For all values of c , we determined the finite-size rate function Φ (d /N ) and were able to extrapolate numerically to N →∞ , indicating that the large-deviation principle holds.

  4. Redshift data and statistical inference

    NASA Technical Reports Server (NTRS)

    Newman, William I.; Haynes, Martha P.; Terzian, Yervant

    1994-01-01

    Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.

  5. Properties of a new small-world network with spatially biased random shortcuts

    NASA Astrophysics Data System (ADS)

    Matsuzawa, Ryo; Tanimoto, Jun; Fukuda, Eriko

    2017-11-01

    This paper introduces a small-world (SW) network with a power-law distance distribution that differs from conventional models in that it uses completely random shortcuts. By incorporating spatial constraints, we analyze the divergence of the proposed model from conventional models in terms of fundamental network properties such as clustering coefficient, average path length, and degree distribution. We find that when the spatial constraint more strongly prohibits a long shortcut, the clustering coefficient is improved and the average path length increases. We also analyze the spatial prisoner's dilemma (SPD) games played on our new SW network in order to understand its dynamical characteristics. Depending on the basis graph, i.e., whether it is a one-dimensional ring or a two-dimensional lattice, and the parameter controlling the prohibition of long-distance shortcuts, the emergent results can vastly differ.

  6. Optimal sampling design for estimating spatial distribution and abundance of a freshwater mussel population

    USGS Publications Warehouse

    Pooler, P.S.; Smith, D.R.

    2005-01-01

    We compared the ability of simple random sampling (SRS) and a variety of systematic sampling (SYS) designs to estimate abundance, quantify spatial clustering, and predict spatial distribution of freshwater mussels. Sampling simulations were conducted using data obtained from a census of freshwater mussels in a 40 X 33 m section of the Cacapon River near Capon Bridge, West Virginia, and from a simulated spatially random population generated to have the same abundance as the real population. Sampling units that were 0.25 m 2 gave more accurate and precise abundance estimates and generally better spatial predictions than 1-m2 sampling units. Systematic sampling with ???2 random starts was more efficient than SRS. Estimates of abundance based on SYS were more accurate when the distance between sampling units across the stream was less than or equal to the distance between sampling units along the stream. Three measures for quantifying spatial clustering were examined: Hopkins Statistic, the Clumping Index, and Morisita's Index. Morisita's Index was the most reliable, and the Hopkins Statistic was prone to false rejection of complete spatial randomness. SYS designs with units spaced equally across and up stream provided the most accurate predictions when estimating the spatial distribution by kriging. Our research indicates that SYS designs with sampling units equally spaced both across and along the stream would be appropriate for sampling freshwater mussels even if no information about the true underlying spatial distribution of the population were available to guide the design choice. ?? 2005 by The North American Benthological Society.

  7. Image-Based Modeling Reveals Dynamic Redistribution of DNA Damage into Nuclear Sub-Domains

    PubMed Central

    Costes, Sylvain V; Ponomarev, Artem; Chen, James L; Nguyen, David; Cucinotta, Francis A; Barcellos-Hoff, Mary Helen

    2007-01-01

    Several proteins involved in the response to DNA double strand breaks (DSB) form microscopically visible nuclear domains, or foci, after exposure to ionizing radiation. Radiation-induced foci (RIF) are believed to be located where DNA damage occurs. To test this assumption, we analyzed the spatial distribution of 53BP1, phosphorylated ATM, and γH2AX RIF in cells irradiated with high linear energy transfer (LET) radiation and low LET. Since energy is randomly deposited along high-LET particle paths, RIF along these paths should also be randomly distributed. The probability to induce DSB can be derived from DNA fragment data measured experimentally by pulsed-field gel electrophoresis. We used this probability in Monte Carlo simulations to predict DSB locations in synthetic nuclei geometrically described by a complete set of human chromosomes, taking into account microscope optics from real experiments. As expected, simulations produced DNA-weighted random (Poisson) distributions. In contrast, the distributions of RIF obtained as early as 5 min after exposure to high LET (1 GeV/amu Fe) were non-random. This deviation from the expected DNA-weighted random pattern can be further characterized by “relative DNA image measurements.” This novel imaging approach shows that RIF were located preferentially at the interface between high and low DNA density regions, and were more frequent than predicted in regions with lower DNA density. The same preferential nuclear location was also measured for RIF induced by 1 Gy of low-LET radiation. This deviation from random behavior was evident only 5 min after irradiation for phosphorylated ATM RIF, while γH2AX and 53BP1 RIF showed pronounced deviations up to 30 min after exposure. These data suggest that DNA damage–induced foci are restricted to certain regions of the nucleus of human epithelial cells. It is possible that DNA lesions are collected in these nuclear sub-domains for more efficient repair. PMID:17676951

  8. The conditional power of randomization tests for single-case effect sizes in designs with randomized treatment order: A Monte Carlo simulation study.

    PubMed

    Michiels, Bart; Heyvaert, Mieke; Onghena, Patrick

    2018-04-01

    The conditional power (CP) of the randomization test (RT) was investigated in a simulation study in which three different single-case effect size (ES) measures were used as the test statistics: the mean difference (MD), the percentage of nonoverlapping data (PND), and the nonoverlap of all pairs (NAP). Furthermore, we studied the effect of the experimental design on the RT's CP for three different single-case designs with rapid treatment alternation: the completely randomized design (CRD), the randomized block design (RBD), and the restricted randomized alternation design (RRAD). As a third goal, we evaluated the CP of the RT for three types of simulated data: data generated from a standard normal distribution, data generated from a uniform distribution, and data generated from a first-order autoregressive Gaussian process. The results showed that the MD and NAP perform very similarly in terms of CP, whereas the PND performs substantially worse. Furthermore, the RRAD yielded marginally higher power in the RT, followed by the CRD and then the RBD. Finally, the power of the RT was almost unaffected by the type of the simulated data. On the basis of the results of the simulation study, we recommend at least 20 measurement occasions for single-case designs with a randomized treatment order that are to be evaluated with an RT using a 5% significance level. Furthermore, we do not recommend use of the PND, because of its low power in the RT.

  9. Evaluation of two school-based HIV prevention interventions in the border city of Tijuana, Mexico.

    PubMed

    Martinez-Donate, Ana P; Hovell, Melbourne F; Zellner, Jennifer; Sipan, Carol L; Blumberg, Elaine J; Carrizosa, Claudia

    2004-08-01

    This research project examined the individual and combined effectiveness of an HIV prevention workshop and a free condom distribution program in four high schools in Tijuana, Mexico. Adolescents (N = 320) completed baseline measures on sexual practices and theoretical correlates and participated in a two-part study. In Study 1, students were randomly assigned to an HIV prevention workshop or a control condition, with a 3-month follow-up assessment. Results indicate three significant workshop benefits regarding HIV transmission by altering sexual initiation, access to condoms, and traditional beliefs regarding condoms. In Study 2, we set up a condom distribution program at two of the participating schools, and students completed a 6-month follow-up assessment. Results indicate that exposure to the workshop followed by access to the condom distribution program yielded two beneficial results for reducing HIV transmission: moderating sexual initiation and increasing condom acquisition. Access to the condom distribution program alone had no effects on behavioral and psychosocial correlates of HIV transmission. We discuss implications of these results.

  10. A Bayesian, generalized frailty model for comet assays.

    PubMed

    Ghebretinsae, Aklilu Habteab; Faes, Christel; Molenberghs, Geert; De Boeck, Marlies; Geys, Helena

    2013-05-01

    This paper proposes a flexible modeling approach for so-called comet assay data regularly encountered in preclinical research. While such data consist of non-Gaussian outcomes in a multilevel hierarchical structure, traditional analyses typically completely or partly ignore this hierarchical nature by summarizing measurements within a cluster. Non-Gaussian outcomes are often modeled using exponential family models. This is true not only for binary and count data, but also for, example, time-to-event outcomes. Two important reasons for extending this family are for (1) the possible occurrence of overdispersion, meaning that the variability in the data may not be adequately described by the models, which often exhibit a prescribed mean-variance link, and (2) the accommodation of a hierarchical structure in the data, owing to clustering in the data. The first issue is dealt with through so-called overdispersion models. Clustering is often accommodated through the inclusion of random subject-specific effects. Though not always, one conventionally assumes such random effects to be normally distributed. In the case of time-to-event data, one encounters, for example, the gamma frailty model (Duchateau and Janssen, 2007 ). While both of these issues may occur simultaneously, models combining both are uncommon. Molenberghs et al. ( 2010 ) proposed a broad class of generalized linear models accommodating overdispersion and clustering through two separate sets of random effects. Here, we use this method to model data from a comet assay with a three-level hierarchical structure. Although a conjugate gamma random effect is used for the overdispersion random effect, both gamma and normal random effects are considered for the hierarchical random effect. Apart from model formulation, we place emphasis on Bayesian estimation. Our proposed method has an upper hand over the traditional analysis in that it (1) uses the appropriate distribution stipulated in the literature; (2) deals with the complete hierarchical nature; and (3) uses all information instead of summary measures. The fit of the model to the comet assay is compared against the background of more conventional model fits. Results indicate the toxicity of 1,2-dimethylhydrazine dihydrochloride at different dose levels (low, medium, and high).

  11. Fast Algorithms for Estimating Mixture Parameters

    DTIC Science & Technology

    1989-08-30

    The investigation is a two year project with the first year sponsored by the Army Research Office and the second year by the National Science Foundation (Grant... Science Foundation during the coming year. Keywords: Fast algorithms; Algorithms Mixture Distribution Random Variables. (KR)...numerical testing of the accelerated fixed-point method was completed. The work on relaxation methods will be done under the sponsorship of the National

  12. First Results on Angular Distributions of Thermal Dileptons in Nuclear Collisions

    NASA Astrophysics Data System (ADS)

    Arnaldi, R.; Banicz, K.; Castor, J.; Chaurand, B.; Cicalò, C.; Colla, A.; Cortese, P.; Damjanovic, S.; David, A.; de Falco, A.; Devaux, A.; Ducroux, L.; En'Yo, H.; Fargeix, J.; Ferretti, A.; Floris, M.; Förster, A.; Force, P.; Guettet, N.; Guichard, A.; Gulkanian, H.; Heuser, J. M.; Keil, M.; Kluberg, L.; Lourenço, C.; Lozano, J.; Manso, F.; Martins, P.; Masoni, A.; Neves, A.; Ohnishi, H.; Oppedisano, C.; Parracho, P.; Pillot, P.; Poghosyan, T.; Puddu, G.; Radermacher, E.; Ramalhete, P.; Rosinsky, P.; Scomparin, E.; Seixas, J.; Serci, S.; Shahoyan, R.; Sonderegger, P.; Specht, H. J.; Tieulent, R.; Usai, G.; Veenhof, R.; Wöhri, H. K.

    2009-06-01

    The NA60 experiment at the CERN Super Proton Synchrotron has studied dimuon production in 158AGeV In-In collisions. The strong excess of pairs above the known sources found in the complete mass region 0.2

  13. Completely device-independent quantum key distribution

    NASA Astrophysics Data System (ADS)

    Aguilar, Edgar A.; Ramanathan, Ravishankar; Kofler, Johannes; Pawłowski, Marcin

    2016-08-01

    Quantum key distribution (QKD) is a provably secure way for two distant parties to establish a common secret key, which then can be used in a classical cryptographic scheme. Using quantum entanglement, one can reduce the necessary assumptions that the parties have to make about their devices, giving rise to device-independent QKD (DIQKD). However, in all existing protocols to date the parties need to have an initial (at least partially) random seed as a resource. In this work, we show that this requirement can be dropped. Using recent advances in the fields of randomness amplification and randomness expansion, we demonstrate that it is sufficient for the message the parties want to communicate to be (partially) unknown to the adversaries—an assumption without which any type of cryptography would be pointless to begin with. One party can use her secret message to locally generate a secret sequence of bits, which can then be openly used by herself and the other party in a DIQKD protocol. Hence our work reduces the requirements needed to perform secure DIQKD and establish safe communication.

  14. Distributional assumptions in food and feed commodities- development of fit-for-purpose sampling protocols.

    PubMed

    Paoletti, Claudia; Esbensen, Kim H

    2015-01-01

    Material heterogeneity influences the effectiveness of sampling procedures. Most sampling guidelines used for assessment of food and/or feed commodities are based on classical statistical distribution requirements, the normal, binomial, and Poisson distributions-and almost universally rely on the assumption of randomness. However, this is unrealistic. The scientific food and feed community recognizes a strong preponderance of non random distribution within commodity lots, which should be a more realistic prerequisite for definition of effective sampling protocols. Nevertheless, these heterogeneity issues are overlooked as the prime focus is often placed only on financial, time, equipment, and personnel constraints instead of mandating acquisition of documented representative samples under realistic heterogeneity conditions. This study shows how the principles promulgated in the Theory of Sampling (TOS) and practically tested over 60 years provide an effective framework for dealing with the complete set of adverse aspects of both compositional and distributional heterogeneity (material sampling errors), as well as with the errors incurred by the sampling process itself. The results of an empirical European Union study on genetically modified soybean heterogeneity, Kernel Lot Distribution Assessment are summarized, as they have a strong bearing on the issue of proper sampling protocol development. TOS principles apply universally in the food and feed realm and must therefore be considered the only basis for development of valid sampling protocols free from distributional constraints.

  15. [Spatial distribution pattern of Chilo suppressalis analyzed by classical method and geostatistics].

    PubMed

    Yuan, Zheming; Fu, Wei; Li, Fangyi

    2004-04-01

    Two original samples of Chilo suppressalis and their grid, random and sequence samples were analyzed by classical method and geostatistics to characterize the spatial distribution pattern of C. suppressalis. The limitations of spatial distribution analysis with classical method, especially influenced by the original position of grid, were summarized rather completely. On the contrary, geostatistics characterized well the spatial distribution pattern, congregation intensity and spatial heterogeneity of C. suppressalis. According to geostatistics, the population was up to Poisson distribution in low density. As for higher density population, its distribution was up to aggregative, and the aggregation intensity and dependence range were 0.1056 and 193 cm, respectively. Spatial heterogeneity was also found in the higher density population. Its spatial correlativity in line direction was more closely than that in row direction, and the dependence ranges in line and row direction were 115 and 264 cm, respectively.

  16. Modeling pattern in collections of parameters

    USGS Publications Warehouse

    Link, W.A.

    1999-01-01

    Wildlife management is increasingly guided by analyses of large and complex datasets. The description of such datasets often requires a large number of parameters, among which certain patterns might be discernible. For example, one may consider a long-term study producing estimates of annual survival rates; of interest is the question whether these rates have declined through time. Several statistical methods exist for examining pattern in collections of parameters. Here, I argue for the superiority of 'random effects models' in which parameters are regarded as random variables, with distributions governed by 'hyperparameters' describing the patterns of interest. Unfortunately, implementation of random effects models is sometimes difficult. Ultrastructural models, in which the postulated pattern is built into the parameter structure of the original data analysis, are approximations to random effects models. However, this approximation is not completely satisfactory: failure to account for natural variation among parameters can lead to overstatement of the evidence for pattern among parameters. I describe quasi-likelihood methods that can be used to improve the approximation of random effects models by ultrastructural models.

  17. Ising Critical Behavior of Inhomogeneous Curie-Weiss Models and Annealed Random Graphs

    NASA Astrophysics Data System (ADS)

    Dommers, Sander; Giardinà, Cristian; Giberti, Claudio; van der Hofstad, Remco; Prioriello, Maria Luisa

    2016-11-01

    We study the critical behavior for inhomogeneous versions of the Curie-Weiss model, where the coupling constant {J_{ij}(β)} for the edge {ij} on the complete graph is given by {J_{ij}(β)=β w_iw_j/( {sum_{kin[N]}w_k})}. We call the product form of these couplings the rank-1 inhomogeneous Curie-Weiss model. This model also arises [with inverse temperature {β} replaced by {sinh(β)} ] from the annealed Ising model on the generalized random graph. We assume that the vertex weights {(w_i)_{iin[N]}} are regular, in the sense that their empirical distribution converges and the second moment converges as well. We identify the critical temperatures and exponents for these models, as well as a non-classical limit theorem for the total spin at the critical point. These depend sensitively on the number of finite moments of the weight distribution. When the fourth moment of the weight distribution converges, then the critical behavior is the same as on the (homogeneous) Curie-Weiss model, so that the inhomogeneity is weak. When the fourth moment of the weights converges to infinity, and the weights satisfy an asymptotic power law with exponent {τ} with {τin(3,5)}, then the critical exponents depend sensitively on {τ}. In addition, at criticality, the total spin {S_N} satisfies that {S_N/N^{(τ-2)/(τ-1)}} converges in law to some limiting random variable whose distribution we explicitly characterize.

  18. Random matrix theory analysis of cross-correlations in the US stock market: Evidence from Pearson’s correlation coefficient and detrended cross-correlation coefficient

    NASA Astrophysics Data System (ADS)

    Wang, Gang-Jin; Xie, Chi; Chen, Shou; Yang, Jiao-Jiao; Yang, Ming-Yan

    2013-09-01

    In this study, we first build two empirical cross-correlation matrices in the US stock market by two different methods, namely the Pearson’s correlation coefficient and the detrended cross-correlation coefficient (DCCA coefficient). Then, combining the two matrices with the method of random matrix theory (RMT), we mainly investigate the statistical properties of cross-correlations in the US stock market. We choose the daily closing prices of 462 constituent stocks of S&P 500 index as the research objects and select the sample data from January 3, 2005 to August 31, 2012. In the empirical analysis, we examine the statistical properties of cross-correlation coefficients, the distribution of eigenvalues, the distribution of eigenvector components, and the inverse participation ratio. From the two methods, we find some new results of the cross-correlations in the US stock market in our study, which are different from the conclusions reached by previous studies. The empirical cross-correlation matrices constructed by the DCCA coefficient show several interesting properties at different time scales in the US stock market, which are useful to the risk management and optimal portfolio selection, especially to the diversity of the asset portfolio. It will be an interesting and meaningful work to find the theoretical eigenvalue distribution of a completely random matrix R for the DCCA coefficient because it does not obey the Marčenko-Pastur distribution.

  19. Rates of profit as correlated sums of random variables

    NASA Astrophysics Data System (ADS)

    Greenblatt, R. E.

    2013-10-01

    Profit realization is the dominant feature of market-based economic systems, determining their dynamics to a large extent. Rather than attaining an equilibrium, profit rates vary widely across firms, and the variation persists over time. Differing definitions of profit result in differing empirical distributions. To study the statistical properties of profit rates, I used data from a publicly available database for the US Economy for 2009-2010 (Risk Management Association). For each of three profit rate measures, the sample space consists of 771 points. Each point represents aggregate data from a small number of US manufacturing firms of similar size and type (NAICS code of principal product). When comparing the empirical distributions of profit rates, significant ‘heavy tails’ were observed, corresponding principally to a number of firms with larger profit rates than would be expected from simple models. An apparently novel correlated sum of random variables statistical model was used to model the data. In the case of operating and net profit rates, a number of firms show negative profits (losses), ruling out simple gamma or lognormal distributions as complete models for these data.

  20. Complete chloroplast genome sequences of Solanum commersonii and its application to chloroplast genotype in somatic hybrids with Solanum tuberosum.

    PubMed

    Cho, Kwang-Soo; Cheon, Kyeong-Sik; Hong, Su-Young; Cho, Ji-Hong; Im, Ju-Seong; Mekapogu, Manjulatha; Yu, Yei-Soo; Park, Tae-Ho

    2016-10-01

    Chloroplast genome of Solanum commersonii and S olanum tuberosum were completely sequenced, and Indel markers were successfully applied to distinguish chlorotypes demonstrating the chloroplast genome was randomly distributed during protoplast fusion. Somatic hybridization has been widely employed for the introgression of resistance to several diseases from wild Solanum species to overcome sexual barriers in potato breeding. Solanum commersonii is a major resource used as a parent line in somatic hybridization to improve bacterial wilt resistance in interspecies transfer to cultivated potato (S. tuberosum). Here, we sequenced the complete chloroplast genomes of Lz3.2 (S. commersonii) and S. tuberosum (PT56), which were used to develop fusion products, then compared them with those of five members of the Solanaceae family, S. tuberosum, Capsicum annum, S. lycopersicum, S. bulbocastanum and S. nigrum and Coffea arabica as an out-group. We then developed Indel markers for application in chloroplast genotyping. The complete chloroplast genome of Lz3.2 is composed of 155,525 bp, which is larger than the PT56 genome with 155,296 bp. Gene content, order and orientation of the S. commersonii chloroplast genome were highly conserved with those of other Solanaceae species, and the phylogenetic tree revealed that S. commersonii is located within the same node of S. tuberosum. However, sequence alignment revealed nine Indels between S. commersonii and S. tuberosum in their chloroplast genomes, allowing two Indel markers to be developed. The markers could distinguish the two species and were successfully applied to chloroplast genotyping (chlorotype) in somatic hybrids and their progenies. The results obtained in this study confirmed the random distribution of the chloroplast genome during protoplast fusion and its maternal inheritance and can be applied to select proper plastid genotypes in potato breeding program.

  1. Fast, Distributed Algorithms in Deep Networks

    DTIC Science & Technology

    2016-05-11

    may not have realized how vital she was in making this project a reality is Professor Crainiceanu. Without knowing who you were, you invited me into...objective function. Training is complete when (2) converges, or stated alternatively , when the difference between t and φL can no longer be...the state-of-the art approaches simply rely on random initialization. We propose an alternative 10 (a) Features in 1-dimensional space (b) Features

  2. Distributed Matrix Completion: Application to Cooperative Positioning in Noisy Environments

    DTIC Science & Technology

    2013-12-11

    positioning, and a gossip version of low-rank approximation were developed. A convex relaxation for positioning in the presence of noise was shown to...of a large data matrix through gossip algorithms. A new algorithm is proposed that amounts to iteratively multiplying a vector by independent random...sparsification of the original matrix and averaging the resulting normalized vectors. This can be viewed as a generalization of gossip algorithms for

  3. Complete Numerical Solution of the Diffusion Equation of Random Genetic Drift

    PubMed Central

    Zhao, Lei; Yue, Xingye; Waxman, David

    2013-01-01

    A numerical method is presented to solve the diffusion equation for the random genetic drift that occurs at a single unlinked locus with two alleles. The method was designed to conserve probability, and the resulting numerical solution represents a probability distribution whose total probability is unity. We describe solutions of the diffusion equation whose total probability is unity as complete. Thus the numerical method introduced in this work produces complete solutions, and such solutions have the property that whenever fixation and loss can occur, they are automatically included within the solution. This feature demonstrates that the diffusion approximation can describe not only internal allele frequencies, but also the boundary frequencies zero and one. The numerical approach presented here constitutes a single inclusive framework from which to perform calculations for random genetic drift. It has a straightforward implementation, allowing it to be applied to a wide variety of problems, including those with time-dependent parameters, such as changing population sizes. As tests and illustrations of the numerical method, it is used to determine: (i) the probability density and time-dependent probability of fixation for a neutral locus in a population of constant size; (ii) the probability of fixation in the presence of selection; and (iii) the probability of fixation in the presence of selection and demographic change, the latter in the form of a changing population size. PMID:23749318

  4. First Results on Angular Distributions of Thermal Dileptons in Nuclear Collisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnaldi, R.; Colla, A.; Cortese, P.

    The NA60 experiment at the CERN Super Proton Synchrotron has studied dimuon production in 158A GeV In-In collisions. The strong excess of pairs above the known sources found in the complete mass region 0.2

  5. Effective Network Size Predicted From Simulations of Pathogen Outbreaks Through Social Networks Provides a Novel Measure of Structure-Standardized Group Size.

    PubMed

    McCabe, Collin M; Nunn, Charles L

    2018-01-01

    The transmission of infectious disease through a population is often modeled assuming that interactions occur randomly in groups, with all individuals potentially interacting with all other individuals at an equal rate. However, it is well known that pairs of individuals vary in their degree of contact. Here, we propose a measure to account for such heterogeneity: effective network size (ENS), which refers to the size of a maximally complete network (i.e., unstructured, where all individuals interact with all others equally) that corresponds to the outbreak characteristics of a given heterogeneous, structured network. We simulated susceptible-infected (SI) and susceptible-infected-recovered (SIR) models on maximally complete networks to produce idealized outbreak duration distributions for a disease on a network of a given size. We also simulated the transmission of these same diseases on random structured networks and then used the resulting outbreak duration distributions to predict the ENS for the group or population. We provide the methods to reproduce these analyses in a public R package, "enss." Outbreak durations of simulations on randomly structured networks were more variable than those on complete networks, but tended to have similar mean durations of disease spread. We then applied our novel metric to empirical primate networks taken from the literature and compared the information represented by our ENSs to that by other established social network metrics. In AICc model comparison frameworks, group size and mean distance proved to be the metrics most consistently associated with ENS for SI simulations, while group size, centralization, and modularity were most consistently associated with ENS for SIR simulations. In all cases, ENS was shown to be associated with at least two other independent metrics, supporting its use as a novel metric. Overall, our study provides a proof of concept for simulation-based approaches toward constructing metrics of ENS, while also revealing the conditions under which this approach is most promising.

  6. A complete sample of double-lobed radio quasars for VLBI tests of source models - Definition and statistics

    NASA Technical Reports Server (NTRS)

    Hough, D. H.; Readhead, A. C. S.

    1989-01-01

    A complete, flux-density-limited sample of double-lobed radio quasars is defined, with nuclei bright enough to be mapped with the Mark III VLBI system. It is shown that the statistics of linear size, nuclear strength, and curvature are consistent with the assumption of random source orientations and simple relativistic beaming in the nuclei. However, these statistics are also consistent with the effects of interaction between the beams and the surrounding medium. The distribution of jet velocities in the nuclei, as measured with VLBI, will provide a powerful test of physical theories of extragalactic radio sources.

  7. Random walk, diffusion and mixing in simulations of scalar transport in fluid flows

    NASA Astrophysics Data System (ADS)

    Klimenko, A. Y.

    2008-12-01

    Physical similarity and mathematical equivalence of continuous diffusion and particle random walk form one of the cornerstones of modern physics and the theory of stochastic processes. In many applied models used in simulation of turbulent transport and turbulent combustion, mixing between particles is used to reflect the influence of the continuous diffusion terms in the transport equations. We show that the continuous scalar transport and diffusion can be accurately specified by means of mixing between randomly walking Lagrangian particles with scalar properties and assess errors associated with this scheme. This gives an alternative formulation for the stochastic process which is selected to represent the continuous diffusion. This paper focuses on statistical errors and deals with relatively simple cases, where one-particle distributions are sufficient for a complete description of the problem.

  8. Linear Regression with a Randomly Censored Covariate: Application to an Alzheimer's Study.

    PubMed

    Atem, Folefac D; Qian, Jing; Maye, Jacqueline E; Johnson, Keith A; Betensky, Rebecca A

    2017-01-01

    The association between maternal age of onset of dementia and amyloid deposition (measured by in vivo positron emission tomography (PET) imaging) in cognitively normal older offspring is of interest. In a regression model for amyloid, special methods are required due to the random right censoring of the covariate of maternal age of onset of dementia. Prior literature has proposed methods to address the problem of censoring due to assay limit of detection, but not random censoring. We propose imputation methods and a survival regression method that do not require parametric assumptions about the distribution of the censored covariate. Existing imputation methods address missing covariates, but not right censored covariates. In simulation studies, we compare these methods to the simple, but inefficient complete case analysis, and to thresholding approaches. We apply the methods to the Alzheimer's study.

  9. Time distributions of solar energetic particle events: Are SEPEs really random?

    NASA Astrophysics Data System (ADS)

    Jiggens, P. T. A.; Gabriel, S. B.

    2009-10-01

    Solar energetic particle events (SEPEs) can exhibit flux increases of several orders of magnitude over background levels and have always been considered to be random in nature in statistical models with no dependence of any one event on the occurrence of previous events. We examine whether this assumption of randomness in time is correct. Engineering modeling of SEPEs is important to enable reliable and efficient design of both Earth-orbiting and interplanetary spacecraft and future manned missions to Mars and the Moon. All existing engineering models assume that the frequency of SEPEs follows a Poisson process. We present analysis of the event waiting times using alternative distributions described by Lévy and time-dependent Poisson processes and compared these with the usual Poisson distribution. The results show significant deviation from a Poisson process and indicate that the underlying physical processes might be more closely related to a Lévy-type process, suggesting that there is some inherent “memory” in the system. Inherent Poisson assumptions of stationarity and event independence are investigated, and it appears that they do not hold and can be dependent upon the event definition used. SEPEs appear to have some memory indicating that events are not completely random with activity levels varying even during solar active periods and are characterized by clusters of events. This could have significant ramifications for engineering models of the SEP environment, and it is recommended that current statistical engineering models of the SEP environment should be modified to incorporate long-term event dependency and short-term system memory.

  10. The Complete Redistribution Approximation in Optically Thick Line-Driven Winds

    NASA Astrophysics Data System (ADS)

    Gayley, K. G.; Onifer, A. J.

    2001-05-01

    Wolf-Rayet winds are thought to exhibit large momentum fluxes, which has in part been explained by ionization stratification in the wind. However, it the cause of high mass loss, not high momentum flux, that remains largely a mystery, because standard models fail to achieve sufficient acceleration near the surface where the mass-loss rate is set. We consider a radiative transfer approximation that allows for the dynamics of optically thick Wolf-Rayet winds to be modeled without detailed treatment of the radiation field, called the complete redistribution approximation. In it, it is assumed that thermalization processes cause the photon frequencies to be completely randomized over the course of propagating through the wind, which allows the radiation field to be treated statistically rather than in detail. Thus the approach is similar to the statistical treatment of the line list used in the celebrated CAK approach. The results differ from the effectively gray treatment in that the radiation field is influenced by the line distribution, and the role of gaps in the line distribution is enhanced. The ramifications for the driving of large mass-loss rates is explored.

  11. Decoherence-induced conductivity in the one-dimensional Anderson model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stegmann, Thomas; Wolf, Dietrich E.; Ujsághy, Orsolya

    We study the effect of decoherence on the electron transport in the one-dimensional Anderson model by means of a statistical model [1, 2, 3, 4, 5]. In this model decoherence bonds are randomly distributed within the system, at which the electron phase is randomized completely. Afterwards, the transport quantity of interest (e.g. resistance or conductance) is ensemble averaged over the decoherence configurations. Averaging the resistance of the sample, the calculation can be performed analytically. In the thermodynamic limit, we find a decoherence-driven transition from the quantum-coherent localized regime to the Ohmic regime at a critical decoherence density, which is determinedmore » by the second-order generalized Lyapunov exponent (GLE) [4].« less

  12. Conditional Tests for Localizing Trait Genes

    PubMed Central

    Di, Yanming; Thompson, Elizabeth A.

    2009-01-01

    Background/Aims With pedigree data, genetic linkage can be detected using inheritance vector tests, which explore the discrepancy between the posterior distribution of the inheritance vectors given observed trait values and the prior distribution of the inheritance vectors. In this paper, we propose conditional inheritance vector tests for linkage localization. These conditional tests can also be used to detect additional linkage signals in the presence of previously detected causal genes. Methods For linkage localization, we propose to perform inheritance vector tests conditioning on the inheritance vectors at two positions bounding a test region. We can detect additional linkage signals by conducting a further conditional test in a region with no previously detected genes. We use randomized p values to extend the marginal and conditional tests when the inheritance vectors cannot be completely determined from genetic marker data. Results We conduct simulation studies to compare and contrast the marginal and the conditional tests and to demonstrate that randomized p values can capture both the significance and the uncertainty in the test results. Conclusions The simulation results demonstrate that the proposed conditional tests provide useful localization information, and with informative marker data, the uncertainty in randomized marginal and conditional test results is small. PMID:19439976

  13. Effects of sensorimotor foot training on the symmetry of weight distribution on the lower extremities of patients in the chronic phase after stroke

    PubMed Central

    Goliwas, Magdalena; Kocur, Piotr; Furmaniuk, Lech; Majchrzycki, Marian; Wiernicka, Marzena; Lewandowski, Jacek

    2015-01-01

    [Purpose] To assess the effects of sensorimotor foot stimulation on the symmetry of weight distribution on the feet of patients in the chronic post-stroke phase. [Subjects and Methods] This study was a prospective, single blind, randomized controlled trial. In the study we examined patients with chronic stroke (post-stroke duration > 1 year). They were randomly allocated to the study group (n=8) or to the control group (n=12). Both groups completed a standard six-week rehabilitation programme. In the study group, the standard rehabilitation programme was supplemented with sensorimotor foot stimulation training. Each patient underwent two assessments of symmetry of weight distribution on the lower extremities with and without visual control, on a treadmill, with stabilometry measurements, and under static conditions. [Results] Only the study group demonstrated a significant increase in the weight placed on the leg directly affected by stroke, and a reduction in asymmetry of weight-bearing on the lower extremities. [Conclusion] Sensorimotor stimulation of the feet enhanced of weight bearing on the foot on the side of the body directly affected by stroke, and a decreased asymmetry of weight distribution on the lower extremities of patients in the chronic post-stroke phase. PMID:26504326

  14. Effects of sensorimotor foot training on the symmetry of weight distribution on the lower extremities of patients in the chronic phase after stroke.

    PubMed

    Goliwas, Magdalena; Kocur, Piotr; Furmaniuk, Lech; Majchrzycki, Marian; Wiernicka, Marzena; Lewandowski, Jacek

    2015-09-01

    [Purpose] To assess the effects of sensorimotor foot stimulation on the symmetry of weight distribution on the feet of patients in the chronic post-stroke phase. [Subjects and Methods] This study was a prospective, single blind, randomized controlled trial. In the study we examined patients with chronic stroke (post-stroke duration > 1 year). They were randomly allocated to the study group (n=8) or to the control group (n=12). Both groups completed a standard six-week rehabilitation programme. In the study group, the standard rehabilitation programme was supplemented with sensorimotor foot stimulation training. Each patient underwent two assessments of symmetry of weight distribution on the lower extremities with and without visual control, on a treadmill, with stabilometry measurements, and under static conditions. [Results] Only the study group demonstrated a significant increase in the weight placed on the leg directly affected by stroke, and a reduction in asymmetry of weight-bearing on the lower extremities. [Conclusion] Sensorimotor stimulation of the feet enhanced of weight bearing on the foot on the side of the body directly affected by stroke, and a decreased asymmetry of weight distribution on the lower extremities of patients in the chronic post-stroke phase.

  15. Random sampling of elementary flux modes in large-scale metabolic networks.

    PubMed

    Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel

    2012-09-15

    The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.

  16. Visitor and community survey results for Prime Hook National Wildlife Refuge: Completion report

    USGS Publications Warehouse

    Sexton, Natalie R.; Stewart, Susan C.; Koontz, Lynne; Ponds, Phadrea; Walters, Katherine D.

    2007-01-01

    Community residents’ perceptions and opinions Data for this study were collected using a survey administered to visitors to Prime Hook NWR and individuals living in the communities surrounding the Refuge. Surveys were randomly distributed to both consumptive and nonconsumptive use visitors over a one year period (September 2004 to September 2005) to account for seasonal variation in Refuge use. Three hundred thirty-two visitor surveys were returned for a response rate of 80 percent with a confidence interval of ± 5.4. Surveys were also distributed to a stratified random sample of community members in adjacent and surrounding areas (Slaughter Beach, Broadkill Beach, Prime Hook Beach, Milton, Lewes, Milford, and surrounding communities). Four hundred ninety-one surveys from the overall community sample were returned for a response rate of 39 percent with a ± 4.4 confidence interval. Community member results were weighted by U.S. Census Bureau data to correct for age and gender bias, and for community proportionality.

  17. Generalized Entanglement Entropies of Quantum Designs.

    PubMed

    Liu, Zi-Wen; Lloyd, Seth; Zhu, Elton Yechao; Zhu, Huangjun

    2018-03-30

    The entanglement properties of random quantum states or dynamics are important to the study of a broad spectrum of disciplines of physics, ranging from quantum information to high energy and many-body physics. This Letter investigates the interplay between the degrees of entanglement and randomness in pure states and unitary channels. We reveal strong connections between designs (distributions of states or unitaries that match certain moments of the uniform Haar measure) and generalized entropies (entropic functions that depend on certain powers of the density operator), by showing that Rényi entanglement entropies averaged over designs of the same order are almost maximal. This strengthens the celebrated Page's theorem. Moreover, we find that designs of an order that is logarithmic in the dimension maximize all Rényi entanglement entropies and so are completely random in terms of the entanglement spectrum. Our results relate the behaviors of Rényi entanglement entropies to the complexity of scrambling and quantum chaos in terms of the degree of randomness, and suggest a generalization of the fast scrambling conjecture.

  18. Generalized Entanglement Entropies of Quantum Designs

    NASA Astrophysics Data System (ADS)

    Liu, Zi-Wen; Lloyd, Seth; Zhu, Elton Yechao; Zhu, Huangjun

    2018-03-01

    The entanglement properties of random quantum states or dynamics are important to the study of a broad spectrum of disciplines of physics, ranging from quantum information to high energy and many-body physics. This Letter investigates the interplay between the degrees of entanglement and randomness in pure states and unitary channels. We reveal strong connections between designs (distributions of states or unitaries that match certain moments of the uniform Haar measure) and generalized entropies (entropic functions that depend on certain powers of the density operator), by showing that Rényi entanglement entropies averaged over designs of the same order are almost maximal. This strengthens the celebrated Page's theorem. Moreover, we find that designs of an order that is logarithmic in the dimension maximize all Rényi entanglement entropies and so are completely random in terms of the entanglement spectrum. Our results relate the behaviors of Rényi entanglement entropies to the complexity of scrambling and quantum chaos in terms of the degree of randomness, and suggest a generalization of the fast scrambling conjecture.

  19. Statistics of primordial density perturbations from discrete seed masses

    NASA Technical Reports Server (NTRS)

    Scherrer, Robert J.; Bertschinger, Edmund

    1991-01-01

    The statistics of density perturbations for general distributions of seed masses with arbitrary matter accretion is examined. Formal expressions for the power spectrum, the N-point correlation functions, and the density distribution function are derived. These results are applied to the case of uncorrelated seed masses, and power spectra are derived for accretion of both hot and cold dark matter plus baryons. The reduced moments (cumulants) of the density distribution are computed and used to obtain a series expansion for the density distribution function. Analytic results are obtained for the density distribution function in the case of a distribution of seed masses with a spherical top-hat accretion pattern. More generally, the formalism makes it possible to give a complete characterization of the statistical properties of any random field generated from a discrete linear superposition of kernels. In particular, the results can be applied to density fields derived by smoothing a discrete set of points with a window function.

  20. Probabilistic Analysis of Algorithms for NP-Complete Problems

    DTIC Science & Technology

    1989-09-29

    LASSIFICATION OF THIS PAGE DTIC FILE COPY i PO ATO PAGEm ’ Forn Approvedii IONO PAGE I iMB NO. 07040188 .... "....... b . RESTRICTIVE MARKINGSECTE D...0790 3. DISTRIBUTION IAVAILABILITY OF REPORTAD-A217 880 -- ApprvdnrPU1l Qroo; B distr’ibutil unli mit od. .... .S. MONITORING...efficiently solves P in bouncded probability under D. I1 b ) A finds a solution to an instance of P chosen randomly according to D in time bounded by a

  1. Randomly displaced phase distribution design and its advantage in page-data recording of Fourier transform holograms.

    PubMed

    Emoto, Akira; Fukuda, Takashi

    2013-02-20

    For Fourier transform holography, an effective random phase distribution with randomly displaced phase segments is proposed for obtaining a smooth finite optical intensity distribution in the Fourier transform plane. Since unitary phase segments are randomly distributed in-plane, the blanks give various spatial frequency components to an image, and thus smooth the spectrum. Moreover, by randomly changing the phase segment size, spike generation from the unitary phase segment size in the spectrum can be reduced significantly. As a result, a smooth spectrum including sidebands can be formed at a relatively narrow extent. The proposed phase distribution sustains the primary functions of a random phase mask for holographic-data recording and reconstruction. Therefore, this distribution is expected to find applications in high-density holographic memory systems, replacing conventional random phase mask patterns.

  2. The two-point correlation function for groups of galaxies in the Center for Astrophysics redshift survey

    NASA Technical Reports Server (NTRS)

    Ramella, Massimo; Geller, Margaret J.; Huchra, John P.

    1990-01-01

    The large-scale distribution of groups of galaxies selected from complete slices of the CfA redshift survey extension is examined. The survey is used to reexamine the contribution of group members to the galaxy correlation function. The relationship between the correlation function for groups and those calculated for rich clusters is discussed, and the results for groups are examined as an extension of the relation between correlation function amplitude and richness. The group correlation function indicates that groups and individual galaxies are equivalent tracers of the large-scale matter distribution. The distribution of group centers is equivalent to random sampling of the galaxy distribution. The amplitude of the correlation function for groups is consistent with an extrapolation of the amplitude-richness relation for clusters. The amplitude scaled by the mean intersystem separation is also consistent with results for richer clusters.

  3. Valid statistical inference methods for a case-control study with missing data.

    PubMed

    Tian, Guo-Liang; Zhang, Chi; Jiang, Xuejun

    2018-04-01

    The main objective of this paper is to derive the valid sampling distribution of the observed counts in a case-control study with missing data under the assumption of missing at random by employing the conditional sampling method and the mechanism augmentation method. The proposed sampling distribution, called the case-control sampling distribution, can be used to calculate the standard errors of the maximum likelihood estimates of parameters via the Fisher information matrix and to generate independent samples for constructing small-sample bootstrap confidence intervals. Theoretical comparisons of the new case-control sampling distribution with two existing sampling distributions exhibit a large difference. Simulations are conducted to investigate the influence of the three different sampling distributions on statistical inferences. One finding is that the conclusion by the Wald test for testing independency under the two existing sampling distributions could be completely different (even contradictory) from the Wald test for testing the equality of the success probabilities in control/case groups under the proposed distribution. A real cervical cancer data set is used to illustrate the proposed statistical methods.

  4. Methods and analysis of realizing randomized grouping.

    PubMed

    Hu, Liang-Ping; Bao, Xiao-Lei; Wang, Qi

    2011-07-01

    Randomization is one of the four basic principles of research design. The meaning of randomization includes two aspects: one is to randomly select samples from the population, which is known as random sampling; the other is to randomly group all the samples, which is called randomized grouping. Randomized grouping can be subdivided into three categories: completely, stratified and dynamically randomized grouping. This article mainly introduces the steps of complete randomization, the definition of dynamic randomization and the realization of random sampling and grouping by SAS software.

  5. Effect of random phase mask on input plane in photorefractive authentic memory with two-wave encryption method

    NASA Astrophysics Data System (ADS)

    Mita, Akifumi; Okamoto, Atsushi; Funakoshi, Hisatoshi

    2004-06-01

    We have proposed an all-optical authentic memory with the two-wave encryption method. In the recording process, the image data are encrypted to a white noise by the random phase masks added on the input beam with the image data and the reference beam. Only reading beam with the phase-conjugated distribution of the reference beam can decrypt the encrypted data. If the encrypted data are read out with an incorrect phase distribution, the output data are transformed into a white noise. Moreover, during read out, reconstructions of the encrypted data interfere destructively resulting in zero intensity. Therefore our memory has a merit that we can detect unlawful accesses easily by measuring the output beam intensity. In our encryption method, the random phase mask on the input plane plays important roles in transforming the input image into a white noise and prohibiting to decrypt a white noise to the input image by the blind deconvolution method. Without this mask, when unauthorized users observe the output beam by using CCD in the readout with the plane wave, the completely same intensity distribution as that of Fourier transform of the input image is obtained. Therefore the encrypted image will be decrypted easily by using the blind deconvolution method. However in using this mask, even if unauthorized users observe the output beam using the same method, the encrypted image cannot be decrypted because the observed intensity distribution is dispersed at random by this mask. Thus it can be said the robustness is increased by this mask. In this report, we compare two correlation coefficients, which represents the degree of a white noise of the output image, between the output image and the input image in using this mask or not. We show that the robustness of this encryption method is increased as the correlation coefficient is improved from 0.3 to 0.1 by using this mask.

  6. Predicting structures in the Zone of Avoidance

    NASA Astrophysics Data System (ADS)

    Sorce, Jenny G.; Colless, Matthew; Kraan-Korteweg, Renée C.; Gottlöber, Stefan

    2017-11-01

    The Zone of Avoidance (ZOA), whose emptiness is an artefact of our Galaxy dust, has been challenging observers as well as theorists for many years. Multiple attempts have been made on the observational side to map this region in order to better understand the local flows. On the theoretical side, however, this region is often simply statistically populated with structures but no real attempt has been made to confront theoretical and observed matter distributions. This paper takes a step forward using constrained realizations (CRs) of the local Universe shown to be perfect substitutes of local Universe-like simulations for smoothed high-density peak studies. Far from generating completely `random' structures in the ZOA, the reconstruction technique arranges matter according to the surrounding environment of this region. More precisely, the mean distributions of structures in a series of constrained and random realizations (RRs) differ: while densities annihilate each other when averaging over 200 RRs, structures persist when summing 200 CRs. The probability distribution function of ZOA grid cells to be highly overdense is a Gaussian with a 15 per cent mean in the random case, while that of the constrained case exhibits large tails. This implies that areas with the largest probabilities host most likely a structure. Comparisons between these predictions and observations, like those of the Puppis 3 cluster, show a remarkable agreement and allow us to assert the presence of the, recently highlighted by observations, Vela supercluster at about 180 h-1 Mpc, right behind the thickest dust layers of our Galaxy.

  7. The fracture load and failure types of veneered anterior zirconia crowns: an analysis of normal and Weibull distribution of complete and censored data.

    PubMed

    Stawarczyk, Bogna; Ozcan, Mutlu; Hämmerle, Christoph H F; Roos, Malgorzata

    2012-05-01

    The aim of this study was to compare the fracture load of veneered anterior zirconia crowns using normal and Weibull distribution of complete and censored data. Standardized zirconia frameworks for maxillary canines were milled using a CAD/CAM system and randomly divided into 3 groups (N=90, n=30 per group). They were veneered with three veneering ceramics, namely GC Initial ZR, Vita VM9, IPS e.max Ceram using layering technique. The crowns were cemented with glass ionomer cement on metal abutments. The specimens were then loaded to fracture (1 mm/min) in a Universal Testing Machine. The data were analyzed using classical method (normal data distribution (μ, σ); Levene test and one-way ANOVA) and according to the Weibull statistics (s, m). In addition, fracture load results were analyzed depending on complete and censored failure types (only chipping vs. total fracture together with chipping). When computed with complete data, significantly higher mean fracture loads (N) were observed for GC Initial ZR (μ=978, σ=157; s=1043, m=7.2) and VITA VM9 (μ=1074, σ=179; s=1139; m=7.8) than that of IPS e.max Ceram (μ=798, σ=174; s=859, m=5.8) (p<0.05) by classical and Weibull statistics, respectively. When the data were censored for only total fracture, IPS e.max Ceram presented the lowest fracture load for chipping with both classical distribution (μ=790, σ=160) and Weibull statistics (s=836, m=6.5). When total fracture with chipping (classical distribution) was considered as failure, IPS e.max Ceram did not show significant fracture load for total fracture (μ=1054, σ=110) compared to other groups (GC Initial ZR: μ=1039, σ=152, VITA VM9: μ=1170, σ=166). According to Weibull distributed data, VITA VM9 showed significantly higher fracture load (s=1228, m=9.4) than those of other groups. Both classical distribution and Weibull statistics for complete data yielded similar outcomes. Censored data analysis of all ceramic systems based on failure types is essential and brings additional information regarding the susceptibility to chipping or total fracture. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  8. Complete convergence of randomly weighted END sequences and its application.

    PubMed

    Li, Penghua; Li, Xiaoqin; Wu, Kehan

    2017-01-01

    We investigate the complete convergence of partial sums of randomly weighted extended negatively dependent (END) random variables. Some results of complete moment convergence, complete convergence and the strong law of large numbers for this dependent structure are obtained. As an application, we study the convergence of the state observers of linear-time-invariant systems. Our results extend the corresponding earlier ones.

  9. Fossil preservation and the stratigraphic ranges of taxa

    NASA Technical Reports Server (NTRS)

    Foote, M.; Raup, D. M.

    1996-01-01

    The incompleteness of the fossil record hinders the inference of evolutionary rates and patterns. Here, we derive relationships among true taxonomic durations, preservation probability, and observed taxonomic ranges. We use these relationships to estimate original distributions of taxonomic durations, preservation probability, and completeness (proportion of taxa preserved), given only the observed ranges. No data on occurrences within the ranges of taxa are required. When preservation is random and the original distribution of durations is exponential, the inference of durations, preservability, and completeness is exact. However, reasonable approximations are possible given non-exponential duration distributions and temporal and taxonomic variation in preservability. Thus, the approaches we describe have great potential in studies of taphonomy, evolutionary rates and patterns, and genealogy. Analyses of Upper Cambrian-Lower Ordovician trilobite species, Paleozoic crinoid genera, Jurassic bivalve species, and Cenozoic mammal species yield the following results: (1) The preservation probability inferred from stratigraphic ranges alone agrees with that inferred from the analysis of stratigraphic gaps when data on the latter are available. (2) Whereas median durations based on simple tabulations of observed ranges are biased by stratigraphic resolution, our estimates of median duration, extinction rate, and completeness are not biased.(3) The shorter geologic ranges of mammalian species relative to those of bivalves cannot be attributed to a difference in preservation potential. However, we cannot rule out the contribution of taxonomic practice to this difference. (4) In the groups studied, completeness (proportion of species [trilobites, bivalves, mammals] or genera [crinoids] preserved) ranges from 60% to 90%. The higher estimates of completeness at smaller geographic scales support previous suggestions that the incompleteness of the fossil record reflects loss of fossiliferous rock more than failure of species to enter the fossil record in the first place.

  10. Modeling missing data in knowledge space theory.

    PubMed

    de Chiusole, Debora; Stefanutti, Luca; Anselmi, Pasquale; Robusto, Egidio

    2015-12-01

    Missing data are a well known issue in statistical inference, because some responses may be missing, even when data are collected carefully. The problem that arises in these cases is how to deal with missing data. In this article, the missingness is analyzed in knowledge space theory, and in particular when the basic local independence model (BLIM) is applied to the data. Two extensions of the BLIM to missing data are proposed: The former, called ignorable missing BLIM (IMBLIM), assumes that missing data are missing completely at random; the latter, called missing BLIM (MissBLIM), introduces specific dependencies of the missing data on the knowledge states, thus assuming that the missing data are missing not at random. The IMBLIM and the MissBLIM modeled the missingness in a satisfactory way, in both a simulation study and an empirical application, depending on the process that generates the missingness: If the missing data-generating process is of type missing completely at random, then either IMBLIM or MissBLIM provide adequate fit to the data. However, if the pattern of missingness is functionally dependent upon unobservable features of the data (e.g., missing answers are more likely to be wrong), then only a correctly specified model of the missingness distribution provides an adequate fit to the data. (c) 2015 APA, all rights reserved).

  11. Quantum cryptography with entangled photons

    PubMed

    Jennewein; Simon; Weihs; Weinfurter; Zeilinger

    2000-05-15

    By realizing a quantum cryptography system based on polarization entangled photon pairs we establish highly secure keys, because a single photon source is approximated and the inherent randomness of quantum measurements is exploited. We implement a novel key distribution scheme using Wigner's inequality to test the security of the quantum channel, and, alternatively, realize a variant of the BB84 protocol. Our system has two completely independent users separated by 360 m, and generates raw keys at rates of 400-800 bits/s with bit error rates around 3%.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahault, Benoit Alexandre; Saxena, Avadh Behari; Nisoli, Cristiano

    We introduce a minimal agent-based model to qualitatively conceptualize the allocation of limited wealth among more abundant opportunities. We study the interplay of power, satisfaction and frustration in the problem of wealth distribution, concentration, and inequality. This framework allows us to compare subjective measures of frustration and satisfaction to collective measures of fairness in wealth distribution, such as the Lorenz curve and the Gini index. We find that a completely libertarian, law-of-the-jungle setting, where every agent can acquire wealth from, or lose wealth to, anybody else invariably leads to a complete polarization of the distribution of wealth vs. opportunity, onlymore » minimally ameliorated by disorder in a non-optimized society. The picture is however dramatically modified when hard constraints are imposed over agents, and they are forced to share wealth with neighbors on a network. We discuss the case of random networks and scale free networks. We then propose an out of equilibrium dynamics of the networks, based on a competition of power and frustration in the decision-making of agents that leads to network evolution. We show that the ratio of power and frustration controls different dynamical regimes separated by kinetic transition and characterized by drastically different values of the indices of equality.« less

  13. Characterizing 3D grain size distributions from 2D sections in mylonites using a modified version of the Saltykov method

    NASA Astrophysics Data System (ADS)

    Lopez-Sanchez, Marco; Llana-Fúnez, Sergio

    2016-04-01

    The understanding of creep behaviour in rocks requires knowledge of 3D grain size distributions (GSD) that result from dynamic recrystallization processes during deformation. The methods to estimate directly the 3D grain size distribution -serial sectioning, synchrotron or X-ray-based tomography- are expensive, time-consuming and, in most cases and at best, challenging. This means that in practice grain size distributions are mostly derived from 2D sections. Although there are a number of methods in the literature to derive the actual 3D grain size distributions from 2D sections, the most popular in highly deformed rocks is the so-called Saltykov method. It has though two major drawbacks: the method assumes no interaction between grains, which is not true in the case of recrystallised mylonites; and uses histograms to describe distributions, which limits the quantification of the GSD. The first aim of this contribution is to test whether the interaction between grains in mylonites, i.e. random grain packing, affects significantly the GSDs estimated by the Saltykov method. We test this using the random resampling technique in a large data set (n = 12298). The full data set is built from several parallel thin sections that cut a completely dynamically recrystallized quartz aggregate in a rock sample from a Variscan shear zone in NW Spain. The results proved that the Saltykov method is reliable as long as the number of grains is large (n > 1000). Assuming that a lognormal distribution is an optimal approximation for the GSD in a completely dynamically recrystallized rock, we introduce an additional step to the Saltykov method, which allows estimating a continuous probability distribution function of the 3D grain size population. The additional step takes the midpoints of the classes obtained by the Saltykov method and fits a lognormal distribution with a trust region using a non-linear least squares algorithm. The new protocol is named the two-step method. The conclusion of this work is that both the Saltykov and the two-step methods are accurate and simple enough to be useful in practice in rocks, alloys or ceramics with near-equant grains and expected lognormal distributions. The Saltykov method is particularly suitable to estimate the volumes of particular grain fractions, while the two-step method to quantify the full GSD (mean and standard deviation in log grain size). The two-step method is implemented in a free, open-source and easy-to-handle script (see http://marcoalopez.github.io/GrainSizeTools/).

  14. An Alternative Method for Computing Mean and Covariance Matrix of Some Multivariate Distributions

    ERIC Educational Resources Information Center

    Radhakrishnan, R.; Choudhury, Askar

    2009-01-01

    Computing the mean and covariance matrix of some multivariate distributions, in particular, multivariate normal distribution and Wishart distribution are considered in this article. It involves a matrix transformation of the normal random vector into a random vector whose components are independent normal random variables, and then integrating…

  15. The reliability of a VISION COACH task as a measure of psychomotor skills.

    PubMed

    Xi, Yubin; Rosopa, Patrick J; Mossey, Mary; Crisler, Matthew C; Drouin, Nathalie; Kopera, Kevin; Brooks, Johnell O

    2014-10-01

    The VISION COACH™ interactive light board is designed to test and enhance participants' psychomotor skills. The primary goal of this study was to examine the test-retest reliability of the Full Field 120 VISION COACH task. One hundred eleven male and 131 female adult participants completed six trials where they responded to 120 randomly distributed lights displayed on the VISION COACH interactive light board. The mean time required for a participant to complete a trial was 101 seconds. Intraclass correlation coefficients, ranging from 0.962 to 0.987 suggest the VISION COACH Full Field 120 task was a reliable task. Cohen's d's of adjacent pairs of trials suggest learning effects did not negatively affect reliability after the third trial.

  16. Bayesian bivariate meta-analysis of correlated effects: Impact of the prior distributions on the between-study correlation, borrowing of strength, and joint inferences

    PubMed Central

    Bujkiewicz, Sylwia; Riley, Richard D

    2016-01-01

    Multivariate random-effects meta-analysis allows the joint synthesis of correlated results from multiple studies, for example, for multiple outcomes or multiple treatment groups. In a Bayesian univariate meta-analysis of one endpoint, the importance of specifying a sensible prior distribution for the between-study variance is well understood. However, in multivariate meta-analysis, there is little guidance about the choice of prior distributions for the variances or, crucially, the between-study correlation, ρB; for the latter, researchers often use a Uniform(−1,1) distribution assuming it is vague. In this paper, an extensive simulation study and a real illustrative example is used to examine the impact of various (realistically) vague prior distributions for ρB and the between-study variances within a Bayesian bivariate random-effects meta-analysis of two correlated treatment effects. A range of diverse scenarios are considered, including complete and missing data, to examine the impact of the prior distributions on posterior results (for treatment effect and between-study correlation), amount of borrowing of strength, and joint predictive distributions of treatment effectiveness in new studies. Two key recommendations are identified to improve the robustness of multivariate meta-analysis results. First, the routine use of a Uniform(−1,1) prior distribution for ρB should be avoided, if possible, as it is not necessarily vague. Instead, researchers should identify a sensible prior distribution, for example, by restricting values to be positive or negative as indicated by prior knowledge. Second, it remains critical to use sensible (e.g. empirically based) prior distributions for the between-study variances, as an inappropriate choice can adversely impact the posterior distribution for ρB, which may then adversely affect inferences such as joint predictive probabilities. These recommendations are especially important with a small number of studies and missing data. PMID:26988929

  17. Probabilistic inference in discrete spaces can be implemented into networks of LIF neurons.

    PubMed

    Probst, Dimitri; Petrovici, Mihai A; Bytschok, Ilja; Bill, Johannes; Pecevski, Dejan; Schemmel, Johannes; Meier, Karlheinz

    2015-01-01

    The means by which cortical neural networks are able to efficiently solve inference problems remains an open question in computational neuroscience. Recently, abstract models of Bayesian computation in neural circuits have been proposed, but they lack a mechanistic interpretation at the single-cell level. In this article, we describe a complete theoretical framework for building networks of leaky integrate-and-fire neurons that can sample from arbitrary probability distributions over binary random variables. We test our framework for a model inference task based on a psychophysical phenomenon (the Knill-Kersten optical illusion) and further assess its performance when applied to randomly generated distributions. As the local computations performed by the network strongly depend on the interaction between neurons, we compare several types of couplings mediated by either single synapses or interneuron chains. Due to its robustness to substrate imperfections such as parameter noise and background noise correlations, our model is particularly interesting for implementation on novel, neuro-inspired computing architectures, which can thereby serve as a fast, low-power substrate for solving real-world inference problems.

  18. Generalized Nonlinear Yule Models

    NASA Astrophysics Data System (ADS)

    Lansky, Petr; Polito, Federico; Sacerdote, Laura

    2016-11-01

    With the aim of considering models related to random graphs growth exhibiting persistent memory, we propose a fractional nonlinear modification of the classical Yule model often studied in the context of macroevolution. Here the model is analyzed and interpreted in the framework of the development of networks such as the World Wide Web. Nonlinearity is introduced by replacing the linear birth process governing the growth of the in-links of each specific webpage with a fractional nonlinear birth process with completely general birth rates. Among the main results we derive the explicit distribution of the number of in-links of a webpage chosen uniformly at random recognizing the contribution to the asymptotics and the finite time correction. The mean value of the latter distribution is also calculated explicitly in the most general case. Furthermore, in order to show the usefulness of our results, we particularize them in the case of specific birth rates giving rise to a saturating behaviour, a property that is often observed in nature. The further specialization to the non-fractional case allows us to extend the Yule model accounting for a nonlinear growth.

  19. Stimulus novelty, task relevance and the visual evoked potential in man

    NASA Technical Reports Server (NTRS)

    Courchesne, E.; Hillyard, S. A.; Galambos, R.

    1975-01-01

    The effect of task relevance on P3 (waveform of human evoked potential) waves and the methodologies used to deal with them are outlined. Visual evoked potentials (VEPs) were recorded from normal adult subjects performing in a visual discrimination task. Subjects counted the number of presentations of the numeral 4 which was interposed rarely and randomly within a sequence of tachistoscopically flashed background stimuli. Intrusive, task-irrelevant (not counted) stimuli were also interspersed rarely and randomly in the sequence of 2s; these stimuli were of two types: simples, which were easily recognizable, and novels, which were completely unrecognizable. It was found that the simples and the counted 4s evoked posteriorly distributed P3 waves while the irrelevant novels evoked large, frontally distributed P3 waves. These large, frontal P3 waves to novels were also found to be preceded by large N2 waves. These findings indicate that the P3 wave is not a unitary phenomenon but should be considered in terms of a family of waves, differing in their brain generators and in their psychological correlates.

  20. Probabilistic inference in discrete spaces can be implemented into networks of LIF neurons

    PubMed Central

    Probst, Dimitri; Petrovici, Mihai A.; Bytschok, Ilja; Bill, Johannes; Pecevski, Dejan; Schemmel, Johannes; Meier, Karlheinz

    2015-01-01

    The means by which cortical neural networks are able to efficiently solve inference problems remains an open question in computational neuroscience. Recently, abstract models of Bayesian computation in neural circuits have been proposed, but they lack a mechanistic interpretation at the single-cell level. In this article, we describe a complete theoretical framework for building networks of leaky integrate-and-fire neurons that can sample from arbitrary probability distributions over binary random variables. We test our framework for a model inference task based on a psychophysical phenomenon (the Knill-Kersten optical illusion) and further assess its performance when applied to randomly generated distributions. As the local computations performed by the network strongly depend on the interaction between neurons, we compare several types of couplings mediated by either single synapses or interneuron chains. Due to its robustness to substrate imperfections such as parameter noise and background noise correlations, our model is particularly interesting for implementation on novel, neuro-inspired computing architectures, which can thereby serve as a fast, low-power substrate for solving real-world inference problems. PMID:25729361

  1. Messaging to Increase Public Support for Naloxone Distribution Policies in the United States: Results from a Randomized Survey Experiment.

    PubMed

    Bachhuber, Marcus A; McGinty, Emma E; Kennedy-Hendricks, Alene; Niederdeppe, Jeff; Barry, Colleen L

    2015-01-01

    Barriers to public support for naloxone distribution include lack of knowledge, concerns about potential unintended consequences, and lack of sympathy for people at risk of overdose. A randomized survey experiment was conducted with a nationally-representative web-based survey research panel (GfK KnowledgePanel). Participants were randomly assigned to read different messages alone or in combination: 1) factual information about naloxone; 2) pre-emptive refutation of potential concerns about naloxone distribution; and 3) a sympathetic narrative about a mother whose daughter died of an opioid overdose. Participants were then asked if they support or oppose policies related to naloxone distribution. For each policy item, logistic regression models were used to test the effect of each message exposure compared with the no-exposure control group. The final sample consisted of 1,598 participants (completion rate: 72.6%). Factual information and the sympathetic narrative alone each led to higher support for training first responders to use naloxone, providing naloxone to friends and family members of people using opioids, and passing laws to protect people who administer naloxone. Participants receiving the combination of the sympathetic narrative and factual information, compared to factual information alone, were more likely to support all policies: providing naloxone to friends and family members (OR: 2.0 [95% CI: 1.4 to 2.9]), training first responders to use naloxone (OR: 2.0 [95% CI: 1.2 to 3.4]), passing laws to protect people if they administer naloxone (OR: 1.5 [95% CI: 1.04 to 2.2]), and passing laws to protect people if they call for medical help for an overdose (OR: 1.7 [95% CI: 1.2 to 2.5]). All messages increased public support, but combining factual information and the sympathetic narrative was most effective. Public support for naloxone distribution can be improved through education and sympathetic portrayals of the population who stands to benefit from these policies.

  2. Epidemiological characteristics of cases of death from tuberculosis and vulnerable territories1

    PubMed Central

    Yamamura, Mellina; Santos-Neto, Marcelino; dos Santos, Rebeca Augusto Neman; Garcia, Maria Concebida da Cunha; Nogueira, Jordana de Almeida; Arcêncio, Ricardo Alexandre

    2015-01-01

    Objective: to characterize the differences in the clinical and epidemiological profile of cases of death that had tuberculosis as an immediate or associated cause, and to analyze the spatial distribution of the cases of death from tuberculosis within the territories of Ribeirão Preto, Brazil. Method: an ecological study, in which the population consisted of 114 cases of death from tuberculosis. Bivariate analysis was carried out, as well as point density analysis, defined with the Kernel estimate. Results: of the cases of death from tuberculosis, 50 were the immediate cause and 64 an associated cause. Age (p=.008) and sector responsible for the death certificate (p=.003) were the variables that presented statistically significant associations with the cause of death. The spatial distribution, in both events, did not occur randomly, forming clusters in areas of the municipality. Conclusion: the difference in the profiles of the cases of death from tuberculosis, as a basic cause and as an associated cause, was governed by the age and the sector responsible for the completion of the death certificate. The non-randomness of the spatial distribution of the cases suggests areas that are vulnerable to these events. Knowing these areas can contribute to the choice of disease control strategies. PMID:26487142

  3. A random effects meta-analysis model with Box-Cox transformation.

    PubMed

    Yamaguchi, Yusuke; Maruo, Kazushi; Partlett, Christopher; Riley, Richard D

    2017-07-19

    In a random effects meta-analysis model, true treatment effects for each study are routinely assumed to follow a normal distribution. However, normality is a restrictive assumption and the misspecification of the random effects distribution may result in a misleading estimate of overall mean for the treatment effect, an inappropriate quantification of heterogeneity across studies and a wrongly symmetric prediction interval. We focus on problems caused by an inappropriate normality assumption of the random effects distribution, and propose a novel random effects meta-analysis model where a Box-Cox transformation is applied to the observed treatment effect estimates. The proposed model aims to normalise an overall distribution of observed treatment effect estimates, which is sum of the within-study sampling distributions and the random effects distribution. When sampling distributions are approximately normal, non-normality in the overall distribution will be mainly due to the random effects distribution, especially when the between-study variation is large relative to the within-study variation. The Box-Cox transformation addresses this flexibly according to the observed departure from normality. We use a Bayesian approach for estimating parameters in the proposed model, and suggest summarising the meta-analysis results by an overall median, an interquartile range and a prediction interval. The model can be applied for any kind of variables once the treatment effect estimate is defined from the variable. A simulation study suggested that when the overall distribution of treatment effect estimates are skewed, the overall mean and conventional I 2 from the normal random effects model could be inappropriate summaries, and the proposed model helped reduce this issue. We illustrated the proposed model using two examples, which revealed some important differences on summary results, heterogeneity measures and prediction intervals from the normal random effects model. The random effects meta-analysis with the Box-Cox transformation may be an important tool for examining robustness of traditional meta-analysis results against skewness on the observed treatment effect estimates. Further critical evaluation of the method is needed.

  4. Simulation of flight maneuver-load distributions by utilizing stationary, non-Gaussian random load histories

    NASA Technical Reports Server (NTRS)

    Leybold, H. A.

    1971-01-01

    Random numbers were generated with the aid of a digital computer and transformed such that the probability density function of a discrete random load history composed of these random numbers had one of the following non-Gaussian distributions: Poisson, binomial, log-normal, Weibull, and exponential. The resulting random load histories were analyzed to determine their peak statistics and were compared with cumulative peak maneuver-load distributions for fighter and transport aircraft in flight.

  5. Flame Spread and Group-Combustion Excitation in Randomly Distributed Droplet Clouds with Low-Volatility Fuel near the Excitation Limit: a Percolation Approach Based on Flame-Spread Characteristics in Microgravity

    NASA Astrophysics Data System (ADS)

    Mikami, Masato; Saputro, Herman; Seo, Takehiko; Oyagi, Hiroshi

    2018-03-01

    Stable operation of liquid-fueled combustors requires the group combustion of fuel spray. Our study employs a percolation approach to describe unsteady group-combustion excitation based on findings obtained from microgravity experiments on the flame spread of fuel droplets. We focus on droplet clouds distributed randomly in three-dimensional square lattices with a low-volatility fuel, such as n-decane in room-temperature air, where the pre-vaporization effect is negligible. We also focus on the flame spread in dilute droplet clouds near the group-combustion-excitation limit, where the droplet interactive effect is assumed negligible. The results show that the occurrence probability of group combustion sharply decreases with the increase in mean droplet spacing around a specific value, which is termed the critical mean droplet spacing. If the lattice size is at smallest about ten times as large as the flame-spread limit distance, the flame-spread characteristics are similar to those over an infinitely large cluster. The number density of unburned droplets remaining after completion of burning attained maximum around the critical mean droplet spacing. Therefore, the critical mean droplet spacing is a good index for stable combustion and unburned hydrocarbon. In the critical condition, the flame spreads through complicated paths, and thus the characteristic time scale of flame spread over droplet clouds has a very large value. The overall flame-spread rate of randomly distributed droplet clouds is almost the same as the flame-spread rate of a linear droplet array except over the flame-spread limit.

  6. Income distribution patterns from a complete social security database

    NASA Astrophysics Data System (ADS)

    Derzsy, N.; Néda, Z.; Santos, M. A.

    2012-11-01

    We analyze the income distribution of employees for 9 consecutive years (2001-2009) using a complete social security database for an economically important district of Romania. The database contains detailed information on more than half million taxpayers, including their monthly salaries from all employers where they worked. Besides studying the characteristic distribution functions in the high and low/medium income limits, the database allows us a detailed dynamical study by following the time-evolution of the taxpayers income. To our knowledge, this is the first extensive study of this kind (a previous Japanese taxpayers survey was limited to two years). In the high income limit we prove once again the validity of Pareto’s law, obtaining a perfect scaling on four orders of magnitude in the rank for all the studied years. The obtained Pareto exponents are quite stable with values around α≈2.5, in spite of the fact that during this period the economy developed rapidly and also a financial-economic crisis hit Romania in 2007-2008. For the low and medium income category we confirmed the exponential-type income distribution. Following the income of employees in time, we have found that the top limit of the income distribution is a highly dynamical region with strong fluctuations in the rank. In this region, the observed dynamics is consistent with a multiplicative random growth hypothesis. Contrarily with previous results obtained for the Japanese employees, we find that the logarithmic growth-rate is not independent of the income.

  7. Source Localization in Wireless Sensor Networks with Randomly Distributed Elements under Multipath Propagation Conditions

    DTIC Science & Technology

    2009-03-01

    IN WIRELESS SENSOR NETWORKS WITH RANDOMLY DISTRIBUTED ELEMENTS UNDER MULTIPATH PROPAGATION CONDITIONS by Georgios Tsivgoulis March 2009...COVERED Engineer’s Thesis 4. TITLE Source Localization in Wireless Sensor Networks with Randomly Distributed Elements under Multipath Propagation...the non-line-of-sight information. 15. NUMBER OF PAGES 111 14. SUBJECT TERMS Wireless Sensor Network , Direction of Arrival, DOA, Random

  8. The Analysis of Completely Randomized Factorial Experiments When Observations Are Lost at Random.

    ERIC Educational Resources Information Center

    Hummel, Thomas J.

    An investigation was conducted of the characteristics of two estimation procedures and corresponding test statistics used in the analysis of completely randomized factorial experiments when observations are lost at random. For one estimator, contrast coefficients for cell means did not involve the cell frequencies. For the other, contrast…

  9. No Additional Benefits of Block- Over Evenly-Distributed High-Intensity Interval Training within a Polarized Microcycle

    PubMed Central

    McGawley, Kerry; Juudas, Elisabeth; Kazior, Zuzanna; Ström, Kristoffer; Blomstrand, Eva; Hansson, Ola; Holmberg, Hans-Christer

    2017-01-01

    Introduction: The current study aimed to investigate the responses to block- versus evenly-distributed high-intensity interval training (HIT) within a polarized microcycle. Methods: Twenty well-trained junior cross-country skiers (10 males, age 17.6 ± 1.5 and 10 females, age 17.3 ± 1.5) completed two, 3-week periods of training (EVEN and BLOCK) in a randomized, crossover-design study. In EVEN, 3 HIT sessions (5 × 4-min of diagonal-stride roller-skiing) were completed at a maximal sustainable intensity each week while low-intensity training (LIT) was distributed evenly around the HIT. In BLOCK, the same 9 HIT sessions were completed in the second week while only LIT was completed in the first and third weeks. Heart rate (HR), session ratings of perceived exertion (sRPE), and perceived recovery (pREC) were recorded for all HIT and LIT sessions, while distance covered was recorded for each HIT interval. The recovery-stress questionnaire for athletes (RESTQ-Sport) was completed weekly. Before and after EVEN and BLOCK, resting saliva and muscle samples were collected and an incremental test and 600-m time-trial (TT) were completed. Results: Pre- to post-testing revealed no significant differences between EVEN and BLOCK for changes in resting salivary cortisol, testosterone, or IgA, or for changes in muscle capillary density, fiber area, fiber composition, enzyme activity (CS, HAD, and PFK) or the protein content of VEGF or PGC-1α. Neither were any differences observed in the changes in skiing economy, V˙O2max or 600-m time-trial performance between interventions. These findings were coupled with no significant differences between EVEN and BLOCK for distance covered during HIT, summated HR zone scores, total sRPE training load, overall pREC or overall recovery-stress state. However, 600-m TT performance improved from pre- to post-training, irrespective of intervention (P = 0.003), and a number of hormonal and muscle biopsy markers were also significantly altered post-training (P < 0.05). Discussion: The current study shows that well-trained junior cross-country skiers are able to complete 9 HIT sessions within 1 week without compromising total work done and without experiencing greater stress or reduced recovery over a 3-week polarized microcycle. However, the findings do not support block-distributed HIT as a superior method to a more even distribution of HIT in terms of enhancing physiological or performance adaptions. PMID:28659826

  10. No Additional Benefits of Block- Over Evenly-Distributed High-Intensity Interval Training within a Polarized Microcycle.

    PubMed

    McGawley, Kerry; Juudas, Elisabeth; Kazior, Zuzanna; Ström, Kristoffer; Blomstrand, Eva; Hansson, Ola; Holmberg, Hans-Christer

    2017-01-01

    Introduction: The current study aimed to investigate the responses to block- versus evenly-distributed high-intensity interval training (HIT) within a polarized microcycle. Methods: Twenty well-trained junior cross-country skiers (10 males, age 17.6 ± 1.5 and 10 females, age 17.3 ± 1.5) completed two, 3-week periods of training (EVEN and BLOCK) in a randomized, crossover-design study. In EVEN, 3 HIT sessions (5 × 4-min of diagonal-stride roller-skiing) were completed at a maximal sustainable intensity each week while low-intensity training (LIT) was distributed evenly around the HIT. In BLOCK, the same 9 HIT sessions were completed in the second week while only LIT was completed in the first and third weeks. Heart rate (HR), session ratings of perceived exertion (sRPE), and perceived recovery (pREC) were recorded for all HIT and LIT sessions, while distance covered was recorded for each HIT interval. The recovery-stress questionnaire for athletes (RESTQ-Sport) was completed weekly. Before and after EVEN and BLOCK, resting saliva and muscle samples were collected and an incremental test and 600-m time-trial (TT) were completed. Results: Pre- to post-testing revealed no significant differences between EVEN and BLOCK for changes in resting salivary cortisol, testosterone, or IgA, or for changes in muscle capillary density, fiber area, fiber composition, enzyme activity (CS, HAD, and PFK) or the protein content of VEGF or PGC-1α. Neither were any differences observed in the changes in skiing economy, [Formula: see text] or 600-m time-trial performance between interventions. These findings were coupled with no significant differences between EVEN and BLOCK for distance covered during HIT, summated HR zone scores, total sRPE training load, overall pREC or overall recovery-stress state. However, 600-m TT performance improved from pre- to post-training, irrespective of intervention ( P = 0.003), and a number of hormonal and muscle biopsy markers were also significantly altered post-training ( P < 0.05). Discussion: The current study shows that well-trained junior cross-country skiers are able to complete 9 HIT sessions within 1 week without compromising total work done and without experiencing greater stress or reduced recovery over a 3-week polarized microcycle. However, the findings do not support block-distributed HIT as a superior method to a more even distribution of HIT in terms of enhancing physiological or performance adaptions.

  11. Personalized contact strategies and predictors of time to survey completion: analysis of two sequential randomized trials.

    PubMed

    Dinglas, Victor D; Huang, Minxuan; Sepulveda, Kristin A; Pinedo, Mariela; Hopkins, Ramona O; Colantuoni, Elizabeth; Needham, Dale M

    2015-01-09

    Effective strategies for contacting and recruiting study participants are critical in conducting clinical research. In this study, we conducted two sequential randomized controlled trials of mail- and telephone-based strategies for contacting and recruiting participants, and evaluated participant-related variables' association with time to survey completion and survey completion rates. Subjects eligible for this study were survivors of acute lung injury who had been previously enrolled in a 12-month observational follow-up study evaluating their physical, cognitive and mental health outcomes, with their last study visit completed at a median of 34 months previously. Eligible subjects were contacted to complete a new research survey as part of two randomized trials, initially using a randomized mail-based contact strategy, followed by a randomized telephone-based contact strategy for non-responders to the mail strategy. Both strategies focused on using either a personalized versus a generic approach. In addition, 18 potentially relevant subject-related variables (e.g., demographics, last known physical and mental health status) were evaluated for association with time to survey completion. Of 308 eligible subjects, 67% completed the survey with a median (IQR) of 3 (2, 5) contact attempts required. There was no significant difference in the time to survey completion for either randomized trial of mail- or phone-based contact strategy. Among all subject-related variables, age ≤40 years and minority race were independently associated with a longer time to survey completion. We found that age ≤40 years and minority race were associated with a longer time to survey completion, but personalized versus generic approaches to mail- and telephone-based contact strategies had no significant effect. Repeating both mail and telephone contact attempts was important for increasing survey completion rate. NCT00719446.

  12. Tectonic resurfacing of Venus

    NASA Technical Reports Server (NTRS)

    Malin, Michael C.; Grimm, Robert E.; Herrick, Robert R.

    1993-01-01

    Impact crater distributions and morphologies have traditionally played an important role in unraveling the geologic histories of terrestrial objects, and Venus has proved no exception. The key observations are: mean crater retention age about 500 Ma; apparently random spatial distribution; modest proportion (17 percent) of modified craters; and preferential association of modified craters with areas of low crater density. The simplest interpretation of these data alone is that Venus experienced global resurfacing (assumed to be largely volcanic) prior to 500 Ma, after which time resurfacing rates decreased dramatically. This scenario does not totally exclude present geological activity: some resurfacing and crater obliteration is occurring on part of the planet, but at rates much smaller than on Earth. An alternative endmember model holds that resurfacing is also spatially randomly distributed. Resurfacing of about 1 sq km/yr eliminates craters such that a typical portion of the surface has an age of 500 Ma, but actual ages range from zero to about 1000 Ma. Monte Carlo simulation indicates that the typical resurfacing 'patch' cannot exceed about 500 km in diameter without producing a crater distribution more heterogeneous than observed. Volcanic or tectonic processes within these patches must be locally intense to be able to obliterate craters completely and leave few modified. In this abstract, we describe how global geologic mapping may be used to test resurfacing hypotheses. We present preliminary evidence that the dominant mode of resurfacing on Venus is tectonism, not volcanism, and that this process must be ongoing today. Lastly, we outline a conceptual model in which to understand the relationship between global tectonics and crater distribution and preservation.

  13. On the Wigner law in dilute random matrices

    NASA Astrophysics Data System (ADS)

    Khorunzhy, A.; Rodgers, G. J.

    1998-12-01

    We consider ensembles of N × N symmetric matrices whose entries are weakly dependent random variables. We show that random dilution can change the limiting eigenvalue distribution of such matrices. We prove that under general and natural conditions the normalised eigenvalue counting function coincides with the semicircle (Wigner) distribution in the limit N → ∞. This can be explained by the observation that dilution (or more generally, random modulation) eliminates the weak dependence (or correlations) between random matrix entries. It also supports our earlier conjecture that the Wigner distribution is stable to random dilution and modulation.

  14. A probabilistic Hu-Washizu variational principle

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  15. Proactive schema based link lifetime estimation and connectivity ratio.

    PubMed

    Bachir, Bouamoud; Ali, Ouacha; Ahmed, Habbani; Mohamed, Elkoutbi

    2014-01-01

    The radio link between a pair of wireless nodes is affected by a set of random factors such as transmission range, node mobility, and environment conditions. The properties of such radio links are continually experienced when nodes status balances between being reachable and being unreachable; thereby on completion of each experience the statistical distribution of link lifetime is updated. This aspect is emphasized in mobile ad hoc network especially when it is deployed in some fields that require intelligent processing of data information such as aerospace domain.

  16. Evaluating the Use of Random Distribution Theory to Introduce Statistical Inference Concepts to Business Students

    ERIC Educational Resources Information Center

    Larwin, Karen H.; Larwin, David A.

    2011-01-01

    Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…

  17. Models of multidimensional discrete distribution of probabilities of random variables in information systems

    NASA Astrophysics Data System (ADS)

    Gromov, Yu Yu; Minin, Yu V.; Ivanova, O. G.; Morozova, O. N.

    2018-03-01

    Multidimensional discrete distributions of probabilities of independent random values were received. Their one-dimensional distribution is widely used in probability theory. Producing functions of those multidimensional distributions were also received.

  18. Scanning Electron Microscopy and Energy-Dispersive X-Ray Microanalysis of Set CEM Cement after Application of Different Bleaching Agents.

    PubMed

    Samiei, Mohammad; Janani, Maryam; Vahdati, Amin; Alemzadeh, Yalda; Bahari, Mahmoud

    2017-01-01

    The present study evaluated the element distribution in completely set calcium-enriched mixture (CEM) cement after application of 35% carbamide peroxide, 40% hydrogen peroxide and sodium perborate as commercial bleaching agents using an energy-dispersive x-ray microanalysis (EDX) system. The surface structure was also observed using the scanning electron microscope (SEM). Twenty completely set CEM cement samples, measuring 4×4 mm 2 , were prepared in the present in vitro study and randomly divided into 4 groups based on the preparation technique as follows: the control group; 35% carbamide peroxide group in contact for 30-60 min for 4 times; 40% hydrogen peroxide group with contact time of 15-20 min for 3 times; and sodium perborate group, where the powder and liquid were mixed and placed on CEM cement surface 4 times. Data were analyzed at a significance level of 0.05 through the one Way ANOVA and Tukey's post hoc tests. EDX showed similar element distribution of oxygen, sodium, calcium and carbon in CEM cement with the use of carbamide peroxide and hydroxide peroxide; however, the distribution of silicon was different ( P <0.05). In addition, these bleaching agents resulted in significantly higher levels of oxygen and carbon ( P <0.05) and a lower level of calcium ( P <0.05) compared to the control group. SEM of the control group showed plate-like and globular structure. Sodium perborate was similar to control group due to its weak oxidizing properties. Globular structures and numerous woodpecker holes were observed on the even surface on the carbamide peroxide group. The mean elemental distribution of completely set CEM cement was different when exposed to sodium perborate, carbamide peroxide and hydrogen peroxide.

  19. Computer simulation of random variables and vectors with arbitrary probability distribution laws

    NASA Technical Reports Server (NTRS)

    Bogdan, V. M.

    1981-01-01

    Assume that there is given an arbitrary n-dimensional probability distribution F. A recursive construction is found for a sequence of functions x sub 1 = f sub 1 (U sub 1, ..., U sub n), ..., x sub n = f sub n (U sub 1, ..., U sub n) such that if U sub 1, ..., U sub n are independent random variables having uniform distribution over the open interval (0,1), then the joint distribution of the variables x sub 1, ..., x sub n coincides with the distribution F. Since uniform independent random variables can be well simulated by means of a computer, this result allows one to simulate arbitrary n-random variables if their joint probability distribution is known.

  20. Randomized clinical trials in orthodontics are rarely registered a priori and often published late or not at all.

    PubMed

    Papageorgiou, Spyridon N; Antonoglou, Georgios N; Sándor, George K; Eliades, Theodore

    2017-01-01

    A priori registration of randomized clinical trials is crucial to the transparency and credibility of their findings. Aim of this study was to assess the frequency with which registered and completed randomized trials in orthodontics are published. We searched ClinicalTrials.gov and ISRCTN for registered randomized clinical trials in orthodontics that had been completed up to January 2017 and judged the publication status and date of registered trials using a systematic protocol. Statistical analysis included descriptive statistics, chi-square or Fisher exact tests, and Kaplan-Meier survival estimates. From the 266 orthodontic trials registered up to January 2017, 80 trials had been completed and included in the present study. Among these 80 included trials, the majority (76%) were registered retrospectively, while only 33 (41%) were published at the time. The median time from completion to publication was 20.1 months (interquartile range: 9.1 to 31.6 months), while survival analysis indicated that less than 10% of the trials were published after 5 years from their completion. Finally, 22 (28%) of completed trials remain unpublished even after 5 years from their completion. Publication rates of registered randomized trials in orthodontics remained low, even 5 years after their completion date.

  1. Randomized clinical trials in orthodontics are rarely registered a priori and often published late or not at all

    PubMed Central

    Antonoglou, Georgios N.; Sándor, George K.; Eliades, Theodore

    2017-01-01

    A priori registration of randomized clinical trials is crucial to the transparency and credibility of their findings. Aim of this study was to assess the frequency with which registered and completed randomized trials in orthodontics are published. We searched ClinicalTrials.gov and ISRCTN for registered randomized clinical trials in orthodontics that had been completed up to January 2017 and judged the publication status and date of registered trials using a systematic protocol. Statistical analysis included descriptive statistics, chi-square or Fisher exact tests, and Kaplan-Meier survival estimates. From the 266 orthodontic trials registered up to January 2017, 80 trials had been completed and included in the present study. Among these 80 included trials, the majority (76%) were registered retrospectively, while only 33 (41%) were published at the time. The median time from completion to publication was 20.1 months (interquartile range: 9.1 to 31.6 months), while survival analysis indicated that less than 10% of the trials were published after 5 years from their completion. Finally, 22 (28%) of completed trials remain unpublished even after 5 years from their completion. Publication rates of registered randomized trials in orthodontics remained low, even 5 years after their completion date. PMID:28777820

  2. Listing triangles in expected linear time on a class of power law graphs.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nordman, Daniel J.; Wilson, Alyson G.; Phillips, Cynthia Ann

    Enumerating triangles (3-cycles) in graphs is a kernel operation for social network analysis. For example, many community detection methods depend upon finding common neighbors of two related entities. We consider Cohen's simple and elegant solution for listing triangles: give each node a 'bucket.' Place each edge into the bucket of its endpoint of lowest degree, breaking ties consistently. Each node then checks each pair of edges in its bucket, testing for the adjacency that would complete that triangle. Cohen presents an informal argument that his algorithm should run well on real graphs. We formalize this argument by providing an analysismore » for the expected running time on a class of random graphs, including power law graphs. We consider a rigorously defined method for generating a random simple graph, the erased configuration model (ECM). In the ECM each node draws a degree independently from a marginal degree distribution, endpoints pair randomly, and we erase self loops and multiedges. If the marginal degree distribution has a finite second moment, it follows immediately that Cohen's algorithm runs in expected linear time. Furthermore, it can still run in expected linear time even when the degree distribution has such a heavy tail that the second moment is not finite. We prove that Cohen's algorithm runs in expected linear time when the marginal degree distribution has finite 4/3 moment and no vertex has degree larger than {radical}n. In fact we give the precise asymptotic value of the expected number of edge pairs per bucket. A finite 4/3 moment is required; if it is unbounded, then so is the number of pairs. The marginal degree distribution of a power law graph has bounded 4/3 moment when its exponent {alpha} is more than 7/3. Thus for this class of power law graphs, with degree at most {radical}n, Cohen's algorithm runs in expected linear time. This is precisely the value of {alpha} for which the clustering coefficient tends to zero asymptotically, and it is in the range that is relevant for the degree distribution of the World-Wide Web.« less

  3. Predictors of study completion and withdrawal in a randomized clinical trial of a pediatric diabetes adherence intervention.

    PubMed

    Driscoll, Kimberly A; Killian, Michael; Johnson, Suzanne Bennett; Silverstein, Janet H; Deeb, Larry C

    2009-05-01

    Loss of participants in randomized clinical trials threatens the validity of study findings. The purpose of this study was to determine pre-randomization predictors of study completion status throughout the course of a randomized clinical trial involving young children with type 1 diabetes and their primary caregivers. An intervention to improve adherence to the diabetes treatment regimen was delivered as part of the child's regular 3-month diabetes clinic visit. The study protocol involved 7 clinic visits across 18 months for the Immediate Treatment group and 9 clinic visits across 24 months for the Delayed Treatment group. Among those who completed the study and regardless of treatment group, participants were categorized into two groups: On-Time Completers (n=41) and Late Completers (n=39). Demographic, disease, and psychosocial characteristics of children and their primary caregivers measured prior to study randomization were tested for their association with the participants' completion status (i.e., On-Time Completers, Late Completers, or Withdrawals). Of the 108 participants, 28 (25.9%) withdrew and 80 (74.1%) completed the study. On-Time Completers (i.e., study completed within 4 months of expected date) were more likely to have private insurance and primary caregivers with some college education. Late Completers (i.e., study completion took longer than 4 months) were more likely to be boys and to have primary caregivers who reported mild to moderate levels of depression. Children who subsequently withdrew from the study reported poorer diabetes-related quality of life and poorer school-related quality of life at study inception and were more likely to have primary caregivers who did not work outside the home. Pre-randomization screening of participants on both demographic and psychological variables may help identify those at greatest risk for study withdrawal or poor study protocol adherence, permitting the investigators to develop retention strategies aimed at this high-risk group.

  4. Time-fractional characterization of brine reaction and precipitation in porous media

    NASA Astrophysics Data System (ADS)

    Xu, Jianping; Jiang, Guancheng

    2018-04-01

    Brine reaction and precipitation in porous media sometimes occur in the presence of a strong fluid flowing field, which induces the mobilization of the precipitated salts and distorts their spatial distribution. It is interesting to investigate how the distribution responds to such mobilization. We view these precipitates as random walkers in the complex inner space of the porous media, where they make stochastic jumps among locations and possibly wait between successive transitions. In consideration of related experimental results, the waiting time of the precipitates at a particular position is allowed to range widely from short sojourn to permanent residence. Through the model of a continuous-time random walk, a class of time-fractional equations for the precipitate's concentration profile is derived, including that in the Riemann-Liouville formalism and the Prabhakar formalism. The solutions to these equations show the general pattern of the precipitate's spatiotemporal evolution: a coupling of mass accumulation and mass transport. And the degree to which the mass is mobilized turns out to be monotonically correlated to the fractional exponent α . Moreover, to keep the completeness of the model, we further discuss how the interaction among the precipitates influences the precipitation process. In doing so, a time-fractional non-linear Fokker-Planck equation with source term is introduced and solved. It is shown that the interaction among the precipitates slightly perturbs their spatial distribution. This distribution is largely dominated by the brine reaction itself and the interaction between the precipitates and the porous media.

  5. Evaluation of Gas Phase Dispersion in Flotation under Predetermined Hydrodynamic Conditions

    NASA Astrophysics Data System (ADS)

    Młynarczykowska, Anna; Oleksik, Konrad; Tupek-Murowany, Klaudia

    2018-03-01

    Results of various investigations shows the relationship between the flotation parameters and gas distribution in a flotation cell. The size of gas bubbles is a random variable with a specific distribution. The analysis of this distribution is useful to make mathematical description of the flotation process. The flotation process depends on many variable factors. These are mainly occurrences like collision of single particle with gas bubble, adhesion of particle to the surface of bubble and detachment process. These factors are characterized by randomness. Because of that it is only possible to talk about the probability of occurence of one of these events which directly affects the speed of the process, thus a constant speed of flotation process. Probability of the bubble-particle collision in the flotation chamber with mechanical pulp agitation depends on the surface tension of the solution, air consumption, degree of pul aeration, energy dissipation and average feed particle size. Appropriate identification and description of the parameters of the dispersion of gas bubbles helps to complete the analysis of the flotation process in a specific physicochemical conditions and hydrodynamic for any raw material. The article presents the results of measurements and analysis of the gas phase dispersion by the size distribution of air bubbles in a flotation chamber under fixed hydrodynamic conditions. The tests were carried out in the Laboratory of Instrumental Methods in Department of Environmental Engineering and Mineral Processing, Faculty of Mining and Geoengineerin, AGH Univeristy of Science and Technology in Krakow.

  6. Moderation analysis with missing data in the predictors.

    PubMed

    Zhang, Qian; Wang, Lijuan

    2017-12-01

    The most widely used statistical model for conducting moderation analysis is the moderated multiple regression (MMR) model. In MMR modeling, missing data could pose a challenge, mainly because the interaction term is a product of two or more variables and thus is a nonlinear function of the involved variables. In this study, we consider a simple MMR model, where the effect of the focal predictor X on the outcome Y is moderated by a moderator U. The primary interest is to find ways of estimating and testing the moderation effect with the existence of missing data in X. We mainly focus on cases when X is missing completely at random (MCAR) and missing at random (MAR). Three methods are compared: (a) Normal-distribution-based maximum likelihood estimation (NML); (b) Normal-distribution-based multiple imputation (NMI); and (c) Bayesian estimation (BE). Via simulations, we found that NML and NMI could lead to biased estimates of moderation effects under MAR missingness mechanism. The BE method outperformed NMI and NML for MMR modeling with missing data in the focal predictor, missingness depending on the moderator and/or auxiliary variables, and correctly specified distributions for the focal predictor. In addition, more robust BE methods are needed in terms of the distribution mis-specification problem of the focal predictor. An empirical example was used to illustrate the applications of the methods with a simple sensitivity analysis. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. Stochastic space interval as a link between quantum randomness and macroscopic randomness?

    NASA Astrophysics Data System (ADS)

    Haug, Espen Gaarder; Hoff, Harald

    2018-03-01

    For many stochastic phenomena, we observe statistical distributions that have fat-tails and high-peaks compared to the Gaussian distribution. In this paper, we will explain how observable statistical distributions in the macroscopic world could be related to the randomness in the subatomic world. We show that fat-tailed (leptokurtic) phenomena in our everyday macroscopic world are ultimately rooted in Gaussian - or very close to Gaussian-distributed subatomic particle randomness, but they are not, in a strict sense, Gaussian distributions. By running a truly random experiment over a three and a half-year period, we observed a type of random behavior in trillions of photons. Combining our results with simple logic, we find that fat-tailed and high-peaked statistical distributions are exactly what we would expect to observe if the subatomic world is quantized and not continuously divisible. We extend our analysis to the fact that one typically observes fat-tails and high-peaks relative to the Gaussian distribution in stocks and commodity prices and many aspects of the natural world; these instances are all observable and documentable macro phenomena that strongly suggest that the ultimate building blocks of nature are discrete (e.g. they appear in quanta).

  8. Distributed Detection with Collisions in a Random, Single-Hop Wireless Sensor Network

    DTIC Science & Technology

    2013-05-26

    public release; distribution is unlimited. Distributed detection with collisions in a random, single-hop wireless sensor network The views, opinions...1274 2 ABSTRACT Distributed detection with collisions in a random, single-hop wireless sensor network Report Title We consider the problem of... WIRELESS SENSOR NETWORK Gene T. Whipps?† Emre Ertin† Randolph L. Moses† ?U.S. Army Research Laboratory, Adelphi, MD 20783 †The Ohio State University

  9. Open-Label, Randomized, Parallel-Group Controlled Clinical Trial of Massage for Treatment of Depression in HIV-Infected Subjects

    PubMed Central

    Gertsik, Lev; Favreau, Joya T.; Smith, Shawnee I.; Mirocha, James M.; Rao, Uma; Daar, Eric S.

    2013-01-01

    Abstract Objectives The study objectives were to determine whether massage therapy reduces symptoms of depression in subjects with human immunodeficiency virus (HIV) disease. Design Subjects were randomized non-blinded into one of three parallel groups to receive Swedish massage or to one of two control groups, touch or no intervention for eight weeks. Settings/location The study was conducted at the Department of Psychiatry and Behavioral Neurosciences at Cedars-Sinai Medical Center in Los Angeles, California, which provided primary clinical care in an institutional setting. Subjects Study inclusion required being at least 16 years of age, HIV-seropositive, with a diagnosis of major depressive disorder. Subjects had to be on a stable neuropsychiatric, analgesic, and antiretroviral regimen for >30 days with no plans to modify therapy for the duration of the study. Approximately 40% of the subjects were currently taking antidepressants. All subjects were medically stable. Fifty-four (54) subjects were randomized, 50 completed at least 1 week (intent-to-treat; ITT), and 37 completed the study (completers). Interventions Swedish massage and touch subjects visited the massage therapist for 1 hour twice per week. The touch group had a massage therapist place both hands on the subject with slight pressure, but no massage, in a uniform distribution in the same pattern used for the massage subjects. Outcome measures The primary outcome measure was the Hamilton Rating Scale for Depression score, with the secondary outcome measure being the Beck Depression Inventory. Results For both the ITT and completers analyses, massage significantly reduced the severity of depression beginning at week 4 (p≤0.04) and continuing at weeks 6 (p≤0.03) and 8 (p≤0.005) compared to no intervention and/or touch. Conclusions The results indicate that massage therapy can reduce symptoms of depression in subjects with HIV disease. The durability of the response, optimal “dose” of massage, and mechanisms by which massage exerts its antidepressant effects remain to be determined. PMID:23098696

  10. A State-of-the-Science Overview of Randomized Controlled Trials Evaluating Acute Management of Moderate-to-Severe Traumatic Brain Injury

    PubMed Central

    Synnot, Anneliese; Maas, Andrew I.; Menon, David K.; Cooper, D. James; Rosenfeld, Jeffrey V.; Gruen, Russell L.

    2016-01-01

    Abstract Moderate-to-severe traumatic brain injury (TBI) remains a major global challenge, with rising incidence, unchanging mortality and lifelong impairments. State-of-the-science reviews are important for research planning and clinical decision support. This review aimed to identify randomized controlled trials (RCTs) evaluating interventions for acute management of moderate/severe TBI, synthesize key RCT characteristics and findings, and determine their implications on clinical practice and future research. RCTs were identified through comprehensive database and other searches. Key characteristics, outcomes, risk of bias, and analysis approach were extracted. Data were narratively synthesized, with a focus on robust (multi-center, low risk of bias, n > 100) RCTs, and three-dimensional graphical figures also were used to explore relationships between RCT characteristics and findings. A total of 207 RCTs were identified. The 191 completed RCTs enrolled 35,340 participants (median, 66). Most (72%) were single center and enrolled less than 100 participants (69%). There were 26 robust RCTs across 18 different interventions. For 74% of 392 comparisons across all included RCTs, there was no significant difference between groups. Positive findings were broadly distributed with respect to RCT characteristics. Less than one-third of RCTs demonstrated low risk of bias for random sequence generation or allocation concealment, less than one-quarter used covariate adjustment, and only 7% employed an ordinal analysis approach. Considerable investment of resources in producing 191 completed RCTs for acute TBI management has resulted in very little translatable evidence. This may result from broad distribution of research effort, small samples, preponderance of single-center RCTs, and methodological shortcomings. More sophisticated RCT design, large multi-center RCTs in priority areas, increased focus on pre-clinical research, and alternatives to RCTs, such as comparative effectiveness research and precision medicine, are needed to fully realize the potential of acute TBI research to benefit patients. PMID:26711675

  11. Investigation of a protein complex network

    NASA Astrophysics Data System (ADS)

    Mashaghi, A. R.; Ramezanpour, A.; Karimipour, V.

    2004-09-01

    The budding yeast Saccharomyces cerevisiae is the first eukaryote whose genome has been completely sequenced. It is also the first eukaryotic cell whose proteome (the set of all proteins) and interactome (the network of all mutual interactions between proteins) has been analyzed. In this paper we study the structure of the yeast protein complex network in which weighted edges between complexes represent the number of shared proteins. It is found that the network of protein complexes is a small world network with scale free behavior for many of its distributions. However we find that there are no strong correlations between the weights and degrees of neighboring complexes. To reveal non-random features of the network we also compare it with a null model in which the complexes randomly select their proteins. Finally we propose a simple evolutionary model based on duplication and divergence of proteins.

  12. Analysis of overdispersed count data by mixtures of Poisson variables and Poisson processes.

    PubMed

    Hougaard, P; Lee, M L; Whitmore, G A

    1997-12-01

    Count data often show overdispersion compared to the Poisson distribution. Overdispersion is typically modeled by a random effect for the mean, based on the gamma distribution, leading to the negative binomial distribution for the count. This paper considers a larger family of mixture distributions, including the inverse Gaussian mixture distribution. It is demonstrated that it gives a significantly better fit for a data set on the frequency of epileptic seizures. The same approach can be used to generate counting processes from Poisson processes, where the rate or the time is random. A random rate corresponds to variation between patients, whereas a random time corresponds to variation within patients.

  13. Bias analysis applied to Agricultural Health Study publications to estimate non-random sources of uncertainty.

    PubMed

    Lash, Timothy L

    2007-11-26

    The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a qualitative description of study limitations. The latter approach is likely to lead to overconfidence regarding the potential for causal associations, whereas the former safeguards against such overinterpretations. Furthermore, such analyses, once programmed, allow rapid implementation of alternative assignments of probability distributions to the bias parameters, so elevate the plane of discussion regarding study bias from characterizing studies as "valid" or "invalid" to a critical and quantitative discussion of sources of uncertainty.

  14. PLATYPUS: A code for reaction dynamics of weakly-bound nuclei at near-barrier energies within a classical dynamical model

    NASA Astrophysics Data System (ADS)

    Diaz-Torres, Alexis

    2011-04-01

    A self-contained Fortran-90 program based on a three-dimensional classical dynamical reaction model with stochastic breakup is presented, which is a useful tool for quantifying complete and incomplete fusion, and breakup in reactions induced by weakly-bound two-body projectiles near the Coulomb barrier. The code calculates (i) integrated complete and incomplete fusion cross sections and their angular momentum distribution, (ii) the excitation energy distribution of the primary incomplete-fusion products, (iii) the asymptotic angular distribution of the incomplete-fusion products and the surviving breakup fragments, and (iv) breakup observables, such as angle, kinetic energy and relative energy distributions. Program summaryProgram title: PLATYPUS Catalogue identifier: AEIG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 332 342 No. of bytes in distributed program, including test data, etc.: 344 124 Distribution format: tar.gz Programming language: Fortran-90 Computer: Any Unix/Linux workstation or PC with a Fortran-90 compiler Operating system: Linux or Unix RAM: 10 MB Classification: 16.9, 17.7, 17.8, 17.11 Nature of problem: The program calculates a wide range of observables in reactions induced by weakly-bound two-body nuclei near the Coulomb barrier. These include integrated complete and incomplete fusion cross sections and their spin distribution, as well as breakup observables (e.g. the angle, kinetic energy, and relative energy distributions of the fragments). Solution method: All the observables are calculated using a three-dimensional classical dynamical model combined with the Monte Carlo sampling of probability-density distributions. See Refs. [1,2] for further details. Restrictions: The program is suited for a weakly-bound two-body projectile colliding with a stable target. The initial orientation of the segment joining the two breakup fragments is considered to be isotropic. Additional comments: Several source routines from Numerical Recipies, and the Mersenne Twister random number generator package are included to enable independent compilation. Running time: About 75 minutes for input provided, using a PC with 1.5 GHz processor.

  15. Log-normal distribution from a process that is not multiplicative but is additive.

    PubMed

    Mouri, Hideaki

    2013-10-01

    The central limit theorem ensures that a sum of random variables tends to a Gaussian distribution as their total number tends to infinity. However, for a class of positive random variables, we find that the sum tends faster to a log-normal distribution. Although the sum tends eventually to a Gaussian distribution, the distribution of the sum is always close to a log-normal distribution rather than to any Gaussian distribution if the summands are numerous enough. This is in contrast to the current consensus that any log-normal distribution is due to a product of random variables, i.e., a multiplicative process, or equivalently to nonlinearity of the system. In fact, the log-normal distribution is also observable for a sum, i.e., an additive process that is typical of linear systems. We show conditions for such a sum, an analytical example, and an application to random scalar fields such as those of turbulence.

  16. Missing Value Imputation Approach for Mass Spectrometry-based Metabolomics Data.

    PubMed

    Wei, Runmin; Wang, Jingye; Su, Mingming; Jia, Erik; Chen, Shaoqiu; Chen, Tianlu; Ni, Yan

    2018-01-12

    Missing values exist widely in mass-spectrometry (MS) based metabolomics data. Various methods have been applied for handling missing values, but the selection can significantly affect following data analyses. Typically, there are three types of missing values, missing not at random (MNAR), missing at random (MAR), and missing completely at random (MCAR). Our study comprehensively compared eight imputation methods (zero, half minimum (HM), mean, median, random forest (RF), singular value decomposition (SVD), k-nearest neighbors (kNN), and quantile regression imputation of left-censored data (QRILC)) for different types of missing values using four metabolomics datasets. Normalized root mean squared error (NRMSE) and NRMSE-based sum of ranks (SOR) were applied to evaluate imputation accuracy. Principal component analysis (PCA)/partial least squares (PLS)-Procrustes analysis were used to evaluate the overall sample distribution. Student's t-test followed by correlation analysis was conducted to evaluate the effects on univariate statistics. Our findings demonstrated that RF performed the best for MCAR/MAR and QRILC was the favored one for left-censored MNAR. Finally, we proposed a comprehensive strategy and developed a public-accessible web-tool for the application of missing value imputation in metabolomics ( https://metabolomics.cc.hawaii.edu/software/MetImp/ ).

  17. Probability distribution of the entanglement across a cut at an infinite-randomness fixed point

    NASA Astrophysics Data System (ADS)

    Devakul, Trithep; Majumdar, Satya N.; Huse, David A.

    2017-03-01

    We calculate the probability distribution of entanglement entropy S across a cut of a finite one-dimensional spin chain of length L at an infinite-randomness fixed point using Fisher's strong randomness renormalization group (RG). Using the random transverse-field Ising model as an example, the distribution is shown to take the form p (S |L ) ˜L-ψ (k ) , where k ≡S /ln[L /L0] , the large deviation function ψ (k ) is found explicitly, and L0 is a nonuniversal microscopic length. We discuss the implications of such a distribution on numerical techniques that rely on entanglement, such as matrix-product-state-based techniques. Our results are verified with numerical RG simulations, as well as the actual entanglement entropy distribution for the random transverse-field Ising model which we calculate for large L via a mapping to Majorana fermions.

  18. A scaling law for random walks on networks

    PubMed Central

    Perkins, Theodore J.; Foxall, Eric; Glass, Leon; Edwards, Roderick

    2014-01-01

    The dynamics of many natural and artificial systems are well described as random walks on a network: the stochastic behaviour of molecules, traffic patterns on the internet, fluctuations in stock prices and so on. The vast literature on random walks provides many tools for computing properties such as steady-state probabilities or expected hitting times. Previously, however, there has been no general theory describing the distribution of possible paths followed by a random walk. Here, we show that for any random walk on a finite network, there are precisely three mutually exclusive possibilities for the form of the path distribution: finite, stretched exponential and power law. The form of the distribution depends only on the structure of the network, while the stepping probabilities control the parameters of the distribution. We use our theory to explain path distributions in domains such as sports, music, nonlinear dynamics and stochastic chemical kinetics. PMID:25311870

  19. Partial transpose of random quantum states: Exact formulas and meanders

    NASA Astrophysics Data System (ADS)

    Fukuda, Motohisa; Śniady, Piotr

    2013-04-01

    We investigate the asymptotic behavior of the empirical eigenvalues distribution of the partial transpose of a random quantum state. The limiting distribution was previously investigated via Wishart random matrices indirectly (by approximating the matrix of trace 1 by the Wishart matrix of random trace) and shown to be the semicircular distribution or the free difference of two free Poisson distributions, depending on how dimensions of the concerned spaces grow. Our use of Wishart matrices gives exact combinatorial formulas for the moments of the partial transpose of the random state. We find three natural asymptotic regimes in terms of geodesics on the permutation groups. Two of them correspond to the above two cases; the third one turns out to be a new matrix model for the meander polynomials. Moreover, we prove the convergence to the semicircular distribution together with its extreme eigenvalues under weaker assumptions, and show large deviation bound for the latter.

  20. A scaling law for random walks on networks

    NASA Astrophysics Data System (ADS)

    Perkins, Theodore J.; Foxall, Eric; Glass, Leon; Edwards, Roderick

    2014-10-01

    The dynamics of many natural and artificial systems are well described as random walks on a network: the stochastic behaviour of molecules, traffic patterns on the internet, fluctuations in stock prices and so on. The vast literature on random walks provides many tools for computing properties such as steady-state probabilities or expected hitting times. Previously, however, there has been no general theory describing the distribution of possible paths followed by a random walk. Here, we show that for any random walk on a finite network, there are precisely three mutually exclusive possibilities for the form of the path distribution: finite, stretched exponential and power law. The form of the distribution depends only on the structure of the network, while the stepping probabilities control the parameters of the distribution. We use our theory to explain path distributions in domains such as sports, music, nonlinear dynamics and stochastic chemical kinetics.

  1. A scaling law for random walks on networks.

    PubMed

    Perkins, Theodore J; Foxall, Eric; Glass, Leon; Edwards, Roderick

    2014-10-14

    The dynamics of many natural and artificial systems are well described as random walks on a network: the stochastic behaviour of molecules, traffic patterns on the internet, fluctuations in stock prices and so on. The vast literature on random walks provides many tools for computing properties such as steady-state probabilities or expected hitting times. Previously, however, there has been no general theory describing the distribution of possible paths followed by a random walk. Here, we show that for any random walk on a finite network, there are precisely three mutually exclusive possibilities for the form of the path distribution: finite, stretched exponential and power law. The form of the distribution depends only on the structure of the network, while the stepping probabilities control the parameters of the distribution. We use our theory to explain path distributions in domains such as sports, music, nonlinear dynamics and stochastic chemical kinetics.

  2. Turbulence hierarchy in a random fibre laser

    PubMed Central

    González, Iván R. Roa; Lima, Bismarck C.; Pincheira, Pablo I. R.; Brum, Arthur A.; Macêdo, Antônio M. S.; Vasconcelos, Giovani L.; de S. Menezes, Leonardo; Raposo, Ernesto P.; Gomes, Anderson S. L.; Kashyap, Raman

    2017-01-01

    Turbulence is a challenging feature common to a wide range of complex phenomena. Random fibre lasers are a special class of lasers in which the feedback arises from multiple scattering in a one-dimensional disordered cavity-less medium. Here we report on statistical signatures of turbulence in the distribution of intensity fluctuations in a continuous-wave-pumped erbium-based random fibre laser, with random Bragg grating scatterers. The distribution of intensity fluctuations in an extensive data set exhibits three qualitatively distinct behaviours: a Gaussian regime below threshold, a mixture of two distributions with exponentially decaying tails near the threshold and a mixture of distributions with stretched-exponential tails above threshold. All distributions are well described by a hierarchical stochastic model that incorporates Kolmogorov’s theory of turbulence, which includes energy cascade and the intermittence phenomenon. Our findings have implications for explaining the remarkably challenging turbulent behaviour in photonics, using a random fibre laser as the experimental platform. PMID:28561064

  3. OxMaR: open source free software for online minimization and randomization for clinical trials.

    PubMed

    O'Callaghan, Christopher A

    2014-01-01

    Minimization is a valuable method for allocating participants between the control and experimental arms of clinical studies. The use of minimization reduces differences that might arise by chance between the study arms in the distribution of patient characteristics such as gender, ethnicity and age. However, unlike randomization, minimization requires real time assessment of each new participant with respect to the preceding distribution of relevant participant characteristics within the different arms of the study. For multi-site studies, this necessitates centralized computational analysis that is shared between all study locations. Unfortunately, there is no suitable freely available open source or free software that can be used for this purpose. OxMaR was developed to enable researchers in any location to use minimization for patient allocation and to access the minimization algorithm using any device that can connect to the internet such as a desktop computer, tablet or mobile phone. The software is complete in itself and requires no special packages or libraries to be installed. It is simple to set up and run over the internet using online facilities which are very low cost or even free to the user. Importantly, it provides real time information on allocation to the study lead or administrator and generates real time distributed backups with each allocation. OxMaR can readily be modified and customised and can also be used for standard randomization. It has been extensively tested and has been used successfully in a low budget multi-centre study. Hitherto, the logistical difficulties involved in minimization have precluded its use in many small studies and this software should allow more widespread use of minimization which should lead to studies with better matched control and experimental arms. OxMaR should be particularly valuable in low resource settings.

  4. Distribution of a Glycosylphosphatidylinositol-anchored Protein at the Apical Surface of MDCK Cells Examined at a Resolution of <100 Å Using Imaging Fluorescence Resonance Energy Transfer

    PubMed Central

    Kenworthy, A.K.; Edidin, M.

    1998-01-01

    Membrane microdomains (“lipid rafts”) enriched in glycosylphosphatidylinositol (GPI)-anchored proteins, glycosphingolipids, and cholesterol have been implicated in events ranging from membrane trafficking to signal transduction. Although there is biochemical evidence for such membrane microdomains, they have not been visualized by light or electron microscopy. To probe for microdomains enriched in GPI- anchored proteins in intact cell membranes, we used a novel form of digital microscopy, imaging fluorescence resonance energy transfer (FRET), which extends the resolution of fluorescence microscopy to the molecular level (<100 Å). We detected significant energy transfer between donor- and acceptor-labeled antibodies against the GPI-anchored protein 5′ nucleotidase (5′ NT) at the apical membrane of MDCK cells. The efficiency of energy transfer correlated strongly with the surface density of the acceptor-labeled antibody. The FRET data conformed to theoretical predictions for two-dimensional FRET between randomly distributed molecules and were inconsistent with a model in which 5′ NT is constitutively clustered. Though we cannot completely exclude the possibility that some 5′ NT is in clusters, the data imply that most 5′ NT molecules are randomly distributed across the apical surface of MDCK cells. These findings constrain current models for lipid rafts and the membrane organization of GPI-anchored proteins. PMID:9660864

  5. Modeling a space-based quantum link that includes an adaptive optics system

    NASA Astrophysics Data System (ADS)

    Duchane, Alexander W.; Hodson, Douglas D.; Mailloux, Logan O.

    2017-10-01

    Quantum Key Distribution uses optical pulses to generate shared random bit strings between two locations. If a high percentage of the optical pulses are comprised of single photons, then the statistical nature of light and information theory can be used to generate secure shared random bit strings which can then be converted to keys for encryption systems. When these keys are incorporated along with symmetric encryption techniques such as a one-time pad, then this method of key generation and encryption is resistant to future advances in quantum computing which will significantly degrade the effectiveness of current asymmetric key sharing techniques. This research first reviews the transition of Quantum Key Distribution free-space experiments from the laboratory environment to field experiments, and finally, ongoing space experiments. Next, a propagation model for an optical pulse from low-earth orbit to ground and the effects of turbulence on the transmitted optical pulse is described. An Adaptive Optics system is modeled to correct for the aberrations caused by the atmosphere. The long-term point spread function of the completed low-earth orbit to ground optical system is explored in the results section. Finally, the impact of this optical system and its point spread function on an overall quantum key distribution system as well as the future work necessary to show this impact is described.

  6. Classifying with confidence from incomplete information.

    DOE PAGES

    Parrish, Nathan; Anderson, Hyrum S.; Gupta, Maya R.; ...

    2013-12-01

    For this paper, we consider the problem of classifying a test sample given incomplete information. This problem arises naturally when data about a test sample is collected over time, or when costs must be incurred to compute the classification features. For example, in a distributed sensor network only a fraction of the sensors may have reported measurements at a certain time, and additional time, power, and bandwidth is needed to collect the complete data to classify. A practical goal is to assign a class label as soon as enough data is available to make a good decision. We formalize thismore » goal through the notion of reliability—the probability that a label assigned given incomplete data would be the same as the label assigned given the complete data, and we propose a method to classify incomplete data only if some reliability threshold is met. Our approach models the complete data as a random variable whose distribution is dependent on the current incomplete data and the (complete) training data. The method differs from standard imputation strategies in that our focus is on determining the reliability of the classification decision, rather than just the class label. We show that the method provides useful reliability estimates of the correctness of the imputed class labels on a set of experiments on time-series data sets, where the goal is to classify the time-series as early as possible while still guaranteeing that the reliability threshold is met.« less

  7. Explicit equilibria in a kinetic model of gambling

    NASA Astrophysics Data System (ADS)

    Bassetti, F.; Toscani, G.

    2010-06-01

    We introduce and discuss a nonlinear kinetic equation of Boltzmann type which describes the evolution of wealth in a pure gambling process, where the entire sum of wealths of two agents is up for gambling, and randomly shared between the agents. For this equation the analytical form of the steady states is found for various realizations of the random fraction of the sum which is shared to the agents. Among others, the exponential distribution appears as steady state in case of a uniformly distributed random fraction, while Gamma distribution appears for a random fraction which is Beta distributed. The case in which the gambling game is only conservative-in-the-mean is shown to lead to an explicit heavy tailed distribution.

  8. Narrow-band generation in random distributed feedback fiber laser.

    PubMed

    Sugavanam, Srikanth; Tarasov, Nikita; Shu, Xuewen; Churkin, Dmitry V

    2013-07-15

    Narrow-band emission of spectral width down to ~0.05 nm line-width is achieved in the random distributed feedback fiber laser employing narrow-band fiber Bragg grating or fiber Fabry-Perot interferometer filters. The observed line-width is ~10 times less than line-width of other demonstrated up to date random distributed feedback fiber lasers. The random DFB laser with Fabry-Perot interferometer filter provides simultaneously multi-wavelength and narrow-band (within each line) generation with possibility of further wavelength tuning.

  9. Cognitive load in distributed and massed practice in virtual reality mastoidectomy simulation.

    PubMed

    Andersen, Steven Arild Wuyts; Mikkelsen, Peter Trier; Konge, Lars; Cayé-Thomasen, Per; Sørensen, Mads Sølvsten

    2016-02-01

    Cognitive load theory states that working memory is limited. This has implications for learning and suggests that reducing cognitive load (CL) could promote learning and skills acquisition. This study aims to explore the effect of repeated practice and simulator-integrated tutoring on CL in virtual reality (VR) mastoidectomy simulation. Prospective trial. Forty novice medical students performed 12 repeated virtual mastoidectomy procedures in the Visible Ear Simulator: 21 completed distributed practice with practice blocks spaced in time and 19 participants completed massed practice (all practices performed in 1 day). Participants were randomized for tutoring with the simulator-integrated tutor function. Cognitive load was estimated by measuring reaction time in a secondary task. Data were analyzed using linear mixed models for repeated measurements. The mean reaction time increased by 37% during the procedure compared with baseline, demonstrating that the procedure placed substantial cognitive demands. Repeated practice significantly lowered CL in the distributed practice group but not in massed practice group. In addition, CL was found to be further increased by 10.3% in the later and more complex stages of the procedure. The simulator-integrated tutor function did not have an impact on CL. Distributed practice decreased CL in repeated VR mastoidectomy training more consistently than was seen in massed practice. This suggests a possible effect of skills and memory consolidation occurring over time. To optimize technical skills learning, training should be organized as time-distributed practice rather than as a massed block of practice, which is common in skills-training courses. N/A. © 2015 The American Laryngological, Rhinological and Otological Society, Inc.

  10. Recruiting participants with peripheral arterial disease for clinical trials: experience from the Study to Improve Leg Circulation (SILC).

    PubMed

    McDermott, Mary M; Domanchuk, Kathryn; Dyer, Alan; Ades, Philip; Kibbe, Melina; Criqui, Michael H

    2009-03-01

    To describe the success of diverse recruitment methods in a randomized controlled clinical trial of exercise in persons with peripheral arterial disease (PAD). An analysis of recruitment sources conducted for the 746 men and women completing a baseline visit for the study to improve leg circulation (SILC), a randomized controlled trial of exercise for patients with PAD. For each recruitment source, we determined the number of randomized participants, the rate of randomization among those completing a baseline visit, and cost per randomized participant. Of the 746 individuals who completed a baseline visit, 156 were eligible and randomized. The most frequent sources of randomized participants were newspaper advertising (n = 67), mailed recruitment letters to patients with PAD identified at the study medical center (n = 25), and radio advertising (n = 18). Costs per randomized participant were $2750 for television advertising, $2167 for Life Line Screening, $2369 for newspaper advertising, $3931 for mailed postcards to older community dwelling men and women, and $5691 for radio advertising. Among those completing a baseline visit, randomization rates ranged from 10% for those identified from radio advertising to 32% for those identified from the Chicago Veterans Administration and 33% for those identified from posted flyers. Most participants in a randomized controlled trial of exercise were recruited from newspaper advertising and mailed recruitment letters to patients with known PAD. The highest randomization rates after a baseline visit occurred among participants identified from posted flyers and mailed recruitment letters to PAD patients.

  11. Contact Time in Random Walk and Random Waypoint: Dichotomy in Tail Distribution

    NASA Astrophysics Data System (ADS)

    Zhao, Chen; Sichitiu, Mihail L.

    Contact time (or link duration) is a fundamental factor that affects performance in Mobile Ad Hoc Networks. Previous research on theoretical analysis of contact time distribution for random walk models (RW) assume that the contact events can be modeled as either consecutive random walks or direct traversals, which are two extreme cases of random walk, thus with two different conclusions. In this paper we conduct a comprehensive research on this topic in the hope of bridging the gap between the two extremes. The conclusions from the two extreme cases will result in a power-law or exponential tail in the contact time distribution, respectively. However, we show that the actual distribution will vary between the two extremes: a power-law-sub-exponential dichotomy, whose transition point depends on the average flight duration. Through simulation results we show that such conclusion also applies to random waypoint.

  12. Dynamics of Quantum Adiabatic Evolution Algorithm for Number Partitioning

    NASA Technical Reports Server (NTRS)

    Smelyanskiy, V. N.; Toussaint, U. V.; Timucin, D. A.

    2002-01-01

    We have developed a general technique to study the dynamics of the quantum adiabatic evolution algorithm applied to random combinatorial optimization problems in the asymptotic limit of large problem size n. We use as an example the NP-complete Number Partitioning problem and map the algorithm dynamics to that of an auxiliary quantum spin glass system with the slowly varying Hamiltonian. We use a Green function method to obtain the adiabatic eigenstates and the minimum excitation gap. g min, = O(n 2(exp -n/2), corresponding to the exponential complexity of the algorithm for Number Partitioning. The key element of the analysis is the conditional energy distribution computed for the set of all spin configurations generated from a given (ancestor) configuration by simultaneous flipping of a fixed number of spins. For the problem in question this distribution is shown to depend on the ancestor spin configuration only via a certain parameter related to 'the energy of the configuration. As the result, the algorithm dynamics can be described in terms of one-dimensional quantum diffusion in the energy space. This effect provides a general limitation of a quantum adiabatic computation in random optimization problems. Analytical results are in agreement with the numerical simulation of the algorithm.

  13. Detection and inpainting of facial wrinkles using texture orientation fields and Markov random field modeling.

    PubMed

    Batool, Nazre; Chellappa, Rama

    2014-09-01

    Facial retouching is widely used in media and entertainment industry. Professional software usually require a minimum level of user expertise to achieve the desirable results. In this paper, we present an algorithm to detect facial wrinkles/imperfection. We believe that any such algorithm would be amenable to facial retouching applications. The detection of wrinkles/imperfections can allow these skin features to be processed differently than the surrounding skin without much user interaction. For detection, Gabor filter responses along with texture orientation field are used as image features. A bimodal Gaussian mixture model (GMM) represents distributions of Gabor features of normal skin versus skin imperfections. Then, a Markov random field model is used to incorporate the spatial relationships among neighboring pixels for their GMM distributions and texture orientations. An expectation-maximization algorithm then classifies skin versus skin wrinkles/imperfections. Once detected automatically, wrinkles/imperfections are removed completely instead of being blended or blurred. We propose an exemplar-based constrained texture synthesis algorithm to inpaint irregularly shaped gaps left by the removal of detected wrinkles/imperfections. We present results conducted on images downloaded from the Internet to show the efficacy of our algorithms.

  14. Dynamics of Quantum Adiabatic Evolution Algorithm for Number Partitioning

    NASA Technical Reports Server (NTRS)

    Smelyanskiy, Vadius; vonToussaint, Udo V.; Timucin, Dogan A.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We have developed a general technique to study the dynamics of the quantum adiabatic evolution algorithm applied to random combinatorial optimization problems in the asymptotic limit of large problem size n. We use as an example the NP-complete Number Partitioning problem and map the algorithm dynamics to that of an auxiliary quantum spin glass system with the slowly varying Hamiltonian. We use a Green function method to obtain the adiabatic eigenstates and the minimum exitation gap, gmin = O(n2(sup -n/2)), corresponding to the exponential complexity of the algorithm for Number Partitioning. The key element of the analysis is the conditional energy distribution computed for the set of all spin configurations generated from a given (ancestor) configuration by simultaneous flipping of a fixed number of spins. For the problem in question this distribution is shown to depend on the ancestor spin configuration only via a certain parameter related to the energy of the configuration. As the result, the algorithm dynamics can be described in terms of one-dimensional quantum diffusion in the energy space. This effect provides a general limitation of a quantum adiabatic computation in random optimization problems. Analytical results are in agreement with the numerical simulation of the algorithm.

  15. Mobile application as a prenatal education and engagement tool: A randomized controlled pilot.

    PubMed

    Ledford, Christy J W; Canzona, Mollie Rose; Cafferty, Lauren A; Hodge, Joshua A

    2016-04-01

    Research has shown that mobile applications provide a powerful alternative to traditional paper diaries; however, little data exists in comparing apps to the traditional mode of paper as a patient education and engagement tool in the clinical setting. This study was designed to compare the effectiveness of a mobile app versus a spiral-notebook guide throughout prenatal care. This randomized (n=173) controlled pilot was conducted at an East Coast community hospital. Chi-square and repeated-measures analysis of variance was used to test intervention effects in the sample of 127 pregnant mothers who completed their prenatal care in the healthcare system. Patients who were distributed the mobile application used the tool to record information about pregnancy more frequently (p=.04) and developed greater patient activation (p=.02) than patients who were distributed notebooks. No difference was detected on interpersonal clinical communication. A mobile application successfully activated a patient population in which self-management is a critical factor. This study shows that mobile apps can prompt greater use and result in more activated patients. Findings may be translated to other patient populations who receive recurring care for chronic disease. Published by Elsevier Ireland Ltd.

  16. IS THE SUICIDE RATE A RANDOM WALK?

    PubMed

    Yang, Bijou; Lester, David; Lyke, Jennifer; Olsen, Robert

    2015-06-01

    The yearly suicide rates for the period 1933-2010 and the daily suicide numbers for 1990 and 1991 were examined for whether the distribution of difference scores (from year to year and from day to day) fitted a normal distribution, a characteristic of stochastic processes that follow a random walk. If the suicide rate were a random walk, then any disturbance to the suicide rate would have a permanent effect and national suicide prevention efforts would likely fail. The distribution of difference scores from day to day (but not the difference scores from year to year) fitted a normal distribution and, therefore, were consistent with a random walk.

  17. Assessing historical rate changes in global tsunami occurrence

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2011-01-01

    The global catalogue of tsunami events is examined to determine if transient variations in tsunami rates are consistent with a Poisson process commonly assumed for tsunami hazard assessments. The primary data analyzed are tsunamis with maximum sizes >1m. The record of these tsunamis appears to be complete since approximately 1890. A secondary data set of tsunamis >0.1m is also analyzed that appears to be complete since approximately 1960. Various kernel density estimates used to determine the rate distribution with time indicate a prominent rate change in global tsunamis during the mid-1990s. Less prominent rate changes occur in the early- and mid-20th century. To determine whether these rate fluctuations are anomalous, the distribution of annual event numbers for the tsunami catalogue is compared to Poisson and negative binomial distributions, the latter of which includes the effects of temporal clustering. Compared to a Poisson distribution, the negative binomial distribution model provides a consistent fit to tsunami event numbers for the >1m data set, but the Poisson null hypothesis cannot be falsified for the shorter duration >0.1m data set. Temporal clustering of tsunami sources is also indicated by the distribution of interevent times for both data sets. Tsunami event clusters consist only of two to four events, in contrast to protracted sequences of earthquakes that make up foreshock-main shock-aftershock sequences. From past studies of seismicity, it is likely that there is a physical triggering mechanism responsible for events within the tsunami source 'mini-clusters'. In conclusion, prominent transient rate increases in the occurrence of global tsunamis appear to be caused by temporal grouping of geographically distinct mini-clusters, in addition to the random preferential location of global M >7 earthquakes along offshore fault zones.

  18. Estimation of the Nonlinear Random Coefficient Model when Some Random Effects Are Separable

    ERIC Educational Resources Information Center

    du Toit, Stephen H. C.; Cudeck, Robert

    2009-01-01

    A method is presented for marginal maximum likelihood estimation of the nonlinear random coefficient model when the response function has some linear parameters. This is done by writing the marginal distribution of the repeated measures as a conditional distribution of the response given the nonlinear random effects. The resulting distribution…

  19. Use of the Wii Gaming System for Balance Rehabilitation: Establishing Parameters for Healthy Individuals.

    PubMed

    Burns, Melissa K; Andeway, Kathleen; Eppenstein, Paula; Ruroede, Kathleen

    2014-06-01

    This study was designed to establish balance parameters for the Nintendo(®) (Redmond, WA) "Wii Fit™" Balance Board system with three common games, in a sample of healthy adults, and to evaluate the balance measurement reproducibility with separation by age. This was a prospective, multivariate analysis of variance, cohort study design. Seventy-five participants who satisfied all inclusion criteria and completed an informed consent were enrolled. Participants were grouped into age ranges: 21-35 years (n=24), 36-50 years (n=24), and 51-65 years (n=27). Each participant completed the following games three consecutive times, in a randomized order, during one session: "Balance Bubble" (BB) for distance and duration, "Tight Rope" (TR) for distance and duration, and "Center of Balance" (COB) on the left and right sides. COB distributed weight was fairly symmetrical across all subjects and trials; therefore, no influence was assumed on or interaction with other "Wii Fit" measurements. Homogeneity of variance statistics indicated the assumption of distribution normality of the dependent variables (rates) were tenable. The multivariate analysis of variance included dependent variables BB and TR rates (distance divided by duration to complete) with age group and trials as the independent variables. The BB rate was statistically significant (F=4.725, P<0.005), but not the TR rate. The youngest group's BB rate was significantly larger than those of the other two groups. "Wii Fit" can discriminate among age groups across trials. The results show promise as a viable tool to measure balance and distance across time (speed) and center of balance distribution.

  20. Effect of texture randomization on the slip and interfacial robustness in turbulent flows over superhydrophobic surfaces

    NASA Astrophysics Data System (ADS)

    Seo, Jongmin; Mani, Ali

    2018-04-01

    Superhydrophobic surfaces demonstrate promising potential for skin friction reduction in naval and hydrodynamic applications. Recent developments of superhydrophobic surfaces aiming for scalable applications use random distribution of roughness, such as spray coating and etched process. However, most previous analyses of the interaction between flows and superhydrophobic surfaces studied periodic geometries that are economically feasible only in laboratory-scale experiments. In order to assess the drag reduction effectiveness as well as interfacial robustness of superhydrophobic surfaces with randomly distributed textures, we conduct direct numerical simulations of turbulent flows over randomly patterned interfaces considering a range of texture widths w+≈4 -26 , and solid fractions ϕs=11 %-25 % . Slip and no-slip boundary conditions are implemented in a pattern, modeling the presence of gas-liquid interfaces and solid elements. Our results indicate that slip of randomly distributed textures under turbulent flows is about 30 % less than those of surfaces with aligned features of the same size. In the small texture size limit w+≈4 , the slip length of the randomly distributed textures in turbulent flows is well described by a previously introduced Stokes flow solution of randomly distributed shear-free holes. By comparing DNS results for patterned slip and no-slip boundary against the corresponding homogenized slip length boundary conditions, we show that turbulent flows over randomly distributed posts can be represented by an isotropic slip length in streamwise and spanwise direction. The average pressure fluctuation on a gas pocket is similar to that of the aligned features with the same texture size and gas fraction, but the maximum interface deformation at the leading edge of the roughness element is about twice as large when the textures are randomly distributed. The presented analyses provide insights on implications of texture randomness on drag reduction performance and robustness of superhydrophobic surfaces.

  1. Diagnostics of Robust Growth Curve Modeling Using Student's "t" Distribution

    ERIC Educational Resources Information Center

    Tong, Xin; Zhang, Zhiyong

    2012-01-01

    Growth curve models with different types of distributions of random effects and of intraindividual measurement errors for robust analysis are compared. After demonstrating the influence of distribution specification on parameter estimation, 3 methods for diagnosing the distributions for both random effects and intraindividual measurement errors…

  2. Slowdowns in diversification rates from real phylogenies may not be real.

    PubMed

    Cusimano, Natalie; Renner, Susanne S

    2010-07-01

    Studies of diversification patterns often find a slowing in lineage accumulation toward the present. This seemingly pervasive pattern of rate downturns has been taken as evidence for adaptive radiations, density-dependent regulation, and metacommunity species interactions. The significance of rate downturns is evaluated with statistical tests (the gamma statistic and Monte Carlo constant rates (MCCR) test; birth-death likelihood models and Akaike Information Criterion [AIC] scores) that rely on null distributions, which assume that the included species are a random sample of the entire clade. Sampling in real phylogenies, however, often is nonrandom because systematists try to include early-diverging species or representatives of previous intrataxon classifications. We studied the effects of biased sampling, structured sampling, and random sampling by experimentally pruning simulated trees (60 and 150 species) as well as a completely sampled empirical tree (58 species) and then applying the gamma statistic/MCCR test and birth-death likelihood models/AIC scores to assess rate changes. For trees with random species sampling, the true model (i.e., the one fitting the complete phylogenies) could be inferred in most cases. Oversampling deep nodes, however, strongly biases inferences toward downturns, with simulations of structured and biased sampling suggesting that this occurs when sampling percentages drop below 80%. The magnitude of the effect and the sensitivity of diversification rate models is such that a useful rule of thumb may be not to infer rate downturns from real trees unless they have >80% species sampling.

  3. Network Analysis Tools: from biological networks to clusters and pathways.

    PubMed

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Vanderstocken, Gilles; van Helden, Jacques

    2008-01-01

    Network Analysis Tools (NeAT) is a suite of computer tools that integrate various algorithms for the analysis of biological networks: comparison between graphs, between clusters, or between graphs and clusters; network randomization; analysis of degree distribution; network-based clustering and path finding. The tools are interconnected to enable a stepwise analysis of the network through a complete analytical workflow. In this protocol, we present a typical case of utilization, where the tasks above are combined to decipher a protein-protein interaction network retrieved from the STRING database. The results returned by NeAT are typically subnetworks, networks enriched with additional information (i.e., clusters or paths) or tables displaying statistics. Typical networks comprising several thousands of nodes and arcs can be analyzed within a few minutes. The complete protocol can be read and executed in approximately 1 h.

  4. Design for cyclic loading endurance of composites

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Murthy, Pappu L. N.; Chamis, Christos C.; Liaw, Leslie D. G.

    1993-01-01

    The application of the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures) to aircraft wing type structures is described. The code performs a complete probabilistic analysis for composites taking into account the uncertainties in geometry, boundary conditions, material properties, laminate lay-ups, and loads. Results of the analysis are presented in terms of cumulative distribution functions (CDF) and probability density function (PDF) of the fatigue life of a wing type composite structure under different hygrothermal environments subjected to the random pressure. The sensitivity of the fatigue life to a number of critical structural/material variables is also computed from the analysis.

  5. Stochastic scheduling on a repairable manufacturing system

    NASA Astrophysics Data System (ADS)

    Li, Wei; Cao, Jinhua

    1995-08-01

    In this paper, we consider some stochastic scheduling problems with a set of stochastic jobs on a manufacturing system with a single machine that is subject to multiple breakdowns and repairs. When the machine processing a job fails, the job processing must restart some time later when the machine is repaired. For this typical manufacturing system, we find the optimal policies that minimize the following objective functions: (1) the weighed sum of the completion times; (2) the weighed number of late jobs having constant due dates; (3) the weighted number of late jobs having random due dates exponentially distributed, which generalize some previous results.

  6. Micromechanical analysis of composites with fibers distributed randomly over the transverse cross-section

    NASA Astrophysics Data System (ADS)

    Weng, Jingmeng; Wen, Weidong; Cui, Haitao; Chen, Bo

    2018-06-01

    A new method to generate the random distribution of fibers in the transverse cross-section of fiber reinforced composites with high fiber volume fraction is presented in this paper. Based on the microscopy observation of the transverse cross-sections of unidirectional composite laminates, hexagon arrangement is set as the initial arrangement status, and the initial velocity of each fiber is arbitrary at an arbitrary direction, the micro-scale representative volume element (RVE) is established by simulating perfectly elastic collision. Combined with the proposed periodic boundary conditions which are suitable for multi-axial loading, the effective elastic properties of composite materials can be predicted. The predicted properties show reasonable agreement with experimental results. By comparing the stress field of RVE with fibers distributed randomly and RVE with fibers distributed periodically, the predicted elastic modulus of RVE with fibers distributed randomly is greater than RVE with fibers distributed periodically.

  7. Sampling solution traces for the problem of sorting permutations by signed reversals

    PubMed Central

    2012-01-01

    Background Traditional algorithms to solve the problem of sorting by signed reversals output just one optimal solution while the space of all optimal solutions can be huge. A so-called trace represents a group of solutions which share the same set of reversals that must be applied to sort the original permutation following a partial ordering. By using traces, we therefore can represent the set of optimal solutions in a more compact way. Algorithms for enumerating the complete set of traces of solutions were developed. However, due to their exponential complexity, their practical use is limited to small permutations. A partial enumeration of traces is a sampling of the complete set of traces and can be an alternative for the study of distinct evolutionary scenarios of big permutations. Ideally, the sampling should be done uniformly from the space of all optimal solutions. This is however conjectured to be ♯P-complete. Results We propose and evaluate three algorithms for producing a sampling of the complete set of traces that instead can be shown in practice to preserve some of the characteristics of the space of all solutions. The first algorithm (RA) performs the construction of traces through a random selection of reversals on the list of optimal 1-sequences. The second algorithm (DFALT) consists in a slight modification of an algorithm that performs the complete enumeration of traces. Finally, the third algorithm (SWA) is based on a sliding window strategy to improve the enumeration of traces. All proposed algorithms were able to enumerate traces for permutations with up to 200 elements. Conclusions We analysed the distribution of the enumerated traces with respect to their height and average reversal length. Various works indicate that the reversal length can be an important aspect in genome rearrangements. The algorithms RA and SWA show a tendency to lose traces with high average reversal length. Such traces are however rare, and qualitatively our results show that, for testable-sized permutations, the algorithms DFALT and SWA produce distributions which approximate the reversal length distributions observed with a complete enumeration of the set of traces. PMID:22704580

  8. Characterizing ISI and sub-threshold membrane potential distributions: Ensemble of IF neurons with random squared-noise intensity.

    PubMed

    Kumar, Sanjeev; Karmeshu

    2018-04-01

    A theoretical investigation is presented that characterizes the emerging sub-threshold membrane potential and inter-spike interval (ISI) distributions of an ensemble of IF neurons that group together and fire together. The squared-noise intensity σ 2 of the ensemble of neurons is treated as a random variable to account for the electrophysiological variations across population of nearly identical neurons. Employing superstatistical framework, both ISI distribution and sub-threshold membrane potential distribution of neuronal ensemble are obtained in terms of generalized K-distribution. The resulting distributions exhibit asymptotic behavior akin to stretched exponential family. Extensive simulations of the underlying SDE with random σ 2 are carried out. The results are found to be in excellent agreement with the analytical results. The analysis has been extended to cover the case corresponding to independent random fluctuations in drift in addition to random squared-noise intensity. The novelty of the proposed analytical investigation for the ensemble of IF neurons is that it yields closed form expressions of probability distributions in terms of generalized K-distribution. Based on a record of spiking activity of thousands of neurons, the findings of the proposed model are validated. The squared-noise intensity σ 2 of identified neurons from the data is found to follow gamma distribution. The proposed generalized K-distribution is found to be in excellent agreement with that of empirically obtained ISI distribution of neuronal ensemble. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Bayesian Sensitivity Analysis of Statistical Models with Missing Data

    PubMed Central

    ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG

    2013-01-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718

  10. Optimal partitioning of random programs across two processors

    NASA Technical Reports Server (NTRS)

    Nicol, D. M.

    1986-01-01

    The optimal partitioning of random distributed programs is discussed. It is concluded that the optimal partitioning of a homogeneous random program over a homogeneous distributed system either assigns all modules to a single processor, or distributes the modules as evenly as possible among all processors. The analysis rests heavily on the approximation which equates the expected maximum of a set of independent random variables with the set's maximum expectation. The results are strengthened by providing an approximation-free proof of this result for two processors under general conditions on the module execution time distribution. It is also shown that use of this approximation causes two of the previous central results to be false.

  11. Work distributions for random sudden quantum quenches

    NASA Astrophysics Data System (ADS)

    Łobejko, Marcin; Łuczka, Jerzy; Talkner, Peter

    2017-05-01

    The statistics of work performed on a system by a sudden random quench is investigated. Considering systems with finite dimensional Hilbert spaces we model a sudden random quench by randomly choosing elements from a Gaussian unitary ensemble (GUE) consisting of Hermitian matrices with identically, Gaussian distributed matrix elements. A probability density function (pdf) of work in terms of initial and final energy distributions is derived and evaluated for a two-level system. Explicit results are obtained for quenches with a sharply given initial Hamiltonian, while the work pdfs for quenches between Hamiltonians from two independent GUEs can only be determined in explicit form in the limits of zero and infinite temperature. The same work distribution as for a sudden random quench is obtained for an adiabatic, i.e., infinitely slow, protocol connecting the same initial and final Hamiltonians.

  12. Quantifying evenly distributed states in exclusion and nonexclusion processes

    NASA Astrophysics Data System (ADS)

    Binder, Benjamin J.; Landman, Kerry A.

    2011-04-01

    Spatial-point data sets, generated from a wide range of physical systems and mathematical models, can be analyzed by counting the number of objects in equally sized bins. We find that the bin counts are related to the Pólya distribution. New measures are developed which indicate whether or not a spatial data set, generated from an exclusion process, is at its most evenly distributed state, the complete spatial randomness (CSR) state. To this end, we define an index in terms of the variance between the bin counts. Limiting values of the index are determined when objects have access to the entire domain and when there are subregions of the domain that are inaccessible to objects. Using three case studies (Lagrangian fluid particles in chaotic laminar flows, cellular automata agents in discrete models, and biological cells within colonies), we calculate the indexes and verify that our theoretical CSR limit accurately predicts the state of the system. These measures should prove useful in many biological applications.

  13. Quantum Key Distribution

    NASA Astrophysics Data System (ADS)

    Hughes, Richard

    2004-05-01

    Quantum key distribution (QKD) uses single-photon communications to generate the shared, secret random number sequences that are used to encrypt and decrypt secret communications. The unconditional security of QKD is based on the interplay between fundamental principles of quantum physics and information theory. An adversary can neither successfully tap the transmissions, nor evade detection (eavesdropping raises the key error rate above a threshold value). QKD could be particularly attractive for free-space optical communications, both ground-based and for satellites. I will describe a QKD experiment performed over multi-kilometer line-of-sight paths, which serves as a model for a satellite-to-ground key distribution system. The system uses single-photon polarization states, without active polarization switching, and for the first time implements the complete BB84 QKD protocol including, reconciliation, privacy amplification and the all-important authentication stage. It is capable of continuous operation throughout the day and night, achieving the self-sustaining production of error-free, shared, secret bits. I will also report on the results of satellite-to-ground QKD modeling.

  14. Using the Quantile Mapping to improve a weather generator

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Themessl, M.; Gobiet, A.

    2012-04-01

    We developed a weather generator (WG) by using statistical and stochastic methods, among them are quantile mapping (QM), Monte-Carlo, auto-regression, empirical orthogonal function (EOF). One of the important steps in the WG is using QM, through which all the variables, no matter what distribution they originally are, are transformed into normal distributed variables. Therefore, the WG can work on normally distributed variables, which greatly facilitates the treatment of random numbers in the WG. Monte-Carlo and auto-regression are used to generate the realization; EOFs are employed for preserving spatial relationships and the relationships between different meteorological variables. We have established a complete model named WGQM (weather generator and quantile mapping), which can be applied flexibly to generate daily or hourly time series. For example, with 30-year daily (hourly) data and 100-year monthly (daily) data as input, the 100-year daily (hourly) data would be relatively reasonably produced. Some evaluation experiments with WGQM have been carried out in the area of Austria and the evaluation results will be presented.

  15. The Skillings-Mack test (Friedman test when there are missing data).

    PubMed

    Chatfield, Mark; Mander, Adrian

    2009-04-01

    The Skillings-Mack statistic (Skillings and Mack, 1981, Technometrics 23: 171-177) is a general Friedman-type statistic that can be used in almost any block design with an arbitrary missing-data structure. The missing data can be either missing by design, for example, an incomplete block design, or missing completely at random. The Skillings-Mack test is equivalent to the Friedman test when there are no missing data in a balanced complete block design, and the Skillings-Mack test is equivalent to the test suggested in Durbin (1951, British Journal of Psychology, Statistical Section 4: 85-90) for a balanced incomplete block design. The Friedman test was implemented in Stata by Goldstein (1991, Stata Technical Bulletin 3: 26-27) and further developed in Goldstein (2005, Stata Journal 5: 285). This article introduces the skilmack command, which performs the Skillings-Mack test.The skilmack command is also useful when there are many ties or equal ranks (N.B. the Friedman statistic compared with the chi(2) distribution will give a conservative result), as well as for small samples; appropriate results can be obtained by simulating the distribution of the test statistic under the null hypothesis.

  16. Disruption of Calcium Homeostasis during Exercise as a Mediator of Bone Metabolism

    DTIC Science & Technology

    2014-10-01

    2013. We met the goal of having 14 women and 14 men complete EXP1. Progress on EXP1: Enrolled Screen Withdrew Randomized Withdrew Completed Failure...Before After Randomized Randomized Women 18 2 1 15 1 14 Men 22 2 3 17 3 14 Total 40 4 4 32 4 28 Reasons for withdrawals: Screening failures...Prepare annual progress report in Q4 This was accomplished. • Data from EXP1 As planned, 14 women and 14 men completed EXP1. The characteristics of

  17. On the generation of log-Lévy distributions and extreme randomness

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2011-10-01

    The log-normal distribution is prevalent across the sciences, as it emerges from the combination of multiplicative processes and the central limit theorem (CLT). The CLT, beyond yielding the normal distribution, also yields the class of Lévy distributions. The log-Lévy distributions are the Lévy counterparts of the log-normal distribution, they appear in the context of ultraslow diffusion processes, and they are categorized by Mandelbrot as belonging to the class of extreme randomness. In this paper, we present a natural stochastic growth model from which both the log-normal distribution and the log-Lévy distributions emerge universally—the former in the case of deterministic underlying setting, and the latter in the case of stochastic underlying setting. In particular, we establish a stochastic growth model which universally generates Mandelbrot’s extreme randomness.

  18. Randomness versus specifics for word-frequency distributions

    NASA Astrophysics Data System (ADS)

    Yan, Xiaoyong; Minnhagen, Petter

    2016-02-01

    The text-length-dependence of real word-frequency distributions can be connected to the general properties of a random book. It is pointed out that this finding has strong implications, when deciding between two conceptually different views on word-frequency distributions, i.e. the specific 'Zipf's-view' and the non-specific 'Randomness-view', as is discussed. It is also noticed that the text-length transformation of a random book does have an exact scaling property precisely for the power-law index γ = 1, as opposed to the Zipf's exponent γ = 2 and the implication of this exact scaling property is discussed. However a real text has γ > 1 and as a consequence γ increases when shortening a real text. The connections to the predictions from the RGF (Random Group Formation) and to the infinite length-limit of a meta-book are also discussed. The difference between 'curve-fitting' and 'predicting' word-frequency distributions is stressed. It is pointed out that the question of randomness versus specifics for the distribution of outcomes in case of sufficiently complex systems has a much wider relevance than just the word-frequency example analyzed in the present work.

  19. Randomness determines practical security of BB84 quantum key distribution.

    PubMed

    Li, Hong-Wei; Yin, Zhen-Qiang; Wang, Shuang; Qian, Yong-Jun; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2015-11-10

    Unconditional security of the BB84 quantum key distribution protocol has been proved by exploiting the fundamental laws of quantum mechanics, but the practical quantum key distribution system maybe hacked by considering the imperfect state preparation and measurement respectively. Until now, different attacking schemes have been proposed by utilizing imperfect devices, but the general security analysis model against all of the practical attacking schemes has not been proposed. Here, we demonstrate that the general practical attacking schemes can be divided into the Trojan horse attack, strong randomness attack and weak randomness attack respectively. We prove security of BB84 protocol under randomness attacking models, and these results can be applied to guarantee the security of the practical quantum key distribution system.

  20. Randomness determines practical security of BB84 quantum key distribution

    PubMed Central

    Li, Hong-Wei; Yin, Zhen-Qiang; Wang, Shuang; Qian, Yong-Jun; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2015-01-01

    Unconditional security of the BB84 quantum key distribution protocol has been proved by exploiting the fundamental laws of quantum mechanics, but the practical quantum key distribution system maybe hacked by considering the imperfect state preparation and measurement respectively. Until now, different attacking schemes have been proposed by utilizing imperfect devices, but the general security analysis model against all of the practical attacking schemes has not been proposed. Here, we demonstrate that the general practical attacking schemes can be divided into the Trojan horse attack, strong randomness attack and weak randomness attack respectively. We prove security of BB84 protocol under randomness attacking models, and these results can be applied to guarantee the security of the practical quantum key distribution system. PMID:26552359

  1. Randomness determines practical security of BB84 quantum key distribution

    NASA Astrophysics Data System (ADS)

    Li, Hong-Wei; Yin, Zhen-Qiang; Wang, Shuang; Qian, Yong-Jun; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2015-11-01

    Unconditional security of the BB84 quantum key distribution protocol has been proved by exploiting the fundamental laws of quantum mechanics, but the practical quantum key distribution system maybe hacked by considering the imperfect state preparation and measurement respectively. Until now, different attacking schemes have been proposed by utilizing imperfect devices, but the general security analysis model against all of the practical attacking schemes has not been proposed. Here, we demonstrate that the general practical attacking schemes can be divided into the Trojan horse attack, strong randomness attack and weak randomness attack respectively. We prove security of BB84 protocol under randomness attacking models, and these results can be applied to guarantee the security of the practical quantum key distribution system.

  2. The effects of noise due to random undetected tilts and paleosecular variation on regional paleomagnetic directions

    USGS Publications Warehouse

    Calderone, G.J.; Butler, R.F.

    1991-01-01

    Random tilting of a single paleomagnetic vector produces a distribution of vectors which is not rotationally symmetric about the original vector and therefore not Fisherian. Monte Carlo simulations were performed on two types of vector distributions: 1) distributions of vectors formed by perturbing a single original vector with a Fisher distribution of bedding poles (each defining a tilt correction) and 2) standard Fisher distributions. These simulations demonstrate that inclinations of vectors drawn from both distributions are biased toward shallow inclinations. The Fisher mean direction of the distribution of vectors formed by perturbing a single vector with random undetected tilts is biased toward shallow inclinations, but this bias is insignificant for angular dispersions of bedding poles less than 20??. -from Authors

  3. Block randomization versus complete randomization of human perception stimuli: is there a difference?

    NASA Astrophysics Data System (ADS)

    Moyer, Steve; Uhl, Elizabeth R.

    2015-05-01

    For more than 50 years, the U.S. Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD) has been studying and modeling the human visual discrimination process as it pertains to military imaging systems. In order to develop sensor performance models, human observers are trained to expert levels in the identification of military vehicles. From 1998 until 2006, the experimental stimuli were block randomized, meaning that stimuli with similar difficulty levels (for example, in terms of distance from target, blur, noise, etc.) were presented together in blocks of approximately 24 images but the order of images within the block was random. Starting in 2006, complete randomization came into vogue, meaning that difficulty could change image to image. It was thought that this would provide a more statistically robust result. In this study we investigated the impact of the two types of randomization on performance in two groups of observers matched for skill to create equivalent groups. It is hypothesized that Soldiers in the Complete Randomized condition will have to shift their decision criterion more frequently than Soldiers in the Block Randomization group and this shifting is expected to impede performance so that Soldiers in the Block Randomized group perform better.

  4. Dynamical Localization for Unitary Anderson Models

    NASA Astrophysics Data System (ADS)

    Hamza, Eman; Joye, Alain; Stolz, Günter

    2009-11-01

    This paper establishes dynamical localization properties of certain families of unitary random operators on the d-dimensional lattice in various regimes. These operators are generalizations of one-dimensional physical models of quantum transport and draw their name from the analogy with the discrete Anderson model of solid state physics. They consist in a product of a deterministic unitary operator and a random unitary operator. The deterministic operator has a band structure, is absolutely continuous and plays the role of the discrete Laplacian. The random operator is diagonal with elements given by i.i.d. random phases distributed according to some absolutely continuous measure and plays the role of the random potential. In dimension one, these operators belong to the family of CMV-matrices in the theory of orthogonal polynomials on the unit circle. We implement the method of Aizenman-Molchanov to prove exponential decay of the fractional moments of the Green function for the unitary Anderson model in the following three regimes: In any dimension, throughout the spectrum at large disorder and near the band edges at arbitrary disorder and, in dimension one, throughout the spectrum at arbitrary disorder. We also prove that exponential decay of fractional moments of the Green function implies dynamical localization, which in turn implies spectral localization. These results complete the analogy with the self-adjoint case where dynamical localization is known to be true in the same three regimes.

  5. Randomized response estimates for the 12-month prevalence of cognitive-enhancing drug use in university students.

    PubMed

    Dietz, Pavel; Striegel, Heiko; Franke, Andreas G; Lieb, Klaus; Simon, Perikles; Ulrich, Rolf

    2013-01-01

    To estimate the 12-month prevalence of cognitive-enhancing drug use. Paper-and-pencil questionnaire that used the randomized response technique. University in Mainz, Germany. A total of 2569 university students who completed the questionnaire. An anonymous, specialized questionnaire that used the randomized response technique was distributed to students at the beginning of classes and was collected afterward. From the responses, we calculated the prevalence of students taking drugs only to improve their cognitive performance and not to treat underlying mental disorders such as attention-deficit-hyperactivity disorder, depression, and sleep disorders. The estimated 12-month prevalence of using cognitive-enhancing drugs was 20%. Prevalence varied by sex (male 23.7%, female 17.0%), field of study (highest in students studying sports-related fields, 25.4%), and semester (first semester 24.3%, beyond first semester 16.7%). To our knowledge, this is the first time that the randomized response technique has been used to survey students about cognitive-enhancing drug use. Using the randomized response technique, our questionnaire provided data that showed a high 12-month prevalence of cognitive-enhancing drug use in German university students. Our study suggests that other direct survey techniques have underestimated the use of these drugs. Drug prevention programs need to be established at universities to address this issue. © 2013 Pharmacotherapy Publications, Inc.

  6. Patterns of particle distribution in multiparticle systems by random walks with memory enhancement and decay

    NASA Astrophysics Data System (ADS)

    Tan, Zhi-Jie; Zou, Xian-Wu; Huang, Sheng-You; Zhang, Wei; Jin, Zhun-Zhi

    2002-07-01

    We investigate the pattern of particle distribution and its evolution with time in multiparticle systems using the model of random walks with memory enhancement and decay. This model describes some biological intelligent walks. With decrease in the memory decay exponent α, the distribution of particles changes from a random dispersive pattern to a locally dense one, and then returns to the random one. Correspondingly, the fractal dimension Df,p characterizing the distribution of particle positions increases from a low value to a maximum and then decreases to the low one again. This is determined by the degree of overlap of regions consisting of sites with remanent information. The second moment of the density ρ(2) was introduced to investigate the inhomogeneity of the particle distribution. The dependence of ρ(2) on α is similar to that of Df,p on α. ρ(2) increases with time as a power law in the process of adjusting the particle distribution, and then ρ(2) tends to a stable equilibrium value.

  7. The invariant statistical rule of aerosol scattering pulse signal modulated by random noise

    NASA Astrophysics Data System (ADS)

    Yan, Zhen-gang; Bian, Bao-Min; Yang, Juan; Peng, Gang; Li, Zhen-hua

    2010-11-01

    A model of the random background noise acting on particle signals is established to study the impact of the background noise of the photoelectric sensor in the laser airborne particle counter on the statistical character of the aerosol scattering pulse signals. The results show that the noises broaden the statistical distribution of the particle's measurement. Further numerical research shows that the output of the signal amplitude still has the same distribution when the airborne particle with the lognormal distribution was modulated by random noise which has lognormal distribution. Namely it follows the statistics law of invariance. Based on this model, the background noise of photoelectric sensor and the counting distributions of random signal for aerosol's scattering pulse are obtained and analyzed by using a high-speed data acquisition card PCI-9812. It is found that the experiment results and simulation results are well consistent.

  8. Influence of microgravity on root-cap regeneration and the structure of columella cells in Zea mays

    NASA Technical Reports Server (NTRS)

    Moore, R.; McClelen, C. E.; Fondren, W. M.; Wang, C. L.

    1987-01-01

    We launched imbibed seeds and seedlings of Zea mays into outer space aboard the space shuttle Columbia to determine the influence of microgravity on 1) root-cap regeneration, and 2) the distribution of amyloplasts and endoplasmic reticulum (ER) in the putative statocytes (i.e., columella cells) of roots. Decapped roots grown on Earth completely regenerated their caps within 4.8 days after decapping, while those grown in microgravity did not regenerate caps. In Earth-grown seedlings, the ER was localized primarily along the periphery of columella cells, and amyloplasts sedimented in response to gravity to the lower sides of the cells. Seeds germinated on Earth and subsequently launched into outer space had a distribution of ER in columella cells similar to that of Earth-grown controls, but amyloplasts were distributed throughout the cells. Seeds germinated in outer space were characterized by the presence of spherical and ellipsoidal masses of ER and randomly distributed amyloplasts in their columella cells. These results indicate that 1) gravity is necessary for regeneration of the root cap, 2) columella cells can maintain their characteristic distribution of ER in microgravity only if they are exposed previously to gravity, and 3) gravity is necessary to distribute the ER in columella cells of this cultivar of Z. mays.

  9. Random phenotypic variation of yeast (Saccharomyces cerevisiae) single-gene knockouts fits a double pareto-lognormal distribution.

    PubMed

    Graham, John H; Robb, Daniel T; Poe, Amy R

    2012-01-01

    Distributed robustness is thought to influence the buffering of random phenotypic variation through the scale-free topology of gene regulatory, metabolic, and protein-protein interaction networks. If this hypothesis is true, then the phenotypic response to the perturbation of particular nodes in such a network should be proportional to the number of links those nodes make with neighboring nodes. This suggests a probability distribution approximating an inverse power-law of random phenotypic variation. Zero phenotypic variation, however, is impossible, because random molecular and cellular processes are essential to normal development. Consequently, a more realistic distribution should have a y-intercept close to zero in the lower tail, a mode greater than zero, and a long (fat) upper tail. The double Pareto-lognormal (DPLN) distribution is an ideal candidate distribution. It consists of a mixture of a lognormal body and upper and lower power-law tails. If our assumptions are true, the DPLN distribution should provide a better fit to random phenotypic variation in a large series of single-gene knockout lines than other skewed or symmetrical distributions. We fit a large published data set of single-gene knockout lines in Saccharomyces cerevisiae to seven different probability distributions: DPLN, right Pareto-lognormal (RPLN), left Pareto-lognormal (LPLN), normal, lognormal, exponential, and Pareto. The best model was judged by the Akaike Information Criterion (AIC). Phenotypic variation among gene knockouts in S. cerevisiae fits a double Pareto-lognormal (DPLN) distribution better than any of the alternative distributions, including the right Pareto-lognormal and lognormal distributions. A DPLN distribution is consistent with the hypothesis that developmental stability is mediated, in part, by distributed robustness, the resilience of gene regulatory, metabolic, and protein-protein interaction networks. Alternatively, multiplicative cell growth, and the mixing of lognormal distributions having different variances, may generate a DPLN distribution.

  10. Design and use of a quantitative scale for measuring presyncope.

    PubMed

    Sheldon, Robert S; Amuah, Joseph E; Connolly, Stuart J; Rose, Sarah; Morillo, Carlos A; Talajic, Mario; Kus, Teresa; Fouad-Tarazi, Fetnat; Klingenheben, Thomas; Krahn, Andrew D; Sheldon, Aaron; Koshman, Mary-Lou; Ritchie, Debbie

    2009-08-01

    Vasovagal syncope is common and distressing. One important symptom is presyncope, but there are no clinimetric measures of this. We developed the Calgary Presyncope Form (CPF) and used it to test whether metoprolol reduces presyncope in a randomized trial. The CPF captures the frequency, duration, and severity of presyncope. We administered it to participants in the Prevention of Syncope Trial (POST), a randomized clinical trial that tested the hypothesis that metoprolol reduces syncope and presyncope in adult patients with vasovagal syncope. The CPF was completed by 44 patients on metoprolol and 39 patients on placebo, of a total of 208 subjects. Completion of the CPF for each of the threedimensions was 84-87% in the 83 respondents. Results were centrally distributed in duration and severity dimensions, but not in frequency. Patients had a median of 1.2 presyncopal spells per day, with a median moderate severity, lasting a median 10 minutes. The 3 scales were statistically independent of each other. These results were independent of subject age, and results in all 3 dimensions were stable over the observation period. There was no significant difference between patients on metoprolol and placebo in any dimension. The 3-dimensional CPF is simple, easy to use, stable over time, measures 3 independent variables, and documents that metoprolol does not reduce presyncope.

  11. Random-Walk Type Model with Fat Tails for Financial Markets

    NASA Astrophysics Data System (ADS)

    Matuttis, Hans-Geors

    Starting from the random-walk model, practices of financial markets are included into the random-walk so that fat tail distributions like those in the high frequency data of the SP500 index are reproduced, though the individual mechanisms are modeled by normally distributed data. The incorporation of local correlation narrows the distribution for "frequent" events, whereas global correlations due to technical analysis leads to fat tails. Delay of market transactions in the trading process shifts the fat tail probabilities downwards. Such an inclusion of reactions to market fluctuations leads to mini-trends which are distributed with unit variance.

  12. Distribution of shortest cycle lengths in random networks

    NASA Astrophysics Data System (ADS)

    Bonneau, Haggai; Hassid, Aviv; Biham, Ofer; Kühn, Reimer; Katzav, Eytan

    2017-12-01

    We present analytical results for the distribution of shortest cycle lengths (DSCL) in random networks. The approach is based on the relation between the DSCL and the distribution of shortest path lengths (DSPL). We apply this approach to configuration model networks, for which analytical results for the DSPL were obtained before. We first calculate the fraction of nodes in the network which reside on at least one cycle. Conditioning on being on a cycle, we provide the DSCL over ensembles of configuration model networks with degree distributions which follow a Poisson distribution (Erdős-Rényi network), degenerate distribution (random regular graph), and a power-law distribution (scale-free network). The mean and variance of the DSCL are calculated. The analytical results are found to be in very good agreement with the results of computer simulations.

  13. Paretian Poisson Processes

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2008-05-01

    Many random populations can be modeled as a countable set of points scattered randomly on the positive half-line. The points may represent magnitudes of earthquakes and tornados, masses of stars, market values of public companies, etc. In this article we explore a specific class of random such populations we coin ` Paretian Poisson processes'. This class is elemental in statistical physics—connecting together, in a deep and fundamental way, diverse issues including: the Poisson distribution of the Law of Small Numbers; Paretian tail statistics; the Fréchet distribution of Extreme Value Theory; the one-sided Lévy distribution of the Central Limit Theorem; scale-invariance, renormalization and fractality; resilience to random perturbations.

  14. Numbering questionnaires had no impact on the response rate and only a slight influence on the response content of a patient safety culture survey: a randomized trial.

    PubMed

    Kundig, François; Staines, Anthony; Kinge, Thompson; Perneger, Thomas V

    2011-11-01

    In self-completed surveys, anonymous questionnaires are sometimes numbered so as to avoid sending reminders to initial nonrespondents. This number may be perceived as a threat to confidentiality by some respondents, which may reduce the response rate, or cause social desirability bias. In this study, we evaluated whether using nonnumbered vs. numbered questionnaires influenced the response rate and the response content. During a patient safety culture survey, we randomized participants into two groups: one received an anonymous nonnumbered questionnaire and the other a numbered questionnaire. We compared the survey response rates and distributions of the responses for the 42-questionnaire items across the two groups. Response rates were similar in the two groups (nonnumbered, 75.2%; numbered, 72.8%; difference, 2.4%; P=0.28). Five of the 42 questions had statistically significant differences in distributions, but these differences were small. Unexpectedly, in all five instances, the patient safety culture ratings were more favorable in the nonnumbered group. Numbering of mailed questionnaires had no impact on the response rate. Numbering influenced significantly the response content of several items, but these differences were small and ran against the hypothesis of social desirability bias. Copyright © 2011 Elsevier Inc. All rights reserved.

  15. A simple two-stage model predicts response time distributions.

    PubMed

    Carpenter, R H S; Reddi, B A J; Anderson, A J

    2009-08-15

    The neural mechanisms underlying reaction times have previously been modelled in two distinct ways. When stimuli are hard to detect, response time tends to follow a random-walk model that integrates noisy sensory signals. But studies investigating the influence of higher-level factors such as prior probability and response urgency typically use highly detectable targets, and response times then usually correspond to a linear rise-to-threshold mechanism. Here we show that a model incorporating both types of element in series - a detector integrating noisy afferent signals, followed by a linear rise-to-threshold performing decision - successfully predicts not only mean response times but, much more stringently, the observed distribution of these times and the rate of decision errors over a wide range of stimulus detectability. By reconciling what previously may have seemed to be conflicting theories, we are now closer to having a complete description of reaction time and the decision processes that underlie it.

  16. An Examination of Adaptive Reversion in Saccharomyces Cerevisiae

    PubMed Central

    Steele, D. F.; Jinks-Robertson, S.

    1992-01-01

    Reversion to Lys(+) prototrophy in a haploid yeast strain containing a defined lys2 frameshift mutation has been examined. When cells were plated on synthetic complete medium lacking only lysine, the numbers of Lys(+) revertant colonies accumulated in a time-dependent manner in the absence of any detectable increase in cell number. An examination of the distribution of the numbers of early appearing Lys(+) colonies from independent cultures suggests that the mutations to prototrophy occurred randomly during nonselective growth. In contrast, an examination of the distribution of late appearing Lys(+) colonies indicates that the underlying reversion events occurred after selective plating. No accumulation of Lys(+) revertants occurred when cells were starved for tryptophan, leucine or both lysine and tryptophan prior to plating selectively for Lys(+) revertants. These results indicate that mutations accumulate more frequently when they confer a selective advantage, and are thus consistent with the occurrence of adaptive mutations in yeast. PMID:1398066

  17. Phylogeny, paleontology, and primates: do incomplete fossils bias the tree of life?

    PubMed

    Pattinson, David J; Thompson, Richard S; Piotrowski, Aleks K; Asher, Robert J

    2015-03-01

    Paleontological systematics relies heavily on morphological data that have undergone decay and fossilization. Here, we apply a heuristic means to assess how a fossil's incompleteness detracts from inferring its phylogenetic relationships. We compiled a phylogenetic matrix for primates and simulated the extinction of living species by deleting an extant taxon's molecular data and keeping only those morphological characters present in actual fossils. The choice of characters present in a given living taxon (the subject) was defined by those present in a given fossil (the template). By measuring congruence between a well-corroborated phylogeny to those incorporating artificial fossils, and by comparing real vs. random character distributions and states, we tested the information content of paleontological datasets and determined if extinction of a living species leads to bias in phylogeny reconstruction. We found a positive correlation between fossil completeness and topological congruence. Real fossil templates sampled for 36 or more of the 360 available morphological characters (including dental) performed significantly better than similarly complete templates with random states. Templates dominated by only one partition performed worse than templates with randomly sampled characters across partitions. The template based on the Eocene primate Darwinius masillae performs better than most other templates with a similar number of sampled characters, likely due to preservation of data across multiple partitions. Our results support the interpretation that Darwinius is strepsirhine, not haplorhine, and suggest that paleontological datasets are reliable in primate phylogeny reconstruction. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Survival of extreme opinions

    NASA Astrophysics Data System (ADS)

    Hsu, Jiann-wien; Huang, Ding-wei

    2009-12-01

    We study the survival of extreme opinions in various processes of consensus formation. All the opinions are treated equally and subjected to the same rules of changing. We investigate three typical models to reach a consensus in each case: (A) personal influence, (B) influence from surroundings, and (C) influence to surroundings. Starting with uniformly distributed random opinions, our calculated results show that the extreme opinions can survive in both models (A) and (B), but not in model (C). We obtain a conclusion that both personal influence and passive adaptation to the environment are not sufficient enough to eradicate all the extreme opinions. Only the active persuasion to change the surroundings eliminates the extreme opinions completely.

  19. Hybrid computer technique yields random signal probability distributions

    NASA Technical Reports Server (NTRS)

    Cameron, W. D.

    1965-01-01

    Hybrid computer determines the probability distributions of instantaneous and peak amplitudes of random signals. This combined digital and analog computer system reduces the errors and delays of manual data analysis.

  20. Virtual house calls for Parkinson disease (Connect.Parkinson): study protocol for a randomized, controlled trial.

    PubMed

    Achey, Meredith A; Beck, Christopher A; Beran, Denise B; Boyd, Cynthia M; Schmidt, Peter N; Willis, Allison W; Riggare, Sara S; Simone, Richard B; Biglan, Kevin M; Dorsey, E Ray

    2014-11-27

    Interest in improving care for the growing number of individuals with chronic conditions is rising. However, access to care is limited by distance, disability, and distribution of doctors. Small-scale studies in Parkinson disease, a prototypical chronic condition, have suggested that delivering care using video house calls is feasible, offers similar clinical outcomes to in-person care, and reduces travel burden. We are conducting a randomized comparative effectiveness study (Connect.Parkinson) comparing usual care in the community to usual care augmented by virtual house calls with a Parkinson disease specialist. Recruitment is completed centrally using online advertisements and emails and by contacting physicians, support groups, and allied health professionals. Efforts target areas with a high proportion of individuals not receiving care from neurologists. Approximately 200 individuals with Parkinson disease and their care partners will be enrolled at 20 centers throughout the United States and followed for one year. Participants receive educational materials, then are randomized in a 1:1 ratio to continue their usual care (control arm) or usual care and specialty care delivered virtually (intervention arm). Care partners are surveyed about their time and travel burden and their perceived caregiver burden. Participants are evaluated via electronic survey forms and videoconferencing with a blinded independent rater at baseline and at 12 months. All study activities are completed remotely.The primary outcomes are: (1) feasibility, as measured by the proportion of visits completed, and (2) quality of life, as measured by the 39-item Parkinson's Disease Questionnaire. Secondary outcomes include measures of clinical benefit, quality of care, time and travel burden, and caregiver burden. Connect.Parkinson will evaluate the feasibility and effectiveness of using technology to deliver care into the homes of individuals with Parkinson disease. The trial may serve as a model for increasing access and delivering patient-centered care at home for individuals with chronic conditions. This trial was registered on clinicaltrials.gov on January 8, 2014 [NCT02038959].

  1. Surface plasmon enhanced cell microscopy with blocked random spatial activation

    NASA Astrophysics Data System (ADS)

    Son, Taehwang; Oh, Youngjin; Lee, Wonju; Yang, Heejin; Kim, Donghyun

    2016-03-01

    We present surface plasmon enhanced fluorescence microscopy with random spatial sampling using patterned block of silver nanoislands. Rigorous coupled wave analysis was performed to confirm near-field localization on nanoislands. Random nanoislands were fabricated in silver by temperature annealing. By analyzing random near-field distribution, average size of localized fields was found to be on the order of 135 nm. Randomly localized near-fields were used to spatially sample F-actin of J774 cells (mouse macrophage cell-line). Image deconvolution algorithm based on linear imaging theory was established for stochastic estimation of fluorescent molecular distribution. The alignment between near-field distribution and raw image was performed by the patterned block. The achieved resolution is dependent upon factors including the size of localized fields and estimated to be 100-150 nm.

  2. Random walks with random velocities.

    PubMed

    Zaburdaev, Vasily; Schmiedeberg, Michael; Stark, Holger

    2008-07-01

    We consider a random walk model that takes into account the velocity distribution of random walkers. Random motion with alternating velocities is inherent to various physical and biological systems. Moreover, the velocity distribution is often the first characteristic that is experimentally accessible. Here, we derive transport equations describing the dispersal process in the model and solve them analytically. The asymptotic properties of solutions are presented in the form of a phase diagram that shows all possible scaling regimes, including superdiffusive, ballistic, and superballistic motion. The theoretical results of this work are in excellent agreement with accompanying numerical simulations.

  3. Modeling species-abundance relationships in multi-species collections

    USGS Publications Warehouse

    Peng, S.; Yin, Z.; Ren, H.; Guo, Q.

    2003-01-01

    Species-abundance relationship is one of the most fundamental aspects of community ecology. Since Motomura first developed the geometric series model to describe the feature of community structure, ecologists have developed many other models to fit the species-abundance data in communities. These models can be classified into empirical and theoretical ones, including (1) statistical models, i.e., negative binomial distribution (and its extension), log-series distribution (and its extension), geometric distribution, lognormal distribution, Poisson-lognormal distribution, (2) niche models, i.e., geometric series, broken stick, overlapping niche, particulate niche, random assortment, dominance pre-emption, dominance decay, random fraction, weighted random fraction, composite niche, Zipf or Zipf-Mandelbrot model, and (3) dynamic models describing community dynamics and restrictive function of environment on community. These models have different characteristics and fit species-abundance data in various communities or collections. Among them, log-series distribution, lognormal distribution, geometric series, and broken stick model have been most widely used.

  4. Does prism width from the shell prismatic layer have a random distribution?

    NASA Astrophysics Data System (ADS)

    Vancolen, Séverine; Verrecchia, Eric

    2008-10-01

    A study of the distribution of the prism width inside the prismatic layer of Unio tumidus (Philipsson 1788, Diss Hist-Nat, Berling, Lundæ) from Lake Neuchâtel, Switzerland, has been conducted in order to determine whether or not this distribution is random. Measurements of 954 to 1,343 prism widths (depending on shell sample) have been made using a scanning electron microscope in backscattered electron mode. A white noise test has been applied to the distribution of prism sizes (i.e. width). It shows that there is no temporal cycle that could potentially influence their formation and growth. These results suggest that prism widths are randomly distributed, and related neither to external rings nor to environmental constraints.

  5. Spatial Distribution of Phase Singularities in Optical Random Vector Waves.

    PubMed

    De Angelis, L; Alpeggiani, F; Di Falco, A; Kuipers, L

    2016-08-26

    Phase singularities are dislocations widely studied in optical fields as well as in other areas of physics. With experiment and theory we show that the vectorial nature of light affects the spatial distribution of phase singularities in random light fields. While in scalar random waves phase singularities exhibit spatial distributions reminiscent of particles in isotropic liquids, in vector fields their distribution for the different vector components becomes anisotropic due to the direct relation between propagation and field direction. By incorporating this relation in the theory for scalar fields by Berry and Dennis [Proc. R. Soc. A 456, 2059 (2000)], we quantitatively describe our experiments.

  6. Random distributed feedback fiber laser at 2.1  μm.

    PubMed

    Jin, Xiaoxi; Lou, Zhaokai; Zhang, Hanwei; Xu, Jiangming; Zhou, Pu; Liu, Zejin

    2016-11-01

    We demonstrate a random distributed feedback fiber laser at 2.1 μm. A high-power pulsed Tm-doped fiber laser operating at 1.94 μm with a temporal duty ratio of 30% was employed as a pump laser to increase the equivalent incident pump power. A piece of 150 m highly GeO2-doped silica fiber that provides a strong Raman gain and random distributed feedbacks was used to act as the gain medium. The maximum output power reached 0.5 W with the optical efficiency of 9%, which could be further improved by more pump power and optimized fiber length. To the best of our knowledge, this is the first demonstration of random distributed feedback fiber laser at 2 μm band based on Raman gain.

  7. Saddlepoint approximation to the distribution of the total distance of the continuous time random walk

    NASA Astrophysics Data System (ADS)

    Gatto, Riccardo

    2017-12-01

    This article considers the random walk over Rp, with p ≥ 2, where a given particle starts at the origin and moves stepwise with uniformly distributed step directions and step lengths following a common distribution. Step directions and step lengths are independent. The case where the number of steps of the particle is fixed and the more general case where it follows an independent continuous time inhomogeneous counting process are considered. Saddlepoint approximations to the distribution of the distance from the position of the particle to the origin are provided. Despite the p-dimensional nature of the random walk, the computations of the saddlepoint approximations are one-dimensional and thus simple. Explicit formulae are derived with dimension p = 3: for uniformly and exponentially distributed step lengths, for fixed and for Poisson distributed number of steps. In these situations, the high accuracy of the saddlepoint approximations is illustrated by numerical comparisons with Monte Carlo simulation. Contribution to the "Topical Issue: Continuous Time Random Walk Still Trendy: Fifty-year History, Current State and Outlook", edited by Ryszard Kutner and Jaume Masoliver.

  8. Exploratory study and application of the angular wavelet analysis for assessing the spatial distribution of breakdown spots in Pt/HfO2/Pt structures

    NASA Astrophysics Data System (ADS)

    Muñoz-Gorriz, J.; Monaghan, S.; Cherkaoui, K.; Suñé, J.; Hurley, P. K.; Miranda, E.

    2017-12-01

    The angular wavelet analysis is applied for assessing the spatial distribution of breakdown spots in Pt/HfO2/Pt capacitors with areas ranging from 104 to 105 μm2. The breakdown spot lateral sizes are in the range from 1 to 3 μm, and they appear distributed on the top metal electrode as a point pattern. The spots are generated by ramped and constant voltage stresses and are the consequence of microexplosions caused by the formation of shorts spanning the dielectric film. This kind of pattern was analyzed in the past using the conventional spatial analysis tools such as intensity plots, distance histograms, pair correlation function, and nearest neighbours. Here, we show that the wavelet analysis offers an alternative and complementary method for testing whether or not the failure site distribution departs from a complete spatial randomness process in the angular domain. The effect of using different wavelet functions, such as the Haar, Sine, French top hat, Mexican hat, and Morlet, as well as the roles played by the process intensity, the location of the voltage probe, and the aspect ratio of the device, are all discussed.

  9. A Pearson Random Walk with Steps of Uniform Orientation and Dirichlet Distributed Lengths

    NASA Astrophysics Data System (ADS)

    Le Caër, Gérard

    2010-08-01

    A constrained diffusive random walk of n steps in ℝ d and a random flight in ℝ d , which are equivalent, were investigated independently in recent papers (J. Stat. Phys. 127:813, 2007; J. Theor. Probab. 20:769, 2007, and J. Stat. Phys. 131:1039, 2008). The n steps of the walk are independent and identically distributed random vectors of exponential length and uniform orientation. Conditioned on the sum of their lengths being equal to a given value l, closed-form expressions for the distribution of the endpoint of the walk were obtained altogether for any n for d=1,2,4. Uniform distributions of the endpoint inside a ball of radius l were evidenced for a walk of three steps in 2D and of two steps in 4D. The previous walk is generalized by considering step lengths which have independent and identical gamma distributions with a shape parameter q>0. Given the total walk length being equal to 1, the step lengths have a Dirichlet distribution whose parameters are all equal to q. The walk and the flight above correspond to q=1. Simple analytical expressions are obtained for any d≥2 and n≥2 for the endpoint distributions of two families of walks whose q are integers or half-integers which depend solely on d. These endpoint distributions have a simple geometrical interpretation. Expressed for a two-step planar walk whose q=1, it means that the distribution of the endpoint on a disc of radius 1 is identical to the distribution of the projection on the disc of a point M uniformly distributed over the surface of the 3D unit sphere. Five additional walks, with a uniform distribution of the endpoint in the inside of a ball, are found from known finite integrals of products of powers and Bessel functions of the first kind. They include four different walks in ℝ3, two of two steps and two of three steps, and one walk of two steps in ℝ4. Pearson-Liouville random walks, obtained by distributing the total lengths of the previous Pearson-Dirichlet walks according to some specified probability law are finally discussed. Examples of unconstrained random walks, whose step lengths are gamma distributed, are more particularly considered.

  10. Canine retraction and anchorage loss: self-ligating versus conventional brackets in a randomized split-mouth study.

    PubMed

    da Costa Monini, André; Júnior, Luiz Gonzaga Gandini; Martins, Renato Parsekian; Vianna, Alexandre Protásio

    2014-09-01

    To evaluate the velocity of canine retraction, anchorage loss and changes on canine and first molar inclinations using self-ligating and conventional brackets. Twenty-five adults with Class I malocclusion and a treatment plan involving extractions of four first premolars were selected for this randomized split-mouth control trial. Patients had either conventional or self-ligating brackets bonded to maxillary canines randomly. Retraction was accomplished using 100-g nickel-titanium closed coil springs, which were reactivated every 4 weeks. Oblique radiographs were taken before and after canine retraction was completed, and the cephalograms were superimposed on stable structures of the maxilla. Cephalometric points were digitized twice by a blinded operator for error control, and the following landmarks were collected: canine cusp and apex horizontal changes, molar cusp and apex horizontal changes, and angulation changes in canines and molars. The blinded data, which were normally distributed, were analyzed through paired t-tests for group differences. No differences were found between the two groups for all variables tested. Both brackets showed the same velocity of canine retraction and loss of anteroposterior anchorage of the molars. No changes were found between brackets regarding the inclination of canines and first molars.

  11. Fractal planetary rings: Energy inequalities and random field model

    NASA Astrophysics Data System (ADS)

    Malyarenko, Anatoliy; Ostoja-Starzewski, Martin

    2017-12-01

    This study is motivated by a recent observation, based on photographs from the Cassini mission, that Saturn’s rings have a fractal structure in radial direction. Accordingly, two questions are considered: (1) What Newtonian mechanics argument in support of such a fractal structure of planetary rings is possible? (2) What kinematics model of such fractal rings can be formulated? Both challenges are based on taking planetary rings’ spatial structure as being statistically stationary in time and statistically isotropic in space, but statistically nonstationary in space. An answer to the first challenge is given through an energy analysis of circular rings having a self-generated, noninteger-dimensional mass distribution [V. E. Tarasov, Int. J. Mod Phys. B 19, 4103 (2005)]. The second issue is approached by taking the random field of angular velocity vector of a rotating particle of the ring as a random section of a special vector bundle. Using the theory of group representations, we prove that such a field is completely determined by a sequence of continuous positive-definite matrix-valued functions defined on the Cartesian square F2 of the radial cross-section F of the rings, where F is a fat fractal.

  12. Effects of initiating moderate wine intake on abdominal adipose tissue in adults with type 2 diabetes: a 2-year randomized controlled trial.

    PubMed

    Golan, Rachel; Shelef, Ilan; Shemesh, Elad; Henkin, Yaakov; Schwarzfuchs, Dan; Gepner, Yftach; Harman-Boehm, Ilana; Witkow, Shula; Friger, Michael; Chassidim, Yoash; Liberty, Idit F; Sarusi, Benjamin; Serfaty, Dana; Bril, Nitzan; Rein, Michal; Cohen, Noa; Ben-Avraham, Sivan; Ceglarek, Uta; Stumvoll, Michael; Blüher, Matthias; Thiery, Joachim; Stampfer, Meir J; Rudich, Assaf; Shai, Iris

    2017-02-01

    To generate evidence-based conclusions about the effect of wine consumption on weight gain and abdominal fat accumulation and distribution in patients with type 2 diabetes. In the 2-year randomized controlled CASCADE (CArdiovaSCulAr Diabetes & Ethanol) trial, patients following a Mediterranean diet were randomly assigned to drink 150 ml of mineral water, white wine or red wine with dinner for 2 years. Visceral adiposity and abdominal fat distribution were measured in a subgroup of sixty-five participants, using abdominal MRI. Ben-Gurion University of the Negev, Soroka-Medical Center and the Nuclear Research Center Negev, Israel. Alcohol-abstaining adults with well-controlled type 2 diabetes. Forty-eight participants (red wine, n 27; mineral water, n 21) who completed a second MRI measurement were included in the 2-year analysis. Similar weight losses (sd) were observed: red wine 1·3 (3·9) kg; water 1·0 (4·2) kg (P=0·8 between groups). Changes (95 % CI) in abdominal adipose-tissue distribution were similar: red wine, visceral adipose tissue (VAT) -3·0 (-8·0, 2·0) %, deep subcutaneous adipose tissue (DSAT) +5·2 (-1·1, 11·6) %, superficial subcutaneous adipose tissue (SSAT) -1·9 (-5·0, 1·2) %; water, VAT -3·2 (-8·9, 2·5) %, DSAT +2·9 (-2·8, 8·6) %, SSAT -0·15 (-3·3, 2·9) %. No changes in antidiabetic medication and no substantial changes in energy intake (+126 (sd 2889) kJ/d (+30·2 (sd 690) kcal/d), P=0·8) were recorded. A 2-year decrease in glycated Hb (β=0·28, P=0·05) was associated with a decrease in VAT. Moderate wine consumption, as part of a Mediterranean diet, in persons with controlled diabetes did not promote weight gain or abdominal adiposity.

  13. Azimuthal Dependence of the Ground Motion Variability from Scenario Modeling of the 2014 Mw6.0 South Napa, California, Earthquake Using an Advanced Kinematic Source Model

    NASA Astrophysics Data System (ADS)

    Gallovič, F.

    2017-09-01

    Strong ground motion simulations require physically plausible earthquake source model. Here, I present the application of such a kinematic model introduced originally by Ruiz et al. (Geophys J Int 186:226-244, 2011). The model is constructed to inherently provide synthetics with the desired omega-squared spectral decay in the full frequency range. The source is composed of randomly distributed overlapping subsources with fractal number-size distribution. The position of the subsources can be constrained by prior knowledge of major asperities (stemming, e.g., from slip inversions), or can be completely random. From earthquake physics point of view, the model includes positive correlation between slip and rise time as found in dynamic source simulations. Rupture velocity and rise time follows local S-wave velocity profile, so that the rupture slows down and rise times increase close to the surface, avoiding unrealistically strong ground motions. Rupture velocity can also have random variations, which result in irregular rupture front while satisfying the causality principle. This advanced kinematic broadband source model is freely available and can be easily incorporated into any numerical wave propagation code, as the source is described by spatially distributed slip rate functions, not requiring any stochastic Green's functions. The source model has been previously validated against the observed data due to the very shallow unilateral 2014 Mw6 South Napa, California, earthquake; the model reproduces well the observed data including the near-fault directivity (Seism Res Lett 87:2-14, 2016). The performance of the source model is shown here on the scenario simulations for the same event. In particular, synthetics are compared with existing ground motion prediction equations (GMPEs), emphasizing the azimuthal dependence of the between-event ground motion variability. I propose a simple model reproducing the azimuthal variations of the between-event ground motion variability, providing an insight into possible refinement of GMPEs' functional forms.

  14. Continuous Time Random Walks with memory and financial distributions

    NASA Astrophysics Data System (ADS)

    Montero, Miquel; Masoliver, Jaume

    2017-11-01

    We study financial distributions from the perspective of Continuous Time Random Walks with memory. We review some of our previous developments and apply them to financial problems. We also present some new models with memory that can be useful in characterizing tendency effects which are inherent in most markets. We also briefly study the effect on return distributions of fractional behaviors in the distribution of pausing times between successive transactions.

  15. Multivariate test power approximations for balanced linear mixed models in studies with missing data.

    PubMed

    Ringham, Brandy M; Kreidler, Sarah M; Muller, Keith E; Glueck, Deborah H

    2016-07-30

    Multilevel and longitudinal studies are frequently subject to missing data. For example, biomarker studies for oral cancer may involve multiple assays for each participant. Assays may fail, resulting in missing data values that can be assumed to be missing completely at random. Catellier and Muller proposed a data analytic technique to account for data missing at random in multilevel and longitudinal studies. They suggested modifying the degrees of freedom for both the Hotelling-Lawley trace F statistic and its null case reference distribution. We propose parallel adjustments to approximate power for this multivariate test in studies with missing data. The power approximations use a modified non-central F statistic, which is a function of (i) the expected number of complete cases, (ii) the expected number of non-missing pairs of responses, or (iii) the trimmed sample size, which is the planned sample size reduced by the anticipated proportion of missing data. The accuracy of the method is assessed by comparing the theoretical results to the Monte Carlo simulated power for the Catellier and Muller multivariate test. Over all experimental conditions, the closest approximation to the empirical power of the Catellier and Muller multivariate test is obtained by adjusting power calculations with the expected number of complete cases. The utility of the method is demonstrated with a multivariate power analysis for a hypothetical oral cancer biomarkers study. We describe how to implement the method using standard, commercially available software products and give example code. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  16. Use of monetary and nonmonetary incentives to increase response rates among African Americans in the Wisconsin Pregnancy Risk Assessment Monitoring System.

    PubMed

    Dykema, Jennifer; Stevenson, John; Kniss, Chad; Kvale, Katherine; González, Kim; Cautley, Eleanor

    2012-05-01

    From 2009 to 2010, an experiment was conducted to increase response rates among African American mothers in the Wisconsin Pregnancy Risk Assessment Monitoring System (PRAMS). Sample members were randomly assigned to groups that received a prepaid, cash incentive of $5 (n = 219); a coupon for diapers valued at $6 (n = 210); or no incentive (n = 209). Incentives were included with the questionnaire, which was mailed to respondents. We examined the effects of the incentives on several outcomes, including response rates, cost effectiveness, survey response distributions, and item nonresponse. Response rates were significantly higher for the cash group than for the coupon (42.5 vs. 32.4%, P < .05) or no incentive group (42.5 vs. 30.1%, P < .01); the coupon and no incentive groups performed similarly. While absolute costs were the highest for the cash group, the cost per completed survey was the lowest. The incentives had limited effects on response distributions for specific survey questions. Although respondents completing the survey by mail in the cash and coupon groups exhibited a trend toward being less likely to have missing data, the effect was not significant. Compared to a coupon or no incentive, a small cash incentive significantly improved response rates and was cost effective among African American respondents in Wisconsin PRAMS. Incentives had only limited effects, however, on survey response distributions, and no significant effects on item nonresponse.

  17. Solar wind driving and substorm triggering

    NASA Astrophysics Data System (ADS)

    Newell, Patrick T.; Liou, Kan

    2011-03-01

    We compare solar wind driving and its changes for three data sets: (1) 4861 identifications of substorm onsets from satellite global imagers (Polar UVI and IMAGE FUV); (2) a similar number of otherwise random times chosen with a similar solar wind distribution (slightly elevated driving); (3) completely random times. Multiple measures of solar wind driving were used, including interplanetary magnetic field (IMF) Bz, the Kan-Lee electric field, the Borovsky function, and dΦMP/dt (all of which estimate dayside merging). Superposed epoch analysis verifies that the mean Bz has a northward turning (or at least averages less southward) starting 20 min before onset. We argue that the delay between IMF impact on the magnetopause and tail effects appearing in the ionosphere is about that long. The northward turning is not the effect of a few extreme events. The median field shows the same result, as do all other measures of solar wind driving. We compare the rate of northward turning to that observed after random times with slightly elevated driving. The subsequent reversion to mean is essentially the same between random elevations and substorms. To further verify this, we consider in detail the distribution of changes from the statistical peak (20 min prior to onset) to onset. For Bz, the mean change after onset is +0.14 nT (i.e., IMF becomes more northward), but the standard deviation is σ = 2.8 nT. Thus large changes in either direction are common. For EKL, the change is -15 nT km/s ± 830 nT km/s. Thus either a hypothesis predicting northward turnings or one predicting southward turnings would find abundant yet random confirming examples. Indeed, applying the Lyons et al. (1997) trigger criteria (excluding only the prior requirement of 22/30 min Bz < 0, which is often not valid for actual substorms) to these three sets of data shows that "northward turning triggers" occur in 23% of the random data, 24% of the actual substorms, and after 27% of the random elevations. These results strongly support the idea of Morley and Freeman (2007), that substorms require initial elevated solar wind driving, but that there is no evidence for external triggering. Finally dynamic pressure, p, and velocity, v, show no meaningful variation around onset (although p averages 10% above an 11 year mean).

  18. The Miniaturization of the AFIT Random Noise Radar

    DTIC Science & Technology

    2013-03-01

    RANDOM NOISE RADAR I. Introduction Recent advances in technology and signal processing techniques have opened thedoor to using an ultra-wide band random...AIR FORCE INSTITUTE OF TECHNOLOGY Wright-Patterson Air Force Base, Ohio DISTRIBUTION STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED...and Computer Engineering Graduate School of Engineering and Management Air Force Institute of Technology Air University Air Education and Training

  19. Smoothing the redshift distributions of random samples for the baryon acoustic oscillations: applications to the SDSS-III BOSS DR12 and QPM mock samples

    NASA Astrophysics Data System (ADS)

    Wang, Shao-Jiang; Guo, Qi; Cai, Rong-Gen

    2017-12-01

    We investigate the impact of different redshift distributions of random samples on the baryon acoustic oscillations (BAO) measurements of D_V(z)r_d^fid/r_d from the two-point correlation functions of galaxies in the Data Release 12 of the Baryon Oscillation Spectroscopic Survey (BOSS). Big surveys, such as BOSS, usually assign redshifts to the random samples by randomly drawing values from the measured redshift distributions of the data, which would necessarily introduce fiducial signals of fluctuations into the random samples, weakening the signals of BAO, if the cosmic variance cannot be ignored. We propose a smooth function of redshift distribution that fits the data well to populate the random galaxy samples. The resulting cosmological parameters match the input parameters of the mock catalogue very well. The significance of BAO signals has been improved by 0.33σ for a low-redshift sample and by 0.03σ for a constant-stellar-mass sample, though the absolute values do not change significantly. Given the precision of the measurements of current cosmological parameters, it would be appreciated for the future improvements on the measurements of galaxy clustering.

  20. Scaling characteristics of one-dimensional fractional diffusion processes in the presence of power-law distributed random noise

    NASA Astrophysics Data System (ADS)

    Nezhadhaghighi, Mohsen Ghasemi

    2017-08-01

    Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ -stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α . We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ -stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.

  1. Scaling characteristics of one-dimensional fractional diffusion processes in the presence of power-law distributed random noise.

    PubMed

    Nezhadhaghighi, Mohsen Ghasemi

    2017-08-01

    Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ-stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α. We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ-stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.

  2. Distribution of Schmidt-like eigenvalues for Gaussian ensembles of the random matrix theory

    NASA Astrophysics Data System (ADS)

    Pato, Mauricio P.; Oshanin, Gleb

    2013-03-01

    We study the probability distribution function P(β)n(w) of the Schmidt-like random variable w = x21/(∑j = 1nx2j/n), where xj, (j = 1, 2, …, n), are unordered eigenvalues of a given n × n β-Gaussian random matrix, β being the Dyson symmetry index. This variable, by definition, can be considered as a measure of how any individual (randomly chosen) eigenvalue deviates from the arithmetic mean value of all eigenvalues of a given random matrix, and its distribution is calculated with respect to the ensemble of such β-Gaussian random matrices. We show that in the asymptotic limit n → ∞ and for arbitrary β the distribution P(β)n(w) converges to the Marčenko-Pastur form, i.e. is defined as P_{n}^{( \\beta )}(w) \\sim \\sqrt{(4 - w)/w} for w ∈ [0, 4] and equals zero outside of the support, despite the fact that formally w is defined on the interval [0, n]. Furthermore, for Gaussian unitary ensembles (β = 2) we present exact explicit expressions for P(β = 2)n(w) which are valid for arbitrary n and analyse their behaviour.

  3. Statistical self-similarity of width function maxima with implications to floods

    USGS Publications Warehouse

    Veitzer, S.A.; Gupta, V.K.

    2001-01-01

    Recently a new theory of random self-similar river networks, called the RSN model, was introduced to explain empirical observations regarding the scaling properties of distributions of various topologic and geometric variables in natural basins. The RSN model predicts that such variables exhibit statistical simple scaling, when indexed by Horton-Strahler order. The average side tributary structure of RSN networks also exhibits Tokunaga-type self-similarity which is widely observed in nature. We examine the scaling structure of distributions of the maximum of the width function for RSNs for nested, complete Strahler basins by performing ensemble simulations. The maximum of the width function exhibits distributional simple scaling, when indexed by Horton-Strahler order, for both RSNs and natural river networks extracted from digital elevation models (DEMs). We also test a powerlaw relationship between Horton ratios for the maximum of the width function and drainage areas. These results represent first steps in formulating a comprehensive physical statistical theory of floods at multiple space-time scales for RSNs as discrete hierarchical branching structures. ?? 2001 Published by Elsevier Science Ltd.

  4. Smart darting diffusion Monte Carlo: Applications to lithium ion-Stockmayer clusters

    NASA Astrophysics Data System (ADS)

    Christensen, H. M.; Jake, L. C.; Curotto, E.

    2016-05-01

    In a recent investigation [K. Roberts et al., J. Chem. Phys. 136, 074104 (2012)], we have shown that, for a sufficiently complex potential, the Diffusion Monte Carlo (DMC) random walk can become quasiergodic, and we have introduced smart darting-like moves to improve the sampling. In this article, we systematically characterize the bias that smart darting moves introduce in the estimate of the ground state energy of a bosonic system. We then test a simple approach to eliminate completely such bias from the results. The approach is applied for the determination of the ground state of lithium ion-n-dipoles clusters in the n = 8-20 range. For these, the smart darting diffusion Monte Carlo simulations find the same ground state energy and mixed-distribution as the traditional approach for n < 14. In larger systems we find that while the ground state energies agree quantitatively with or without smart darting moves, the mixed-distributions can be significantly different. Some evidence is offered to conclude that introducing smart darting-like moves in traditional DMC simulations may produce a more reliable ground state mixed-distribution.

  5. A Multicenter, Rater-Blinded, Randomized Controlled Study of Auditory Processing-Focused Cognitive Remediation Combined With Open-Label Lurasidone in Patients With Schizophrenia and Schizoaffective Disorder.

    PubMed

    Kantrowitz, Joshua T; Sharif, Zafar; Medalia, Alice; Keefe, Richard S E; Harvey, Philip; Bruder, Gerard; Barch, Deanna M; Choo, Tse; Lee, Seonjoo; Lieberman, Jeffrey A

    2016-06-01

    Small-scale studies of auditory processing cognitive remediation programs have demonstrated efficacy in schizophrenia. We describe a multicenter, rater-blinded, randomized, controlled study of auditory-focused cognitive remediation, conducted from June 24, 2010, to June 14, 2013, and approved by the local institutional review board at all sites. Prior to randomization, participants with schizophrenia (DSM-IV-TR) were stabilized on a standardized antipsychotic regimen (lurasidone [40-160 mg/d]), followed by randomization to adjunctive cognitive remediation: auditory focused (Brain Fitness) versus control (nonspecific video games), administered 1-2 times weekly for 30 sessions. Coprimary outcome measures included MATRICS Consensus Cognitive Battery (MCCB) and the University of California, San Diego, Performance-Based Skills Assessment-Brief scale. 120 participants were randomized and completed at least 1 auditory-focused cognitive remediation (n = 56) or video game control session (n = 64). 74 participants completed ≥ 25 sessions and postrandomization assessments. At study completion, the change from prestabilization was statistically significant for MCCB composite score (d = 0.42, P < .0001) across groups. Participants randomized to auditory-focused cognitive remediation had a trend-level higher mean MCCB composite score compared to participants randomized to control cognitive remediation (P = .08). After controlling for scores at the time of randomization, there were no significant between-treatment group differences at study completion. Auditory processing cognitive remediation combined with lurasidone did not lead to differential improvement over nonspecific video games. Across-group improvement from prestabilization baseline to study completion was observed, but since all participants were receiving lurasidone open label, it is difficult to interpret the source of these effects. Future studies comparing both pharmacologic and behavioral cognitive enhancers should consider a 2 × 2 design, using a control for both the medication and the cognitive remediation. ClinicalTrials.gov identifier: NCT01173874. © Copyright 2016 Physicians Postgraduate Press, Inc.

  6. Deficiencies in the reporting of VD and t1/2 in the FDA approved chemotherapy drug inserts

    PubMed Central

    D’Souza, Malcolm J.; Alabed, Ghada J.

    2011-01-01

    Since its release in 2006, the US Food and Drug Administration (FDA) final improved format for prescription drug labeling has revamped the comprehensiveness of drug inserts, including chemotherapy drugs. The chemotherapy drug “packets”, retrieved via the FDA website and other accredited drug information reporting agencies such as the Physician Drug Reference (PDR), are practically the only available unbiased summary of information. One objective is to impartially evaluate the reporting of useful pharmacokinetic parameters, in particular, Volume of Distribution (VD) and elimination half-life (t1/2), in randomly selected FDA approved chemotherapy drug inserts. The web-accessible portable document format (PDF) files for 30 randomly selected chemotherapy drugs are subjected to detailed search and the two parameters of interest are tabulated. The knowledge of the two parameters is essential in directing patient care as well as for clinical research and since the completeness of the core FDA recommendations has been found deficient, a detailed explanation of the impact of such deficiencies is provided. PMID:21643531

  7. A Multiscale Approach Indicates a Severe Reduction in Atlantic Forest Wetlands and Highlights that São Paulo Marsh Antwren Is on the Brink of Extinction

    PubMed Central

    Del-Rio, Glaucia; Rêgo, Marco Antonio; Silveira, Luís Fábio

    2015-01-01

    Over the last 200 years the wetlands of the Upper Tietê and Upper Paraíba do Sul basins, in the southeastern Atlantic Forest, Brazil, have been almost-completely transformed by urbanization, agriculture and mining. Endemic to these river basins, the São Paulo Marsh Antwren (Formicivora paludicola) survived these impacts, but remained unknown to science until its discovery in 2005. Its population status was cause for immediate concern. In order to understand the factors imperiling the species, and provide guidelines for its conservation, we investigated both the species’ distribution and the distribution of areas of suitable habitat using a multiscale approach encompassing species distribution modeling, fieldwork surveys and occupancy models. Of six species distribution models methods used (Generalized Linear Models, Generalized Additive Models, Multivariate Adaptive Regression Splines, Classification Tree Analysis, Artificial Neural Networks and Random Forest), Random Forest showed the best fit and was utilized to guide field validation. After surveying 59 sites, our results indicated that Formicivora paludicola occurred in only 13 sites, having narrow habitat specificity, and restricted habitat availability. Additionally, historic maps, distribution models and satellite imagery showed that human occupation has resulted in a loss of more than 346 km2 of suitable habitat for this species since the early twentieth century, so that it now only occupies a severely fragmented area (area of occupancy) of 1.42 km2, and it should be considered Critically Endangered according to IUCN criteria. Furthermore, averaged occupancy models showed that marshes with lower cattail (Typha dominguensis) densities have higher probabilities of being occupied. Thus, these areas should be prioritized in future conservation efforts to protect the species, and to restore a portion of Atlantic Forest wetlands, in times of unprecedented regional water supply problems. PMID:25798608

  8. A multiscale approach indicates a severe reduction in Atlantic Forest wetlands and highlights that São Paulo Marsh Antwren is on the brink of extinction.

    PubMed

    Del-Rio, Glaucia; Rêgo, Marco Antonio; Silveira, Luís Fábio

    2015-01-01

    Over the last 200 years the wetlands of the Upper Tietê and Upper Paraíba do Sul basins, in the southeastern Atlantic Forest, Brazil, have been almost-completely transformed by urbanization, agriculture and mining. Endemic to these river basins, the São Paulo Marsh Antwren (Formicivora paludicola) survived these impacts, but remained unknown to science until its discovery in 2005. Its population status was cause for immediate concern. In order to understand the factors imperiling the species, and provide guidelines for its conservation, we investigated both the species' distribution and the distribution of areas of suitable habitat using a multiscale approach encompassing species distribution modeling, fieldwork surveys and occupancy models. Of six species distribution models methods used (Generalized Linear Models, Generalized Additive Models, Multivariate Adaptive Regression Splines, Classification Tree Analysis, Artificial Neural Networks and Random Forest), Random Forest showed the best fit and was utilized to guide field validation. After surveying 59 sites, our results indicated that Formicivora paludicola occurred in only 13 sites, having narrow habitat specificity, and restricted habitat availability. Additionally, historic maps, distribution models and satellite imagery showed that human occupation has resulted in a loss of more than 346 km2 of suitable habitat for this species since the early twentieth century, so that it now only occupies a severely fragmented area (area of occupancy) of 1.42 km2, and it should be considered Critically Endangered according to IUCN criteria. Furthermore, averaged occupancy models showed that marshes with lower cattail (Typha dominguensis) densities have higher probabilities of being occupied. Thus, these areas should be prioritized in future conservation efforts to protect the species, and to restore a portion of Atlantic Forest wetlands, in times of unprecedented regional water supply problems.

  9. Programmable quantum random number generator without postprocessing.

    PubMed

    Nguyen, Lac; Rehain, Patrick; Sua, Yong Meng; Huang, Yu-Ping

    2018-02-15

    We demonstrate a viable source of unbiased quantum random numbers whose statistical properties can be arbitrarily programmed without the need for any postprocessing such as randomness distillation or distribution transformation. It is based on measuring the arrival time of single photons in shaped temporal modes that are tailored with an electro-optical modulator. We show that quantum random numbers can be created directly in customized probability distributions and pass all randomness tests of the NIST and Dieharder test suites without any randomness extraction. The min-entropies of such generated random numbers are measured close to the theoretical limits, indicating their near-ideal statistics and ultrahigh purity. Easy to implement and arbitrarily programmable, this technique can find versatile uses in a multitude of data analysis areas.

  10. The Shark Random Swim - (Lévy Flight with Memory)

    NASA Astrophysics Data System (ADS)

    Businger, Silvia

    2018-05-01

    The Elephant Random Walk (ERW), first introduced by Schütz and Trimper (Phys Rev E 70:045101, 2004), is a one-dimensional simple random walk on Z having a memory about the whole past. We study the Shark Random Swim, a random walk with memory about the whole past, whose steps are α -stable distributed with α \\in (0,2] . Our aim in this work is to study the impact of the heavy tailed step distributions on the asymptotic behavior of the random walk. We shall see that, as for the ERW, the asymptotic behavior of the Shark Random Swim depends on its memory parameter p, and that a phase transition can be observed at the critical value p=1/α.

  11. Isolation and Connectivity in Random Geometric Graphs with Self-similar Intensity Measures

    NASA Astrophysics Data System (ADS)

    Dettmann, Carl P.

    2018-05-01

    Random geometric graphs consist of randomly distributed nodes (points), with pairs of nodes within a given mutual distance linked. In the usual model the distribution of nodes is uniform on a square, and in the limit of infinitely many nodes and shrinking linking range, the number of isolated nodes is Poisson distributed, and the probability of no isolated nodes is equal to the probability the whole graph is connected. Here we examine these properties for several self-similar node distributions, including smooth and fractal, uniform and nonuniform, and finitely ramified or otherwise. We show that nonuniformity can break the Poisson distribution property, but it strengthens the link between isolation and connectivity. It also stretches out the connectivity transition. Finite ramification is another mechanism for lack of connectivity. The same considerations apply to fractal distributions as smooth, with some technical differences in evaluation of the integrals and analytical arguments.

  12. Optimal random search for a single hidden target.

    PubMed

    Snider, Joseph

    2011-01-01

    A single target is hidden at a location chosen from a predetermined probability distribution. Then, a searcher must find a second probability distribution from which random search points are sampled such that the target is found in the minimum number of trials. Here it will be shown that if the searcher must get very close to the target to find it, then the best search distribution is proportional to the square root of the target distribution regardless of dimension. For a Gaussian target distribution, the optimum search distribution is approximately a Gaussian with a standard deviation that varies inversely with how close the searcher must be to the target to find it. For a network where the searcher randomly samples nodes and looks for the fixed target along edges, the optimum is either to sample a node with probability proportional to the square root of the out-degree plus 1 or not to do so at all.

  13. [Comparison of different methods in dealing with HIV viral load data with diversified missing value mechanism on HIV positive MSM].

    PubMed

    Jiang, Z; Dou, Z; Song, W L; Xu, J; Wu, Z Y

    2017-11-10

    Objective: To compare results of different methods: in organizing HIV viral load (VL) data with missing values mechanism. Methods We used software SPSS 17.0 to simulate complete and missing data with different missing value mechanism from HIV viral loading data collected from MSM in 16 cities in China in 2013. Maximum Likelihood Methods Using the Expectation and Maximization Algorithm (EM), regressive method, mean imputation, delete method, and Markov Chain Monte Carlo (MCMC) were used to supplement missing data respectively. The results: of different methods were compared according to distribution characteristics, accuracy and precision. Results HIV VL data could not be transferred into a normal distribution. All the methods showed good results in iterating data which is Missing Completely at Random Mechanism (MCAR). For the other types of missing data, regressive and MCMC methods were used to keep the main characteristic of the original data. The means of iterating database with different methods were all close to the original one. The EM, regressive method, mean imputation, and delete method under-estimate VL while MCMC overestimates it. Conclusion: MCMC can be used as the main imputation method for HIV virus loading missing data. The iterated data can be used as a reference for mean HIV VL estimation among the investigated population.

  14. A strategy for rapid production and screening of yeast artificial chromosome libraries.

    PubMed

    Strauss, W M; Jaenisch, E; Jaenisch, R

    1992-01-01

    We describe methods for rapid production and screening of yeast artificial chromosome (YAC) libraries. Utilizing complete restriction digests of mouse genomic DNA for ligations in agarose, a 32,000-clone library was produced and screened in seven weeks. Screening was accomplished by subdividing primary transformation plates into pools of approximately 100 clones which were transferred into a master glycerol stock. These master stocks were used to inoculate liquid cultures to produce culture "pools," and ten pools of 100 clones were then combined to yield superpools of 1,000 clones. Both pool and superpool DNA was screened by polymerase chain reaction (PCR) and positive pools representing 100 clones were then plated on selective medium and screened by in situ hybridization. Screening by the two tiered PCR assay and by in situ hybridization was completed in 4-5 days. Utilizing this methodology we have isolated a 150 kb clone spanning the alpha 1(I) collagen (Col1a1) gene as well as 40 kb clones from the Hox-2 locus. To characterize the representation of the YAC library, the size distribution of genomic Sal I fragments was compared to that of clones picked at random from the library. The results demonstrate significant biasing of the cloned fragment distribution, resulting in a loss of representation for larger fragments.

  15. Dynamic Uncertain Causality Graph for Knowledge Representation and Probabilistic Reasoning: Directed Cyclic Graph and Joint Probability Distribution.

    PubMed

    Zhang, Qin

    2015-07-01

    Probabilistic graphical models (PGMs) such as Bayesian network (BN) have been widely applied in uncertain causality representation and probabilistic reasoning. Dynamic uncertain causality graph (DUCG) is a newly presented model of PGMs, which can be applied to fault diagnosis of large and complex industrial systems, disease diagnosis, and so on. The basic methodology of DUCG has been previously presented, in which only the directed acyclic graph (DAG) was addressed. However, the mathematical meaning of DUCG was not discussed. In this paper, the DUCG with directed cyclic graphs (DCGs) is addressed. In contrast, BN does not allow DCGs, as otherwise the conditional independence will not be satisfied. The inference algorithm for the DUCG with DCGs is presented, which not only extends the capabilities of DUCG from DAGs to DCGs but also enables users to decompose a large and complex DUCG into a set of small, simple sub-DUCGs, so that a large and complex knowledge base can be easily constructed, understood, and maintained. The basic mathematical definition of a complete DUCG with or without DCGs is proved to be a joint probability distribution (JPD) over a set of random variables. The incomplete DUCG as a part of a complete DUCG may represent a part of JPD. Examples are provided to illustrate the methodology.

  16. [Formula: see text]-convergence, complete convergence, and weak laws of large numbers for asymptotically negatively associated random vectors with values in [Formula: see text].

    PubMed

    Ko, Mi-Hwa

    2018-01-01

    In this paper, based on the Rosenthal-type inequality for asymptotically negatively associated random vectors with values in [Formula: see text], we establish results on [Formula: see text]-convergence and complete convergence of the maximums of partial sums are established. We also obtain weak laws of large numbers for coordinatewise asymptotically negatively associated random vectors with values in [Formula: see text].

  17. Tuning Monotonic Basin Hopping: Improving the Efficiency of Stochastic Search as Applied to Low-Thrust Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    Englander, Jacob A.; Englander, Arnold C.

    2014-01-01

    Trajectory optimization methods using monotonic basin hopping (MBH) have become well developed during the past decade [1, 2, 3, 4, 5, 6]. An essential component of MBH is a controlled random search through the multi-dimensional space of possible solutions. Historically, the randomness has been generated by drawing random variable (RV)s from a uniform probability distribution. Here, we investigate the generating the randomness by drawing the RVs from Cauchy and Pareto distributions, chosen because of their characteristic long tails. We demonstrate that using Cauchy distributions (as first suggested by J. Englander [3, 6]) significantly improves monotonic basin hopping (MBH) performance, and that Pareto distributions provide even greater improvements. Improved performance is defined in terms of efficiency and robustness. Efficiency is finding better solutions in less time. Robustness is efficiency that is undiminished by (a) the boundary conditions and internal constraints of the optimization problem being solved, and (b) by variations in the parameters of the probability distribution. Robustness is important for achieving performance improvements that are not problem specific. In this work we show that the performance improvements are the result of how these long-tailed distributions enable MBH to search the solution space faster and more thoroughly. In developing this explanation, we use the concepts of sub-diffusive, normally-diffusive, and super-diffusive random walks (RWs) originally developed in the field of statistical physics.

  18. Contextuality in canonical systems of random variables

    NASA Astrophysics Data System (ADS)

    Dzhafarov, Ehtibar N.; Cervantes, Víctor H.; Kujala, Janne V.

    2017-10-01

    Random variables representing measurements, broadly understood to include any responses to any inputs, form a system in which each of them is uniquely identified by its content (that which it measures) and its context (the conditions under which it is recorded). Two random variables are jointly distributed if and only if they share a context. In a canonical representation of a system, all random variables are binary, and every content-sharing pair of random variables has a unique maximal coupling (the joint distribution imposed on them so that they coincide with maximal possible probability). The system is contextual if these maximal couplings are incompatible with the joint distributions of the context-sharing random variables. We propose to represent any system of measurements in a canonical form and to consider the system contextual if and only if its canonical representation is contextual. As an illustration, we establish a criterion for contextuality of the canonical system consisting of all dichotomizations of a single pair of content-sharing categorical random variables. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  19. The triglyceride composition of 17 seed fats rich in octanoic, decanoic, or lauric acid.

    PubMed

    Litchfield, C; Miller, E; Harlow, R D; Reiser, R

    1967-07-01

    Seed fats of eight species ofLauraceae (laurel family), six species ofCuphea (Lythraceae family), and three species ofUlmaceae (elm family) were extracted, and the triglycerides were isolated by preparative thin-layer chromatography. GLC of the triglycerides on a silicone column resolved 10 to 18 peaks with a 22 to 58 carbon number range for each fat. These carbon number distributions yielded considerable information about triglyceride compositions of the fats.The most interesting finding was withLaurus nobilis seed fat, which contained 58.4% lauric acid and 29.2-29.8% trilaurin. A maximum of 19.9% trilaurin would be predicted by a 1, 2, 3-random, a 1, 3-random-2-random, or a 1-random-2-random-3-random distribution of the lauric acid(3). This indicates a specificity for the biosynthesis of a simple triglyceride byLaurus nobilis seed enzymes.Cuphea lanceolata seed fat also contained more simple triglyceride (tridecanoin) than would be predicted by the fatty acid distribution theories.

  20. Probability distribution for the Gaussian curvature of the zero level surface of a random function

    NASA Astrophysics Data System (ADS)

    Hannay, J. H.

    2018-04-01

    A rather natural construction for a smooth random surface in space is the level surface of value zero, or ‘nodal’ surface f(x,y,z)  =  0, of a (real) random function f; the interface between positive and negative regions of the function. A physically significant local attribute at a point of a curved surface is its Gaussian curvature (the product of its principal curvatures) because, when integrated over the surface it gives the Euler characteristic. Here the probability distribution for the Gaussian curvature at a random point on the nodal surface f  =  0 is calculated for a statistically homogeneous (‘stationary’) and isotropic zero mean Gaussian random function f. Capitalizing on the isotropy, a ‘fixer’ device for axes supplies the probability distribution directly as a multiple integral. Its evaluation yields an explicit algebraic function with a simple average. Indeed, this average Gaussian curvature has long been known. For a non-zero level surface instead of the nodal one, the probability distribution is not fully tractable, but is supplied as an integral expression.

  1. Gossip and Distributed Kalman Filtering: Weak Consensus Under Weak Detectability

    NASA Astrophysics Data System (ADS)

    Kar, Soummya; Moura, José M. F.

    2011-04-01

    The paper presents the gossip interactive Kalman filter (GIKF) for distributed Kalman filtering for networked systems and sensor networks, where inter-sensor communication and observations occur at the same time-scale. The communication among sensors is random; each sensor occasionally exchanges its filtering state information with a neighbor depending on the availability of the appropriate network link. We show that under a weak distributed detectability condition: 1. the GIKF error process remains stochastically bounded, irrespective of the instability properties of the random process dynamics; and 2. the network achieves \\emph{weak consensus}, i.e., the conditional estimation error covariance at a (uniformly) randomly selected sensor converges in distribution to a unique invariant measure on the space of positive semi-definite matrices (independent of the initial state.) To prove these results, we interpret the filtered states (estimates and error covariances) at each node in the GIKF as stochastic particles with local interactions. We analyze the asymptotic properties of the error process by studying as a random dynamical system the associated switched (random) Riccati equation, the switching being dictated by a non-stationary Markov chain on the network graph.

  2. Efficient sampling of complex network with modified random walk strategies

    NASA Astrophysics Data System (ADS)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  3. Human mammary epithelial cells exhibit a bimodal correlated random walk pattern.

    PubMed

    Potdar, Alka A; Jeon, Junhwan; Weaver, Alissa M; Quaranta, Vito; Cummings, Peter T

    2010-03-10

    Organisms, at scales ranging from unicellular to mammals, have been known to exhibit foraging behavior described by random walks whose segments confirm to Lévy or exponential distributions. For the first time, we present evidence that single cells (mammary epithelial cells) that exist in multi-cellular organisms (humans) follow a bimodal correlated random walk (BCRW). Cellular tracks of MCF-10A pBabe, neuN and neuT random migration on 2-D plastic substrates, analyzed using bimodal analysis, were found to reveal the BCRW pattern. We find two types of exponentially distributed correlated flights (corresponding to what we refer to as the directional and re-orientation phases) each having its own correlation between move step-lengths within flights. The exponential distribution of flight lengths was confirmed using different analysis methods (logarithmic binning with normalization, survival frequency plots and maximum likelihood estimation). Because of the presence of non-uniform turn angle distribution of move step-lengths within a flight and two different types of flights, we propose that the epithelial random walk is a BCRW comprising of two alternating modes with varying degree of correlations, rather than a simple persistent random walk. A BCRW model rather than a simple persistent random walk correctly matches the super-diffusivity in the cell migration paths as indicated by simulations based on the BCRW model.

  4. Probabilistic SSME blades structural response under random pulse loading

    NASA Technical Reports Server (NTRS)

    Shiao, Michael; Rubinstein, Robert; Nagpal, Vinod K.

    1987-01-01

    The purpose is to develop models of random impacts on a Space Shuttle Main Engine (SSME) turbopump blade and to predict the probabilistic structural response of the blade to these impacts. The random loading is caused by the impact of debris. The probabilistic structural response is characterized by distribution functions for stress and displacements as functions of the loading parameters which determine the random pulse model. These parameters include pulse arrival, amplitude, and location. The analysis can be extended to predict level crossing rates. This requires knowledge of the joint distribution of the response and its derivative. The model of random impacts chosen allows the pulse arrivals, pulse amplitudes, and pulse locations to be random. Specifically, the pulse arrivals are assumed to be governed by a Poisson process, which is characterized by a mean arrival rate. The pulse intensity is modelled as a normally distributed random variable with a zero mean chosen independently at each arrival. The standard deviation of the distribution is a measure of pulse intensity. Several different models were used for the pulse locations. For example, three points near the blade tip were chosen at which pulses were allowed to arrive with equal probability. Again, the locations were chosen independently at each arrival. The structural response was analyzed both by direct Monte Carlo simulation and by a semi-analytical method.

  5. New Quantum Key Distribution Scheme Based on Random Hybrid Quantum Channel with EPR Pairs and GHZ States

    NASA Astrophysics Data System (ADS)

    Yan, Xing-Yu; Gong, Li-Hua; Chen, Hua-Ying; Zhou, Nan-Run

    2018-05-01

    A theoretical quantum key distribution scheme based on random hybrid quantum channel with EPR pairs and GHZ states is devised. In this scheme, EPR pairs and tripartite GHZ states are exploited to set up random hybrid quantum channel. Only one photon in each entangled state is necessary to run forth and back in the channel. The security of the quantum key distribution scheme is guaranteed by more than one round of eavesdropping check procedures. It is of high capacity since one particle could carry more than two bits of information via quantum dense coding.

  6. Atom probe study of vanadium interphase precipitates and randomly distributed vanadium precipitates in ferrite.

    PubMed

    Nöhrer, M; Zamberger, S; Primig, S; Leitner, H

    2013-01-01

    Atom probe tomography and transmission electron microscopy were used to examine the precipitation reaction in the austenite and ferrite phases in vanadium micro-alloyed steel after a thermo-mechanical process. It was observed that only in the ferrite phase precipitates could be found, whereupon two different types were detected. Thus, the aim was to reveal the difference between these two types. The first type was randomly distributed precipitates from V supersaturated ferrite and the second type V interphase precipitates. Not only the arrangement of the particles was different also the chemical composition. The randomly distributed precipitates consisted of V, C and N in contrast to that the interphase precipitates showed a composition of V, C and Mn. Furthermore the randomly distributed precipitates had maximum size of 20 nm and the interphase precipitates a maximum size of 15 nm. It was assumed that the reason for these differences is caused by the site in which they were formed. The randomly distributed precipitates were formed in a matrix consisting mainly of 0.05 at% C, 0.68 at% Si, 0.03 at% N, 0.145 at% V and 1.51 at% Mn. The interphase precipitates were formed in a region with a much higher C, Mn and V content. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Distributed Synchronization in Networks of Agent Systems With Nonlinearities and Random Switchings.

    PubMed

    Tang, Yang; Gao, Huijun; Zou, Wei; Kurths, Jürgen

    2013-02-01

    In this paper, the distributed synchronization problem of networks of agent systems with controllers and nonlinearities subject to Bernoulli switchings is investigated. Controllers and adaptive updating laws injected in each vertex of networks depend on the state information of its neighborhood. Three sets of Bernoulli stochastic variables are introduced to describe the occurrence probabilities of distributed adaptive controllers, updating laws and nonlinearities, respectively. By the Lyapunov functions method, we show that the distributed synchronization of networks composed of agent systems with multiple randomly occurring nonlinearities, multiple randomly occurring controllers, and multiple randomly occurring updating laws can be achieved in mean square under certain criteria. The conditions derived in this paper can be solved by semi-definite programming. Moreover, by mathematical analysis, we find that the coupling strength, the probabilities of the Bernoulli stochastic variables, and the form of nonlinearities have great impacts on the convergence speed and the terminal control strength. The synchronization criteria and the observed phenomena are demonstrated by several numerical simulation examples. In addition, the advantage of distributed adaptive controllers over conventional adaptive controllers is illustrated.

  8. Comparative randomized pilot study of azithromycin and doxycycline efficacy and tolerability in the treatment of prostate infection caused by Ureaplasma urealyticum.

    PubMed

    Skerk, Visnja; Mareković, Ivana; Markovinović, Leo; Begovac, Josip; Skerk, Vedrana; Barsić, Neven; Majdak-Gluhinić, Vida

    2006-01-01

    A total of 1,442 patients with symptoms of chronic prostatitis were examined over a 4-year period at the Outpatient Department for Urogenital Infections, University Hospital for Infectious Diseases Dr. Fran Mihaljević, Zagreb, Croatia. The inclusion criteria for chronic prostatitis caused by Ureaplasma urealyticum were the presence of clinical symptoms, presence of U. urealyticum in expressed prostatic secretion (EPS) or voided urine collected immediately after prostatic massage (VB(3)), absence of U. urealyticum in urethral swabs and absence of other possible pathogens of chronic prostatitis in EPS or VB(3). A total of 63 patients with prostate infection caused by U. urealyticum were available for this pilot study. The patients were randomized according to a computer randomization list to receive a total dose of 4.5 g of azithromycin given as a 3-day therapy of 1 x 500 mg weekly for 3 weeks or doxycyline 100 mg b.i.d. for 21 days. Patients' sexual partners were treated at the same time. Clinical efficacy and tolerability of the administered drug as well as possible adverse events were evaluated during, at the end and 4-6 weeks after completion of therapy. Bacteriological efficacy was evaluated 4-6 weeks after completion of therapy. Treatment groups did not differ regarding age, distribution of urethral, prostatic, sexual and other symptoms, or digitorectal prostatic examination. Five patients treated with doxycycline had nausea. In the group of patients with prostate infection caused by U. urealyticum, the eradication rate was not significantly different with regard to the administered azithromycin (25/32) or doxycycline (23/31). Clinical cure did not significantly differ with regard to the administered azithromycin (22/32) or doxycycline (21/31). Copyright 2006 S. Karger AG, Basel.

  9. Nonequilibrium Precondensation of Classical Waves in Two Dimensions Propagating through Atomic Vapors

    NASA Astrophysics Data System (ADS)

    Šantić, Neven; Fusaro, Adrien; Salem, Sabeur; Garnier, Josselin; Picozzi, Antonio; Kaiser, Robin

    2018-02-01

    The nonlinear Schrödinger equation, used to describe the dynamics of quantum fluids, is known to be valid not only for massive particles but also for the propagation of light in a nonlinear medium, predicting condensation of classical waves. Here we report on the initial evolution of random waves with Gaussian statistics using atomic vapors as an efficient two dimensional nonlinear medium. Experimental and theoretical analysis of near field images reveal a phenomenon of nonequilibrium precondensation, characterized by a fast relaxation towards a precondensate fraction of up to 75%. Such precondensation is in contrast to complete thermalization to the Rayleigh-Jeans equilibrium distribution, requiring prohibitive long interaction lengths.

  10. Utilization of a patient-centered asthma passport tool in a subspecialty clinic

    PubMed Central

    Greenberg, Jonathan; Prushinskaya, Olga; Harris, Joshua D.; Guidetti-Myers, Giltian; Steiding, Jacqueline; Sawicki, Gregory S.; Gaffin, Jonathan M.

    2018-01-01

    Introduction Despite available and effective tools for asthma self-assessment (Asthma Control Test, ACT) and self-management (Asthma Action Plan, AAP), they are underutilized in outpatient specialty clinics. We evaluated the impact of a patient-centered checklist, the Asthma Passport, on improving ACT and AAP utilization in clinic. Methods This was a randomized, interventional quality-improvement project in which the Asthma Passport was distributed to 120 pediatric asthma patients over the duration of 16 weeks. The passport’s checklist consisted of tasks to be completed by the patient/family, including completion of the ACT and AAP. We compared rates of completion of the ACT and AAP for those who received the passport versus the control group, and assessed patient/caregiver and provider satisfaction. Results Based on electronic medical record data from 222 participants, the ACT completion rate was not significantly different between the passport and control groups, however, the AAP completion rate was significantly greater than control (30.0% vs. 17.7%, p = 0.04). When per-protocol analysis was limited to groups who completed and returned their passports, ACT and AAP completion rates were significantly greater than control (73.8% vs. 44.1% (p = 0.002) and 35.7% vs. 17.7% (p = 0.04), respectively). Nearly all participants reported high satisfaction with care, and surveyed providers viewed the passport favorably. Conclusions A patient-centered checklist significantly improved the completion rate of the AAP. For patient’s who completed and returned the asthma passport, the ACT completion rate was also improved. Participants and providers reported high satisfaction with the checklist, suggesting that it can effectively promote asthma self-management and self-assessment without burdening clinicians or clinic workfiow. PMID:28548904

  11. Utilization of a patient-centered asthma passport tool in a subspecialty clinic.

    PubMed

    Greenberg, Jonathan; Prushinskaya, Olga; Harris, Joshua D; Guidetti-Myers, Gillian; Steiding, Jacqueline; Sawicki, Gregory S; Gaffin, Jonathan M

    2018-02-01

    Despite available and effective tools for asthma self-assessment (Asthma Control Test, ACT) and self-management (Asthma Action Plan, AAP), they are underutilized in outpatient specialty clinics. We evaluated the impact of a patient-centered checklist, the Asthma Passport, on improving ACT and AAP utilization in clinic. This was a randomized, interventional quality-improvement project in which the Asthma Passport was distributed to 120 pediatric asthma patients over the duration of 16 weeks. The passport's checklist consisted of tasks to be completed by the patient/family, including completion of the ACT and AAP. We compared rates of completion of the ACT and AAP for those who received the passport versus the control group, and assessed patient/caregiver and provider satisfaction. Based on electronic medical record data from 222 participants, the ACT completion rate was not significantly different between the passport and control groups, however, the AAP completion rate was significantly greater than control (30.0% vs. 17.7%, p = 0.04). When per-protocol analysis was limited to groups who completed and returned their passports, ACT and AAP completion rates were significantly greater than control (73.8% vs. 44.1% (p = 0.002) and 35.7% vs. 17.7% (p = 0.04), respectively). Nearly all participants reported high satisfaction with care, and surveyed providers viewed the passport favorably. A patient-centered checklist significantly improved the completion rate of the AAP. For patient's who completed and returned the asthma passport, the ACT completion rate was also improved. Participants and providers reported high satisfaction with the checklist, suggesting that it can effectively promote asthma self-management and self-assessment without burdening clinicians or clinic workflow.

  12. Theoretical size distribution of fossil taxa: analysis of a null model.

    PubMed

    Reed, William J; Hughes, Barry D

    2007-03-22

    This article deals with the theoretical size distribution (of number of sub-taxa) of a fossil taxon arising from a simple null model of macroevolution. New species arise through speciations occurring independently and at random at a fixed probability rate, while extinctions either occur independently and at random (background extinctions) or cataclysmically. In addition new genera are assumed to arise through speciations of a very radical nature, again assumed to occur independently and at random at a fixed probability rate. The size distributions of the pioneering genus (following a cataclysm) and of derived genera are determined. Also the distribution of the number of genera is considered along with a comparison of the probability of a monospecific genus with that of a monogeneric family.

  13. Two Models of Time Constrained Target Travel between Two Endpoints Constructed by the Application of Brownian Motion and a Random Tour.

    DTIC Science & Technology

    1983-03-01

    the Naval Postgraduate School. As my *advisor, Prof. Gaver suggested and derived the Brownian bridge, as well as nudged me in the right direction when...the * random tour process by deriving the mean square radial distance for a random tour with arbitrary course change distribution to be: EECR I2(V / 2...random tour model, li = Iy = 8, and equation (3)x y results as expected. The notion of an arbitrary course change distribution is important because the

  14. Anonymous authenticated communications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaver, Cheryl L; Schroeppel, Richard C; Snyder, Lillian A

    2007-06-19

    A method of performing electronic communications between members of a group wherein the communications are authenticated as being from a member of the group and have not been altered, comprising: generating a plurality of random numbers; distributing in a digital medium the plurality of random numbers to the members of the group; publishing a hash value of contents of the digital medium; distributing to the members of the group public-key-encrypted messages each containing a same token comprising a random number; and encrypting a message with a key generated from the token and the plurality of random numbers.

  15. Return probabilities and hitting times of random walks on sparse Erdös-Rényi graphs.

    PubMed

    Martin, O C; Sulc, P

    2010-03-01

    We consider random walks on random graphs, focusing on return probabilities and hitting times for sparse Erdös-Rényi graphs. Using the tree approach, which is expected to be exact in the large graph limit, we show how to solve for the distribution of these quantities and we find that these distributions exhibit a form of self-similarity.

  16. Wireless cellular networks with Pareto-distributed call holding times

    NASA Astrophysics Data System (ADS)

    Rodriguez-Dagnino, Ramon M.; Takagi, Hideaki

    2001-07-01

    Nowadays, there is a growing interest in providing internet to mobile users. For instance, NTT DoCoMo in Japan deploys an important mobile phone network with that offers the Internet service, named 'i-mode', to more than 17 million subscribers. Internet traffic measurements show that the session duration of Call Holding Time (CHT) has probability distributions with heavy-tails, which tells us that they depart significantly from the traffic statistics of traditional voice services. In this environment, it is particularly important to know the number of handovers during a call for a network designer to make an appropriate dimensioning of virtual circuits for a wireless cell. The handover traffic has a direct impact on the Quality of Service (QoS); e.g. the service disruption due to the handover failure may significantly degrade the specified QoS of time-constrained services. In this paper, we first study the random behavior of the number of handovers during a call, where we assume that the CHT are Pareto distributed (heavy-tail distribution), and the Cell Residence Times (CRT) are exponentially distributed. Our approach is based on renewal theory arguments. We present closed-form formulae for the probability mass function (pmf) of the number of handovers during a Pareto distributed CHT, and obtain the probability of call completion as well as handover rates. Most of the formulae are expressed in terms of the Whittaker's function. We compare the Pareto case with cases of $k(subscript Erlang and hyperexponential distributions for the CHT.

  17. Laser absorption of carbon fiber reinforced polymer with randomly distributed carbon fibers

    NASA Astrophysics Data System (ADS)

    Hu, Jun; Xu, Hebing; Li, Chao

    2018-03-01

    Laser processing of carbon fiber reinforced polymer (CFRP) is a non-traditional machining method which has many prospective applications. The laser absorption characteristics of CFRP are analyzed in this paper. A ray tracing model describing the interaction of the laser spot with CFRP is established. The material model contains randomly distributed carbon fibers which are generated using an improved carbon fiber placement method. It was found that CFRP has good laser absorption due to multiple reflections of the light rays in the material’s microstructure. The randomly distributed carbon fibers make the absorptivity of the light rays change randomly in the laser spot. Meanwhile, the average absorptivity fluctuation is obvious during movement of the laser. The experimental measurements agree well with the values predicted by the ray tracing model.

  18. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants.

    PubMed

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-04-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided.

  19. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants

    PubMed Central

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-01-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided. PMID:24834325

  20. The Statistical Drake Equation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2010-12-01

    We provide the statistical generalization of the Drake equation. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov Form of the CLT, or the Lindeberg Form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the LOGNORMAL distribution. Then, as a consequence, the mean value of this lognormal distribution is the ordinary N in the Drake equation. The standard deviation, mode, and all the moments of this lognormal N are also found. The seven factors in the ordinary Drake equation now become seven positive random variables. The probability distribution of each random variable may be ARBITRARY. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) DISTANCE between any two neighboring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density function, apparently previously unknown and dubbed "Maccone distribution" by Paul Davies. DATA ENRICHMENT PRINCIPLE. It should be noticed that ANY positive number of random variables in the Statistical Drake Equation is compatible with the CLT. So, our generalization allows for many more factors to be added in the future as long as more refined scientific knowledge about each factor will be known to the scientists. This capability to make room for more future factors in the statistical Drake equation, we call the "Data Enrichment Principle," and we regard it as the key to more profound future results in the fields of Astrobiology and SETI. Finally, a practical example is given of how our statistical Drake equation works numerically. We work out in detail the case, where each of the seven random variables is uniformly distributed around its own mean value and has a given standard deviation. For instance, the number of stars in the Galaxy is assumed to be uniformly distributed around (say) 350 billions with a standard deviation of (say) 1 billion. Then, the resulting lognormal distribution of N is computed numerically by virtue of a MathCad file that the author has written. This shows that the mean value of the lognormal random variable N is actually of the same order as the classical N given by the ordinary Drake equation, as one might expect from a good statistical generalization.

  1. NHash: Randomized N-Gram Hashing for Distributed Generation of Validatable Unique Study Identifiers in Multicenter Research.

    PubMed

    Zhang, Guo-Qiang; Tao, Shiqiang; Xing, Guangming; Mozes, Jeno; Zonjy, Bilal; Lhatoo, Samden D; Cui, Licong

    2015-11-10

    A unique study identifier serves as a key for linking research data about a study subject without revealing protected health information in the identifier. While sufficient for single-site and limited-scale studies, the use of common unique study identifiers has several drawbacks for large multicenter studies, where thousands of research participants may be recruited from multiple sites. An important property of study identifiers is error tolerance (or validatable), in that inadvertent editing mistakes during their transmission and use will most likely result in invalid study identifiers. This paper introduces a novel method called "Randomized N-gram Hashing (NHash)," for generating unique study identifiers in a distributed and validatable fashion, in multicenter research. NHash has a unique set of properties: (1) it is a pseudonym serving the purpose of linking research data about a study participant for research purposes; (2) it can be generated automatically in a completely distributed fashion with virtually no risk for identifier collision; (3) it incorporates a set of cryptographic hash functions based on N-grams, with a combination of additional encryption techniques such as a shift cipher; (d) it is validatable (error tolerant) in the sense that inadvertent edit errors will mostly result in invalid identifiers. NHash consists of 2 phases. First, an intermediate string using randomized N-gram hashing is generated. This string consists of a collection of N-gram hashes f1, f2, ..., fk. The input for each function fi has 3 components: a random number r, an integer n, and input data m. The result, fi(r, n, m), is an n-gram of m with a starting position s, which is computed as (r mod |m|), where |m| represents the length of m. The output for Step 1 is the concatenation of the sequence f1(r1, n1, m1), f2(r2, n2, m2), ..., fk(rk, nk, mk). In the second phase, the intermediate string generated in Phase 1 is encrypted using techniques such as shift cipher. The result of the encryption, concatenated with the random number r, is the final NHash study identifier. We performed experiments using a large synthesized dataset comparing NHash with random strings, and demonstrated neglegible probability for collision. We implemented NHash for the Center for SUDEP Research (CSR), a National Institute for Neurological Disorders and Stroke-funded Center Without Walls for Collaborative Research in the Epilepsies. This multicenter collaboration involves 14 institutions across the United States and Europe, bringing together extensive and diverse expertise to understand sudden unexpected death in epilepsy patients (SUDEP). The CSR Data Repository has successfully used NHash to link deidentified multimodal clinical data collected in participating CSR institutions, meeting all desired objectives of NHash.

  2. Image-based modeling of radiation-induced foci

    NASA Astrophysics Data System (ADS)

    Costes, Sylvain; Cucinotta, Francis A.; Ponomarev, Artem; Barcellos-Hoff, Mary Helen; Chen, James; Chou, William; Gascard, Philippe

    Several proteins involved in the response to DNA double strand breaks (DSB) form microscopically visible nuclear domains, or foci, after exposure to ionizing radiation. Radiation-induced foci (RIF) are believed to be located where DNA damage occurs. To test this assumption, we used Monte Carlo simulations to predict the spatial distribution of DSB in human nuclei exposed to high or low-LET radiation. We then compared these predictions to the distribution patterns of three DNA damage sensing proteins, i.e. 53BP1, phosphorylated ATM and γH2AX in human mammary epithelial. The probability to induce DSB can be derived from DNA fragment data measured experimentally by pulsed-field gel electrophoresis. We first used this probability in Monte Carlo simulations to predict DSB locations in synthetic nuclei geometrically described by a complete set of human chromosomes, taking into account microscope optics from real experiments. Simulations showed a very good agreement for high-LET, predicting 0.7 foci/µm along the path of a 1 GeV/amu Fe particle against measurement of 0.69 to 0.82 foci/µm for various RIF 5 min following exposure (LET 150 keV/µm). On the other hand, discrepancies were shown in foci frequency for low-LET, with measurements 20One drawback using a theoretical model for the nucleus is that it assumes a simplistic and static pattern for DNA densities. However DNA damage pattern is highly correlated to DNA density pattern (i.e. the more DNA, the more likely to have a break). Therefore, we generalized our Monte Carlo approach to real microscope images, assuming pixel intensity of DAPI in the nucleus was directly proportional to the amount of DNA in that pixel. With such approach we could predict DNA damage pattern in real images on a per nucleus basis. Since energy is randomly deposited along high-LET particle paths, RIF along these paths should also be randomly distributed. As expected, simulations produced DNA-weighted random (Poisson) distributions. In contrast, the distributions of RIF obtained as early as 5 min after exposure to high LET (1 GeV/amu Fe) were non-random. This deviation from the expected DNA-weighted random pattern was further characterized by "relative DNA image measurements". This novel imaging approach showed that RIF were located preferentially at the interface between high and low DNA density regions, and were more frequent than predicted in regions with lower DNA density. The same preferential nuclear location was also measured for RIF induced by 1 Gy of low-LET radiation. This deviation from random behavior was evident only 5 min after irradiation for phosphorylated ATM RIF, while γH2AX and 53BP1 RIF showed pronounced deviations up to 30 min after exposure. These data suggest that RIF within a few minutes following exposure to radiation cluster into open regions of the nucleus (i.e. euchromatin). It is possible that DNA lesions are collected in these nuclear sub-domains for more efficient repair. If so, this would imply that DSB are actively transported within the nucleus, a phenomenon that has not yet been considered in modeling DNA misrepair following exposure to radiation. These results are thus critical for more accurate risk models of radiation and we are actively working on characterizing further RIF movement in human nuclei using live cell imaging.

  3. Random walk to a nonergodic equilibrium concept

    NASA Astrophysics Data System (ADS)

    Bel, G.; Barkai, E.

    2006-01-01

    Random walk models, such as the trap model, continuous time random walks, and comb models, exhibit weak ergodicity breaking, when the average waiting time is infinite. The open question is, what statistical mechanical theory replaces the canonical Boltzmann-Gibbs theory for such systems? In this paper a nonergodic equilibrium concept is investigated, for a continuous time random walk model in a potential field. In particular we show that in the nonergodic phase the distribution of the occupation time of the particle in a finite region of space approaches U- or W-shaped distributions related to the arcsine law. We show that when conditions of detailed balance are applied, these distributions depend on the partition function of the problem, thus establishing a relation between the nonergodic dynamics and canonical statistical mechanics. In the ergodic phase the distribution function of the occupation times approaches a δ function centered on the value predicted based on standard Boltzmann-Gibbs statistics. The relation of our work to single-molecule experiments is briefly discussed.

  4. Anisotropy Induced Switching Field Distribution in High-Density Patterned Media

    NASA Astrophysics Data System (ADS)

    Talapatra, A.; Mohanty, J.

    We present here micromagnetic study of variation of switching field distribution (SFD) in a high-density patterned media as a function of magnetic anisotropy of the system. We consider the manifold effect of magnetic anisotropy in terms of its magnitude, tilt in anisotropy axis and random arrangements of magnetic islands with random anisotropy values. Our calculation shows that reduction in anisotropy causes linear decrease in coercivity because the anisotropy energy tries to align the spins along a preferred crystallographic direction. Tilt in anisotropy axis results in decrease in squareness of the hysteresis loop and hence facilitates switching. Finally, the experimental challenges like lithographic distribution of magnetic islands, their orientation, creation of defects, etc. demanded the distribution of anisotropy to be random along with random repetitions. We have explained that the range of anisotropy values and the number of bits with different anisotropy play a key role over SFD, whereas the position of the bits and their repetitions do not show a considerable contribution.

  5. The topology of large-scale structure. I - Topology and the random phase hypothesis. [galactic formation models

    NASA Technical Reports Server (NTRS)

    Weinberg, David H.; Gott, J. Richard, III; Melott, Adrian L.

    1987-01-01

    Many models for the formation of galaxies and large-scale structure assume a spectrum of random phase (Gaussian), small-amplitude density fluctuations as initial conditions. In such scenarios, the topology of the galaxy distribution on large scales relates directly to the topology of the initial density fluctuations. Here a quantitative measure of topology - the genus of contours in a smoothed density distribution - is described and applied to numerical simulations of galaxy clustering, to a variety of three-dimensional toy models, and to a volume-limited sample of the CfA redshift survey. For random phase distributions the genus of density contours exhibits a universal dependence on threshold density. The clustering simulations show that a smoothing length of 2-3 times the mass correlation length is sufficient to recover the topology of the initial fluctuations from the evolved galaxy distribution. Cold dark matter and white noise models retain a random phase topology at shorter smoothing lengths, but massive neutrino models develop a cellular topology.

  6. Quantitative Phylogenomics of Within-Species Mitogenome Variation: Monte Carlo and Non-Parametric Analysis of Phylogeographic Structure among Discrete Transatlantic Breeding Areas of Harp Seals (Pagophilus groenlandicus)

    PubMed Central

    Carr, Steven M.; Duggan, Ana T.; Stenson, Garry B.; Marshall, H. Dawn

    2015-01-01

    Phylogenomic analysis of highly-resolved intraspecific phylogenies obtained from complete mitochondrial DNA genomes has had great success in clarifying relationships within and among human populations, but has found limited application in other wild species. Analytical challenges include assessment of random versus non-random phylogeographic distributions, and quantification of differences in tree topologies among populations. Harp Seals (Pagophilus groenlandicus Erxleben, 1777) have a biogeographic distribution based on four discrete trans-Atlantic breeding and whelping populations located on “fast ice” attached to land in the White Sea, Greenland Sea, the Labrador ice Front, and Southern Gulf of St Lawrence. This East to West distribution provides a set of a priori phylogeographic hypotheses. Outstanding biogeographic questions include the degree of genetic distinctiveness among these populations, in particular between the Greenland Sea and White Sea grounds. We obtained complete coding-region DNA sequences (15,825 bp) for 53 seals. Each seal has a unique mtDNA genome sequence, which differ by 6 ~ 107 substitutions. Six major clades / groups are detectable by parsimony, neighbor-joining, and Bayesian methods, all of which are found in breeding populations on either side of the Atlantic. The species coalescent is at 180 KYA; the most recent clade, which accounts for 66% of the diversity, reflects an expansion during the mid-Wisconsinan glaciation 40 ~ 60 KYA. FST is significant only between the White Sea and Greenland Sea or Ice Front populations. Hierarchal AMOVA of 2-, 3-, or 4-island models identifies small but significant ΦSC among populations within groups, but not among groups. A novel Monte-Carlo simulation indicates that the observed distribution of individuals within breeding populations over the phylogenetic tree requires significantly fewer dispersal events than random expectation, consistent with island or a priori East to West 2- or 3-stepping-stone biogeographic models, but not a simple 1-step trans-Atlantic model. Plots of the cumulative pairwise sequence difference curves among seals in each of the four populations provide continuous proxies for phylogenetic diversification within each. Non-parametric Kolmogorov-Smirnov (K-S) tests of maximum pairwise differences between these curves indicates that the Greenland Sea population has a markedly younger phylogenetic structure than either the White Sea population or the two Northwest Atlantic populations, which are of intermediate age and homogeneous structure. The Monte Carlo and K-S assessments provide sensitive quantitative tests of within-species mitogenomic phylogeography. This is the first study to indicate that the White Sea and Greenland Sea populations have different population genetic histories. The analysis supports the hypothesis that Harp Seals comprises three genetically distinguishable breeding populations, in the White Sea, Greenland Sea, and Northwest Atlantic. Implications for an ice-dependent species during ongoing climate change are discussed. PMID:26301872

  7. Quantitative Phylogenomics of Within-Species Mitogenome Variation: Monte Carlo and Non-Parametric Analysis of Phylogeographic Structure among Discrete Transatlantic Breeding Areas of Harp Seals (Pagophilus groenlandicus).

    PubMed

    Carr, Steven M; Duggan, Ana T; Stenson, Garry B; Marshall, H Dawn

    2015-01-01

    Phylogenomic analysis of highly-resolved intraspecific phylogenies obtained from complete mitochondrial DNA genomes has had great success in clarifying relationships within and among human populations, but has found limited application in other wild species. Analytical challenges include assessment of random versus non-random phylogeographic distributions, and quantification of differences in tree topologies among populations. Harp Seals (Pagophilus groenlandicus Erxleben, 1777) have a biogeographic distribution based on four discrete trans-Atlantic breeding and whelping populations located on "fast ice" attached to land in the White Sea, Greenland Sea, the Labrador ice Front, and Southern Gulf of St Lawrence. This East to West distribution provides a set of a priori phylogeographic hypotheses. Outstanding biogeographic questions include the degree of genetic distinctiveness among these populations, in particular between the Greenland Sea and White Sea grounds. We obtained complete coding-region DNA sequences (15,825 bp) for 53 seals. Each seal has a unique mtDNA genome sequence, which differ by 6 ~ 107 substitutions. Six major clades / groups are detectable by parsimony, neighbor-joining, and Bayesian methods, all of which are found in breeding populations on either side of the Atlantic. The species coalescent is at 180 KYA; the most recent clade, which accounts for 66% of the diversity, reflects an expansion during the mid-Wisconsinan glaciation 40~60 KYA. FST is significant only between the White Sea and Greenland Sea or Ice Front populations. Hierarchal AMOVA of 2-, 3-, or 4-island models identifies small but significant ΦSC among populations within groups, but not among groups. A novel Monte-Carlo simulation indicates that the observed distribution of individuals within breeding populations over the phylogenetic tree requires significantly fewer dispersal events than random expectation, consistent with island or a priori East to West 2- or 3-stepping-stone biogeographic models, but not a simple 1-step trans-Atlantic model. Plots of the cumulative pairwise sequence difference curves among seals in each of the four populations provide continuous proxies for phylogenetic diversification within each. Non-parametric Kolmogorov-Smirnov (K-S) tests of maximum pairwise differences between these curves indicates that the Greenland Sea population has a markedly younger phylogenetic structure than either the White Sea population or the two Northwest Atlantic populations, which are of intermediate age and homogeneous structure. The Monte Carlo and K-S assessments provide sensitive quantitative tests of within-species mitogenomic phylogeography. This is the first study to indicate that the White Sea and Greenland Sea populations have different population genetic histories. The analysis supports the hypothesis that Harp Seals comprises three genetically distinguishable breeding populations, in the White Sea, Greenland Sea, and Northwest Atlantic. Implications for an ice-dependent species during ongoing climate change are discussed.

  8. Decisions with Uncertain Consequences—A Total Ordering on Loss-Distributions

    PubMed Central

    König, Sandra; Schauer, Stefan

    2016-01-01

    Decisions are often based on imprecise, uncertain or vague information. Likewise, the consequences of an action are often equally unpredictable, thus putting the decision maker into a twofold jeopardy. Assuming that the effects of an action can be modeled by a random variable, then the decision problem boils down to comparing different effects (random variables) by comparing their distribution functions. Although the full space of probability distributions cannot be ordered, a properly restricted subset of distributions can be totally ordered in a practically meaningful way. We call these loss-distributions, since they provide a substitute for the concept of loss-functions in decision theory. This article introduces the theory behind the necessary restrictions and the hereby constructible total ordering on random loss variables, which enables decisions under uncertainty of consequences. Using data obtained from simulations, we demonstrate the practical applicability of our approach. PMID:28030572

  9. Biological monitoring of environmental quality: The use of developmental instability

    USGS Publications Warehouse

    Freeman, D.C.; Emlen, J.M.; Graham, J.H.; Hough, R. A.; Bannon, T.A.

    1994-01-01

    Distributed robustness is thought to influence the buffering of random phenotypic variation through the scale-free topology of gene regulatory, metabolic, and protein-protein interaction networks. If this hypothesis is true, then the phenotypic response to the perturbation of particular nodes in such a network should be proportional to the number of links those nodes make with neighboring nodes. This suggests a probability distribution approximating an inverse power-law of random phenotypic variation. Zero phenotypic variation, however, is impossible, because random molecular and cellular processes are essential to normal development. Consequently, a more realistic distribution should have a y-intercept close to zero in the lower tail, a mode greater than zero, and a long (fat) upper tail. The double Pareto-lognormal (DPLN) distribution is an ideal candidate distribution. It consists of a mixture of a lognormal body and upper and lower power-law tails.

  10. Two Universality Properties Associated with the Monkey Model of Zipf's Law

    NASA Astrophysics Data System (ADS)

    Perline, Richard; Perline, Ron

    2016-03-01

    The distribution of word probabilities in the monkey model of Zipf's law is associated with two universality properties: (1) the power law exponent converges strongly to $-1$ as the alphabet size increases and the letter probabilities are specified as the spacings from a random division of the unit interval for any distribution with a bounded density function on $[0,1]$; and (2), on a logarithmic scale the version of the model with a finite word length cutoff and unequal letter probabilities is approximately normally distributed in the part of the distribution away from the tails. The first property is proved using a remarkably general limit theorem for the logarithm of sample spacings from Shao and Hahn, and the second property follows from Anscombe's central limit theorem for a random number of i.i.d. random variables. The finite word length model leads to a hybrid Zipf-lognormal mixture distribution closely related to work in other areas.

  11. The Effect of Food Stamp Nutrition Education on the Food Insecurity of Low-Income Women Participants

    ERIC Educational Resources Information Center

    Eicher-Miller, Heather A.; Mason, April C.; Abbott, Angela R.; McCabe, George P.; Boushey, Carol J.

    2009-01-01

    Objective: To determine the effect of Food Stamp Nutrition Education (FSNE) in Indiana on participants' food insecurity and food insufficiency. Design: A single-blind randomized design. A randomized experimental group completed 5 FSNE lessons as an intervention between a pre- and posttest, whereas a control group completed a pre- and posttest…

  12. On the genealogy of branching random walks and of directed polymers

    NASA Astrophysics Data System (ADS)

    Derrida, Bernard; Mottishaw, Peter

    2016-08-01

    It is well known that the mean-field theory of directed polymers in a random medium exhibits replica symmetry breaking with a distribution of overlaps which consists of two delta functions. Here we show that the leading finite-size correction to this distribution of overlaps has a universal character which can be computed explicitly. Our results can also be interpreted as genealogical properties of branching Brownian motion or of branching random walks.

  13. All optical mode controllable Er-doped random fiber laser with distributed Bragg gratings.

    PubMed

    Zhang, W L; Ma, R; Tang, C H; Rao, Y J; Zeng, X P; Yang, Z J; Wang, Z N; Gong, Y; Wang, Y S

    2015-07-01

    An all-optical method to control the lasing modes of Er-doped random fiber lasers (RFLs) is proposed and demonstrated. In the RFL, an Er-doped fiber (EDF) recoded with randomly separated fiber Bragg gratings (FBG) is used as the gain medium and randomly distributed reflectors, as well as the controllable element. By combining random feedback of the FBG array and Fresnel feedback of a cleaved fiber end, multi-mode coherent random lasing is obtained with a threshold of 14 mW and power efficiency of 14.4%. Moreover, a laterally-injected control light is used to induce local gain perturbation, providing additional gain for certain random resonance modes. As a result, active mode selection of the RFL is realized by changing locations of the laser cavity that is exposed to the control light.

  14. An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution

    NASA Technical Reports Server (NTRS)

    Campbell, C. W.

    1983-01-01

    An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.

  15. Transcription, intercellular variability and correlated random walk.

    PubMed

    Müller, Johannes; Kuttler, Christina; Hense, Burkhard A; Zeiser, Stefan; Liebscher, Volkmar

    2008-11-01

    We develop a simple model for the random distribution of a gene product. It is assumed that the only source of variance is due to switching transcription on and off by a random process. Under the condition that the transition rates between on and off are constant we find that the amount of mRNA follows a scaled Beta distribution. Additionally, a simple positive feedback loop is considered. The simplicity of the model allows for an explicit solution also in this setting. These findings in turn allow, e.g., for easy parameter scans. We find that bistable behavior translates into bimodal distributions. These theoretical findings are in line with experimental results.

  16. Recruitment strategy effectiveness for a cryotherapy intervention for a venous leg ulcer prevention study.

    PubMed

    Kelechi, Teresa J; Watts, Ashlee; Wiseman, Jan

    2010-01-01

    To describe the strategies and costs associated with recruiting African American and white adults into a randomized controlled pilot trial. "Cryotherapy for Venous Disorders: A Pilot Study" is a randomized controlled trial designed to determine the effects of a cool gel wrap and leg elevation intervention versus a leg elevation alone intervention on skin temperature, skin microcirculation, quality of life, and pain in adults with stages 4 and 5 chronic venous disorders. We sought to recruit 60 participants (21 African Americans, 37 whites, and 2 Hispanic or Latino) to complete the study. These enrollment targets reflect the demographic distribution of the community in which the study was conducted (33% African American, 66% white, and 2% Latino). Proactive and reactive recruitment strategies were implemented to recruit subjects. Seventy-three individuals (9 African American men, 29 African American women, 11 white men, 22 white women, 1 Asian woman, and 1 Hispanic woman) were screened, and of those, 67 were randomized (9 African American men, 25 African American women, 9 white men, 22 white women, 1 Asian woman, and 1 Hispanic women). Fifty-eight completed the study, yielding an overall 11% attrition rate. An additional 8 subjects canceled or did not show up for a first appointment. Reactive recruitment strategies were most successful for recruiting men, women, African American, and white participants. The 3 most successful reactive strategies were referrals from providers/clinics (34%), flyers posted in the hospital elevators (22%), and targeted mailings from a business (16%). Of the healthcare provider referrals (19), wound care nurses referred 12 completed participants. The amount budgeted for advertisement was $5,000 (2% of the total grant award). The amount spent on recruitment including labor was $5,978, which averaged $103 per participant who completed the study (N = 58). Reactive strategies per participant completer proved more cost-efficient than proactive strategies ($83 vs $215). However, the time spent by the principal investigator (approximately 100 hours or 2.5 hours per week x 40 weeks) on recruitment, particularly maintaining frequent face-to-face contact with providers, increased success in the area of healthcare provider referrals. A variety of recruitment strategies are needed to ensure a diverse participant response to clinical research studies. As nurses become more involved in research activities, and particularly in recruitment, it is important to understand the most effective types of strategies and costs associated with these activities.

  17. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions.

    PubMed

    Yura, Harold T; Hanson, Steen G

    2012-04-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative examples with relevance for optics are given.

  18. Dynamical implications of sample shape for avalanches in 2-dimensional random-field Ising model with saw-tooth domain wall

    NASA Astrophysics Data System (ADS)

    Tadić, Bosiljka

    2018-03-01

    We study dynamics of a built-in domain wall (DW) in 2-dimensional disordered ferromagnets with different sample shapes using random-field Ising model on a square lattice rotated by 45 degrees. The saw-tooth DW of the length Lx is created along one side and swept through the sample by slow ramping of the external field until the complete magnetisation reversal and the wall annihilation at the open top boundary at a distance Ly. By fixing the number of spins N =Lx ×Ly = 106 and the random-field distribution at a value above the critical disorder, we vary the ratio of the DW length to the annihilation distance in the range Lx /Ly ∈ [ 1 / 16 , 16 ] . The periodic boundary conditions are applied in the y-direction so that these ratios comprise different samples, i.e., surfaces of cylinders with the changing perimeter Lx and height Ly. We analyse the avalanches of the DW slips between following field updates, and the multifractal structure of the magnetisation fluctuation time series. Our main findings are that the domain-wall lengths materialised in different sample shapes have an impact on the dynamics at all scales. Moreover, the domain-wall motion at the beginning of the hysteresis loop (HLB) probes the disorder effects resulting in the fluctuations that are significantly different from the large avalanches in the central part of the loop (HLC), where the strong fields dominate. Specifically, the fluctuations in HLB exhibit a wide multi-fractal spectrum, which shifts towards higher values of the exponents when the DW length is reduced. The distributions of the avalanches in this segments of the loops obey power-law decay and the exponential cutoffs with the exponents firmly in the mean-field universality class for long DW. In contrast, the avalanches in the HLC obey Tsallis density distribution with the power-law tails which indicate the new categories of the scale invariant behaviour for different ratios Lx /Ly. The large fluctuations in the HLC, on the other hand, have a rather narrow spectrum which is less sensitive to the length of the wall. These findings shed light to the dynamical criticality of the random-field Ising model at its lower critical dimension; they can be relevant to applications of the dynamics of injected domain walls in two-dimensional nanowires and ferromagnetic films.

  19. Theoretical size distribution of fossil taxa: analysis of a null model

    PubMed Central

    Reed, William J; Hughes, Barry D

    2007-01-01

    Background This article deals with the theoretical size distribution (of number of sub-taxa) of a fossil taxon arising from a simple null model of macroevolution. Model New species arise through speciations occurring independently and at random at a fixed probability rate, while extinctions either occur independently and at random (background extinctions) or cataclysmically. In addition new genera are assumed to arise through speciations of a very radical nature, again assumed to occur independently and at random at a fixed probability rate. Conclusion The size distributions of the pioneering genus (following a cataclysm) and of derived genera are determined. Also the distribution of the number of genera is considered along with a comparison of the probability of a monospecific genus with that of a monogeneric family. PMID:17376249

  20. Reliability Overhaul Model

    DTIC Science & Technology

    1989-08-01

    Random variables for the conditional exponential distribution are generated using the inverse transform method. C1) Generate U - UCO,i) (2) Set s - A ln...e - [(x+s - 7)/ n] 0 + [Cx-T)/n]0 c. Random variables from the conditional weibull distribution are generated using the inverse transform method. C1...using a standard normal transformation and the inverse transform method. B - 3 APPENDIX 3 DISTRIBUTIONS SUPPORTED BY THE MODEL (1) Generate Y - PCX S

  1. Empirical scaling of the length of the longest increasing subsequences of random walks

    NASA Astrophysics Data System (ADS)

    Mendonça, J. Ricardo G.

    2017-02-01

    We provide Monte Carlo estimates of the scaling of the length L n of the longest increasing subsequences of n-step random walks for several different distributions of step lengths, short and heavy-tailed. Our simulations indicate that, barring possible logarithmic corrections, {{L}n}∼ {{n}θ} with the leading scaling exponent 0.60≲ θ ≲ 0.69 for the heavy-tailed distributions of step lengths examined, with values increasing as the distribution becomes more heavy-tailed, and θ ≃ 0.57 for distributions of finite variance, irrespective of the particular distribution. The results are consistent with existing rigorous bounds for θ, although in a somewhat surprising manner. For random walks with step lengths of finite variance, we conjecture that the correct asymptotic behavior of L n is given by \\sqrt{n}\\ln n , and also propose the form for the subleading asymptotics. The distribution of L n was found to follow a simple scaling form with scaling functions that vary with θ. Accordingly, when the step lengths are of finite variance they seem to be universal. The nature of this scaling remains unclear, since we lack a working model, microscopic or hydrodynamic, for the behavior of the length of the longest increasing subsequences of random walks.

  2. Survival distributions impact the power of randomized placebo-phase design and parallel groups randomized clinical trials.

    PubMed

    Abrahamyan, Lusine; Li, Chuan Silvia; Beyene, Joseph; Willan, Andrew R; Feldman, Brian M

    2011-03-01

    The study evaluated the power of the randomized placebo-phase design (RPPD)-a new design of randomized clinical trials (RCTs), compared with the traditional parallel groups design, assuming various response time distributions. In the RPPD, at some point, all subjects receive the experimental therapy, and the exposure to placebo is for only a short fixed period of time. For the study, an object-oriented simulation program was written in R. The power of the simulated trials was evaluated using six scenarios, where the treatment response times followed the exponential, Weibull, or lognormal distributions. The median response time was assumed to be 355 days for the placebo and 42 days for the experimental drug. Based on the simulation results, the sample size requirements to achieve the same level of power were different under different response time to treatment distributions. The scenario where the response times followed the exponential distribution had the highest sample size requirement. In most scenarios, the parallel groups RCT had higher power compared with the RPPD. The sample size requirement varies depending on the underlying hazard distribution. The RPPD requires more subjects to achieve a similar power to the parallel groups design. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. An application of randomization for detecting evidence of thermoregulation in timber rattlesnakes (Crotalus horridus) from northwest Arkansas.

    PubMed

    Wills, C A; Beaupre, S J

    2000-01-01

    Most reptiles maintain their body temperatures within normal functional ranges through behavioral thermoregulation. Under some circumstances, thermoregulation may be a time-consuming activity, and thermoregulatory needs may impose significant constraints on the activities of ectotherms. A necessary (but not sufficient) condition for demonstrating thermoregulation is a difference between observed body temperature distributions and available operative temperature distributions. We examined operative and body temperature distributions of the timber rattlesnake (Crotalus horridus) for evidence of thermoregulation. Specifically, we compared the distribution of available operative temperatures in the environment to snake body temperatures during August and September. Operative temperatures were measured using 48 physical models that were randomly deployed in the environment and connected to a Campbell CR-21X data logger. Body temperatures (n=1,803) were recorded from 12 radiotagged snakes using temperature-sensitive telemetry. Separate randomization tests were conducted for each hour of day within each month. Actual body temperature distributions differed significantly from operative temperature distributions at most time points considered. Thus, C. horridus exhibits a necessary (but not sufficient) condition for demonstrating thermoregulation. However, unlike some desert ectotherms, we found no compelling evidence for thermal constraints on surface activity. Randomization may prove to be a powerful technique for drawing inferences about thermoregulation without reliance on studies of laboratory thermal preference.

  4. Using a cluster randomized controlled trial to determine the effects of intervention of battery and hardwired smoke alarms in New South Wales, Australia: Home fire safety checks pilot program.

    PubMed

    Tannous, W Kathy; Whybro, Mark; Lewis, Chris; Ollerenshaw, Michael; Watson, Graeme; Broomhall, Susan; Agho, Kingsley E

    2016-02-01

    In 2014, Fire & Rescue New South Wales piloted the delivery of its home fire safety checks program (HFSC) aimed at engaging and educating targeted top "at risk" groups to prevent and prepare for fire. This pilot study aimed to assess the effectiveness of smoke alarms using a cluster randomized controlled trial. Survey questionnaires were distributed to the households that had participated in the HFSC program (intervention group). A separate survey questionnaire was distributed to the control group that was identified with similar characteristics to the intervention group in the same suburb. To adjust for potential clustering effects, generalized estimation equations with a log link were used. Multivariable analyses revealed that battery and hardwired smoking alarm usage increased by 9% and 3% respectively among the intervention group compared to the control group. Females were more likely to install battery smoke alarms than males. Respondents who possessed a certificate or diploma (AOR=1.31, 95% CI 1.00-1.70, P=0.047) and those who were educated up to years 8-12 (AOR=1.32, 95% CI 1.06-1.64, P=0.012) were significantly more likely to install battery smoke alarms than those who completed bachelor degrees. Conversely, holders of a certificate or diploma and people who were educated up to years 8-12 were 31% (AOR=0.69, 95% CI 0.52-0.93, P=0.014) and 24% (AOR=0.76, 95% CI 0.60-0.95, P=0.015) significantly less likely to install a hardwired smoke alarm compared to those who completed bachelor degrees. This pilot study provided evidence of the benefit of the HFSC in New South Wales. Fire safety intervention programs, like HFSC, need to be targeted to male adults with lower level of schooling even when they are aware of their risks. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.

  5. [Randomised study of the relationship between the use of CPRmeter® device and the quality of chest compressions in a simulated cardiopulmonary resuscitation].

    PubMed

    Calvo-Buey, J A; Calvo-Marcos, D; Marcos-Camina, R M

    2016-01-01

    To determine whether the use of CPRmeter(®) during the resuscitation manoeuvres, is related to a higher quality of external cardiac massage, as recommended by the International Liaison Committee on Resuscitation (ILCOR). To compare the quality obtained without the use or this, and whether there are differences related to anthropometric, demographic, professional and/or occupational factors. Experimental, open trial performed with life support simulators in a stratified random sample of 88 health workers randomly distributed between groups A (without indications of the device) and B (with them). The homogeneity of their confounding variables was compared, as well as the compressions depth and compressions rate, the proportion of completed release, and distribution of the quality massage variable (according to criteria ILCOR) between the groups. The qualitative variables were analysed with the chi-square test, and quantitative variables with the Student t-test or Mann-Whitney U-test and the association between the variable quality massage variable, and use of the device with the odds ratio. Group A: mean depth 42.1mm (standard deviation 10.1), mean rate 121.3/min (21.6), percentage of complete release 71.2% (36.9). Group B: 51.2mm (5.9) 111.9/min (6.4), 92.9% (10.1) respectively. Odds ratio for quality massage regarding the use of the device was 5.170 (95% CI; 2.060-12.977). The use of CPRmeter(®) device in simulated resuscitations is related to a higher quality of cardiac massage, improving the approach to the ILCOR recommendations, regardless of the characteristics of the participants. They were 83.8% more likely to achieve a quality massage using the device than without it. Copyright © 2015 Elsevier España, S.L.U. y SEEIUC. All rights reserved.

  6. Evaluating the validity of multiple imputation for missing physiological data in the national trauma data bank.

    PubMed

    Moore, Lynne; Hanley, James A; Lavoie, André; Turgeon, Alexis

    2009-05-01

    The National Trauma Data Bank (NTDB) is plagued by the problem of missing physiological data. The Glasgow Coma Scale score, Respiratory Rate and Systolic Blood Pressure are an essential part of risk adjustment strategies for trauma system evaluation and clinical research. Missing data on these variables may compromise the feasibility and the validity of trauma group comparisons. To evaluate the validity of Multiple Imputation (MI) for completing missing physiological data in the National Trauma Data Bank (NTDB), by assessing the impact of MI on 1) frequency distributions, 2) associations with mortality, and 3) risk adjustment. Analyses were based on 170,956 NTDB observations with complete physiological data (observed data set). Missing physiological data were artificially imposed on this data set and then imputed using MI (MI data set). To assess the impact of MI on risk adjustment, 100 pairs of hospitals were randomly selected with replacement and compared using adjusted Odds Ratios (OR) of mortality. OR generated by the observed data set were then compared to those generated by the MI data set. Frequency distributions and associations with mortality were preserved following MI. The median absolute difference between adjusted OR of mortality generated by the observed data set and by the MI data set was 3.6% (inter-quartile range: 2.4%-6.1%). This study suggests that, provided it is implemented with care, MI of missing physiological data in the NTDB leads to valid frequency distributions, preserves associations with mortality, and does not compromise risk adjustment in inter-hospital comparisons of mortality.

  7. Sensitivity of goodness-of-fit statistics to rainfall data rounding off

    NASA Astrophysics Data System (ADS)

    Deidda, Roberto; Puliga, Michelangelo

    An analysis based on the L-moments theory suggests of adopting the generalized Pareto distribution to interpret daily rainfall depths recorded by the rain-gauge network of the Hydrological Survey of the Sardinia Region. Nevertheless, a big problem, not yet completely resolved, arises in the estimation of a left-censoring threshold able to assure a good fitting of rainfall data with the generalized Pareto distribution. In order to detect an optimal threshold, keeping the largest possible number of data, we chose to apply a “failure-to-reject” method based on goodness-of-fit tests, as it was proposed by Choulakian and Stephens [Choulakian, V., Stephens, M.A., 2001. Goodness-of-fit tests for the generalized Pareto distribution. Technometrics 43, 478-484]. Unfortunately, the application of the test, using percentage points provided by Choulakian and Stephens (2001), did not succeed in detecting a useful threshold value in most analyzed time series. A deeper analysis revealed that these failures are mainly due to the presence of large quantities of rounding off values among sample data, affecting the distribution of goodness-of-fit statistics and leading to significant departures from percentage points expected for continuous random variables. A procedure based on Monte Carlo simulations is thus proposed to overcome these problems.

  8. The Poisson model limits in NBA basketball: Complexity in team sports

    NASA Astrophysics Data System (ADS)

    Martín-González, Juan Manuel; de Saá Guerra, Yves; García-Manso, Juan Manuel; Arriaza, Enrique; Valverde-Estévez, Teresa

    2016-12-01

    Team sports are frequently studied by researchers. There is presumption that scoring in basketball is a random process and that can be described using the Poisson Model. Basketball is a collaboration-opposition sport, where the non-linear local interactions among players are reflected in the evolution of the score that ultimately determines the winner. In the NBA, the outcomes of close games are often decided in the last minute, where fouls play a main role. We examined 6130 NBA games in order to analyze the time intervals between baskets and scoring dynamics. Most numbers of baskets (n) over a time interval (ΔT) follow a Poisson distribution, but some (e.g., ΔT = 10 s, n > 3) behave as a Power Law. The Poisson distribution includes most baskets in any game, in most game situations, but in close games in the last minute, the numbers of events are distributed following a Power Law. The number of events can be adjusted by a mixture of two distributions. In close games, both teams try to maintain their advantage solely in order to reach the last minute: a completely different game. For this reason, we propose to use the Poisson model as a reference. The complex dynamics will emerge from the limits of this model.

  9. Correlation between discrete probability and reaction front propagation rate in heterogeneous mixtures

    NASA Astrophysics Data System (ADS)

    Naine, Tarun Bharath; Gundawar, Manoj Kumar

    2017-09-01

    We demonstrate a very powerful correlation between the discrete probability of distances of neighboring cells and thermal wave propagation rate, for a system of cells spread on a one-dimensional chain. A gamma distribution is employed to model the distances of neighboring cells. In the absence of an analytical solution and the differences in ignition times of adjacent reaction cells following non-Markovian statistics, invariably the solution for thermal wave propagation rate for a one-dimensional system with randomly distributed cells is obtained by numerical simulations. However, such simulations which are based on Monte-Carlo methods require several iterations of calculations for different realizations of distribution of adjacent cells. For several one-dimensional systems, differing in the value of shaping parameter of the gamma distribution, we show that the average reaction front propagation rates obtained by a discrete probability between two limits, shows excellent agreement with those obtained numerically. With the upper limit at 1.3, the lower limit depends on the non-dimensional ignition temperature. Additionally, this approach also facilitates the prediction of burning limits of heterogeneous thermal mixtures. The proposed method completely eliminates the need for laborious, time intensive numerical calculations where the thermal wave propagation rates can now be calculated based only on macroscopic entity of discrete probability.

  10. Last Passage Percolation and Traveling Fronts

    NASA Astrophysics Data System (ADS)

    Comets, Francis; Quastel, Jeremy; Ramírez, Alejandro F.

    2013-08-01

    We consider a system of N particles with a stochastic dynamics introduced by Brunet and Derrida (Phys. Rev. E 70:016106, 2004). The particles can be interpreted as last passage times in directed percolation on {1,…, N} of mean-field type. The particles remain grouped and move like a traveling front, subject to discretization and driven by a random noise. As N increases, we obtain estimates for the speed of the front and its profile, for different laws of the driving noise. As shown in Brunet and Derrida (Phys. Rev. E 70:016106, 2004), the model with Gumbel distributed jumps has a simple structure. We establish that the scaling limit is a Lévy process in this case. We study other jump distributions. We prove a result showing that the limit for large N is stable under small perturbations of the Gumbel. In the opposite case of bounded jumps, a completely different behavior is found, where finite-size corrections are extremely small.

  11. An analysis of the attitudes of dental patients attending general dental practice in Galway.

    PubMed

    Hayes, Martina; Burke, Francis; McKenna, Gerald; Madden, Jamie; Cronin, Michael

    2013-01-01

    To describe the patterns of dental attendance and attitudes towards tooth loss of general dental practice patients in Galway. 1. To determine the pattern of adult dental attendance in general practices in Galway; and, 2. To examine the oral health attitudes of these patients. Questionnaires were distributed to 311 consecutive adult patients in the waiting rooms of ten general dental practices in Galway, which were randomly selected from the telephone directory. A total of 254 of the 311 questionnaires distributed were fully completed, returned and included in the results, giving a response rate of 81.7%. A total of 59% of dentate participants attended their dentist for annual or biannual examinations compared to 23% of edentate patients. Some 10.5% of medical card holders and 0.5% of non-medical card holders were edentulous. The data from the survey indicated that medical card holders in Galway were more likely to be edentulous than nonmedical card holders. Edentate patients were less likely to be regular dental attenders than dentate patients.

  12. Impact craters and Venus resurfacing history

    NASA Technical Reports Server (NTRS)

    Phillips, Roger J.; Raubertas, Richard F.; Arvidson, Raymond E.; Sarkar, Ila C.; Herrick, Robert R.; Izenberg, Noam; Grimm, Robert E.

    1992-01-01

    The history of resurfacing by tectonism and volcanism on Venus is reconstructed by means of an analysis of Venusian impact crater size-frequency distributions, locations, and preservation states. An atmospheric transit model for meteoroids demonstrates that for craters larger than about 30 km, the size-frequency distribution is close to the atmosphere-free case. An age of cessation of rapid resurfacing of about 500 Ma is obtained. It is inferred that a range of surface ages are recorded by the impact crater population; e.g., the Aphrodite zone is relatively young. An end-member model is developed to quantify resurfacing scenarios. It is argued that Venus has been resurfacing at an average rate of about 1 sq km/yr. Numerical simulations of resurfacing showed that there are two solution branches that satisfy the completely spatially random location restraint for Venusian craters: a is less than 0.0003 (4 deg diameter circle) and a is greater than 0.1 (74 deg diameter circle).

  13. On the degree distribution of horizontal visibility graphs associated with Markov processes and dynamical systems: diagrammatic and variational approaches

    NASA Astrophysics Data System (ADS)

    Lacasa, Lucas

    2014-09-01

    Dynamical processes can be transformed into graphs through a family of mappings called visibility algorithms, enabling the possibility of (i) making empirical time series analysis and signal processing and (ii) characterizing classes of dynamical systems and stochastic processes using the tools of graph theory. Recent works show that the degree distribution of these graphs encapsulates much information on the signals' variability, and therefore constitutes a fundamental feature for statistical learning purposes. However, exact solutions for the degree distributions are only known in a few cases, such as for uncorrelated random processes. Here we analytically explore these distributions in a list of situations. We present a diagrammatic formalism which computes for all degrees their corresponding probability as a series expansion in a coupling constant which is the number of hidden variables. We offer a constructive solution for general Markovian stochastic processes and deterministic maps. As case tests we focus on Ornstein-Uhlenbeck processes, fully chaotic and quasiperiodic maps. Whereas only for certain degree probabilities can all diagrams be summed exactly, in the general case we show that the perturbation theory converges. In a second part, we make use of a variational technique to predict the complete degree distribution for special classes of Markovian dynamics with fast-decaying correlations. In every case we compare the theory with numerical experiments.

  14. Free-Space Quantum Key Distribution using Polarization Entangled Photons

    NASA Astrophysics Data System (ADS)

    Kurtsiefer, Christian

    2007-06-01

    We report on a complete experimental implementation of a quantum key distribution protocol through a free space link using polarization-entangled photon pairs from a compact parametric down-conversion source [1]. Based on a BB84-equivalent protocol, we generated without interruption over 10 hours a secret key free-space optical link distance of 1.5 km with a rate up to 950 bits per second after error correction and privacy amplification. Our system is based on two time stamp units and relies on no specific hardware channel for coincidence identification besides an IP link. For that, initial clock synchronization with an accuracy of better than 2 ns is achieved, based on a conventional NTP protocol and a tiered cross correlation of time tags on both sides. Time tags are used to servo a local clock, allowing a streamed measurement on correctly identified photon pairs. Contrary to the majority of quantum key distribution systems, this approach does not require a trusted large-bandwidth random number generator, but integrates that into the physical key generation process. We discuss our current progress of implementing a key distribution via an atmospherical link during daylight conditions, and possible attack scenarios on a physical timing information side channel to a entanglement-based key distribution system. [1] I. Marcikic, A. Lamas-Linares, C. Kurtsiefer, Appl. Phys. Lett. 89, 101122 (2006).

  15. On the minimum of independent geometrically distributed random variables

    NASA Technical Reports Server (NTRS)

    Ciardo, Gianfranco; Leemis, Lawrence M.; Nicol, David

    1994-01-01

    The expectations E(X(sub 1)), E(Z(sub 1)), and E(Y(sub 1)) of the minimum of n independent geometric, modifies geometric, or exponential random variables with matching expectations differ. We show how this is accounted for by stochastic variability and how E(X(sub 1))/E(Y(sub 1)) equals the expected number of ties at the minimum for the geometric random variables. We then introduce the 'shifted geometric distribution' and show that there is a unique value of the shift for which the individual shifted geometric and exponential random variables match expectations both individually and in the minimums.

  16. Open quantum random walk in terms of quantum Bernoulli noise

    NASA Astrophysics Data System (ADS)

    Wang, Caishi; Wang, Ce; Ren, Suling; Tang, Yuling

    2018-03-01

    In this paper, we introduce an open quantum random walk, which we call the QBN-based open walk, by means of quantum Bernoulli noise, and study its properties from a random walk point of view. We prove that, with the localized ground state as its initial state, the QBN-based open walk has the same limit probability distribution as the classical random walk. We also show that the probability distributions of the QBN-based open walk include those of the unitary quantum walk recently introduced by Wang and Ye (Quantum Inf Process 15:1897-1908, 2016) as a special case.

  17. Experimental demonstration of an active phase randomization and monitor module for quantum key distribution

    NASA Astrophysics Data System (ADS)

    Sun, Shi-Hai; Liang, Lin-Mei

    2012-08-01

    Phase randomization is a very important assumption in the BB84 quantum key distribution (QKD) system with weak coherent source; otherwise, eavesdropper may spy the final key. In this Letter, a stable and monitored active phase randomization scheme for the one-way and two-way QKD system is proposed and demonstrated in experiments. Furthermore, our scheme gives an easy way for Alice to monitor the degree of randomization in experiments. Therefore, we expect our scheme to become a standard part in future QKD systems due to its secure significance and feasibility.

  18. Neutron monitor generated data distributions in quantum variational Monte Carlo

    NASA Astrophysics Data System (ADS)

    Kussainov, A. S.; Pya, N.

    2016-08-01

    We have assessed the potential applications of the neutron monitor hardware as random number generator for normal and uniform distributions. The data tables from the acquisition channels with no extreme changes in the signal level were chosen as the retrospective model. The stochastic component was extracted by fitting the raw data with splines and then subtracting the fit. Scaling the extracted data to zero mean and variance of one is sufficient to obtain a stable standard normal random variate. Distributions under consideration pass all available normality tests. Inverse transform sampling is suggested to use as a source of the uniform random numbers. Variational Monte Carlo method for quantum harmonic oscillator was used to test the quality of our random numbers. If the data delivery rate is of importance and the conventional one minute resolution neutron count is insufficient, we could always settle for an efficient seed generator to feed into the faster algorithmic random number generator or create a buffer.

  19. Reducing financial avalanches by random investments

    NASA Astrophysics Data System (ADS)

    Biondo, Alessio Emanuele; Pluchino, Alessandro; Rapisarda, Andrea; Helbing, Dirk

    2013-12-01

    Building on similarities between earthquakes and extreme financial events, we use a self-organized criticality-generating model to study herding and avalanche dynamics in financial markets. We consider a community of interacting investors, distributed in a small-world network, who bet on the bullish (increasing) or bearish (decreasing) behavior of the market which has been specified according to the S&P 500 historical time series. Remarkably, we find that the size of herding-related avalanches in the community can be strongly reduced by the presence of a relatively small percentage of traders, randomly distributed inside the network, who adopt a random investment strategy. Our findings suggest a promising strategy to limit the size of financial bubbles and crashes. We also obtain that the resulting wealth distribution of all traders corresponds to the well-known Pareto power law, while that of random traders is exponential. In other words, for technical traders, the risk of losses is much greater than the probability of gains compared to those of random traders.

  20. Reducing financial avalanches by random investments.

    PubMed

    Biondo, Alessio Emanuele; Pluchino, Alessandro; Rapisarda, Andrea; Helbing, Dirk

    2013-12-01

    Building on similarities between earthquakes and extreme financial events, we use a self-organized criticality-generating model to study herding and avalanche dynamics in financial markets. We consider a community of interacting investors, distributed in a small-world network, who bet on the bullish (increasing) or bearish (decreasing) behavior of the market which has been specified according to the S&P 500 historical time series. Remarkably, we find that the size of herding-related avalanches in the community can be strongly reduced by the presence of a relatively small percentage of traders, randomly distributed inside the network, who adopt a random investment strategy. Our findings suggest a promising strategy to limit the size of financial bubbles and crashes. We also obtain that the resulting wealth distribution of all traders corresponds to the well-known Pareto power law, while that of random traders is exponential. In other words, for technical traders, the risk of losses is much greater than the probability of gains compared to those of random traders.

  1. Accounting for Sampling Error in Genetic Eigenvalues Using Random Matrix Theory.

    PubMed

    Sztepanacz, Jacqueline L; Blows, Mark W

    2017-07-01

    The distribution of genetic variance in multivariate phenotypes is characterized by the empirical spectral distribution of the eigenvalues of the genetic covariance matrix. Empirical estimates of genetic eigenvalues from random effects linear models are known to be overdispersed by sampling error, where large eigenvalues are biased upward, and small eigenvalues are biased downward. The overdispersion of the leading eigenvalues of sample covariance matrices have been demonstrated to conform to the Tracy-Widom (TW) distribution. Here we show that genetic eigenvalues estimated using restricted maximum likelihood (REML) in a multivariate random effects model with an unconstrained genetic covariance structure will also conform to the TW distribution after empirical scaling and centering. However, where estimation procedures using either REML or MCMC impose boundary constraints, the resulting genetic eigenvalues tend not be TW distributed. We show how using confidence intervals from sampling distributions of genetic eigenvalues without reference to the TW distribution is insufficient protection against mistaking sampling error as genetic variance, particularly when eigenvalues are small. By scaling such sampling distributions to the appropriate TW distribution, the critical value of the TW statistic can be used to determine if the magnitude of a genetic eigenvalue exceeds the sampling error for each eigenvalue in the spectral distribution of a given genetic covariance matrix. Copyright © 2017 by the Genetics Society of America.

  2. Evaluation of the path integral for flow through random porous media

    NASA Astrophysics Data System (ADS)

    Westbroek, Marise J. E.; Coche, Gil-Arnaud; King, Peter R.; Vvedensky, Dimitri D.

    2018-04-01

    We present a path integral formulation of Darcy's equation in one dimension with random permeability described by a correlated multivariate lognormal distribution. This path integral is evaluated with the Markov chain Monte Carlo method to obtain pressure distributions, which are shown to agree with the solutions of the corresponding stochastic differential equation for Dirichlet and Neumann boundary conditions. The extension of our approach to flow through random media in two and three dimensions is discussed.

  3. Testing a pollen-parent fecundity distribution model on seed-parent fecundity distributions in bee-pollinated forage legume polycrosses

    USDA-ARS?s Scientific Manuscript database

    Random mating (i.e., panmixis) is a fundamental assumption in quantitative genetics. In outcrossing bee-pollinated perennial forage legume polycrosses, mating is assumed by default to follow theoretical random mating. This assumption informs breeders of expected inbreeding estimates based on polycro...

  4. 29 CFR 1926.1413 - Wire rope-inspection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... Apparent deficiencies in this category are: (A) Visible broken wires, as follows: (1) In running wire ropes: Six randomly distributed broken wires in one rope lay or three broken wires in one strand in one rope... around the rope. (2) In rotation resistant ropes: Two randomly distributed broken wires in six rope...

  5. 29 CFR 1926.1413 - Wire rope-inspection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... Apparent deficiencies in this category are: (A) Visible broken wires, as follows: (1) In running wire ropes: Six randomly distributed broken wires in one rope lay or three broken wires in one strand in one rope... around the rope. (2) In rotation resistant ropes: Two randomly distributed broken wires in six rope...

  6. 29 CFR 1926.1413 - Wire rope-inspection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... Apparent deficiencies in this category are: (A) Visible broken wires, as follows: (1) In running wire ropes: Six randomly distributed broken wires in one rope lay or three broken wires in one strand in one rope... around the rope. (2) In rotation resistant ropes: Two randomly distributed broken wires in six rope...

  7. Does Mass Azithromycin Distribution Impact Child Growth and Nutrition in Niger? A Cluster-Randomized Trial

    PubMed Central

    Amza, Abdou; Yu, Sun N.; Kadri, Boubacar; Nassirou, Baido; Stoller, Nicole E.; Zhou, Zhaoxia; West, Sheila K.; Bailey, Robin L.; Gaynor, Bruce D.; Keenan, Jeremy D.; Porco, Travis C.; Lietman, Thomas M.

    2014-01-01

    Background Antibiotic use on animals demonstrates improved growth regardless of whether or not there is clinical evidence of infectious disease. Antibiotics used for trachoma control may play an unintended benefit of improving child growth. Methodology In this sub-study of a larger randomized controlled trial, we assess anthropometry of pre-school children in a community-randomized trial of mass oral azithromycin distributions for trachoma in Niger. We measured height, weight, and mid-upper arm circumference (MUAC) in 12 communities randomized to receive annual mass azithromycin treatment of everyone versus 12 communities randomized to receive biannual mass azithromycin treatments for children, 3 years after the initial mass treatment. We collected measurements in 1,034 children aged 6–60 months of age. Principal Findings We found no difference in the prevalence of wasting among children in the 12 annually treated communities that received three mass azithromycin distributions compared to the 12 biannually treated communities that received six mass azithromycin distributions (odds ratio = 0.88, 95% confidence interval = 0.53 to 1.49). Conclusions/Significance We were unable to demonstrate a statistically significant difference in stunting, underweight, and low MUAC of pre-school children in communities randomized to annual mass azithromycin treatment or biannual mass azithromycin treatment. The role of antibiotics on child growth and nutrition remains unclear, but larger studies and longitudinal trials may help determine any association. PMID:25210836

  8. A Randomized Clinical Trial of Methadone Maintenance for Prisoners: Prediction of Treatment Entry and Completion in Prison

    ERIC Educational Resources Information Center

    Gordon, Michael S.; Kinlock, Timothy W.; Couvillion, Kathryn A.; Schwartz, Robert P.; O'Grady, Kevin

    2012-01-01

    The present report is an intent-to-treat analysis involving secondary data drawn from the first randomized clinical trial of prison-initiated methadone in the United States. This study examined predictors of treatment entry and completion in prison. A sample of 211 adult male prerelease inmates with preincarceration heroin dependence were randomly…

  9. Examining the Missing Completely at Random Mechanism in Incomplete Data Sets: A Multiple Testing Approach

    ERIC Educational Resources Information Center

    Raykov, Tenko; Lichtenberg, Peter A.; Paulson, Daniel

    2012-01-01

    A multiple testing procedure for examining implications of the missing completely at random (MCAR) mechanism in incomplete data sets is discussed. The approach uses the false discovery rate concept and is concerned with testing group differences on a set of variables. The method can be used for ascertaining violations of MCAR and disproving this…

  10. A Permutation-Randomization Approach to Test the Spatial Distribution of Plant Diseases.

    PubMed

    Lione, G; Gonthier, P

    2016-01-01

    The analysis of the spatial distribution of plant diseases requires the availability of trustworthy geostatistical methods. The mean distance tests (MDT) are here proposed as a series of permutation and randomization tests to assess the spatial distribution of plant diseases when the variable of phytopathological interest is categorical. A user-friendly software to perform the tests is provided. Estimates of power and type I error, obtained with Monte Carlo simulations, showed the reliability of the MDT (power > 0.80; type I error < 0.05). A biological validation on the spatial distribution of spores of two fungal pathogens causing root rot on conifers was successfully performed by verifying the consistency between the MDT responses and previously published data. An application of the MDT was carried out to analyze the relation between the plantation density and the distribution of the infection of Gnomoniopsis castanea, an emerging fungal pathogen causing nut rot on sweet chestnut. Trees carrying nuts infected by the pathogen were randomly distributed in areas with different plantation densities, suggesting that the distribution of G. castanea was not related to the plantation density. The MDT could be used to analyze the spatial distribution of plant diseases both in agricultural and natural ecosystems.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    ALAM,TODD M.

    Monte Carlo simulations of phosphate tetrahedron connectivity distributions in alkali and alkaline earth phosphate glasses are reported. By utilizing a discrete bond model, the distribution of next-nearest neighbor connectivities between phosphate polyhedron for random, alternating and clustering bonding scenarios was evaluated as a function of the relative bond energy difference. The simulated distributions are compared to experimentally observed connectivities reported for solid-state two-dimensional exchange and double-quantum NMR experiments of phosphate glasses. These Monte Carlo simulations demonstrate that the polyhedron connectivity is best described by a random distribution in lithium phosphate and calcium phosphate glasses.

  12. Generating and using truly random quantum states in Mathematica

    NASA Astrophysics Data System (ADS)

    Miszczak, Jarosław Adam

    2012-01-01

    The problem of generating random quantum states is of a great interest from the quantum information theory point of view. In this paper we present a package for Mathematica computing system harnessing a specific piece of hardware, namely Quantis quantum random number generator (QRNG), for investigating statistical properties of quantum states. The described package implements a number of functions for generating random states, which use Quantis QRNG as a source of randomness. It also provides procedures which can be used in simulations not related directly to quantum information processing. Program summaryProgram title: TRQS Catalogue identifier: AEKA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 7924 No. of bytes in distributed program, including test data, etc.: 88 651 Distribution format: tar.gz Programming language: Mathematica, C Computer: Requires a Quantis quantum random number generator (QRNG, http://www.idquantique.com/true-random-number-generator/products-overview.html) and supporting a recent version of Mathematica Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit) RAM: Case dependent Classification: 4.15 Nature of problem: Generation of random density matrices. Solution method: Use of a physical quantum random number generator. Running time: Generating 100 random numbers takes about 1 second, generating 1000 random density matrices takes more than a minute.

  13. The distribution of the intervals between neural impulses in the maintained discharges of retinal ganglion cells.

    PubMed

    Levine, M W

    1991-01-01

    Simulated neural impulse trains were generated by a digital realization of the integrate-and-fire model. The variability in these impulse trains had as its origin a random noise of specified distribution. Three different distributions were used: the normal (Gaussian) distribution (no skew, normokurtic), a first-order gamma distribution (positive skew, leptokurtic), and a uniform distribution (no skew, platykurtic). Despite these differences in the distribution of the variability, the distributions of the intervals between impulses were nearly indistinguishable. These inter-impulse distributions were better fit with a hyperbolic gamma distribution than a hyperbolic normal distribution, although one might expect a better approximation for normally distributed inverse intervals. Consideration of why the inter-impulse distribution is independent of the distribution of the causative noise suggests two putative interval distributions that do not depend on the assumed noise distribution: the log normal distribution, which is predicated on the assumption that long intervals occur with the joint probability of small input values, and the random walk equation, which is the diffusion equation applied to a random walk model of the impulse generating process. Either of these equations provides a more satisfactory fit to the simulated impulse trains than the hyperbolic normal or hyperbolic gamma distributions. These equations also provide better fits to impulse trains derived from the maintained discharges of ganglion cells in the retinae of cats or goldfish. It is noted that both equations are free from the constraint that the coefficient of variation (CV) have a maximum of unity.(ABSTRACT TRUNCATED AT 250 WORDS)

  14. Multiwavelength generation in a random distributed feedback fiber laser using an all fiber Lyot filter.

    PubMed

    Sugavanam, S; Yan, Z; Kamynin, V; Kurkov, A S; Zhang, L; Churkin, D V

    2014-02-10

    Multiwavelength lasing in the random distributed feedback fiber laser is demonstrated by employing an all fiber Lyot filter. Stable multiwavelength generation is obtained, with each line exhibiting sub-nanometer line-widths. A flat power distribution over multiple lines is obtained, which indicates that the power between lines is redistributed in nonlinear mixing processes. The multiwavelength generation is observed both in first and second Stokes waves.

  15. Fragmentation under the Scaling Symmetry and Turbulent Cascade with Intermittency

    NASA Technical Reports Server (NTRS)

    Gorokhovski, M.

    2003-01-01

    Fragmentation plays an important role in a variety of physical, chemical, and geological processes. Examples include atomization in sprays, crushing of rocks, explosion and impact of solids, polymer degradation, etc. Although each individual action of fragmentation is a complex process, the number of these elementary actions is large. It is natural to abstract a simple 'effective' scenario of fragmentation and to represent its essential features. One of the models is the fragmentation under the scaling symmetry: each breakup action reduces the typical length of fragments, r (right arrow) alpha r, by an independent random multiplier alpha (0 < alpha < 1), which is governed by the fragmentation intensity spectrum q(alpha), integral(sup 1)(sub 0) q(alpha)d alpha = 1. This scenario has been proposed by Kolmogorov (1941), when he considered the breakup of solid carbon particle. Describing the breakup as a random discrete process, Kolmogorov stated that at latest times, such a process leads to the log-normal distribution. In Gorokhovski & Saveliev, the fragmentation under the scaling symmetry has been reviewed as a continuous evolution process with new features established. The objective of this paper is twofold. First, the paper synthesizes and completes theoretical part of Gorokhovski & Saveliev. Second, the paper shows a new application of the fragmentation theory under the scale invariance. This application concerns the turbulent cascade with intermittency. We formulate here a model describing the evolution of the velocity increment distribution along the progressively decreasing length scale. The model shows that when the turbulent length scale gets smaller, the velocity increment distribution has central growing peak and develops stretched tails. The intermittency in turbulence is manifested in the same way: large fluctuations of velocity provoke highest strain in narrow (dissipative) regions of flow.

  16. Dynamics of transit times and StorAge Selection functions in four forested catchments from stable isotope data

    NASA Astrophysics Data System (ADS)

    Rodriguez, Nicolas B.; McGuire, Kevin J.; Klaus, Julian

    2017-04-01

    Transit time distributions, residence time distributions and StorAge Selection functions are fundamental integrated descriptors of water storage, mixing, and release in catchments. In this contribution, we determined these time-variant functions in four neighboring forested catchments in H.J. Andrews Experimental Forest, Oregon, USA by employing a two year time series of 18O in precipitation and discharge. Previous studies in these catchments assumed stationary, exponentially distributed transit times, and complete mixing/random sampling to explore the influence of various catchment properties on the mean transit time. Here we relaxed such assumptions to relate transit time dynamics and the variability of StoreAge Selection functions to catchment characteristics, catchment storage, and meteorological forcing seasonality. Conceptual models of the catchments, consisting of two reservoirs combined in series-parallel, were calibrated to discharge and stable isotope tracer data. We assumed randomly sampled/fully mixed conditions for each reservoir, which resulted in an incompletely mixed system overall. Based on the results we solved the Master Equation, which describes the dynamics of water ages in storage and in catchment outflows Consistent between all catchments, we found that transit times were generally shorter during wet periods, indicating the contribution of shallow storage (soil, saprolite) to discharge. During extended dry periods, transit times increased significantly indicating the contribution of deeper storage (bedrock) to discharge. Our work indicated that the strong seasonality of precipitation impacted transit times by leading to a dynamic selection of stored water ages, whereas catchment size was not a control on transit times. In general this work showed the usefulness of using time-variant transit times with conceptual models and confirmed the existence of the catchment age mixing behaviors emerging from other similar studies.

  17. Evaluation of Scat Deposition Transects versus Radio Telemetry for Developing a Species Distribution Model for a Rare Desert Carnivore, the Kit Fox.

    PubMed

    Dempsey, Steven J; Gese, Eric M; Kluever, Bryan M; Lonsinger, Robert C; Waits, Lisette P

    2015-01-01

    Development and evaluation of noninvasive methods for monitoring species distribution and abundance is a growing area of ecological research. While noninvasive methods have the advantage of reduced risk of negative factors associated with capture, comparisons to methods using more traditional invasive sampling is lacking. Historically kit foxes (Vulpes macrotis) occupied the desert and semi-arid regions of southwestern North America. Once the most abundant carnivore in the Great Basin Desert of Utah, the species is now considered rare. In recent decades, attempts have been made to model the environmental variables influencing kit fox distribution. Using noninvasive scat deposition surveys for determination of kit fox presence, we modeled resource selection functions to predict kit fox distribution using three popular techniques (Maxent, fixed-effects, and mixed-effects generalized linear models) and compared these with similar models developed from invasive sampling (telemetry locations from radio-collared foxes). Resource selection functions were developed using a combination of landscape variables including elevation, slope, aspect, vegetation height, and soil type. All models were tested against subsequent scat collections as a method of model validation. We demonstrate the importance of comparing multiple model types for development of resource selection functions used to predict a species distribution, and evaluating the importance of environmental variables on species distribution. All models we examined showed a large effect of elevation on kit fox presence, followed by slope and vegetation height. However, the invasive sampling method (i.e., radio-telemetry) appeared to be better at determining resource selection, and therefore may be more robust in predicting kit fox distribution. In contrast, the distribution maps created from the noninvasive sampling (i.e., scat transects) were significantly different than the invasive method, thus scat transects may be appropriate when used in an occupancy framework to predict species distribution. We concluded that while scat deposition transects may be useful for monitoring kit fox abundance and possibly occupancy, they do not appear to be appropriate for determining resource selection. On our study area, scat transects were biased to roadways, while data collected using radio-telemetry was dictated by movements of the kit foxes themselves. We recommend that future studies applying noninvasive scat sampling should consider a more robust random sampling design across the landscape (e.g., random transects or more complete road coverage) that would then provide a more accurate and unbiased depiction of resource selection useful to predict kit fox distribution.

  18. Classes of Split-Plot Response Surface Designs for Equivalent Estimation

    NASA Technical Reports Server (NTRS)

    Parker, Peter A.; Kowalski, Scott M.; Vining, G. Geoffrey

    2006-01-01

    When planning an experimental investigation, we are frequently faced with factors that are difficult or time consuming to manipulate, thereby making complete randomization impractical. A split-plot structure differentiates between the experimental units associated with these hard-to-change factors and others that are relatively easy-to-change and provides an efficient strategy that integrates the restrictions imposed by the experimental apparatus. Several industrial and scientific examples are presented to illustrate design considerations encountered in the restricted randomization context. In this paper, we propose classes of split-plot response designs that provide an intuitive and natural extension from the completely randomized context. For these designs, the ordinary least squares estimates of the model are equivalent to the generalized least squares estimates. This property provides best linear unbiased estimators and simplifies model estimation. The design conditions that allow for equivalent estimation are presented enabling design construction strategies to transform completely randomized Box-Behnken, equiradial, and small composite designs into a split-plot structure.

  19. The treatment of medial tibial stress syndrome in athletes; a randomized clinical trial

    PubMed Central

    2012-01-01

    Background The only three randomized trials on the treatment of MTSS were all performed in military populations. The treatment options investigated in this study were not previously examined in athletes. This study investigated if functional outcome of three common treatment options for medial tibial stress syndrome (MTSS) in athletes in a non-military setting was the same. Methods The study design was randomized and multi-centered. Physical therapists and sports physicians referred athletes with MTSS to the hospital for inclusion. 81 athletes were assessed for eligibility of which 74 athletes were included and randomized to three treatment groups. Group one performed a graded running program, group two performed a graded running program with additional stretching and strengthening exercises for the calves, while group three performed a graded running program with an additional sports compression stocking. The primary outcome measure was: time to complete a running program (able to run 18 minutes with high intensity) and secondary outcome was: general satisfaction with treatment. Results 74 Athletes were randomized and included of which 14 did not complete the study due a lack of progress (18.9%). The data was analyzed on an intention-to-treat basis. Time to complete a running program and general satisfaction with the treatment were not significantly different between the three treatment groups. Conclusion This was the first randomized trial on the treatment of MTSS in athletes in a non-military setting. No differences were found between the groups for the time to complete a running program. Trial registration CCMO; NL23471.098.08 PMID:22464032

  20. Directed Random Markets: Connectivity Determines Money

    NASA Astrophysics Data System (ADS)

    Martínez-Martínez, Ismael; López-Ruiz, Ricardo

    2013-12-01

    Boltzmann-Gibbs (BG) distribution arises as the statistical equilibrium probability distribution of money among the agents of a closed economic system where random and undirected exchanges are allowed. When considering a model with uniform savings in the exchanges, the final distribution is close to the gamma family. In this paper, we implement these exchange rules on networks and we find that these stationary probability distributions are robust and they are not affected by the topology of the underlying network. We introduce a new family of interactions: random but directed ones. In this case, it is found the topology to be determinant and the mean money per economic agent is related to the degree of the node representing the agent in the network. The relation between the mean money per economic agent and its degree is shown to be linear.

  1. Smart darting diffusion Monte Carlo: Applications to lithium ion-Stockmayer clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, H. M.; Jake, L. C.; Curotto, E., E-mail: curotto@arcadia.edu

    2016-05-07

    In a recent investigation [K. Roberts et al., J. Chem. Phys. 136, 074104 (2012)], we have shown that, for a sufficiently complex potential, the Diffusion Monte Carlo (DMC) random walk can become quasiergodic, and we have introduced smart darting-like moves to improve the sampling. In this article, we systematically characterize the bias that smart darting moves introduce in the estimate of the ground state energy of a bosonic system. We then test a simple approach to eliminate completely such bias from the results. The approach is applied for the determination of the ground state of lithium ion-n–dipoles clusters in themore » n = 8–20 range. For these, the smart darting diffusion Monte Carlo simulations find the same ground state energy and mixed-distribution as the traditional approach for n < 14. In larger systems we find that while the ground state energies agree quantitatively with or without smart darting moves, the mixed-distributions can be significantly different. Some evidence is offered to conclude that introducing smart darting-like moves in traditional DMC simulations may produce a more reliable ground state mixed-distribution.« less

  2. Waiting time distributions in financial markets

    NASA Astrophysics Data System (ADS)

    Sabatelli, L.; Keating, S.; Dudley, J.; Richmond, P.

    2002-05-01

    We study waiting time distributions for data representing two completely different financial markets that have dramatically different characteristics. The first are data for the Irish market during the 19th century over the period 1850 to 1854. A total of 10 stocks out of a database of 60 are examined. The second database is for Japanese yen currency fluctuations during the latter part of the 20th century (1989-1992). The Irish stock activity was recorded on a daily basis and activity was characterised by waiting times that varied from one day to a few months. The Japanese yen data was recorded every minute over 24 hour periods and the waiting times varied from a minute to a an hour or so. For both data sets, the waiting time distributions exhibit power law tails. The results for Irish daily data can be easily interpreted using the model of a continuous time random walk first proposed by Montroll and applied recently to some financial data by Mainardi, Scalas and colleagues. Yen data show a quite different behaviour. For large waiting times, the Irish data exhibit a cut off; the Yen data exhibit two humps that could arise as result of major trading centres in the World.

  3. Do the rich get richer? An empirical analysis of the Bitcoin transaction network.

    PubMed

    Kondor, Dániel; Pósfai, Márton; Csabai, István; Vattay, Gábor

    2014-01-01

    The possibility to analyze everyday monetary transactions is limited by the scarcity of available data, as this kind of information is usually considered highly sensitive. Present econophysics models are usually employed on presumed random networks of interacting agents, and only some macroscopic properties (e.g. the resulting wealth distribution) are compared to real-world data. In this paper, we analyze Bitcoin, which is a novel digital currency system, where the complete list of transactions is publicly available. Using this dataset, we reconstruct the network of transactions and extract the time and amount of each payment. We analyze the structure of the transaction network by measuring network characteristics over time, such as the degree distribution, degree correlations and clustering. We find that linear preferential attachment drives the growth of the network. We also study the dynamics taking place on the transaction network, i.e. the flow of money. We measure temporal patterns and the wealth accumulation. Investigating the microscopic statistics of money movement, we find that sublinear preferential attachment governs the evolution of the wealth distribution. We report a scaling law between the degree and wealth associated to individual nodes.

  4. Do the Rich Get Richer? An Empirical Analysis of the Bitcoin Transaction Network

    PubMed Central

    Kondor, Dániel; Pósfai, Márton; Csabai, István; Vattay, Gábor

    2014-01-01

    The possibility to analyze everyday monetary transactions is limited by the scarcity of available data, as this kind of information is usually considered highly sensitive. Present econophysics models are usually employed on presumed random networks of interacting agents, and only some macroscopic properties (e.g. the resulting wealth distribution) are compared to real-world data. In this paper, we analyze Bitcoin, which is a novel digital currency system, where the complete list of transactions is publicly available. Using this dataset, we reconstruct the network of transactions and extract the time and amount of each payment. We analyze the structure of the transaction network by measuring network characteristics over time, such as the degree distribution, degree correlations and clustering. We find that linear preferential attachment drives the growth of the network. We also study the dynamics taking place on the transaction network, i.e. the flow of money. We measure temporal patterns and the wealth accumulation. Investigating the microscopic statistics of money movement, we find that sublinear preferential attachment governs the evolution of the wealth distribution. We report a scaling law between the degree and wealth associated to individual nodes. PMID:24505257

  5. Effects of ignition location models on the burn patterns of simulated wildfires

    USGS Publications Warehouse

    Bar-Massada, A.; Syphard, A.D.; Hawbaker, T.J.; Stewart, S.I.; Radeloff, V.C.

    2011-01-01

    Fire simulation studies that use models such as FARSITE often assume that ignition locations are distributed randomly, because spatially explicit information about actual ignition locations are difficult to obtain. However, many studies show that the spatial distribution of ignition locations, whether human-caused or natural, is non-random. Thus, predictions from fire simulations based on random ignitions may be unrealistic. However, the extent to which the assumption of ignition location affects the predictions of fire simulation models has never been systematically explored. Our goal was to assess the difference in fire simulations that are based on random versus non-random ignition location patterns. We conducted four sets of 6000 FARSITE simulations for the Santa Monica Mountains in California to quantify the influence of random and non-random ignition locations and normal and extreme weather conditions on fire size distributions and spatial patterns of burn probability. Under extreme weather conditions, fires were significantly larger for non-random ignitions compared to random ignitions (mean area of 344.5 ha and 230.1 ha, respectively), but burn probability maps were highly correlated (r = 0.83). Under normal weather, random ignitions produced significantly larger fires than non-random ignitions (17.5 ha and 13.3 ha, respectively), and the spatial correlations between burn probability maps were not high (r = 0.54), though the difference in the average burn probability was small. The results of the study suggest that the location of ignitions used in fire simulation models may substantially influence the spatial predictions of fire spread patterns. However, the spatial bias introduced by using a random ignition location model may be minimized if the fire simulations are conducted under extreme weather conditions when fire spread is greatest. ?? 2010 Elsevier Ltd.

  6. Models for the hotspot distribution

    NASA Technical Reports Server (NTRS)

    Jurdy, Donna M.; Stefanick, Michael

    1990-01-01

    Published hotspot catalogs all show a hemispheric concentration beyond what can be expected by chance. Cumulative distributions about the center of concentration are described by a power law with a fractal dimension closer to 1 than 2. Random sets of the corresponding sizes do not show this effect. A simple shift of the random sets away from a point would produce distributions similar to those of hotspot sets. The possible relation of the hotspots to the locations of ridges and subduction zones is tested using large sets of randomly-generated points to estimate areas within given distances of the plate boundaries. The probability of finding the observed number of hotspots within 10 deg of the ridges is about what is expected.

  7. Physically Unclonable Cryptographic Primitives by Chemical Vapor Deposition of Layered MoS2.

    PubMed

    Alharbi, Abdullah; Armstrong, Darren; Alharbi, Somayah; Shahrjerdi, Davood

    2017-12-26

    Physically unclonable cryptographic primitives are promising for securing the rapidly growing number of electronic devices. Here, we introduce physically unclonable primitives from layered molybdenum disulfide (MoS 2 ) by leveraging the natural randomness of their island growth during chemical vapor deposition (CVD). We synthesize a MoS 2 monolayer film covered with speckles of multilayer islands, where the growth process is engineered for an optimal speckle density. Using the Clark-Evans test, we confirm that the distribution of islands on the film exhibits complete spatial randomness, hence indicating the growth of multilayer speckles is a spatial Poisson process. Such a property is highly desirable for constructing unpredictable cryptographic primitives. The security primitive is an array of 2048 pixels fabricated from this film. The complex structure of the pixels makes the physical duplication of the array impossible (i.e., physically unclonable). A unique optical response is generated by applying an optical stimulus to the structure. The basis for this unique response is the dependence of the photoemission on the number of MoS 2 layers, which by design is random throughout the film. Using a threshold value for the photoemission, we convert the optical response into binary cryptographic keys. We show that the proper selection of this threshold is crucial for maximizing combination randomness and that the optimal value of the threshold is linked directly to the growth process. This study reveals an opportunity for generating robust and versatile security primitives from layered transition metal dichalcogenides.

  8. Dissecting random and systematic differences between noisy composite data sets.

    PubMed

    Diederichs, Kay

    2017-04-01

    Composite data sets measured on different objects are usually affected by random errors, but may also be influenced by systematic (genuine) differences in the objects themselves, or the experimental conditions. If the individual measurements forming each data set are quantitative and approximately normally distributed, a correlation coefficient is often used to compare data sets. However, the relations between data sets are not obvious from the matrix of pairwise correlations since the numerical value of the correlation coefficient is lowered by both random and systematic differences between the data sets. This work presents a multidimensional scaling analysis of the pairwise correlation coefficients which places data sets into a unit sphere within low-dimensional space, at a position given by their CC* values [as defined by Karplus & Diederichs (2012), Science, 336, 1030-1033] in the radial direction and by their systematic differences in one or more angular directions. This dimensionality reduction can not only be used for classification purposes, but also to derive data-set relations on a continuous scale. Projecting the arrangement of data sets onto the subspace spanned by systematic differences (the surface of a unit sphere) allows, irrespective of the random-error levels, the identification of clusters of closely related data sets. The method gains power with increasing numbers of data sets. It is illustrated with an example from low signal-to-noise ratio image processing, and an application in macromolecular crystallography is shown, but the approach is completely general and thus should be widely applicable.

  9. Nutrient intakes of US infants, toddlers, and preschoolers meet or exceed dietary reference intakes.

    PubMed

    Butte, Nancy F; Fox, Mary Kay; Briefel, Ronette R; Siega-Riz, Anna Maria; Dwyer, Johanna T; Deming, Denise M; Reidy, Kathleen C

    2010-12-01

    To assess the usual nutrient intakes of 3,273 US infants, toddlers, and preschoolers, aged 0 to 47 months, surveyed in the Feeding Infants and Toddlers Study (FITS) 2008; and to compare data on the usual nutrient intakes for the two waves of FITS conducted in 2002 and 2008. The FITS 2008 is a cross-sectional survey of a national random sample of US children from birth through age 47 months. Usual nutrient intakes derived from foods, beverages, and supplements were ascertained using a telephone-administered, multiple-pass 24-hour dietary recall. Infants aged birth to 5 months (n=382) and 6 to 11 months (n=505), toddlers aged 12 to 23 months (n=925), and preschoolers aged 24 to 47 months (n=1,461) were surveyed. All primary caregivers completed one 24-hour dietary recall and a random subsample (n=701) completed a second 24-hour dietary recall. The personal computer version of the Software for Intake Distribution Estimation was used to estimate the 10th, 25th, 50th, 75th, and 90th percentiles, as well as the proportions below and above cutoff values defined by the Dietary Reference Intakes or the 2005 Dietary Guidelines for Americans. Usual nutrient intakes met or exceeded energy and protein requirements with minimal risk of vitamin and mineral deficiencies. The usual intakes of antioxidants, B vitamins, bone-related nutrients, and other micronutrients were adequate relative to the Adequate Intakes or Estimated Average Requirements, except for iron and zinc in a small subset of older infants, and vitamin E and potassium in toddlers and preschoolers. Intakes of synthetic folate, preformed vitamin A, zinc, and sodium exceeded Tolerable Upper Intake Level in a significant proportion of toddlers and preschoolers. Macronutrient distributions were within acceptable macronutrient distribution ranges, except for dietary fat, in some toddlers and preschoolers. Dietary fiber was low in the vast majority of toddlers and preschoolers, and saturated fat intakes exceeded recommendations for the majority of preschoolers. The prevalence of inadequate intakes, excessive intake, and intakes outside the acceptable macronutrient distribution range was similar in FITS 2002 and FITS 2008. In FITS 2008, usual nutrient intakes were adequate for the majority of US infants, toddlers, and preschoolers, except for a small but important number of infants at risk for inadequate iron and zinc intakes. Diet quality should be improved in the transition from infancy to early childhood, particularly with respect to healthier fats and fiber in the diets of toddlers and preschoolers. Copyright © 2010 American Dietetic Association. Published by Elsevier Inc. All rights reserved.

  10. Bayesian approach to non-Gaussian field statistics for diffusive broadband terahertz pulses.

    PubMed

    Pearce, Jeremy; Jian, Zhongping; Mittleman, Daniel M

    2005-11-01

    We develop a closed-form expression for the probability distribution function for the field components of a diffusive broadband wave propagating through a random medium. We consider each spectral component to provide an individual observation of a random variable, the configurationally averaged spectral intensity. Since the intensity determines the variance of the field distribution at each frequency, this random variable serves as the Bayesian prior that determines the form of the non-Gaussian field statistics. This model agrees well with experimental results.

  11. Genetic algorithms as global random search methods

    NASA Technical Reports Server (NTRS)

    Peck, Charles C.; Dhawan, Atam P.

    1995-01-01

    Genetic algorithm behavior is described in terms of the construction and evolution of the sampling distributions over the space of candidate solutions. This novel perspective is motivated by analysis indicating that the schema theory is inadequate for completely and properly explaining genetic algorithm behavior. Based on the proposed theory, it is argued that the similarities of candidate solutions should be exploited directly, rather than encoding candidate solutions and then exploiting their similarities. Proportional selection is characterized as a global search operator, and recombination is characterized as the search process that exploits similarities. Sequential algorithms and many deletion methods are also analyzed. It is shown that by properly constraining the search breadth of recombination operators, convergence of genetic algorithms to a global optimum can be ensured.

  12. Genetic algorithms as global random search methods

    NASA Technical Reports Server (NTRS)

    Peck, Charles C.; Dhawan, Atam P.

    1995-01-01

    Genetic algorithm behavior is described in terms of the construction and evolution of the sampling distributions over the space of candidate solutions. This novel perspective is motivated by analysis indicating that that schema theory is inadequate for completely and properly explaining genetic algorithm behavior. Based on the proposed theory, it is argued that the similarities of candidate solutions should be exploited directly, rather than encoding candidate solution and then exploiting their similarities. Proportional selection is characterized as a global search operator, and recombination is characterized as the search process that exploits similarities. Sequential algorithms and many deletion methods are also analyzed. It is shown that by properly constraining the search breadth of recombination operators, convergence of genetic algorithms to a global optimum can be ensured.

  13. XRootd, disk-based, caching proxy for optimization of data access, data placement and data replication

    NASA Astrophysics Data System (ADS)

    Bauerdick, L. A. T.; Bloom, K.; Bockelman, B.; Bradley, D. C.; Dasu, S.; Dost, J. M.; Sfiligoi, I.; Tadel, A.; Tadel, M.; Wuerthwein, F.; Yagil, A.; Cms Collaboration

    2014-06-01

    Following the success of the XRootd-based US CMS data federation, the AAA project investigated extensions of the federation architecture by developing two sample implementations of an XRootd, disk-based, caching proxy. The first one simply starts fetching a whole file as soon as a file open request is received and is suitable when completely random file access is expected or it is already known that a whole file be read. The second implementation supports on-demand downloading of partial files. Extensions to the Hadoop Distributed File System have been developed to allow for an immediate fallback to network access when local HDFS storage fails to provide the requested block. Both cache implementations are in pre-production testing at UCSD.

  14. Effect of packing method on the randomness of disc packings

    NASA Astrophysics Data System (ADS)

    Zhang, Z. P.; Yu, A. B.; Oakeshott, R. B. S.

    1996-06-01

    The randomness of disc packings, generated by random sequential adsorption (RSA), random packing under gravity (RPG) and Mason packing (MP) which gives a packing density close to that of the RSA packing, has been analysed, based on the Delaunay tessellation, and is evaluated at two levels, i.e. the randomness at individual subunit level which relates to the construction of a triangle from a given edge length distribution and the randomness at network level which relates to the connection between triangles from a given triangle frequency distribution. The Delaunay tessellation itself is also analysed and its almost perfect randomness at the two levels is demonstrated, which verifies the proposed approach and provides a random reference system for the present analysis. It is found that (i) the construction of a triangle subunit is not random for the RSA, MP and RPG packings, with the degree of randomness decreasing from the RSA to MP and then to RPG packing; (ii) the connection of triangular subunits in the network is almost perfectly random for the RSA packing, acceptable for the MP packing and not good for the RPG packing. Packing method is an important factor governing the randomness of disc packings.

  15. Nature of alpha and beta particles in glycogen using molecular size distributions.

    PubMed

    Sullivan, Mitchell A; Vilaplana, Francisco; Cave, Richard A; Stapleton, David; Gray-Weale, Angus A; Gilbert, Robert G

    2010-04-12

    Glycogen is a randomly hyperbranched glucose polymer. Complex branched polymers have two structural levels: individual branches and the way these branches are linked. Liver glycogen has a third level: supramolecular clusters of beta particles which form larger clusters of alpha particles. Size distributions of native glycogen were characterized using size exclusion chromatography (SEC) to find the number and weight distributions and the size dependences of the number- and weight-average masses. These were fitted to two distinct randomly joined reference structures, constructed by random attachment of individual branches and as random aggregates of beta particles. The z-average size of the alpha particles in dimethylsulfoxide does not change significantly with high concentrations of LiBr, a solvent system that would disrupt hydrogen bonding. These data reveal that the beta particles are covalently bonded to form alpha particles through a hitherto unsuspected enzyme process, operative in the liver on particles above a certain size range.

  16. Noise-Induced Synchronization among Sub-RF CMOS Analog Oscillators for Skew-Free Clock Distribution

    NASA Astrophysics Data System (ADS)

    Utagawa, Akira; Asai, Tetsuya; Hirose, Tetsuya; Amemiya, Yoshihito

    We present on-chip oscillator arrays synchronized by random noises, aiming at skew-free clock distribution on synchronous digital systems. Nakao et al. recently reported that independent neural oscillators can be synchronized by applying temporal random impulses to the oscillators [1], [2]. We regard neural oscillators as independent clock sources on LSIs; i. e., clock sources are distributed on LSIs, and they are forced to synchronize through the use of random noises. We designed neuron-based clock generators operating at sub-RF region (<1GHz) by modifying the original neuron model to a new model that is suitable for CMOS implementation with 0.25-μm CMOS parameters. Through circuit simulations, we demonstrate that i) the clock generators are certainly synchronized by pseudo-random noises and ii) clock generators exhibited phase-locked oscillations even if they had small device mismatches.

  17. Generation of pseudo-random numbers

    NASA Technical Reports Server (NTRS)

    Howell, L. W.; Rheinfurth, M. H.

    1982-01-01

    Practical methods for generating acceptable random numbers from a variety of probability distributions which are frequently encountered in engineering applications are described. The speed, accuracy, and guarantee of statistical randomness of the various methods are discussed.

  18. Emergence of an optimal search strategy from a simple random walk

    PubMed Central

    Sakiyama, Tomoko; Gunji, Yukio-Pegio

    2013-01-01

    In reports addressing animal foraging strategies, it has been stated that Lévy-like algorithms represent an optimal search strategy in an unknown environment, because of their super-diffusion properties and power-law-distributed step lengths. Here, starting with a simple random walk algorithm, which offers the agent a randomly determined direction at each time step with a fixed move length, we investigated how flexible exploration is achieved if an agent alters its randomly determined next step forward and the rule that controls its random movement based on its own directional moving experiences. We showed that our algorithm led to an effective food-searching performance compared with a simple random walk algorithm and exhibited super-diffusion properties, despite the uniform step lengths. Moreover, our algorithm exhibited a power-law distribution independent of uniform step lengths. PMID:23804445

  19. Emergence of an optimal search strategy from a simple random walk.

    PubMed

    Sakiyama, Tomoko; Gunji, Yukio-Pegio

    2013-09-06

    In reports addressing animal foraging strategies, it has been stated that Lévy-like algorithms represent an optimal search strategy in an unknown environment, because of their super-diffusion properties and power-law-distributed step lengths. Here, starting with a simple random walk algorithm, which offers the agent a randomly determined direction at each time step with a fixed move length, we investigated how flexible exploration is achieved if an agent alters its randomly determined next step forward and the rule that controls its random movement based on its own directional moving experiences. We showed that our algorithm led to an effective food-searching performance compared with a simple random walk algorithm and exhibited super-diffusion properties, despite the uniform step lengths. Moreover, our algorithm exhibited a power-law distribution independent of uniform step lengths.

  20. Multi-peak structure of generation spectrum of random distributed feedback fiber Raman lasers.

    PubMed

    Vatnik, I D; Zlobina, E A; Kablukov, S I; Babin, S A

    2017-02-06

    We study spectral features of the generation of random distributed feedback fiber Raman laser arising from two-peak shape of the Raman gain spectral profile realized in the germanosilicate fibers. We demonstrate that number of peaks can be calculated using power balance model considering different subcomponents within each Stokes component.

  1. Robustness of optimal random searches in fragmented environments

    NASA Astrophysics Data System (ADS)

    Wosniack, M. E.; Santos, M. C.; Raposo, E. P.; Viswanathan, G. M.; da Luz, M. G. E.

    2015-05-01

    The random search problem is a challenging and interdisciplinary topic of research in statistical physics. Realistic searches usually take place in nonuniform heterogeneous distributions of targets, e.g., patchy environments and fragmented habitats in ecological systems. Here we present a comprehensive numerical study of search efficiency in arbitrarily fragmented landscapes with unlimited visits to targets that can only be found within patches. We assume a random walker selecting uniformly distributed turning angles and step lengths from an inverse power-law tailed distribution with exponent μ . Our main finding is that for a large class of fragmented environments the optimal strategy corresponds approximately to the same value μopt≈2 . Moreover, this exponent is indistinguishable from the well-known exact optimal value μopt=2 for the low-density limit of homogeneously distributed revisitable targets. Surprisingly, the best search strategies do not depend (or depend only weakly) on the specific details of the fragmentation. Finally, we discuss the mechanisms behind this observed robustness and comment on the relevance of our results to both the random search theory in general, as well as specifically to the foraging problem in the biological context.

  2. Random deflections of a string on an elastic foundation.

    NASA Technical Reports Server (NTRS)

    Sanders, J. L., Jr.

    1972-01-01

    The paper is concerned with the problem of a taut string on a random elastic foundation subjected to random loads. The boundary value problem is transformed into an initial value problem by the method of invariant imbedding. Fokker-Planck equations for the random initial value problem are formulated and solved in some special cases. The analysis leads to a complete characterization of the random deflection function.

  3. Metabolic effects of soy supplementation in postmenopausal Caucasian and African American women: a randomized, placebo-controlled trial.

    PubMed

    Christie, Daniel R; Grant, Jan; Darnell, Betty E; Chapman, Victoria R; Gastaldelli, Amalia; Sites, Cynthia K

    2010-08-01

    We sought to determine the effect of daily soy supplementation on abdominal fat, glucose metabolism, and circulating inflammatory markers and adipokines in obese, postmenopausal Caucasian and African American women. In a double-blinded controlled trial, 39 postmenopausal women were randomized to soy supplementation or to a casein placebo without isoflavones. In all, 33 completed the study and were analyzed. At baseline and at 3 months, glucose disposal and insulin secretion were measured using hyperglycemic clamps, body composition and body fat distribution were measured by computed tomographic scan and dual energy x-ray absorptiometry, and serum levels of C-reactive protein, interleukin-6, tumor necrosis factor-alpha, leptin, and adiponectin were measured by immunoassay. Soy supplementation reduced total and subcutaneous abdominal fat and interleukin-6. No difference between groups was noted for glucose metabolism, C-reactive protein, tumor necrosis factor-alpha, leptin, or adiponectin. Soy supplementation reduced abdominal fat in obese postmenopausal women. Caucasians primarily lost subcutaneous and total abdominal fat, and African Americans primarily lost total body fat. Copyright (c) 2010 Mosby, Inc. All rights reserved.

  4. Monte Carlo simulation of reflection spectra of random multilayer media strongly scattering and absorbing light

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meglinskii, I V

    2001-12-31

    The reflection spectra of a multilayer random medium - the human skin - strongly scattering and absorbing light are numerically simulated. The propagation of light in the medium and the absorption spectra are simulated by the stochastic Monte Carlo method, which combines schemes for calculations of real photon trajectories and the statistical weight method. The model takes into account the inhomogeneous spatial distribution of blood vessels, water, and melanin, the degree of blood oxygenation, and the hematocrit index. The attenuation of the incident radiation caused by reflection and refraction at Fresnel boundaries of layers inside the medium is also considered.more » The simulated reflection spectra are compared with the experimental reflection spectra of the human skin. It is shown that a set of parameters that was used to describe the optical properties of skin layers and their possible variations, despite being far from complete, is nevertheless sufficient for the simulation of the reflection spectra of the human skin and their quantitative analysis. (laser applications and other topics in quantum electronics)« less

  5. Application of Multiple Imputation for Missing Values in Three-Way Three-Mode Multi-Environment Trial Data

    PubMed Central

    Tian, Ting; McLachlan, Geoffrey J.; Dieters, Mark J.; Basford, Kaye E.

    2015-01-01

    It is a common occurrence in plant breeding programs to observe missing values in three-way three-mode multi-environment trial (MET) data. We proposed modifications of models for estimating missing observations for these data arrays, and developed a novel approach in terms of hierarchical clustering. Multiple imputation (MI) was used in four ways, multiple agglomerative hierarchical clustering, normal distribution model, normal regression model, and predictive mean match. The later three models used both Bayesian analysis and non-Bayesian analysis, while the first approach used a clustering procedure with randomly selected attributes and assigned real values from the nearest neighbour to the one with missing observations. Different proportions of data entries in six complete datasets were randomly selected to be missing and the MI methods were compared based on the efficiency and accuracy of estimating those values. The results indicated that the models using Bayesian analysis had slightly higher accuracy of estimation performance than those using non-Bayesian analysis but they were more time-consuming. However, the novel approach of multiple agglomerative hierarchical clustering demonstrated the overall best performances. PMID:26689369

  6. Properties of plane discrete Poisson-Voronoi tessellations on triangular tiling formed by the Kolmogorov-Johnson-Mehl-Avrami growth of triangular islands

    NASA Astrophysics Data System (ADS)

    Korobov, A.

    2011-08-01

    Discrete uniform Poisson-Voronoi tessellations of two-dimensional triangular tilings resulting from the Kolmogorov-Johnson-Mehl-Avrami (KJMA) growth of triangular islands have been studied. This shape of tiles and islands, rarely considered in the field of random tessellations, is prompted by the birth-growth process of Ir(210) faceting. The growth mode determines a triangular metric different from the Euclidean metric. Kinetic characteristics of tessellations appear to be metric sensitive, in contrast to area distributions. The latter have been studied for the variant of nuclei growth to the first impingement in addition to the conventional case of complete growth. Kiang conjecture works in both cases. The averaged number of neighbors is six for all studied densities of random tessellations, but neighbors appear to be mainly different in triangular and Euclidean metrics. Also, the applicability of the obtained results for simulating birth-growth processes when the 2D nucleation and impingements are combined with the 3D growth in the particular case of similar shape and the same orientation of growing nuclei is briefly discussed.

  7. Properties of plane discrete Poisson-Voronoi tessellations on triangular tiling formed by the Kolmogorov-Johnson-Mehl-Avrami growth of triangular islands.

    PubMed

    Korobov, A

    2011-08-01

    Discrete uniform Poisson-Voronoi tessellations of two-dimensional triangular tilings resulting from the Kolmogorov-Johnson-Mehl-Avrami (KJMA) growth of triangular islands have been studied. This shape of tiles and islands, rarely considered in the field of random tessellations, is prompted by the birth-growth process of Ir(210) faceting. The growth mode determines a triangular metric different from the Euclidean metric. Kinetic characteristics of tessellations appear to be metric sensitive, in contrast to area distributions. The latter have been studied for the variant of nuclei growth to the first impingement in addition to the conventional case of complete growth. Kiang conjecture works in both cases. The averaged number of neighbors is six for all studied densities of random tessellations, but neighbors appear to be mainly different in triangular and Euclidean metrics. Also, the applicability of the obtained results for simulating birth-growth processes when the 2D nucleation and impingements are combined with the 3D growth in the particular case of similar shape and the same orientation of growing nuclei is briefly discussed.

  8. Application of Multiple Imputation for Missing Values in Three-Way Three-Mode Multi-Environment Trial Data.

    PubMed

    Tian, Ting; McLachlan, Geoffrey J; Dieters, Mark J; Basford, Kaye E

    2015-01-01

    It is a common occurrence in plant breeding programs to observe missing values in three-way three-mode multi-environment trial (MET) data. We proposed modifications of models for estimating missing observations for these data arrays, and developed a novel approach in terms of hierarchical clustering. Multiple imputation (MI) was used in four ways, multiple agglomerative hierarchical clustering, normal distribution model, normal regression model, and predictive mean match. The later three models used both Bayesian analysis and non-Bayesian analysis, while the first approach used a clustering procedure with randomly selected attributes and assigned real values from the nearest neighbour to the one with missing observations. Different proportions of data entries in six complete datasets were randomly selected to be missing and the MI methods were compared based on the efficiency and accuracy of estimating those values. The results indicated that the models using Bayesian analysis had slightly higher accuracy of estimation performance than those using non-Bayesian analysis but they were more time-consuming. However, the novel approach of multiple agglomerative hierarchical clustering demonstrated the overall best performances.

  9. A New Stratified Sampling Procedure which Decreases Error Estimation of Varroa Mite Number on Sticky Boards.

    PubMed

    Kretzschmar, A; Durand, E; Maisonnasse, A; Vallon, J; Le Conte, Y

    2015-06-01

    A new procedure of stratified sampling is proposed in order to establish an accurate estimation of Varroa destructor populations on sticky bottom boards of the hive. It is based on the spatial sampling theory that recommends using regular grid stratification in the case of spatially structured process. The distribution of varroa mites on sticky board being observed as spatially structured, we designed a sampling scheme based on a regular grid with circles centered on each grid element. This new procedure is then compared with a former method using partially random sampling. Relative error improvements are exposed on the basis of a large sample of simulated sticky boards (n=20,000) which provides a complete range of spatial structures, from a random structure to a highly frame driven structure. The improvement of varroa mite number estimation is then measured by the percentage of counts with an error greater than a given level. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. A Randomized Intervention Study to Evaluate Whether Electronic Messaging Can Increase Human Papillomavirus Vaccine Completion and Knowledge among College Students

    ERIC Educational Resources Information Center

    Richman, Alice R.; Maddy, LaDonna; Torres, Essie; Goldberg, Ellen J.

    2016-01-01

    Objective: To evaluate an intervention aimed at increasing human papillomavirus (HPV) vaccine completion of the 3-dose series and knowledge. Participants: Two hundred sixty-four male and female US college students 18-26 years old who were receiving HPV vaccine dose 1. Methods: Students were randomly assigned to the intervention or control group.…

  11. Cascaded Raman lasing in a PM phosphosilicate fiber with random distributed feedback

    NASA Astrophysics Data System (ADS)

    Lobach, Ivan A.; Kablukov, Sergey I.; Babin, Sergey A.

    2018-02-01

    We report on the first demonstration of a linearly polarized cascaded Raman fiber laser based on a simple half-open cavity with a broadband composite reflector and random distributed feedback in a polarization maintaining phosphosilicate fiber operating beyond zero dispersion wavelength ( 1400 nm). With increasing pump power from a Yb-doped fiber laser at 1080 nm, the random laser generates subsequently 8 W at 1262 nm and 9 W at 1515 nm with polarization extinction ratio of 27 dB. The generation linewidths amount to about 1 nm and 3 nm, respectively, being almost independent of power, in correspondence with the theory of a cascaded random lasing.

  12. Effects of vibration and shock on the performance of gas-bearing space-power Brayton cycle turbomachinery. Part 3: Sinusoidal and random vibration data reduction and evaluation, and random vibration probability analysis

    NASA Technical Reports Server (NTRS)

    Tessarzik, J. M.; Chiang, T.; Badgley, R. H.

    1973-01-01

    The random vibration response of a gas bearing rotor support system has been experimentally and analytically investigated in the amplitude and frequency domains. The NASA Brayton Rotating Unit (BRU), a 36,000 rpm, 10 KWe turbogenerator had previously been subjected in the laboratory to external random vibrations, and the response data recorded on magnetic tape. This data has now been experimentally analyzed for amplitude distribution and magnetic tape. This data has now been experimentally analyzed for amplitude distribution and frequency content. The results of the power spectral density analysis indicate strong vibration responses for the major rotor-bearing system components at frequencies which correspond closely to their resonant frequencies obtained under periodic vibration testing. The results of amplitude analysis indicate an increasing shift towards non-Gaussian distributions as the input level of external vibrations is raised. Analysis of axial random vibration response of the BRU was performed by using a linear three-mass model. Power spectral densities, the root-mean-square value of the thrust bearing surface contact were calculated for specified input random excitation.

  13. Generating constrained randomized sequences: item frequency matters.

    PubMed

    French, Robert M; Perruchet, Pierre

    2009-11-01

    All experimental psychologists understand the importance of randomizing lists of items. However, randomization is generally constrained, and these constraints-in particular, not allowing immediately repeated items-which are designed to eliminate particular biases, frequently engender others. We describe a simple Monte Carlo randomization technique that solves a number of these problems. However, in many experimental settings, we are concerned not only with the number and distribution of items but also with the number and distribution of transitions between items. The algorithm mentioned above provides no control over this. We therefore introduce a simple technique that uses transition tables for generating correctly randomized sequences. We present an analytic method of producing item-pair frequency tables and item-pair transitional probability tables when immediate repetitions are not allowed. We illustrate these difficulties and how to overcome them, with reference to a classic article on word segmentation in infants. Finally, we provide free access to an Excel file that allows users to generate transition tables with up to 10 different item types, as well as to generate appropriately distributed randomized sequences of any length without immediately repeated elements. This file is freely available from http://leadserv.u-bourgogne.fr/IMG/xls/TransitionMatrix.xls.

  14. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE PAGES

    Ye, Xin; Garikapati, Venu M.; You, Daehyun; ...

    2017-11-08

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  15. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Xin; Garikapati, Venu M.; You, Daehyun

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  16. Comparison of five modelling techniques to predict the spatial distribution and abundance of seabirds

    USGS Publications Warehouse

    O'Connell, Allan F.; Gardner, Beth; Oppel, Steffen; Meirinho, Ana; Ramírez, Iván; Miller, Peter I.; Louzao, Maite

    2012-01-01

    Knowledge about the spatial distribution of seabirds at sea is important for conservation. During marine conservation planning, logistical constraints preclude seabird surveys covering the complete area of interest and spatial distribution of seabirds is frequently inferred from predictive statistical models. Increasingly complex models are available to relate the distribution and abundance of pelagic seabirds to environmental variables, but a comparison of their usefulness for delineating protected areas for seabirds is lacking. Here we compare the performance of five modelling techniques (generalised linear models, generalised additive models, Random Forest, boosted regression trees, and maximum entropy) to predict the distribution of Balearic Shearwaters (Puffinus mauretanicus) along the coast of the western Iberian Peninsula. We used ship transect data from 2004 to 2009 and 13 environmental variables to predict occurrence and density, and evaluated predictive performance of all models using spatially segregated test data. Predicted distribution varied among the different models, although predictive performance varied little. An ensemble prediction that combined results from all five techniques was robust and confirmed the existence of marine important bird areas for Balearic Shearwaters in Portugal and Spain. Our predictions suggested additional areas that would be of high priority for conservation and could be proposed as protected areas. Abundance data were extremely difficult to predict, and none of five modelling techniques provided a reliable prediction of spatial patterns. We advocate the use of ensemble modelling that combines the output of several methods to predict the spatial distribution of seabirds, and use these predictions to target separate surveys assessing the abundance of seabirds in areas of regular use.

  17. Imbalance p values for baseline covariates in randomized controlled trials: a last resort for the use of p values? A pro and contra debate.

    PubMed

    Stang, Andreas; Baethge, Christopher

    2018-01-01

    Results of randomized controlled trials (RCTs) are usually accompanied by a table that compares covariates between the study groups at baseline. Sometimes, the investigators report p values for imbalanced covariates. The aim of this debate is to illustrate the pro and contra of the use of these p values in RCTs. Low p values can be a sign of biased or fraudulent randomization and can be used as a warning sign. They can be considered as a screening tool with low positive-predictive value. Low p values should prompt us to ask for the reasons and for potential consequences, especially in combination with hints of methodological problems. A fair randomization produces the expectation that the distribution of p values follows a flat distribution. It does not produce an expectation related to a single p value. The distribution of p values in RCTs can be influenced by the correlation among covariates, differential misclassification or differential mismeasurement of baseline covariates. Given only a small number of reported p values in the reports of RCTs, judging whether the realized p value distribution is, indeed, a flat distribution becomes difficult. If p values ≤0.005 or ≥0.995 were used as a sign of alarm, the false-positive rate would be 5.0% if randomization was done correctly, and five p values per RCT were reported. Use of a low p value as a warning sign that randomization is potentially biased can be considered a vague heuristic. The authors of this debate are obviously more or less enthusiastic with this heuristic and differ in the consequences they propose.

  18. Vaginal distribution and retention of a multiparticulate drug delivery system, assessed by gamma scintigraphy and magnetic resonance imaging.

    PubMed

    Mehta, Samata; Verstraelen, Hans; Peremans, Kathelijne; Villeirs, Geert; Vermeire, Simon; De Vos, Filip; Mehuys, Els; Remon, Jean Paul; Vervaet, Chris

    2012-04-15

    For any new vaginal dosage form, the distribution and retention in the vagina has to be assessed by in vivo evaluation. We evaluated the vaginal distribution and retention of starch-based pellets in sheep as live animal model by gamma scintigraphy (using Indium-111 DTPA as radiolabel) and in women via magnetic resonance imaging (MRI, using a gadolinium chelate as contrast agent). A conventional cream formulation was used as reference in both studies. Cream and pellets were administered to sheep (n=6) in a two period-two treatment study and to healthy female volunteers (n=6) via a randomized crossover trial. Pellets (filled into hard gelatin capsule) and cetomacrogol cream, both labeled with Indium-111 DTPA (for gamma scintigraphy) or with gadolinium chelate (for MRI) were evaluated for their intravaginal distribution and retention over a 24h period. Spreading in the vagina was assessed based on the part of the vagina covered with formulation (expressed in relation to the total vaginal length). Vaginal retention of the formulation was quantified based on the radioactivity remaining in the vaginal area (sheep study), or qualitatively evaluated (women study). Both trials indicated a rapid distribution of the cream within the vagina as complete coverage of the vaginal mucosa was seen 1h after dose administration. Clearance of the cream was rapid: about 10% activity remained in the vaginal area of the sheep 12h post-administration, while after 8h only a thin layer of cream was detected on the vaginal mucosa of women. After disintegration of the hard gelatin capsule, the pellet formulation gradually distributed over the entire vaginal mucosa. Residence time of the pellets in the vagina was longer compared to the semi-solid formulation: after 24h 23 ± 7% radioactivity was detected in the vaginal area of the sheep, while in women the pellet formulation was still detected throughout the vagina. A multi-particulate system containing starch-based pellets was identified as a promising novel vaginal drug delivery system, resulting in complete coverage of the vaginal mucosa and long retention time. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Distribution law of the Dirac eigenmodes in QCD

    NASA Astrophysics Data System (ADS)

    Catillo, Marco; Glozman, Leonid Ya.

    2018-04-01

    The near-zero modes of the Dirac operator are connected to spontaneous breaking of chiral symmetry in QCD (SBCS) via the Banks-Casher relation. At the same time, the distribution of the near-zero modes is well described by the Random Matrix Theory (RMT) with the Gaussian Unitary Ensemble (GUE). Then, it has become a standard lore that a randomness, as observed through distributions of the near-zero modes of the Dirac operator, is a consequence of SBCS. The higher-lying modes of the Dirac operator are not affected by SBCS and are sensitive to confinement physics and related SU(2)CS and SU(2NF) symmetries. We study the distribution of the near-zero and higher-lying eigenmodes of the overlap Dirac operator within NF = 2 dynamical simulations. We find that both the distributions of the near-zero and higher-lying modes are perfectly described by GUE of RMT. This means that randomness, while consistent with SBCS, is not a consequence of SBCS and is linked to the confining chromo-electric field.

  20. Non-random distribution and co-localization of purine/pyrimidine-encoded information and transcriptional regulatory domains.

    PubMed

    Povinelli, C M

    1992-01-01

    In order to detect sequence-based information predictive for the location of eukaryotic transcriptional regulatory domains, the frequencies and distributions of the 36 possible purine/pyrimidine reverse complement hexamer pairs was determined for test sets of real and random sequences. The distribution of one of the hexamer pairs (RRYYRR/YYRRYY, referred to as M1) was further examined in a larger set of sequences (> 32 genes, 230 kb). Predominant clusters of M1 and the locations of eukaryotic transcriptional regulatory domains were found to be associated and non-randomly distributed along the DNA consistent with a periodicity of approximately 1.2 kb. In the context of higher ordered chromatin this would align promoters, enhancers and the predominant clusters of M1 longitudinally along one face of a 30 nm fiber. Using only information about the distribution of the M1 motif, 50-70% of a sequence could be eliminated as being unlikely to contain transcriptional regulatory domains with an 87% recovery of the regulatory domains present.

  1. Two approximations of the present value distribution of a disability annuity

    NASA Astrophysics Data System (ADS)

    Spreeuw, Jaap

    2006-02-01

    The distribution function of the present value of a cash flow can be approximated by means of a distribution function of a random variable, which is also the present value of a sequence of payments, but with a simpler structure. The corresponding random variable has the same expectation as the random variable corresponding to the original distribution function and is a stochastic upper bound of convex order. A sharper upper bound can be obtained if more information about the risk is available. In this paper, it will be shown that such an approach can be adopted for disability annuities (also known as income protection policies) in a three state model under Markov assumptions. Benefits are payable during any spell of disability whilst premiums are only due whenever the insured is healthy. The quality of the two approximations is investigated by comparing the distributions obtained with the one derived from the algorithm presented in the paper by Hesselager and Norberg [Insurance Math. Econom. 18 (1996) 35-42].

  2. Stationary Random Metrics on Hierarchical Graphs Via {(min,+)}-type Recursive Distributional Equations

    NASA Astrophysics Data System (ADS)

    Khristoforov, Mikhail; Kleptsyn, Victor; Triestino, Michele

    2016-07-01

    This paper is inspired by the problem of understanding in a mathematical sense the Liouville quantum gravity on surfaces. Here we show how to define a stationary random metric on self-similar spaces which are the limit of nice finite graphs: these are the so-called hierarchical graphs. They possess a well-defined level structure and any level is built using a simple recursion. Stopping the construction at any finite level, we have a discrete random metric space when we set the edges to have random length (using a multiplicative cascade with fixed law {m}). We introduce a tool, the cut-off process, by means of which one finds that renormalizing the sequence of metrics by an exponential factor, they converge in law to a non-trivial metric on the limit space. Such limit law is stationary, in the sense that glueing together a certain number of copies of the random limit space, according to the combinatorics of the brick graph, the obtained random metric has the same law when rescaled by a random factor of law {m} . In other words, the stationary random metric is the solution of a distributional equation. When the measure m has continuous positive density on {mathbf{R}+}, the stationary law is unique up to rescaling and any other distribution tends to a rescaled stationary law under the iterations of the hierarchical transformation. We also investigate topological and geometric properties of the random space when m is log-normal, detecting a phase transition influenced by the branching random walk associated to the multiplicative cascade.

  3. Kinetic market models with single commodity having price fluctuations

    NASA Astrophysics Data System (ADS)

    Chatterjee, A.; Chakrabarti, B. K.

    2006-12-01

    We study here numerically the behavior of an ideal gas like model of markets having only one non-consumable commodity. We investigate the behavior of the steady-state distributions of money, commodity and total wealth, as the dynamics of trading or exchange of money and commodity proceeds, with local (in time) fluctuations in the price of the commodity. These distributions are studied in markets with agents having uniform and random saving factors. The self-organizing features in money distribution are similar to the cases without any commodity (or with consumable commodities), while the commodity distribution shows an exponential decay. The wealth distribution shows interesting behavior: gamma like distribution for uniform saving propensity and has the same power-law tail, as that of the money distribution, for a market with agents having random saving propensity.

  4. Analysis of random drop for gateway congestion control. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Hashem, Emam Salaheddin

    1989-01-01

    Lately, the growing demand on the Internet has prompted the need for more effective congestion control policies. Currently No Gateway Policy is used to relieve and signal congestion, which leads to unfair service to the individual users and a degradation of overall network performance. Network simulation was used to illustrate the character of Internet congestion and its causes. A newly proposed gateway congestion control policy, called Random Drop, was considered as a promising solution to the pressing problem. Random Drop relieves resource congestion upon buffer overflow by choosing a random packet from the service queue to be dropped. The random choice should result in a drop distribution proportional to the bandwidth distribution among all contending TCP connections, thus applying the necessary fairness. Nonetheless, the simulation experiments demonstrate several shortcomings with this policy. Because Random Drop is a congestion control policy, which is not applied until congestion has already occurred, it usually results in a high drop rate that hurts too many connections including well-behaved ones. Even though the number of packets dropped is different from one connection to another depending on the buffer utilization upon overflow, the TCP recovery overhead is high enough to neutralize these differences, causing unfair congestion penalties. Besides, the drop distribution itself is an inaccurate representation of the average bandwidth distribution, missing much important information about the bandwidth utilization between buffer overflow events. A modification of Random Drop to do congestion avoidance by applying the policy early was also proposed. Early Random Drop has the advantage of avoiding the high drop rate of buffer overflow. The early application of the policy removes the pressure of congestion relief and allows more accurate signaling of congestion. To be used effectively, algorithms for the dynamic adjustment of the parameters of Early Random Drop to suite the current network load must still be developed.

  5. Online distribution channel increases article usage on Mendeley: a randomized controlled trial.

    PubMed

    Kudlow, Paul; Cockerill, Matthew; Toccalino, Danielle; Dziadyk, Devin Bissky; Rutledge, Alan; Shachak, Aviv; McIntyre, Roger S; Ravindran, Arun; Eysenbach, Gunther

    2017-01-01

    Prior research shows that article reader counts (i.e. saves) on the online reference manager, Mendeley, correlate to future citations. There are currently no evidenced-based distribution strategies that have been shown to increase article saves on Mendeley. We conducted a 4-week randomized controlled trial to examine how promotion of article links in a novel online cross-publisher distribution channel (TrendMD) affect article saves on Mendeley. Four hundred articles published in the Journal of Medical Internet Research were randomized to either the TrendMD arm ( n  = 200) or the control arm ( n  = 200) of the study. Our primary outcome compares the 4-week mean Mendeley saves of articles randomized to TrendMD versus control. Articles randomized to TrendMD showed a 77% increase in article saves on Mendeley relative to control. The difference in mean Mendeley saves for TrendMD articles versus control was 2.7, 95% CI (2.63, 2.77), and statistically significant ( p  < 0.01). There was a positive correlation between pageviews driven by TrendMD and article saves on Mendeley (Spearman's rho r  = 0.60). This is the first randomized controlled trial to show how an online cross-publisher distribution channel (TrendMD) enhances article saves on Mendeley. While replication and further study are needed, these data suggest that cross-publisher article recommendations via TrendMD may enhance citations of scholarly articles.

  6. Universal energy distribution for interfaces in a random-field environment

    NASA Astrophysics Data System (ADS)

    Fedorenko, Andrei A.; Stepanow, Semjon

    2003-11-01

    We study the energy distribution function ρ(E) for interfaces in a random-field environment at zero temperature by summing the leading terms in the perturbation expansion of ρ(E) in powers of the disorder strength, and by taking into account the nonperturbational effects of the disorder using the functional renormalization group. We have found that the average and the variance of the energy for one-dimensional interface of length L behave as, R∝L ln L, ΔER∝L, while the distribution function of the energy tends for large L to the Gumbel distribution of the extreme value statistics.

  7. The Complete Chloroplast Genome of Wild Rice (Oryza minuta) and Its Comparison to Related Species.

    PubMed

    Asaf, Sajjad; Waqas, Muhammad; Khan, Abdul L; Khan, Muhammad A; Kang, Sang-Mo; Imran, Qari M; Shahzad, Raheem; Bilal, Saqib; Yun, Byung-Wook; Lee, In-Jung

    2017-01-01

    Oryza minuta , a tetraploid wild relative of cultivated rice (family Poaceae), possesses a BBCC genome and contains genes that confer resistance to bacterial blight (BB) and white-backed (WBPH) and brown (BPH) plant hoppers. Based on the importance of this wild species, this study aimed to understand the phylogenetic relationships of O. minuta with other Oryza species through an in-depth analysis of the composition and diversity of the chloroplast (cp) genome. The analysis revealed a cp genome size of 135,094 bp with a typical quadripartite structure and consisting of a pair of inverted repeats separated by small and large single copies, 139 representative genes, and 419 randomly distributed microsatellites. The genomic organization, gene order, GC content and codon usage are similar to those of typical angiosperm cp genomes. Approximately 30 forward, 28 tandem and 20 palindromic repeats were detected in the O . minuta cp genome. Comparison of the complete O. minuta cp genome with another eleven Oryza species showed a high degree of sequence similarity and relatively high divergence of intergenic spacers. Phylogenetic analyses were conducted based on the complete genome sequence, 65 shared genes and matK gene showed same topologies and O. minuta forms a single clade with parental O. punctata . Thus, the complete O . minuta cp genome provides interesting insights and valuable information that can be used to identify related species and reconstruct its phylogeny.

  8. Weighted Scaling in Non-growth Random Networks

    NASA Astrophysics Data System (ADS)

    Chen, Guang; Yang, Xu-Hua; Xu, Xin-Li

    2012-09-01

    We propose a weighted model to explain the self-organizing formation of scale-free phenomenon in non-growth random networks. In this model, we use multiple-edges to represent the connections between vertices and define the weight of a multiple-edge as the total weights of all single-edges within it and the strength of a vertex as the sum of weights for those multiple-edges attached to it. The network evolves according to a vertex strength preferential selection mechanism. During the evolution process, the network always holds its total number of vertices and its total number of single-edges constantly. We show analytically and numerically that a network will form steady scale-free distributions with our model. The results show that a weighted non-growth random network can evolve into scale-free state. It is interesting that the network also obtains the character of an exponential edge weight distribution. Namely, coexistence of scale-free distribution and exponential distribution emerges.

  9. General Exact Solution to the Problem of the Probability Density for Sums of Random Variables

    NASA Astrophysics Data System (ADS)

    Tribelsky, Michael I.

    2002-07-01

    The exact explicit expression for the probability density pN(x) for a sum of N random, arbitrary correlated summands is obtained. The expression is valid for any number N and any distribution of the random summands. Most attention is paid to application of the developed approach to the case of independent and identically distributed summands. The obtained results reproduce all known exact solutions valid for the, so called, stable distributions of the summands. It is also shown that if the distribution is not stable, the profile of pN(x) may be divided into three parts, namely a core (small x), a tail (large x), and a crossover from the core to the tail (moderate x). The quantitative description of all three parts as well as that for the entire profile is obtained. A number of particular examples are considered in detail.

  10. General exact solution to the problem of the probability density for sums of random variables.

    PubMed

    Tribelsky, Michael I

    2002-08-12

    The exact explicit expression for the probability density p(N)(x) for a sum of N random, arbitrary correlated summands is obtained. The expression is valid for any number N and any distribution of the random summands. Most attention is paid to application of the developed approach to the case of independent and identically distributed summands. The obtained results reproduce all known exact solutions valid for the, so called, stable distributions of the summands. It is also shown that if the distribution is not stable, the profile of p(N)(x) may be divided into three parts, namely a core (small x), a tail (large x), and a crossover from the core to the tail (moderate x). The quantitative description of all three parts as well as that for the entire profile is obtained. A number of particular examples are considered in detail.

  11. Random bit generation at tunable rates using a chaotic semiconductor laser under distributed feedback.

    PubMed

    Li, Xiao-Zhou; Li, Song-Sui; Zhuang, Jun-Ping; Chan, Sze-Chun

    2015-09-01

    A semiconductor laser with distributed feedback from a fiber Bragg grating (FBG) is investigated for random bit generation (RBG). The feedback perturbs the laser to emit chaotically with the intensity being sampled periodically. The samples are then converted into random bits by a simple postprocessing of self-differencing and selecting bits. Unlike a conventional mirror that provides localized feedback, the FBG provides distributed feedback which effectively suppresses the information of the round-trip feedback delay time. Randomness is ensured even when the sampling period is commensurate with the feedback delay between the laser and the grating. Consequently, in RBG, the FBG feedback enables continuous tuning of the output bit rate, reduces the minimum sampling period, and increases the number of bits selected per sample. RBG is experimentally investigated at a sampling period continuously tunable from over 16 ns down to 50 ps, while the feedback delay is fixed at 7.7 ns. By selecting 5 least-significant bits per sample, output bit rates from 0.3 to 100 Gbps are achieved with randomness examined by the National Institute of Standards and Technology test suite.

  12. Blocking for Sequential Political Experiments

    PubMed Central

    Moore, Sally A.

    2013-01-01

    In typical political experiments, researchers randomize a set of households, precincts, or individuals to treatments all at once, and characteristics of all units are known at the time of randomization. However, in many other experiments, subjects “trickle in” to be randomized to treatment conditions, usually via complete randomization. To take advantage of the rich background data that researchers often have (but underutilize) in these experiments, we develop methods that use continuous covariates to assign treatments sequentially. We build on biased coin and minimization procedures for discrete covariates and demonstrate that our methods outperform complete randomization, producing better covariate balance in simulated data. We then describe how we selected and deployed a sequential blocking method in a clinical trial and demonstrate the advantages of our having done so. Further, we show how that method would have performed in two larger sequential political trials. Finally, we compare causal effect estimates from differences in means, augmented inverse propensity weighted estimators, and randomization test inversion. PMID:24143061

  13. Psyllium supplementation in adolescents improves fat distribution & lipid profile: a randomized, participant-blinded, placebo-controlled, crossover trial.

    PubMed

    de Bock, Martin; Derraik, José G B; Brennan, Christine M; Biggs, Janene B; Smith, Greg C; Cameron-Smith, David; Wall, Clare R; Cutfield, Wayne S

    2012-01-01

    We aimed to assess the effects of psyllium supplementation on insulin sensitivity and other parameters of the metabolic syndrome in an at risk adolescent population. This study encompassed a participant-blinded, randomized, placebo-controlled, crossover trial. Subjects were 47 healthy adolescent males aged 15-16 years, recruited from secondary schools in lower socio-economic areas with high rates of obesity. Participants received 6 g/day of psyllium or placebo for 6 weeks, with a two-week washout before crossing over. Fasting lipid profiles, ambulatory blood pressure, auxological data, body composition, activity levels, and three-day food records were collected at baseline and after each 6-week intervention. Insulin sensitivity was measured by the Matsuda method using glucose and insulin values from an oral glucose tolerance test. 45 subjects completed the study, and compliance was very high: 87% of participants took >80% of prescribed capsules. At baseline, 44% of subjects were overweight or obese. 28% had decreased insulin sensitivity, but none had impaired glucose tolerance. Fibre supplementation led to a 4% reduction in android fat to gynoid fat ratio (p = 0.019), as well as a 0.12 mmol/l (6%) reduction in LDL cholesterol (p = 0.042). No associated adverse events were recorded. Dietary supplementation with 6 g/day of psyllium over 6 weeks improves fat distribution and lipid profile (parameters of the metabolic syndrome) in an at risk population of adolescent males. Australian New Zealand Clinical Trials Registry ACTRN12609000888268.

  14. Random growth lattice filling model of percolation: a crossover from continuous to discontinuous transition

    NASA Astrophysics Data System (ADS)

    Roy, Bappaditya; Santra, S. B.

    2018-05-01

    A random growth lattice filling model of percolation with a touch and stop growth rule is developed and studied numerically on a two dimensional square lattice. Nucleation centers are continuously added one at a time to the empty lattice sites and clusters are grown from these nucleation centers with a growth probability g. For a given g (), the system passes through a critical point during the growth process where the transition from a disconnected to a connected phase occurs. The model is found to exhibit second order continuous percolation transitions as ordinary percolation for whereas for it exhibits weak first order discontinuous percolation transitions. The continuous transitions are characterized by estimating the values of the critical exponents associated with the order parameter fluctuation and the fractal dimension of the spanning cluster over the whole range of g. The discontinuous transitions, however, are characterized by a compact spanning cluster, lattice size independent fluctuation of the order parameter per lattice, departure from power law scaling in the cluster size distribution and weak bimodal distribution of the order parameter. The nature of transitions are further confirmed by studying the Binder cumulant. Instead of a sharp tricritical point, a tricritical region is found to occur for 0.5  <  g  <  0.8 within which the values of the critical exponents change continuously until the crossover from continuous to discontinuous transition is completed.

  15. Comparative analysis of ferroelectric domain statistics via nonlinear diffraction in random nonlinear materials.

    PubMed

    Wang, B; Switowski, K; Cojocaru, C; Roppo, V; Sheng, Y; Scalora, M; Kisielewski, J; Pawlak, D; Vilaseca, R; Akhouayri, H; Krolikowski, W; Trull, J

    2018-01-22

    We present an indirect, non-destructive optical method for domain statistic characterization in disordered nonlinear crystals having homogeneous refractive index and spatially random distribution of ferroelectric domains. This method relies on the analysis of the wave-dependent spatial distribution of the second harmonic, in the plane perpendicular to the optical axis in combination with numerical simulations. We apply this technique to the characterization of two different media, Calcium Barium Niobate and Strontium Barium Niobate, with drastically different statistical distributions of ferroelectric domains.

  16. Deviations from Rayleigh statistics in ultrasonic speckle.

    PubMed

    Tuthill, T A; Sperry, R H; Parker, K J

    1988-04-01

    The statistics of speckle patterns in ultrasound images have potential for tissue characterization. In "fully developed speckle" from many random scatterers, the amplitude is widely recognized as possessing a Rayleigh distribution. This study examines how scattering populations and signal processing can produce non-Rayleigh distributions. The first order speckle statistics are shown to depend on random scatterer density and the amplitude and spacing of added periodic scatterers. Envelope detection, amplifier compression, and signal bandwidth are also shown to cause distinct changes in the signal distribution.

  17. Effects of zinc supplementation on subscales of anorexia in children: A randomized controlled trial.

    PubMed

    Khademian, Majid; Farhangpajouh, Neda; Shahsanaee, Armindokht; Bahreynian, Maryam; Mirshamsi, Mehran; Kelishadi, Roya

    2014-01-01

    This study aims to assess the effects of zinc supplementation on improving the appetite and its subscales in children. This study was conducted in 2013 in Isfahan, Iran. It had two phases. At the first step, after validation of the Child Eating Behaviour Questionaire (CEBQ), it was completed for 300 preschool children, who were randomly selected. The second phase was conducted as a randomized controlled trial. Eighty of these children were randomly selected, and were randomly assigned to two groups of equal number receiving zinc (10 mg/day) or placebo for 12 weeks. Overall 77 children completed the trial (39 in the case and 3 in the control group).The results showed that zinc supplement can improve calorie intake in children by affecting some CEBQ subscales like Emotional over Eating and Food Responsible. Zinc supplementation had positive impact in promoting the calorie intake and some subscales of anorexia.

  18. The Use of Compressive Sensing to Reconstruct Radiation Characteristics of Wide-Band Antennas from Sparse Measurements

    DTIC Science & Technology

    2015-06-01

    of uniform- versus nonuniform -pattern reconstruction, of transform function used, and of minimum randomly distributed measurements needed to...the radiation-frequency pattern’s reconstruction using uniform and nonuniform randomly distributed samples even though the pattern error manifests...5 Fig. 3 The nonuniform compressive-sensing reconstruction of the radiation

  19. Reinforcing Sampling Distributions through a Randomization-Based Activity for Introducing ANOVA

    ERIC Educational Resources Information Center

    Taylor, Laura; Doehler, Kirsten

    2015-01-01

    This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course…

  20. Averaging in SU(2) open quantum random walk

    NASA Astrophysics Data System (ADS)

    Clement, Ampadu

    2014-03-01

    We study the average position and the symmetry of the distribution in the SU(2) open quantum random walk (OQRW). We show that the average position in the central limit theorem (CLT) is non-uniform compared with the average position in the non-CLT. The symmetry of distribution is shown to be even in the CLT.

  1. A comparison of methods for estimating the random effects distribution of a linear mixed model.

    PubMed

    Ghidey, Wendimagegn; Lesaffre, Emmanuel; Verbeke, Geert

    2010-12-01

    This article reviews various recently suggested approaches to estimate the random effects distribution in a linear mixed model, i.e. (1) the smoothing by roughening approach of Shen and Louis,(1) (2) the semi-non-parametric approach of Zhang and Davidian,(2) (3) the heterogeneity model of Verbeke and Lesaffre( 3) and (4) a flexible approach of Ghidey et al. (4) These four approaches are compared via an extensive simulation study. We conclude that for the considered cases, the approach of Ghidey et al. (4) often shows to have the smallest integrated mean squared error for estimating the random effects distribution. An analysis of a longitudinal dental data set illustrates the performance of the methods in a practical example.

  2. Instantaneous phase estimation to measure weak velocity variations: application to noise correlation on seismic data at the exploration scale

    NASA Astrophysics Data System (ADS)

    Corciulo, M.; Roux, P.; Campillo, M.; Dubucq, D.

    2010-12-01

    Passive imaging from noise cross-correlation is a consolidated analysis applied at continental and regional scale whereas its use at local scale for seismic exploration purposes is still uncertain. The development of passive imaging by cross-correlation analysis is based on the extraction of the Green’s function from seismic noise data. In a completely random field in time and space, the cross-correlation permits to retrieve the complete Green’s function whatever the complexity of the medium. At the exploration scale and at frequency above 2 Hz, the noise sources are not ideally distributed around the stations which strongly affect the extraction of the direct arrivals from the noise cross-correlation process. In order to overcome this problem, the coda waves extracted from noise correlation could be useful. Coda waves describe long and scattered paths sampling the medium in different ways such that they become sensitive to weak velocity variations without being dependent on the noise source distribution. Indeed, scatters in the medium behave as a set of secondary noise sources which randomize the spatial distribution of noise sources contributing to the coda waves in the correlation process. We developed a new technique to measure weak velocity changes based on the computation of the local phase variations (instantaneous phase variation or IPV) of the cross-correlated signals. This newly-developed technique takes advantage from the doublet and stretching techniques classically used to monitor weak velocity variation from coda waves. We apply IPV to data acquired in Northern America (Canada) on a 1-km side square seismic network laid out by 397 stations. Data used to study temporal variations are cross-correlated signals computed on 10-minutes ambient noise in the frequency band 2-5 Hz. As the data set was acquired over five days, about 660 files are processed to perform a complete temporal analysis for each stations pair. The IPV permits to estimate the phase shift all over the signal length without any assumption on the medium velocity. The instantaneous phase is computed using the Hilbert transform of the signal. For each stations pair, we measure the phase difference between successive correlation functions calculated for 10 minutes of ambient noise. We then fit the instantaneous phase shift using a first-order polynomial function. The measure of the velocity variation corresponds to the slope of this fit. Compared to other techniques, the advantage of IPV is a very fast procedure which efficiently provides the measure of velocity variation on large data sets. Both experimental results and numerical tests on synthetic signals will be presented to assess the reliability of the IPV technique, with comparison to the doublet and stretching methods.

  3. The effect of completeness of revascularization on event-free survival at one year in the ARTS trial.

    PubMed

    van den Brand, Marcel J B M; Rensing, Benno J W M; Morel, Marie-angèle M; Foley, David P; de Valk, Vincent; Breeman, Arno; Suryapranata, Harry; Haalebos, Maximiliaan M P; Wijns, William; Wellens, Francis; Balcon, Rafael; Magee, Patrick; Ribeiro, Expedito; Buffolo, Enio; Unger, Felix; Serruys, Patrick W

    2002-02-20

    We sought to assess the relationship between completeness of revascularization and adverse events at one year in the ARTS (Arterial Revascularization Therapies Study) trial. There is uncertainty to what extent degree of completeness of revascularization, using up-to-date techniques, influences medium-term outcome. After consensus between surgeon and cardiologist regarding the potential for equivalence in the completeness of revascularization, 1,205 patients with multivessel disease were randomly assigned to either bypass surgery or stent implantation. All baseline and procedural angiograms and surgical case-record forms were centrally assessed for completeness of revascularization. Of 1,205 patients randomized, 1,172 underwent the assigned treatment. Complete data for review were available in 1,143 patients (97.5%). Complete revascularization was achieved in 84.1% of the surgically treated patients and 70.5% of the angioplasty patients (p < 0.001). After one year, the stented angioplasty patients with incomplete revascularization showed a significantly lower event-free survival than stented patients with complete revascularization (i.e., freedom from death, myocardial infarction, cerebrovascular accident and repeat revascularization) (69.4% vs. 76.6%; p < 0.05). This difference was due to a higher incidence of subsequent bypass procedures (10.0% vs. 2.0%; p < 0.05). Conversely, at one year, bypass surgery patients with incomplete revascularization showed only a marginally lower event-free survival rate than those with complete revascularization (87.8% vs. 89.9%). Complete revascularization was more frequently accomplished by bypass surgery than by stent implantation. One year after bypass, there was no significant difference in event-free survival between surgically treated patients with complete revascularization and those with incomplete revascularization, but patients randomized to stenting with incomplete revascularization had a greater need for subsequent bypass surgery.

  4. Record statistics of financial time series and geometric random walks

    NASA Astrophysics Data System (ADS)

    Sabir, Behlool; Santhanam, M. S.

    2014-09-01

    The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.

  5. A novel complete-case analysis to determine statistical significance between treatments in an intention-to-treat population of randomized clinical trials involving missing data.

    PubMed

    Liu, Wei; Ding, Jinhui

    2018-04-01

    The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.

  6. Generalised Central Limit Theorems for Growth Rate Distribution of Complex Systems

    NASA Astrophysics Data System (ADS)

    Takayasu, Misako; Watanabe, Hayafumi; Takayasu, Hideki

    2014-04-01

    We introduce a solvable model of randomly growing systems consisting of many independent subunits. Scaling relations and growth rate distributions in the limit of infinite subunits are analysed theoretically. Various types of scaling properties and distributions reported for growth rates of complex systems in a variety of fields can be derived from this basic physical model. Statistical data of growth rates for about 1 million business firms are analysed as a real-world example of randomly growing systems. Not only are the scaling relations consistent with the theoretical solution, but the entire functional form of the growth rate distribution is fitted with a theoretical distribution that has a power-law tail.

  7. Recruitment strategies in two Reproductive Medicine Network infertility trials

    PubMed Central

    Usadi, Rebecca S.; Diamond, Michael P.; Legro, Richard S.; Schlaff, William D.; Hansen, Karl R.; Casson, Peter; Christman, Gregory; Bates, G. Wright; Baker, Valerie; Seungdamrong, Aimee; Rosen, Mitchell P.; Lucidi, Scott; Thomas, Tracey; Huang, Hao; Santoro, Nanette; Eisenberg, Esther; Zhang, Heping; Alvero, Ruben

    2016-01-01

    Background Recruitment of individuals into clinical trials is a critical step in completing studies. Reports examining the effectiveness of different recruitment strategies, and specifically in infertile couples, are limited. Methods We investigated recruitment methods used in two NIH sponsored trials, Pregnancy in Polycystic Ovary Syndrome (PPCOS II) and Assessment of Multiple Intrauterine Gestations from Ovarian Stimulation (AMIGOS), and examined which strategies yielded the greatest number of participants completing the trials. Results 3683 couples were eligible for screening. 1650 participants were randomized and 1339 completed the trials. 750 women were randomized in PPCOS II; 212 of the participants who completed the trial were referred by physicians. Participants recruited from radio ads (84/750) and the internet (81/750) resulted in similar rates of trial completion in PPCOS II. 900 participants were randomized in AMIGOS. 440 participants who completed the trial were referred to the study by physicians. The next most successful method in AMIGOS was use of the internet, achieving 78 completed participants. Radio ads proved the most successful strategy in both trials for participants who earned <$50,000 annually. Radio ads were most successful in enrolling white patients in PPCOS II and black patients in AMIGOS. Seven ancillary Clinical Research Scientist Training (CREST) sites enrolled 324 of the participants who completed the trials. Conclusions Physician referral was the most successful recruitment strategy. Radio ads and the internet were the next most successful strategies, particularly for women of limited income. Ancillary clinical sites were important for overall recruitment. PMID:26386293

  8. Recruitment strategies in two reproductive medicine network infertility trials.

    PubMed

    Usadi, Rebecca S; Diamond, Michael P; Legro, Richard S; Schlaff, William D; Hansen, Karl R; Casson, Peter; Christman, Gregory; Wright Bates, G; Baker, Valerie; Seungdamrong, Aimee; Rosen, Mitchell P; Lucidi, Scott; Thomas, Tracey; Huang, Hao; Santoro, Nanette; Eisenberg, Esther; Zhang, Heping; Alvero, Ruben

    2015-11-01

    Recruitment of individuals into clinical trials is a critical step in completing studies. Reports examining the effectiveness of different recruitment strategies, and specifically in infertile couples, are limited. We investigated recruitment methods used in two NIH sponsored trials, Pregnancy in Polycystic Ovary Syndrome (PPCOS II) and Assessment of Multiple Intrauterine Gestations from Ovarian Stimulation (AMIGOS), and examined which strategies yielded the greatest number of participants completing the trials. 3683 couples were eligible for screening. 1650 participants were randomized and 1339 completed the trials. 750 women were randomized in PPCOS II; 212 of the participants who completed the trial were referred by physicians. Participants recruited from radio ads (84/750) and the internet (81/750) resulted in similar rates of trial completion in PPCOS II. 900 participants were randomized in AMIGOS. 440 participants who completed the trial were referred to the study by physicians. The next most successful method in AMIGOS was the use of the internet, achieving 78 completed participants. Radio ads proved the most successful strategy in both trials for participants who earned <$50,000 annually. Radio ads were most successful in enrolling white patients in PPCOS II and black patients in AMIGOS. Seven ancillary Clinical Research Scientist Training (CREST) sites enrolled 324 of the participants who completed the trials. Physician referral was the most successful recruitment strategy. Radio ads and the internet were the next most successful strategies, particularly for women of limited income. Ancillary clinical sites were important for overall recruitment. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Time series analysis of collective motions in proteins

    NASA Astrophysics Data System (ADS)

    Alakent, Burak; Doruker, Pemra; ćamurdan, Mehmet C.

    2004-01-01

    The dynamics of α-amylase inhibitor tendamistat around its native state is investigated using time series analysis of the principal components of the Cα atomic displacements obtained from molecular dynamics trajectories. Collective motion along a principal component is modeled as a homogeneous nonstationary process, which is the result of the damped oscillations in local minima superimposed on a random walk. The motion in local minima is described by a stationary autoregressive moving average model, consisting of the frequency, damping factor, moving average parameters and random shock terms. Frequencies for the first 50 principal components are found to be in the 3-25 cm-1 range, which are well correlated with the principal component indices and also with atomistic normal mode analysis results. Damping factors, though their correlation is less pronounced, decrease as principal component indices increase, indicating that low frequency motions are less affected by friction. The existence of a positive moving average parameter indicates that the stochastic force term is likely to disturb the mode in opposite directions for two successive sampling times, showing the modes tendency to stay close to minimum. All these four parameters affect the mean square fluctuations of a principal mode within a single minimum. The inter-minima transitions are described by a random walk model, which is driven by a random shock term considerably smaller than that for the intra-minimum motion. The principal modes are classified into three subspaces based on their dynamics: essential, semiconstrained, and constrained, at least in partial consistency with previous studies. The Gaussian-type distributions of the intermediate modes, called "semiconstrained" modes, are explained by asserting that this random walk behavior is not completely free but between energy barriers.

  10. Alternative approaches for studying humanitarian interventions: propensity score methods to evaluate reintegration packages impact on depression, PTSD, and function impairment among child soldiers in Nepal.

    PubMed

    Kohrt, B A; Burkey, M; Stuart, E A; Koirala, S

    2015-01-01

    Ethical, logistical, and funding approaches preclude conducting randomized control trials (RCTs) in some humanitarian crises. A lack of RCTs and other intervention research has contributed to a limited evidence-base for mental health and psychosocial support (MHPS) programs after disasters, war, and disease outbreaks. Propensity score methods (PSMs) are an alternative analysis technique with potential application for evaluating MHPS programs in humanitarian emergencies. PSMs were used to evaluate impacts of education reintegration packages (ERPs) and other (vocational or economic) reintegration packages (ORPs) v. no reintegration programs on mental health of child soldiers. Propensity scores were used to determine weighting of child soldiers in each of the three treatment arms. Multiple linear regression was used to estimate adjusted changes in symptom score severity on culturally validated measures of depression, post-traumatic stress disorder (PTSD), and functional impairment from baseline to 1-year follow-up. Among 258 Nepali child soldiers participating in reintegration programs, 54.7% completed ERP and 22.9% completed ORP. There was a non-significant reduction in depression by 0.59 (95% CI -1.97 to 0.70) for ERP and by 0.60 (95% CI -2.16 to 0.96) for ORP compared with no treatment. There were non-significant increases in PTSD (1.15, 95% CI -1.55 to 3.86) and functional impairment (0.91, 95% CI -0.31 to 2.14) associated with ERP and similar findings for ORP (PTSD: 0.66, 95% CI -2.24 to 3.57; functional impairment (1.05, 95% CI -0.71 to 2.80). In a humanitarian crisis in which a non-randomized intervention assignment protocol was employed, the statistical technique of PSMs addressed differences in covariate distribution between child soldiers who received different integration packages. Our analysis did not demonstrate significant changes in psychosocial outcomes for ERPs and ORPs. We suggest the use of PSMs in evaluating non-randomized interventions in humanitarian crises when non-randomized conditions are not utilized.

  11. Spatial Analysis of “Crazy Quilts”, a Class of Potentially Random Aesthetic Artefacts

    PubMed Central

    Westphal-Fitch, Gesche; Fitch, W. Tecumseh

    2013-01-01

    Human artefacts in general are highly structured and often display ordering principles such as translational, reflectional or rotational symmetry. In contrast, human artefacts that are intended to appear random and non symmetrical are very rare. Furthermore, many studies show that humans find it extremely difficult to recognize or reproduce truly random patterns or sequences. Here, we attempt to model two-dimensional decorative spatial patterns produced by humans that show no obvious order. “Crazy quilts” represent a historically important style of quilt making that became popular in the 1870s, and lasted about 50 years. Crazy quilts are unusual because unlike most human artefacts, they are specifically intended to appear haphazard and unstructured. We evaluate the degree to which this intention was achieved by using statistical techniques of spatial point pattern analysis to compare crazy quilts with regular quilts from the same region and era and to evaluate the fit of various random distributions to these two quilt classes. We found that the two quilt categories exhibit fundamentally different spatial characteristics: The patch areas of crazy quilts derive from a continuous random distribution, while area distributions of regular quilts consist of Gaussian mixtures. These Gaussian mixtures derive from regular pattern motifs that are repeated and we suggest that such a mixture is a distinctive signature of human-made visual patterns. In contrast, the distribution found in crazy quilts is shared with many other naturally occurring spatial patterns. Centroids of patches in the two quilt classes are spaced differently and in general, crazy quilts but not regular quilts are well-fitted by a random Strauss process. These results indicate that, within the constraints of the quilt format, Victorian quilters indeed achieved their goal of generating random structures. PMID:24066095

  12. Spatial analysis of "crazy quilts", a class of potentially random aesthetic artefacts.

    PubMed

    Westphal-Fitch, Gesche; Fitch, W Tecumseh

    2013-01-01

    Human artefacts in general are highly structured and often display ordering principles such as translational, reflectional or rotational symmetry. In contrast, human artefacts that are intended to appear random and non symmetrical are very rare. Furthermore, many studies show that humans find it extremely difficult to recognize or reproduce truly random patterns or sequences. Here, we attempt to model two-dimensional decorative spatial patterns produced by humans that show no obvious order. "Crazy quilts" represent a historically important style of quilt making that became popular in the 1870s, and lasted about 50 years. Crazy quilts are unusual because unlike most human artefacts, they are specifically intended to appear haphazard and unstructured. We evaluate the degree to which this intention was achieved by using statistical techniques of spatial point pattern analysis to compare crazy quilts with regular quilts from the same region and era and to evaluate the fit of various random distributions to these two quilt classes. We found that the two quilt categories exhibit fundamentally different spatial characteristics: The patch areas of crazy quilts derive from a continuous random distribution, while area distributions of regular quilts consist of Gaussian mixtures. These Gaussian mixtures derive from regular pattern motifs that are repeated and we suggest that such a mixture is a distinctive signature of human-made visual patterns. In contrast, the distribution found in crazy quilts is shared with many other naturally occurring spatial patterns. Centroids of patches in the two quilt classes are spaced differently and in general, crazy quilts but not regular quilts are well-fitted by a random Strauss process. These results indicate that, within the constraints of the quilt format, Victorian quilters indeed achieved their goal of generating random structures.

  13. Alcohol assessment among college students using wireless mobile technology.

    PubMed

    Bernhardt, Jay M; Usdan, Stuart; Mays, Darren; Martin, Ryan; Cremeens, Jennifer; Arriola, Kimberly Jacob

    2009-09-01

    This study used a two-group randomized design to assess the validity of measuring self-reported alcohol consumption among college students using the Handheld Assisted Network Diary (HAND), a daily diary assessment administered using wireless mobile devices. A convenience sample of college students was recruited at a large, public university in the southeastern United States and randomized into two groups. A randomly assigned group of 86 students completed the daily HAND assessment during the 30-day study and a Timeline Followback (TLFB) at 30-day follow-up. A randomly assigned group of 82 students completed the paper-and-pencil Daily Social Diary (DSD) over the same study period. Data from the daily HAND assessment were compared with the TLFB completed at follow-up by participants who completed the HAND using 95% limits of agreement analysis. Furthermore, individual growth models were used to examine differences between the HAND and DSD by comparing the total drinks, drinking days, and drinks per drinking day captured by the two assessments over the study period. Results suggest that the HAND captured similar levels of alcohol use compared with the TLFB completed at follow-up by the same participants. In addition, comparisons of the two study groups suggest that, controlling for baseline alcohol use and demographics, the HAND assessment captured similar levels of total drinks, drinking days, and drinks per drinking day as the paper-and-pencil DSD. The study findings support the validity of wireless mobile devices as a daily assessment of alcohol use among college students.

  14. A random matrix approach to credit risk.

    PubMed

    Münnix, Michael C; Schäfer, Rudi; Guhr, Thomas

    2014-01-01

    We estimate generic statistical properties of a structural credit risk model by considering an ensemble of correlation matrices. This ensemble is set up by Random Matrix Theory. We demonstrate analytically that the presence of correlations severely limits the effect of diversification in a credit portfolio if the correlations are not identically zero. The existence of correlations alters the tails of the loss distribution considerably, even if their average is zero. Under the assumption of randomly fluctuating correlations, a lower bound for the estimation of the loss distribution is provided.

  15. A Random Matrix Approach to Credit Risk

    PubMed Central

    Guhr, Thomas

    2014-01-01

    We estimate generic statistical properties of a structural credit risk model by considering an ensemble of correlation matrices. This ensemble is set up by Random Matrix Theory. We demonstrate analytically that the presence of correlations severely limits the effect of diversification in a credit portfolio if the correlations are not identically zero. The existence of correlations alters the tails of the loss distribution considerably, even if their average is zero. Under the assumption of randomly fluctuating correlations, a lower bound for the estimation of the loss distribution is provided. PMID:24853864

  16. The influence of statistical properties of Fourier coefficients on random Gaussian surfaces.

    PubMed

    de Castro, C P; Luković, M; Andrade, R F S; Herrmann, H J

    2017-05-16

    Many examples of natural systems can be described by random Gaussian surfaces. Much can be learned by analyzing the Fourier expansion of the surfaces, from which it is possible to determine the corresponding Hurst exponent and consequently establish the presence of scale invariance. We show that this symmetry is not affected by the distribution of the modulus of the Fourier coefficients. Furthermore, we investigate the role of the Fourier phases of random surfaces. In particular, we show how the surface is affected by a non-uniform distribution of phases.

  17. NHash: Randomized N-Gram Hashing for Distributed Generation of Validatable Unique Study Identifiers in Multicenter Research

    PubMed Central

    Zhang, Guo-Qiang; Tao, Shiqiang; Xing, Guangming; Mozes, Jeno; Zonjy, Bilal; Lhatoo, Samden D

    2015-01-01

    Background A unique study identifier serves as a key for linking research data about a study subject without revealing protected health information in the identifier. While sufficient for single-site and limited-scale studies, the use of common unique study identifiers has several drawbacks for large multicenter studies, where thousands of research participants may be recruited from multiple sites. An important property of study identifiers is error tolerance (or validatable), in that inadvertent editing mistakes during their transmission and use will most likely result in invalid study identifiers. Objective This paper introduces a novel method called "Randomized N-gram Hashing (NHash)," for generating unique study identifiers in a distributed and validatable fashion, in multicenter research. NHash has a unique set of properties: (1) it is a pseudonym serving the purpose of linking research data about a study participant for research purposes; (2) it can be generated automatically in a completely distributed fashion with virtually no risk for identifier collision; (3) it incorporates a set of cryptographic hash functions based on N-grams, with a combination of additional encryption techniques such as a shift cipher; (d) it is validatable (error tolerant) in the sense that inadvertent edit errors will mostly result in invalid identifiers. Methods NHash consists of 2 phases. First, an intermediate string using randomized N-gram hashing is generated. This string consists of a collection of N-gram hashes f 1, f 2, ..., f k. The input for each function f i has 3 components: a random number r, an integer n, and input data m. The result, f i(r, n, m), is an n-gram of m with a starting position s, which is computed as (r mod |m|), where |m| represents the length of m. The output for Step 1 is the concatenation of the sequence f 1(r 1, n 1, m 1), f 2(r 2, n 2, m 2), ..., f k(r k, n k, m k). In the second phase, the intermediate string generated in Phase 1 is encrypted using techniques such as shift cipher. The result of the encryption, concatenated with the random number r, is the final NHash study identifier. Results We performed experiments using a large synthesized dataset comparing NHash with random strings, and demonstrated neglegible probability for collision. We implemented NHash for the Center for SUDEP Research (CSR), a National Institute for Neurological Disorders and Stroke-funded Center Without Walls for Collaborative Research in the Epilepsies. This multicenter collaboration involves 14 institutions across the United States and Europe, bringing together extensive and diverse expertise to understand sudden unexpected death in epilepsy patients (SUDEP). Conclusions The CSR Data Repository has successfully used NHash to link deidentified multimodal clinical data collected in participating CSR institutions, meeting all desired objectives of NHash. PMID:26554419

  18. Hundred-watt-level high power random distributed feedback Raman fiber laser at 1150 nm and its application in mid-infrared laser generation.

    PubMed

    Zhang, Hanwei; Zhou, Pu; Wang, Xiong; Du, Xueyuan; Xiao, Hu; Xu, Xiaojun

    2015-06-29

    Two kinds of hundred-watt-level random distributed feedback Raman fiber have been demonstrated. The optical efficiency can reach to as high as 84.8%. The reported power and efficiency of the random laser is the highest one as we know. We have also demonstrated that the developed random laser can be further used to pump a Ho-doped fiber laser for mid-infrared laser generation. Finally, 23 W 2050 nm laser is achieved. The presented laser can obtain high power output efficiently and conveniently and opens a new direction for high power laser sources at designed wavelength.

  19. Statistical Orbit Determination using the Particle Filter for Incorporating Non-Gaussian Uncertainties

    NASA Technical Reports Server (NTRS)

    Mashiku, Alinda; Garrison, James L.; Carpenter, J. Russell

    2012-01-01

    The tracking of space objects requires frequent and accurate monitoring for collision avoidance. As even collision events with very low probability are important, accurate prediction of collisions require the representation of the full probability density function (PDF) of the random orbit state. Through representing the full PDF of the orbit state for orbit maintenance and collision avoidance, we can take advantage of the statistical information present in the heavy tailed distributions, more accurately representing the orbit states with low probability. The classical methods of orbit determination (i.e. Kalman Filter and its derivatives) provide state estimates based on only the second moments of the state and measurement errors that are captured by assuming a Gaussian distribution. Although the measurement errors can be accurately assumed to have a Gaussian distribution, errors with a non-Gaussian distribution could arise during propagation between observations. Moreover, unmodeled dynamics in the orbit model could introduce non-Gaussian errors into the process noise. A Particle Filter (PF) is proposed as a nonlinear filtering technique that is capable of propagating and estimating a more complete representation of the state distribution as an accurate approximation of a full PDF. The PF uses Monte Carlo runs to generate particles that approximate the full PDF representation. The PF is applied in the estimation and propagation of a highly eccentric orbit and the results are compared to the Extended Kalman Filter and Splitting Gaussian Mixture algorithms to demonstrate its proficiency.

  20. Declines in moose population density at Isle Royle National Park, MI, USA and accompanied changes in landscape patterns

    USGS Publications Warehouse

    De Jager, N. R.; Pastor, J.

    2009-01-01

    Ungulate herbivores create patterns of forage availability, plant species composition, and soil fertility as they range across large landscapes and consume large quantities of plant material. Over time, herbivore populations fluctuate, producing great potential for spatio-temporal landscape dynamics. In this study, we extend the spatial and temporal extent of a long-term investigation of the relationship of landscape patterns to moose foraging behavior at Isle Royale National Park, MI. We examined how patterns of browse availability and consumption, plant basal area, and soil fertility changed during a recent decline in the moose population. We used geostatistics to examine changes in the nature of spatial patterns in two valleys over 18 years and across short-range and long-range distance scales. Landscape patterns of available and consumed browse changed from either repeated patches or randomly distributed patches in 1988-1992 to random point distributions by 2007 after a recent record high peak followed by a rapid decline in the moose population. Patterns of available and consumed browse became decoupled during the moose population low, which is in contrast to coupled patterns during the earlier high moose population. Distributions of plant basal area and soil nitrogen availability also switched from repeated patches to randomly distributed patches in one valley and to random point distributions in the other valley. Rapid declines in moose population density may release vegetation and soil fertility from browsing pressure and in turn create random landscape patterns. ?? Springer Science+Business Media B.V. 2009.

  1. A comparison of numerical solutions of partial differential equations with probabilistic and possibilistic parameters for the quantification of uncertainty in subsurface solute transport.

    PubMed

    Zhang, Kejiang; Achari, Gopal; Li, Hua

    2009-11-03

    Traditionally, uncertainty in parameters are represented as probabilistic distributions and incorporated into groundwater flow and contaminant transport models. With the advent of newer uncertainty theories, it is now understood that stochastic methods cannot properly represent non random uncertainties. In the groundwater flow and contaminant transport equations, uncertainty in some parameters may be random, whereas those of others may be non random. The objective of this paper is to develop a fuzzy-stochastic partial differential equation (FSPDE) model to simulate conditions where both random and non random uncertainties are involved in groundwater flow and solute transport. Three potential solution techniques namely, (a) transforming a probability distribution to a possibility distribution (Method I) then a FSPDE becomes a fuzzy partial differential equation (FPDE), (b) transforming a possibility distribution to a probability distribution (Method II) and then a FSPDE becomes a stochastic partial differential equation (SPDE), and (c) the combination of Monte Carlo methods and FPDE solution techniques (Method III) are proposed and compared. The effects of these three methods on the predictive results are investigated by using two case studies. The results show that the predictions obtained from Method II is a specific case of that got from Method I. When an exact probabilistic result is needed, Method II is suggested. As the loss or gain of information during a probability-possibility (or vice versa) transformation cannot be quantified, their influences on the predictive results is not known. Thus, Method III should probably be preferred for risk assessments.

  2. Thermodynamic method for generating random stress distributions on an earthquake fault

    USGS Publications Warehouse

    Barall, Michael; Harris, Ruth A.

    2012-01-01

    This report presents a new method for generating random stress distributions on an earthquake fault, suitable for use as initial conditions in a dynamic rupture simulation. The method employs concepts from thermodynamics and statistical mechanics. A pattern of fault slip is considered to be analogous to a micro-state of a thermodynamic system. The energy of the micro-state is taken to be the elastic energy stored in the surrounding medium. Then, the Boltzmann distribution gives the probability of a given pattern of fault slip and stress. We show how to decompose the system into independent degrees of freedom, which makes it computationally feasible to select a random state. However, due to the equipartition theorem, straightforward application of the Boltzmann distribution leads to a divergence which predicts infinite stress. To avoid equipartition, we show that the finite strength of the fault acts to restrict the possible states of the system. By analyzing a set of earthquake scaling relations, we derive a new formula for the expected power spectral density of the stress distribution, which allows us to construct a computer algorithm free of infinities. We then present a new technique for controlling the extent of the rupture by generating a random stress distribution thousands of times larger than the fault surface, and selecting a portion which, by chance, has a positive stress perturbation of the desired size. Finally, we present a new two-stage nucleation method that combines a small zone of forced rupture with a larger zone of reduced fracture energy.

  3. Latin Hypercube Sampling (LHS) UNIX Library/Standalone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2004-05-13

    The LHS UNIX Library/Standalone software provides the capability to draw random samples from over 30 distribution types. It performs the sampling by a stratified sampling method called Latin Hypercube Sampling (LHS). Multiple distributions can be sampled simultaneously, with user-specified correlations amongst the input distributions, LHS UNIX Library/ Standalone provides a way to generate multi-variate samples. The LHS samples can be generated either as a callable library (e.g., from within the DAKOTA software framework) or as a standalone capability. LHS UNIX Library/Standalone uses the Latin Hypercube Sampling method (LHS) to generate samples. LHS is a constrained Monte Carlo sampling scheme. Inmore » LHS, the range of each variable is divided into non-overlapping intervals on the basis of equal probability. A sample is selected at random with respect to the probability density in each interval, If multiple variables are sampled simultaneously, then values obtained for each are paired in a random manner with the n values of the other variables. In some cases, the pairing is restricted to obtain specified correlations amongst the input variables. Many simulation codes have input parameters that are uncertain and can be specified by a distribution, To perform uncertainty analysis and sensitivity analysis, random values are drawn from the input parameter distributions, and the simulation is run with these values to obtain output values. If this is done repeatedly, with many input samples drawn, one can build up a distribution of the output as well as examine correlations between input and output variables.« less

  4. A prospective randomized clinical trial to evaluate methods of postoperative care of hypospadias.

    PubMed

    McLorie, G; Joyner, B; Herz, D; McCallum, J; Bagli, D; Merguerian, P; Khoury, A

    2001-05-01

    Hypospadias repair is a common operation performed by pediatric urologists. Perhaps the greatest variable and source of controversy of postoperative care is the surgical dressing. We hypothesized that using no dressing would achieve surgically comparable results to those traditionally achieved by a postoperative dressing and it would also simplify postoperative parent delivered home care. Accordingly we designed a prospective randomized clinical trial to compare surgical outcome and postoperative care after hypospadias repair in boys with no dressing and those who received 1 of the 2 most common types of dressing. In a 12-month period 120 boys with an average age of 2.2 years underwent primary 1-stage hypospadias repair at a single center with 4 participating surgeons. Repair was performed in 60 boys with proximal and 60 with distal hypospadias on an outpatient basis. Ethics and Internal Review Board approval, and informed consent were obtained. Boys were then prospectively randomized to receive no dressing, an adhesive biomembrane dressing or a compressive wrap dressing. Comprehensive instructions on postoperative care were distributed to all families and a questionnaire was distributed to the parents at the initial followup. Surgical outcome was evaluated and questionnaire responses were analyzed. Fisher's exact test was done to test the significance of differences in surgical outcomes and questionnaire responses. A total of 117 boys completed the prospective randomized trial. Surgical staff withdrew 3 cases from randomized selection to place a dressing for postoperative hemostasis. We obtained 101 questionnaires for response analysis. The type or absence of the dressing did not correlate with the need for repeat procedures, urethrocutaneous fistula, or meatal stenosis or regression. Analysis revealed less narcotic use in the no dressing group and fewer telephone calls to the urology nurse, or on-call resident and/or fellow. These findings were statistically significant. In addition, there were more unscheduled visits to the urology clinic, emergency room or primary physician office by boys with than without a dressing. Furthermore, 29% of the parents were not psychologically prepared to remove the dressing and 12% were so reluctant that the dressing was removed at the urology outpatient clinic. The surgical outcome and rate of adverse events or complications were not compromised without a postoperative dressing. An absent dressing simplified postoperative ambulatory parent delivered home care. We recommend that dressings should be omitted from routine use after hypospadias repair.

  5. Scalable and fault tolerant orthogonalization based on randomized distributed data aggregation

    PubMed Central

    Gansterer, Wilfried N.; Niederbrucker, Gerhard; Straková, Hana; Schulze Grotthoff, Stefan

    2013-01-01

    The construction of distributed algorithms for matrix computations built on top of distributed data aggregation algorithms with randomized communication schedules is investigated. For this purpose, a new aggregation algorithm for summing or averaging distributed values, the push-flow algorithm, is developed, which achieves superior resilience properties with respect to failures compared to existing aggregation methods. It is illustrated that on a hypercube topology it asymptotically requires the same number of iterations as the optimal all-to-all reduction operation and that it scales well with the number of nodes. Orthogonalization is studied as a prototypical matrix computation task. A new fault tolerant distributed orthogonalization method rdmGS, which can produce accurate results even in the presence of node failures, is built on top of distributed data aggregation algorithms. PMID:24748902

  6. Smoking cessation and reduction in schizophrenia (SCARIS) with e-cigarette: study protocol for a randomized control trial.

    PubMed

    Caponnetto, Pasquale; Polosa, Riccardo; Auditore, Roberta; Minutolo, Giuseppe; Signorelli, Maria; Maglia, Marilena; Alamo, Angela; Palermo, Filippo; Aguglia, Eugenio

    2014-03-22

    It is well established in studies across several countries that tobacco smoking is more prevalent among schizophrenic patients than the general population. Electronic cigarettes are becoming increasingly popular with smokers worldwide. To date there are no large randomized trials of electronic cigarettes in schizophrenic smokers. A well-designed trial is needed to compare efficacy and safety of these products in this special population. We have designed a randomized controlled trial investigating the efficacy and safety of electronic cigarette. The trial will take the form of a prospective 12-month randomized clinical study to evaluate smoking reduction, smoking abstinence and adverse events in schizophrenic smokers not intending to quit. We will also monitor quality of life, neurocognitive functioning and measure participants' perception and satisfaction of the product. A ≥50% reduction in the number of cigarettes/day from baseline, will be calculated at each study visit ("reducers"). Abstinence from smoking will be calculated at each study visit ("quitters"). Smokers who leave the study protocol before its completion and will carry out the Early Termination Visit or who will not satisfy the criteria of "reducers" and "quitters" will be defined "non responders". The differences of continuous variables between the three groups will be evaluated with the Kruskal-Wallis Test, followed by the Dunn multiple comparison test. The differences between the three groups for normally distributed data will be evaluated with ANOVA test one way, followed by the Newman-Keuls multiple comparison test. The normality of the distribution will be evaluated with the Kolmogorov-Smirnov test. Any correlations between the variables under evaluation will be assessed by Spearman r correlation. To compare qualitative data will be used the Chi-square test. The main strengths of the SCARIS study are the following: it's the first large RCT on schizophrenic patient, involving in and outpatient, evaluating the effect of a three-arm study design, and a long term of follow-up (52-weeks).The goal is to propose an effective intervention to reduce the risk of tobacco smoking, as a complementary tool to treat tobacco addiction in schizophrenia. ClinicalTrials.gov, NCT01979796.

  7. Detection limits for nanoparticles in solution with classical turbidity spectra

    NASA Astrophysics Data System (ADS)

    Le Blevennec, G.

    2013-09-01

    Detection of nanoparticles in solution is required to manage safety and environmental problems. Spectral transmission turbidity method has now been known for a long time. It is derived from the Mie Theory and can be applied to any number of spheres, randomly distributed and separated by large distance compared to wavelength. Here, we describe a method for determination of size, distribution and concentration of nanoparticles in solution using UV-Vis transmission measurements. The method combines Mie and Beer Lambert computation integrated in a best fit approximation. In a first step, a validation of the approach is completed on silver nanoparticles solution. Verification of results is realized with Transmission Electronic Microscopy measurements for size distribution and an Inductively Coupled Plasma Mass Spectrometry for concentration. In view of the good agreement obtained, a second step of work focuses on how to manage the concentration to be the most accurate on the size distribution. Those efficient conditions are determined by simple computation. As we are dealing with nanoparticles, one of the key points is to know what the size limits reachable are with that kind of approach based on classical electromagnetism. In taking into account the transmission spectrometer accuracy limit we determine for several types of materials, metals, dielectrics, semiconductors the particle size limit detectable by such a turbidity method. These surprising results are situated at the quantum physics frontier.

  8. Modeling Achievement Trajectories when Attrition Is Informative

    ERIC Educational Resources Information Center

    Feldman, Betsy J.; Rabe-Hesketh, Sophia

    2012-01-01

    In longitudinal education studies, assuming that dropout and missing data occur completely at random is often unrealistic. When the probability of dropout depends on covariates and observed responses (called "missing at random" [MAR]), or on values of responses that are missing (called "informative" or "not missing at random" [NMAR]),…

  9. Quantum tunneling recombination in a system of randomly distributed trapped electrons and positive ions.

    PubMed

    Pagonis, Vasilis; Kulp, Christopher; Chaney, Charity-Grace; Tachiya, M

    2017-09-13

    During the past 10 years, quantum tunneling has been established as one of the dominant mechanisms for recombination in random distributions of electrons and positive ions, and in many dosimetric materials. Specifically quantum tunneling has been shown to be closely associated with two important effects in luminescence materials, namely long term afterglow luminescence and anomalous fading. Two of the common assumptions of quantum tunneling models based on random distributions of electrons and positive ions are: (a) An electron tunnels from a donor to the nearest acceptor, and (b) the concentration of electrons is much lower than that of positive ions at all times during the tunneling process. This paper presents theoretical studies for arbitrary relative concentrations of electrons and positive ions in the solid. Two new differential equations are derived which describe the loss of charge in the solid by tunneling, and they are solved analytically. The analytical solution compares well with the results of Monte Carlo simulations carried out in a random distribution of electrons and positive ions. Possible experimental implications of the model are discussed for tunneling phenomena in long term afterglow signals, and also for anomalous fading studies in feldspars and apatite samples.

  10. Multilinear Computing and Multilinear Algebraic Geometry

    DTIC Science & Technology

    2016-08-10

    instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send...performance period of this project. 15. SUBJECT TERMS Tensors , multilinearity, algebraic geometry, numerical computations, computational tractability, high...Reset DISTRIBUTION A: Distribution approved for public release. DISTRIBUTION A: Distribution approved for public release. INSTRUCTIONS FOR COMPLETING

  11. Reduction and return of infectious trachoma in severely affected communities in Ethiopia.

    PubMed

    Lakew, Takele; House, Jenafir; Hong, Kevin C; Yi, Elizabeth; Alemayehu, Wondu; Melese, Muluken; Zhou, Zhaoxia; Ray, Kathryn; Chin, Stephanie; Romero, Emmanuel; Keenan, Jeremy; Whitcher, John P; Gaynor, Bruce D; Lietman, Thomas M

    2009-01-01

    Antibiotics are a major tool in the WHO's trachoma control program. Even a single mass distribution reduces the prevalence of the ocular chlamydia that causes trachoma. Unfortunately, infection returns after a single treatment, at least in severely affected areas. Here, we test whether additional scheduled treatments further reduce infection, and whether infection returns after distributions are discontinued. Sixteen communities in Ethiopia were randomly selected. Ocular chlamydial infection in 1- to 5-year-old children was monitored over four biannual azithromycin distributions and for 24 months after the last treatment. The average prevalence of infection in 1- to 5-year-old children was reduced from 63.5% pre-treatment to 11.5% six months after the first distribution (P<0.0001). It further decreased to 2.6% six months after the fourth and final treatment (P = 0.0004). In the next 18 months, infection returned to 25.2%, a significant increase from six months after the last treatment (P = 0.008), but still far lower than baseline (P<0.0001). Although the prevalence of infection in any particular village fluctuated, the mean prevalence of the 16 villages steadily decreased with each treatment and steadily returned after treatments were discontinued. In some of the most severely affected communities ever studied, we demonstrate that repeated mass oral azithromycin distributions progressively reduce ocular chlamydial infection in a community, as long as these distributions are given frequently enough and at a high enough coverage. However, infection returns into the communities after the last treatment. Sustainable changes or complete local elimination of infection will be necessary. ClinicalTrials.gov NCT00221364.

  12. The Statistical Fermi Paradox

    NASA Astrophysics Data System (ADS)

    Maccone, C.

    In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in 2008. 4. A practical example is then given of how the SEH works numerically. Each of the ten random variables is uniformly distributed around its own mean value as given by Dole (1964) and a standard deviation of 10% is assumed. The conclusion is that the average number of habitable planets in the Galaxy should be around 100 million ±200 million, and the average distance in between any two nearby habitable planets should be about 88 light years ±40 light years. 5. The SEH results are matched against the results of the Statistical Drake Equation from reference 4. As expected, the number of currently communicating ET civilizations in the Galaxy turns out to be much smaller than the number of habitable planets (about 10,000 against 100 million, i.e. one ET civilization out of 10,000 habitable planets). The average distance between any two nearby habitable planets is much smaller that the average distance between any two neighbouring ET civilizations: 88 light years vs. 2000 light years, respectively. This means an ET average distance about 20 times higher than the average distance between any pair of adjacent habitable planets. 6. Finally, a statistical model of the Fermi Paradox is derived by applying the above results to the coral expansion model of Galactic colonization. The symbolic manipulator "Macsyma" is used to solve these difficult equations. A new random variable Tcol, representing the time needed to colonize a new planet is introduced, which follows the lognormal distribution, Then the new quotient random variable Tcol/D is studied and its probability density function is derived by Macsyma. Finally a linear transformation of random variables yields the overall time TGalaxy needed to colonize the whole Galaxy. We believe that our mathematical work in deriving this STATISTICAL Fermi Paradox is highly innovative and fruitful for the future.

  13. Convex hulls of random walks in higher dimensions: A large-deviation study

    NASA Astrophysics Data System (ADS)

    Schawe, Hendrik; Hartmann, Alexander K.; Majumdar, Satya N.

    2017-12-01

    The distribution of the hypervolume V and surface ∂ V of convex hulls of (multiple) random walks in higher dimensions are determined numerically, especially containing probabilities far smaller than P =10-1000 to estimate large deviation properties. For arbitrary dimensions and large walk lengths T , we suggest a scaling behavior of the distribution with the length of the walk T similar to the two-dimensional case and behavior of the distributions in the tails. We underpin both with numerical data in d =3 and d =4 dimensions. Further, we confirm the analytically known means of those distributions and calculate their variances for large T .

  14. A randomized, double-blind, placebo controlled, parallel group, efficacy study of alpha BRAIN® administered orally.

    PubMed

    Solomon, Todd M; Leech, Jarrett; deBros, Guy B; Murphy, Cynthia A; Budson, Andrew E; Vassey, Elizabeth A; Solomon, Paul R

    2016-03-01

    Alpha BRAIN® is a nootropic supplement that purports to enhance cognitive functioning in healthy adults. The goal of this study was to investigate the efficacy of this self-described cognitive enhancing nootropic on cognitive functioning in a group of healthy adults by utilizing a randomized, double blind, placebo-controlled design. A total of 63-treatment naïve individuals between 18 and 35 years of age completed the randomized, double-blind, placebo controlled trial. All participants completed a 2-week placebo run in before receiving active product, Alpha BRAIN® or new placebo, for 6 weeks. Participants undertook a battery of neuropsychological tests at randomization and at study completion. Primary outcome measures included a battery of neuropsychological tests and measures of sleep. Compared with placebo, Alpha BRAIN® significantly improved on tasks of delayed verbal recall and executive functioning. Results also indicated significant time-by-group interaction in delayed verbal recall for the Alpha BRAIN® group. The use of Alpha BRAIN® for 6 weeks significantly improved recent verbal memory when compared with controls, in a group of healthy adults. While the outcome of the study is encouraging, this is the first randomized controlled trial of Alpha BRAIN®, and the results merit further study. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Determination of the influence of dispersion pattern of pesticide-resistant individuals on the reliability of resistance estimates using different sampling plans.

    PubMed

    Shah, R; Worner, S P; Chapman, R B

    2012-10-01

    Pesticide resistance monitoring includes resistance detection and subsequent documentation/ measurement. Resistance detection would require at least one (≥1) resistant individual(s) to be present in a sample to initiate management strategies. Resistance documentation, on the other hand, would attempt to get an estimate of the entire population (≥90%) of the resistant individuals. A computer simulation model was used to compare the efficiency of simple random and systematic sampling plans to detect resistant individuals and to document their frequencies when the resistant individuals were randomly or patchily distributed. A patchy dispersion pattern of resistant individuals influenced the sampling efficiency of systematic sampling plans while the efficiency of random sampling was independent of such patchiness. When resistant individuals were randomly distributed, sample sizes required to detect at least one resistant individual (resistance detection) with a probability of 0.95 were 300 (1%) and 50 (10% and 20%); whereas, when resistant individuals were patchily distributed, using systematic sampling, sample sizes required for such detection were 6000 (1%), 600 (10%) and 300 (20%). Sample sizes of 900 and 400 would be required to detect ≥90% of resistant individuals (resistance documentation) with a probability of 0.95 when resistant individuals were randomly dispersed and present at a frequency of 10% and 20%, respectively; whereas, when resistant individuals were patchily distributed, using systematic sampling, a sample size of 3000 and 1500, respectively, was necessary. Small sample sizes either underestimated or overestimated the resistance frequency. A simple random sampling plan is, therefore, recommended for insecticide resistance detection and subsequent documentation.

  16. A random wave model for the Aharonov-Bohm effect

    NASA Astrophysics Data System (ADS)

    Houston, Alexander J. H.; Gradhand, Martin; Dennis, Mark R.

    2017-05-01

    We study an ensemble of random waves subject to the Aharonov-Bohm effect. The introduction of a point with a magnetic flux of arbitrary strength into a random wave ensemble gives a family of wavefunctions whose distribution of vortices (complex zeros) is responsible for the topological phase associated with the Aharonov-Bohm effect. Analytical expressions are found for the vortex number and topological charge densities as functions of distance from the flux point. Comparison is made with the distribution of vortices in the isotropic random wave model. The results indicate that as the flux approaches half-integer values, a vortex with the same sign as the fractional part of the flux is attracted to the flux point, merging with it in the limit of half-integer flux. We construct a statistical model of the neighbourhood of the flux point to study how this vortex-flux merger occurs in more detail. Other features of the Aharonov-Bohm vortex distribution are also explored.

  17. Analysis of the expected density of internal equilibria in random evolutionary multi-player multi-strategy games.

    PubMed

    Duong, Manh Hong; Han, The Anh

    2016-12-01

    In this paper, we study the distribution and behaviour of internal equilibria in a d-player n-strategy random evolutionary game where the game payoff matrix is generated from normal distributions. The study of this paper reveals and exploits interesting connections between evolutionary game theory and random polynomial theory. The main contributions of the paper are some qualitative and quantitative results on the expected density, [Formula: see text], and the expected number, E(n, d), of (stable) internal equilibria. Firstly, we show that in multi-player two-strategy games, they behave asymptotically as [Formula: see text] as d is sufficiently large. Secondly, we prove that they are monotone functions of d. We also make a conjecture for games with more than two strategies. Thirdly, we provide numerical simulations for our analytical results and to support the conjecture. As consequences of our analysis, some qualitative and quantitative results on the distribution of zeros of a random Bernstein polynomial are also obtained.

  18. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.

  19. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    PubMed Central

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information. PMID:26823196

  20. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption.

    PubMed

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-29

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.

  1. Random walks exhibiting anomalous diffusion: elephants, urns and the limits of normality

    NASA Astrophysics Data System (ADS)

    Kearney, Michael J.; Martin, Richard J.

    2018-01-01

    A random walk model is presented which exhibits a transition from standard to anomalous diffusion as a parameter is varied. The model is a variant on the elephant random walk and differs in respect of the treatment of the initial state, which in the present work consists of a given number N of fixed steps. This also links the elephant random walk to other types of history dependent random walk. As well as being amenable to direct analysis, the model is shown to be asymptotically equivalent to a non-linear urn process. This provides fresh insights into the limiting form of the distribution of the walker’s position at large times. Although the distribution is intrinsically non-Gaussian in the anomalous diffusion regime, it gradually reverts to normal form when N is large under quite general conditions.

  2. Analysis of Internet Suicide Pacts Reported by the Media in Mainland China.

    PubMed

    Jiang, Fang-Fan; Xu, Hui-Lan; Liao, Hui-Ying; Zhang, Ting

    2017-01-01

    In mainland China, frequent Internet suicide pacts in recent years have raised strong concerns from several social sectors and the influence of social networks on suicide is constantly growing. To identify the epidemiological characteristics of media-reported Internet suicide pacts in mainland China. Our study comprised 62 Internet suicide pacts involving 159 victims in mainland China before June 1, 2015. Kendall's randomness test, a trend test, and a circular distribution test were applied to identify the rising or concentrated trends in the time of occurrence of Internet suicide pacts. The overall male-to-female ratio was 2.3:1. Suicide victims were mainly people in their 20s to 30s (84.1%). In all, 87.1% suicide victims completed suicide in sealed hotels or rental housing, and charcoal-burning suicide accounted for 80.6% of cases. People who complete suicide as part of an Internet suicide pact are more likely to be males, aged 20-30 years. Charcoal-burning suicide in sealed hotels or rental housing was the commonest way of dying.

  3. Informatic and genomic analysis of melanocyte cDNA libraries as a resource for the study of melanocyte development and function.

    PubMed

    Baxter, Laura L; Hsu, Benjamin J; Umayam, Lowell; Wolfsberg, Tyra G; Larson, Denise M; Frith, Martin C; Kawai, Jun; Hayashizaki, Yoshihide; Carninci, Piero; Pavan, William J

    2007-06-01

    As part of the RIKEN mouse encyclopedia project, two cDNA libraries were prepared from melanocyte-derived cell lines, using techniques of full-length clone selection and subtraction/normalization to enrich for rare transcripts. End sequencing showed that these libraries display over 83% complete coding sequence at the 5' end and 96-97% complete coding sequence at the 3' end. Evaluation of the libraries, derived from B16F10Y tumor cells and melan-c cells, revealed that they contain clones for a majority of the genes previously demonstrated to function in melanocyte biology. Analysis of genomic locations for transcripts revealed that the distribution of melanocyte genes is non-random throughout the genome. Three genomic regions identified that showed significant clustering of melanocyte-expressed genes contain one or more genes previously shown to regulate melanocyte development or function. A catalog of genes expressed in these libraries is presented, providing a valuable resource of cDNA clones and sequence information that can be used for identification of new genes important for melanocyte development, function, and disease.

  4. French version of the Copenhagen neck functional disability scale.

    PubMed

    Forestier, Romain; Françon, Alain; Arroman, Frédérique Saint; Bertolino, Christiane

    2007-03-01

    We conducted a study to validate the French version of the Copenhagen Neck Functional Disability Scale (CNFDS). We used the CNFDS on data generated by a previous randomized controlled trial comparing pulsed electromagnetic field therapy (PEMFT), spa therapy, and standard therapy in patients with neck pain. Patients were recruited locally and examined by a physician who was unaware of the treatment group and independent from the trial. Treatment efficacy was evaluated based on a visual analog scale (VAS) for pain, the short-form-36 quality-of-life instrument (SF36), payments by public healthcare insurance, and overall assessments by the patients and physicians. Efficacy was evaluated at baseline, at treatment completion, and after 3 and 6 months. In addition, the patients completed the CNFDS at these time points. CNFDS scores were normally distributed. CNFDS scores and their variations correlated well with the other efficacy criteria. CNFDS scores were less sensitive to change than the VAS pain scores and more sensitive to change than the other efficacy criteria. The CNFDS holds promise as a tool for evaluating neck pain. Score reproducibility needs to be studied. The CNFDS can be added to the other instruments that have been translated in recent years to serve as tools for clinical research. However, the ease of completion of the CNFDS is consistent with use in clinical practice.

  5. Meta-analysis with missing study-level sample variance data.

    PubMed

    Chowdhry, Amit K; Dworkin, Robert H; McDermott, Michael P

    2016-07-30

    We consider a study-level meta-analysis with a normally distributed outcome variable and possibly unequal study-level variances, where the object of inference is the difference in means between a treatment and control group. A common complication in such an analysis is missing sample variances for some studies. A frequently used approach is to impute the weighted (by sample size) mean of the observed variances (mean imputation). Another approach is to include only those studies with variances reported (complete case analysis). Both mean imputation and complete case analysis are only valid under the missing-completely-at-random assumption, and even then the inverse variance weights produced are not necessarily optimal. We propose a multiple imputation method employing gamma meta-regression to impute the missing sample variances. Our method takes advantage of study-level covariates that may be used to provide information about the missing data. Through simulation studies, we show that multiple imputation, when the imputation model is correctly specified, is superior to competing methods in terms of confidence interval coverage probability and type I error probability when testing a specified group difference. Finally, we describe a similar approach to handling missing variances in cross-over studies. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Impact and extinction signatures in complete Cretaceous-Tertiary (K-T) boundary sections

    NASA Technical Reports Server (NTRS)

    Smit, J.; Groot, H.; Dejonge, R.; Smit, P.

    1988-01-01

    The Zumaya, Caravaca and Agost sections in Spain, the El Kef section in Tunisia and the Negev (Nahal Avdat) sections in Israel are among the most continuous, expanded and complete K-T boundary sections. The distribution patterns of the planktic faunas were quantitatively analyzed in closely spaced samples across the K-T boundary in these sections, in conjuction with the geochemistry, stable isotopes, mineralogy and magnetostratigraphy. Three hundred foraminiferal specimens were randomly selected and determined. Reliable estimates for the foraminiferal productivity changes across the K-T boundary and for the 1 to 2 Ma interval preceding the K-T boundary were made from the numbers of individuals/gram of sediment corrected for the sedimentation rates (calculated from magnetic reversals and lithology). No gradual or stepwise extinction is seen below the K-T boundary nor any productivity decrease. Stable isotope analyses show a warming just after deposition of the ejecta layer, not cooling as predicted by nuclear winter scenarios, although the duration of such cooling may be too short to be observed even in these complete sections. Low REE values and cpx spherules with quench textures idential to quench-textures in diagenetically altered spherules, strongly indicate an oceanic site of (one of) the impactor(s).

  7. Exact results for the Floquet coin toss for driven integrable models

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Utso; Maity, Somnath; Banik, Uddipan; Dutta, Amit

    2018-05-01

    We study an integrable Hamiltonian reducible to free fermions, which is subjected to an imperfect periodic driving with the amplitude of driving (or kicking), randomly chosen from a binary distribution like a coin-toss problem. The randomness present in the driving protocol destabilizes the periodic steady state reached in the limit of perfectly periodic driving, leading to a monotonic rise of the stroboscopic residual energy with the number of periods (N ) for such Hamiltonians. We establish that a minimal deviation from the perfectly periodic driving in the present case using such protocols would always result in a bounded heating up of the system with N to an asymptotic finite value. Exploiting the completely uncorrelated nature of the randomness and the knowledge of the stroboscopic Floquet operator in the perfectly periodic situation, we provide an exact analytical formalism to derive the disorder averaged expectation value of the residual energy through a disorder operator. This formalism not only leads to an immense numerical simplification, but also enables us to derive an exact analytical form for the residual energy in the asymptotic limit which is universal, i.e., independent of the bias of coin-toss and the protocol chosen. Furthermore, this formalism clearly establishes the nature of the monotonic growth of the residual energy at intermediate N while clearly revealing the possible nonuniversal behavior of the same.

  8. Clustering, randomness and regularity in cloud fields. I - Theoretical considerations. II - Cumulus cloud fields

    NASA Technical Reports Server (NTRS)

    Weger, R. C.; Lee, J.; Zhu, Tianri; Welch, R. M.

    1992-01-01

    The current controversy existing in reference to the regularity vs. clustering in cloud fields is examined by means of analysis and simulation studies based upon nearest-neighbor cumulative distribution statistics. It is shown that the Poisson representation of random point processes is superior to pseudorandom-number-generated models and that pseudorandom-number-generated models bias the observed nearest-neighbor statistics towards regularity. Interpretation of this nearest-neighbor statistics is discussed for many cases of superpositions of clustering, randomness, and regularity. A detailed analysis is carried out of cumulus cloud field spatial distributions based upon Landsat, AVHRR, and Skylab data, showing that, when both large and small clouds are included in the cloud field distributions, the cloud field always has a strong clustering signal.

  9. Landscape-scale spatial abundance distributions discriminate core from random components of boreal lake bacterioplankton.

    PubMed

    Niño-García, Juan Pablo; Ruiz-González, Clara; Del Giorgio, Paul A

    2016-12-01

    Aquatic bacterial communities harbour thousands of coexisting taxa. To meet the challenge of discriminating between a 'core' and a sporadically occurring 'random' component of these communities, we explored the spatial abundance distribution of individual bacterioplankton taxa across 198 boreal lakes and their associated fluvial networks (188 rivers). We found that all taxa could be grouped into four distinct categories based on model statistical distributions (normal like, bimodal, logistic and lognormal). The distribution patterns across lakes and their associated river networks showed that lake communities are composed of a core of taxa whose distribution appears to be linked to in-lake environmental sorting (normal-like and bimodal categories), and a large fraction of mostly rare bacteria (94% of all taxa) whose presence appears to be largely random and linked to downstream transport in aquatic networks (logistic and lognormal categories). These rare taxa are thus likely to reflect species sorting at upstream locations, providing a perspective of the conditions prevailing in entire aquatic networks rather than only in lakes. © 2016 John Wiley & Sons Ltd/CNRS.

  10. Underestimating extreme events in power-law behavior due to machine-dependent cutoffs

    NASA Astrophysics Data System (ADS)

    Radicchi, Filippo

    2014-11-01

    Power-law distributions are typical macroscopic features occurring in almost all complex systems observable in nature. As a result, researchers in quantitative analyses must often generate random synthetic variates obeying power-law distributions. The task is usually performed through standard methods that map uniform random variates into the desired probability space. Whereas all these algorithms are theoretically solid, in this paper we show that they are subject to severe machine-dependent limitations. As a result, two dramatic consequences arise: (i) the sampling in the tail of the distribution is not random but deterministic; (ii) the moments of the sample distribution, which are theoretically expected to diverge as functions of the sample sizes, converge instead to finite values. We provide quantitative indications for the range of distribution parameters that can be safely handled by standard libraries used in computational analyses. Whereas our findings indicate possible reinterpretations of numerical results obtained through flawed sampling methodologies, they also pave the way for the search for a concrete solution to this central issue shared by all quantitative sciences dealing with complexity.

  11. A randomized controlled trial comparing EMDR and CBT for obsessive-compulsive disorder.

    PubMed

    Marsden, Zoe; Lovell, Karina; Blore, David; Ali, Shehzad; Delgadillo, Jaime

    2018-01-01

    This study aimed to evaluate eye movement desensitization and reprocessing (EMDR) as a treatment for obsessive-compulsive disorder (OCD), by comparison to cognitive behavioural therapy (CBT) based on exposure and response prevention. This was a pragmatic, feasibility randomized controlled trial in which 55 participants with OCD were randomized to EMDR (n = 29) or CBT (n = 26). The Yale-Brown obsessive-compulsive scale was completed at baseline, after treatment and at 6 months follow-up. Treatment completion and response rates were compared using chi-square tests. Effect size was examined using Cohen's d and multilevel modelling. Overall, 61.8% completed treatment and 30.2% attained reliable and clinically significant improvement in OCD symptoms, with no significant differences between groups (p > .05). There were no significant differences between groups in Yale-Brown obsessive-compulsive scale severity post-treatment (d = -0.24, p = .38) or at 6 months follow-up (d = -0.03, p = .90). EMDR and CBT had comparable completion rates and clinical outcomes. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Improving Project Management with Simulation and Completion Distribution Functions

    NASA Technical Reports Server (NTRS)

    Cates, Grant R.

    2004-01-01

    Despite the critical importance of project completion timeliness, management practices in place today remain inadequate for addressing the persistent problem of project completion tardiness. A major culprit in late projects is uncertainty, which most, if not all, projects are inherently subject to. This uncertainty resides in the estimates for activity durations, the occurrence of unplanned and unforeseen events, and the availability of critical resources. In response to this problem, this research developed a comprehensive simulation based methodology for conducting quantitative project completion time risk analysis. It is called the Project Assessment by Simulation Technique (PAST). This new tool enables project stakeholders to visualize uncertainty or risk, i.e. the likelihood of their project completing late and the magnitude of the lateness, by providing them with a completion time distribution function of their projects. Discrete event simulation is used within PAST to determine the completion distribution function for the project of interest. The simulation is populated with both deterministic and stochastic elements. The deterministic inputs include planned project activities, precedence requirements, and resource requirements. The stochastic inputs include activity duration growth distributions, probabilities for events that can impact the project, and other dynamic constraints that may be placed upon project activities and milestones. These stochastic inputs are based upon past data from similar projects. The time for an entity to complete the simulation network, subject to both the deterministic and stochastic factors, represents the time to complete the project. Repeating the simulation hundreds or thousands of times allows one to create the project completion distribution function. The Project Assessment by Simulation Technique was demonstrated to be effective for the on-going NASA project to assemble the International Space Station. Approximately $500 million per month is being spent on this project, which is scheduled to complete by 2010. NASA project stakeholders participated in determining and managing completion distribution functions produced from PAST. The first result was that project stakeholders improved project completion risk awareness. Secondly, using PAST, mitigation options were analyzed to improve project completion performance and reduce total project cost.

  13. The living Drake equation of the Tau Zero Foundation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2011-03-01

    The living Drake equation is our statistical generalization of the Drake equation such that it can take into account any number of factors. This new result opens up the possibility to enrich the equation by inserting more new factors as long as the scientific learning increases. The adjective "Living" refers just to this continuous enrichment of the Drake equation and is the goal of a new research project that the Tau Zero Foundation has entrusted to this author as the discoverer of the statistical Drake equation described hereafter. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be arbitrarily distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov form of the CLT, or the Lindeberg form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the lognormal distribution. Then, the mean value, standard deviation, mode, median and all the moments of this lognormal N can be derived from the means and standard deviations of the seven input random variables. In fact, the seven factors in the ordinary Drake equation now become seven independent positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) distance between any two neighbouring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, this distance now becomes a new random variable. We derive the relevant probability density function, apparently previously unknown (dubbed "Maccone distribution" by Paul Davies). Data Enrichment Principle. It should be noticed that any positive number of random variables in the statistical Drake equation is compatible with the CLT. So, our generalization allows for many more factors to be added in the future as long as more refined scientific knowledge about each factor will be known to the scientists. This capability to make room for more future factors in the statistical Drake equation we call the "Data Enrichment Principle", and regard as the key to more profound, future results in Astrobiology and SETI.

  14. Randomized Trial of Complete Versus Lesion-Only Revascularization in Patients Undergoing Primary Percutaneous Coronary Intervention for STEMI and Multivessel Disease

    PubMed Central

    Gershlick, Anthony H.; Khan, Jamal Nasir; Kelly, Damian J.; Greenwood, John P.; Sasikaran, Thiagarajah; Curzen, Nick; Blackman, Daniel J.; Dalby, Miles; Fairbrother, Kathryn L.; Banya, Winston; Wang, Duolao; Flather, Marcus; Hetherington, Simon L.; Kelion, Andrew D.; Talwar, Suneel; Gunning, Mark; Hall, Roger; Swanton, Howard; McCann, Gerry P.

    2015-01-01

    Background The optimal management of patients found to have multivessel disease while undergoing primary percutaneous coronary intervention (P-PCI) for ST-segment elevation myocardial infarction is uncertain. Objectives CvLPRIT (Complete versus Lesion-only Primary PCI trial) is a U.K. open-label randomized study comparing complete revascularization at index admission with treatment of the infarct-related artery (IRA) only. Methods After they provided verbal assent and underwent coronary angiography, 296 patients in 7 U.K. centers were randomized through an interactive voice-response program to either in-hospital complete revascularization (n = 150) or IRA-only revascularization (n = 146). Complete revascularization was performed either at the time of P-PCI or before hospital discharge. Randomization was stratified by infarct location (anterior/nonanterior) and symptom onset (≤3 h or >3 h). The primary endpoint was a composite of all-cause death, recurrent myocardial infarction (MI), heart failure, and ischemia-driven revascularization within 12 months. Results Patient groups were well matched for baseline clinical characteristics. The primary endpoint occurred in 10.0% of the complete revascularization group versus 21.2% in the IRA-only revascularization group (hazard ratio: 0.45; 95% confidence interval: 0.24 to 0.84; p = 0.009). A trend toward benefit was seen early after complete revascularization (p = 0.055 at 30 days). Although there was no significant reduction in death or MI, a nonsignificant reduction in all primary endpoint components was seen. There was no reduction in ischemic burden on myocardial perfusion scintigraphy or in the safety endpoints of major bleeding, contrast-induced nephropathy, or stroke between the groups. Conclusions In patients presenting for P-PCI with multivessel disease, index admission complete revascularization significantly lowered the rate of the composite primary endpoint at 12 months compared with treating only the IRA. In such patients, inpatient total revascularization may be considered, but larger clinical trials are required to confirm this result and specifically address whether this strategy is associated with improved survival. (Complete Versus Lesion-only Primary PCI Pilot Study [CvLPRIT]; ISRCTN70913605) PMID:25766941

  15. Light scattering from laser induced pit ensembles on high power laser optics

    DOE PAGES

    Feigenbaum, Eyal; Elhadj, Selim; Matthews, Manyalibo J.

    2015-01-01

    Far-field light scattering characteristics from randomly arranged shallow Gaussian-like shaped laser induced pits, found on optics exposed to high energy laser pulses, is studied. Closed-form expressions for the far-field intensity distribution and scattered power are derived for individual pits and validated using numerical calculations of both Fourier optics and FDTD solutions to Maxwell’s equations. It is found that the scattered power is proportional to the square of the pit width and approximately also to the square of the pit depth, with the proportionality factor scaling with pit depth. As a result, the power scattered from shallow pitted optics is expectedmore » to be substantially lower than assuming complete scattering from the total visible footprint of the pits.« less

  16. Development of land based radar polarimeter processor system

    NASA Technical Reports Server (NTRS)

    Kronke, C. W.; Blanchard, A. J.

    1983-01-01

    The processing subsystem of a land based radar polarimeter was designed and constructed. This subsystem is labeled the remote data acquisition and distribution system (RDADS). The radar polarimeter, an experimental remote sensor, incorporates the RDADS to control all operations of the sensor. The RDADS uses industrial standard components including an 8-bit microprocessor based single board computer, analog input/output boards, a dynamic random access memory board, and power supplis. A high-speed digital electronics board was specially designed and constructed to control range-gating for the radar. A complete system of software programs was developed to operate the RDADS. The software uses a powerful real time, multi-tasking, executive package as an operating system. The hardware and software used in the RDADS are detailed. Future system improvements are recommended.

  17. Opinion dynamics in a group-based society

    NASA Astrophysics Data System (ADS)

    Gargiulo, F.; Huet, S.

    2010-09-01

    Many models have been proposed to analyze the evolution of opinion structure due to the interaction of individuals in their social environment. Such models analyze the spreading of ideas both in completely interacting backgrounds and on social networks, where each person has a finite set of interlocutors. In this paper we analyze the reciprocal feedback between the opinions of the individuals and the structure of the interpersonal relationships at the level of community structures. For this purpose we define a group-based random network and we study how this structure co-evolves with opinion dynamics processes. We observe that the adaptive network structure affects the opinion dynamics process helping the consensus formation. The results also show interesting behaviors in regards to the size distribution of the groups and their correlation with opinion structure.

  18. The Neurona at Home project: Simulating a large-scale cellular automata brain in a distributed computing environment

    NASA Astrophysics Data System (ADS)

    Acedo, L.; Villanueva-Oller, J.; Moraño, J. A.; Villanueva, R.-J.

    2013-01-01

    The Berkeley Open Infrastructure for Network Computing (BOINC) has become the standard open source solution for grid computing in the Internet. Volunteers use their computers to complete an small part of the task assigned by a dedicated server. We have developed a BOINC project called Neurona@Home whose objective is to simulate a cellular automata random network with, at least, one million neurons. We consider a cellular automata version of the integrate-and-fire model in which excitatory and inhibitory nodes can activate or deactivate neighbor nodes according to a set of probabilistic rules. Our aim is to determine the phase diagram of the model and its behaviour and to compare it with the electroencephalographic signals measured in real brains.

  19. Effects of language of assessment on the measurement of acculturation: measurement equivalence and cultural frame switching.

    PubMed

    Schwartz, Seth J; Benet-Martínez, Verónica; Knight, George P; Unger, Jennifer B; Zamboanga, Byron L; Des Rosiers, Sabrina E; Stephens, Dionne P; Huang, Shi; Szapocznik, José

    2014-03-01

    The present study used a randomized design, with fully bilingual Hispanic participants from the Miami area, to investigate 2 sets of research questions. First, we sought to ascertain the extent to which measures of acculturation (Hispanic and U.S. practices, values, and identifications) satisfied criteria for linguistic measurement equivalence. Second, we sought to examine whether cultural frame switching would emerge--that is, whether latent acculturation mean scores for U.S. acculturation would be higher among participants randomized to complete measures in English and whether latent acculturation mean scores for Hispanic acculturation would be higher among participants randomized to complete measures in Spanish. A sample of 722 Hispanic students from a Hispanic-serving university participated in the study. Participants were first asked to complete translation tasks to verify that they were fully bilingual. Based on ratings from 2 independent coders, 574 participants (79.5% of the sample) qualified as fully bilingual and were randomized to complete the acculturation measures in either English or Spanish. Theoretically relevant criterion measures--self-esteem, depressive symptoms, and personal identity--were also administered in the randomized language. Measurement equivalence analyses indicated that all of the acculturation measures--Hispanic and U.S. practices, values, and identifications-met criteria for configural, weak/metric, strong/scalar, and convergent validity equivalence. These findings indicate that data generated using acculturation measures can, at least under some conditions, be combined or compared across languages of administration. Few latent mean differences emerged. These results are discussed in terms of the measurement of acculturation in linguistically diverse populations. 2014 APA

  20. Effects of Language of Assessment on the Measurement of Acculturation: Measurement Equivalence and Cultural Frame Switching

    PubMed Central

    Schwartz, Seth J.; Benet-Martínez, Verónica; Knight, George P.; Unger, Jennifer B.; Zamboanga, Byron L.; Des Rosiers, Sabrina E.; Stephens, Dionne; Huang, Shi; Szapocznik, José

    2014-01-01

    The present study used a randomized design, with fully bilingual Hispanic participants from the Miami area, to investigate two sets of research questions. First, we sought to ascertain the extent to which measures of acculturation (heritage and U.S. practices, values, and identifications) satisfied criteria for linguistic measurement equivalence. Second, we sought to examine whether cultural frame switching would emerge – that is, whether latent acculturation mean scores for U.S. acculturation would be higher among participants randomized to complete measures in English, and whether latent acculturation mean scores for Hispanic acculturation would be higher among participants randomized to complete measures in Spanish. A sample of 722 Hispanic students from a Hispanic-serving university participated in the study. Participants were first asked to complete translation tasks to verify that they were fully bilingual. Based on ratings from two independent coders, 574 participants (79.5% of the sample) qualified as fully bilingual and were randomized to complete the acculturation measures in either English or Spanish. Theoretically relevant criterion measures – self-esteem, depressive symptoms, and personal identity – were also administered in the randomized language. Measurement equivalence analyses indicated that all of the acculturation measures – Hispanic and U.S. practices, values, and identifications – met criteria for configural, weak/metric, strong/scalar, and convergent validity equivalence. These findings indicate that data generated using acculturation measures can, at least under some conditions, be combined or compared across languages of administration. Few latent mean differences emerged. These results are discussed in terms of the measurement of acculturation in linguistically diverse populations. PMID:24188146

  1. The influence of the directional energy distribution on the nonlinear dispersion relation in a random gravity wave field

    NASA Technical Reports Server (NTRS)

    Huang, N. E.; Tung, C.-C.

    1977-01-01

    The influence of the directional distribution of wave energy on the dispersion relation is calculated numerically using various directional wave spectrum models. The results indicate that the dispersion relation varies both as a function of the directional energy distribution and the direction of propagation of the wave component under consideration. Furthermore, both the mean deviation and the random scatter from the linear approximation increase as the energy spreading decreases. Limited observational data are compared with the theoretical results. The agreement is favorable.

  2. Use of 3D printed models in medical education: A randomized control trial comparing 3D prints versus cadaveric materials for learning external cardiac anatomy.

    PubMed

    Lim, Kah Heng Alexander; Loo, Zhou Yaw; Goldie, Stephen J; Adams, Justin W; McMenamin, Paul G

    2016-05-06

    Three-dimensional (3D) printing is an emerging technology capable of readily producing accurate anatomical models, however, evidence for the use of 3D prints in medical education remains limited. A study was performed to assess their effectiveness against cadaveric materials for learning external cardiac anatomy. A double blind randomized controlled trial was undertaken on undergraduate medical students without prior formal cardiac anatomy teaching. Following a pre-test examining baseline external cardiac anatomy knowledge, participants were randomly assigned to three groups who underwent self-directed learning sessions using either cadaveric materials, 3D prints, or a combination of cadaveric materials/3D prints (combined materials). Participants were then subjected to a post-test written by a third party. Fifty-two participants completed the trial; 18 using cadaveric materials, 16 using 3D models, and 18 using combined materials. Age and time since completion of high school were equally distributed between groups. Pre-test scores were not significantly different (P = 0.231), however, post-test scores were significantly higher for 3D prints group compared to the cadaveric materials or combined materials groups (mean of 60.83% vs. 44.81% and 44.62%, P = 0.010, adjusted P = 0.012). A significant improvement in test scores was detected for the 3D prints group (P = 0.003) but not for the other two groups. The finding of this pilot study suggests that use of 3D prints do not disadvantage students relative to cadaveric materials; maximally, results suggest that 3D may confer certain benefits to anatomy learning and supports their use and ongoing evaluation as supplements to cadaver-based curriculums. Anat Sci Educ 9: 213-221. © 2015 American Association of Anatomists. © 2015 American Association of Anatomists.

  3. Financial Incentives to Increase Advance Care Planning Among Medicaid Beneficiaries: Lessons Learned From Two Pragmatic Randomized Trials.

    PubMed

    Barnato, Amber E; Moore, Robert; Moore, Charity G; Kohatsu, Neal D; Sudore, Rebecca L

    2017-07-01

    Medicaid populations have low rates of advance care planning (ACP). Potential policy interventions include financial incentives. To test the effectiveness of patient plus provider financial incentive compared with provider financial incentive alone for increasing ACP discussions among Medicaid patients. Between April 2014 and July 2015, we conducted two sequential assessor-blinded pragmatic randomized trials in a health plan that pays primary care providers (PCPs) $100 to discuss ACP: 1) a parallel cluster trial (provider-delivered patient incentive) and 2) an individual-level trial (mail-delivered patient incentive). Control and intervention arms included encouragement to complete ACP, instructions for using an online ACP tool, and (in the intervention arm) $50 for completing the online ACP tool and a small probability of $1000 (i.e., lottery) for discussing ACP with their PCP. The primary outcome was provider-reported ACP discussion within three months. In the provider-delivered patient incentive study, 38 PCPs were randomized to the intervention (n = 18) or control (n = 20) and given 10 patient packets each to distribute. Using an intention-to-treat analysis, there were 27 of 180 ACP discussions (15%) in the intervention group and 5 of 200 (2.5%) in the control group (P = .0391). In the mail-delivered patient incentive study, there were 5 of 187 ACP discussions (2.7%) in the intervention group and 5 of 189 (2.6%) in the control group (P = .99). ACP rates were low despite an existing provider financial incentive. Adding a provider-delivered patient financial incentive, but not a mail-delivered patient incentive, modestly increased ACP discussions. PCP encouragement combined with a patient incentive may be more powerful than either encouragement or incentive alone. Copyright © 2017 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  4. A multi-center randomized controlled trial to compare a self-ligating bracket with a conventional bracket in a UK population: Part 1: Treatment efficiency.

    PubMed

    O'Dywer, Lian; Littlewood, Simon J; Rahman, Shahla; Spencer, R James; Barber, Sophy K; Russell, Joanne S

    2016-01-01

    To use a two-arm parallel trial to compare treatment efficiency between a self-ligating and a conventional preadjusted edgewise appliance system. A prospective multi-center randomized controlled clinical trial was conducted in three hospital orthodontic departments. Subjects were randomly allocated to receive treatment with either a self-ligating (3M SmartClip) or conventional (3M Victory) preadjusted edgewise appliance bracket system using a computer-generated random sequence concealed in opaque envelopes, with stratification for operator and center. Two operators followed a standardized protocol regarding bracket bonding procedure and archwire sequence. Efficiency of each ligation system was assessed by comparing the duration of treatment (months), total number of appointments (scheduled and emergency visits), and number of bracket bond failures. One hundred thirty-eight subjects (mean age 14 years 11 months) were enrolled in the study, of which 135 subjects (97.8%) completed treatment. The mean treatment time and number of visits were 25.12 months and 19.97 visits in the SmartClip group and 25.80 months and 20.37 visits in the Victory group. The overall bond failure rate was 6.6% for the SmartClip and 7.2% for Victory, with a similar debond distribution between the two appliances. No significant differences were found between the bracket systems in any of the outcome measures. No serious harm was observed from either bracket system. There was no clinically significant difference in treatment efficiency between treatment with a self-ligating bracket system and a conventional ligation system.

  5. Random-growth urban model with geographical fitness

    NASA Astrophysics Data System (ADS)

    Kii, Masanobu; Akimoto, Keigo; Doi, Kenji

    2012-12-01

    This paper formulates a random-growth urban model with a notion of geographical fitness. Using techniques of complex-network theory, we study our system as a type of preferential-attachment model with fitness, and we analyze its macro behavior to clarify the properties of the city-size distributions it predicts. First, restricting the geographical fitness to take positive values and using a continuum approach, we show that the city-size distributions predicted by our model asymptotically approach Pareto distributions with coefficients greater than unity. Then, allowing the geographical fitness to take negative values, we perform local coefficient analysis to show that the predicted city-size distributions can deviate from Pareto distributions, as is often observed in actual city-size distributions. As a result, the model we propose can generate a generic class of city-size distributions, including but not limited to Pareto distributions. For applications to city-population projections, our simple model requires randomness only when new cities are created, not during their subsequent growth. This property leads to smooth trajectories of city population growth, in contrast to other models using Gibrat’s law. In addition, a discrete form of our dynamical equations can be used to estimate past city populations based on present-day data; this fact allows quantitative assessment of the performance of our model. Further study is needed to determine appropriate formulas for the geographical fitness.

  6. Cancer treatment adherence among low-income women with breast or gynecologic cancer: a randomized controlled trial of patient navigation.

    PubMed

    Ell, Kathleen; Vourlekis, Betsy; Xie, Bin; Nedjat-Haiem, Frances R; Lee, Pey-Jiuan; Muderspach, Laila; Russell, Christy; Palinkas, Lawrence A

    2009-10-01

    The authors implemented a controlled, randomized trial that compared 2 interventions: the provision of written resource navigation information (enhanced usual care [EUC]) versus written information plus patient navigation (TPN) aimed at improving adjuvant treatment adherence and follow-up among 487 low-income, predominantly Hispanic women with breast cancer or gynecologic cancer. Women were randomized to receive either TPN or EUC; and chemotherapy, radiation therapy, hormone therapy, and follow-up were assessed over 12 months. Patients with breast cancer were analyzed separately from patients with gynecologic cancer. Overall adherence rates ranged from 87% to 94%, and there were no significant differences between the TPN group and the EUC group. Among women with breast cancer, 90% of the EUC group and 88% of the TPN group completed chemotherapy (14% of the EUC group and 26% of the TPN group delayed the completion of chemotherapy), 2% of the EUC group and 4% of the TPN group failed to complete chemotherapy, and 8% of the EUC group and 7% of the TPN group refused chemotherapy. Radiation treatment adherence was similar between the groups: Ninety percent of patients completed radiation (40% of the EUC group and 42% of the TPN group delayed the completion of radiation); in both groups, 2% failed to complete radiation, and 8% refused radiation. Among gynecologic patients, 87% of the EUC group and 94% of the TPN group completed chemotherapy (41% of the EUC group and 31% of the TPN group completed it with delays), 7% of the EUC group and 6% of the TPN group failed to complete chemotherapy, 6% of the EUC refused chemotherapy, 87% of the EUC group and 84% of the TPN group completed radiation (51% of the EUC group and 42% of the TPN with delays), 5% of the EUC group and 8% of the TPN group failed to complete radiation, and 8% of the EUC group and 5% of the TPN group refused radiation. Treatment adherence across randomized groups was notably higher than reported in previous studies, suggesting that active telephone patient navigation or written resource informational materials may facilitate adherence among low-income, predominantly Hispanic women. Adherence also may have be facilitated by federal-state breast and cervical cancer treatment funding. 2009 American Cancer Society.

  7. Random elements on lattices: Review and statistical applications

    NASA Astrophysics Data System (ADS)

    Potocký, Rastislav; Villarroel, Claudia Navarro; Sepúlveda, Maritza; Luna, Guillermo; Stehlík, Milan

    2017-07-01

    We discuss important contributions to random elements on lattices. We relate to both algebraic and probabilistic properties. Several applications and concepts are discussed, e.g. positive dependence, Random walks and distributions on lattices, Super-lattices, learning. The application to Chilean Ecology is given.

  8. On the extreme value statistics of normal random matrices and 2D Coulomb gases: Universality and finite N corrections

    NASA Astrophysics Data System (ADS)

    Ebrahimi, R.; Zohren, S.

    2018-03-01

    In this paper we extend the orthogonal polynomials approach for extreme value calculations of Hermitian random matrices, developed by Nadal and Majumdar (J. Stat. Mech. P04001 arXiv:1102.0738), to normal random matrices and 2D Coulomb gases in general. Firstly, we show that this approach provides an alternative derivation of results in the literature. More precisely, we show convergence of the rescaled eigenvalue with largest modulus of a normal Gaussian ensemble to a Gumbel distribution, as well as universality for an arbitrary radially symmetric potential. Secondly, it is shown that this approach can be generalised to obtain convergence of the eigenvalue with smallest modulus and its universality for ring distributions. Most interestingly, the here presented techniques are used to compute all slowly varying finite N correction of the above distributions, which is important for practical applications, given the slow convergence. Another interesting aspect of this work is the fact that we can use standard techniques from Hermitian random matrices to obtain the extreme value statistics of non-Hermitian random matrices resembling the large N expansion used in context of the double scaling limit of Hermitian matrix models in string theory.

  9. The model of drugs distribution dynamics in biological tissue

    NASA Astrophysics Data System (ADS)

    Ginevskij, D. A.; Izhevskij, P. V.; Sheino, I. N.

    2017-09-01

    The dose distribution by Neutron Capture Therapy follows the distribution of 10B in the tissue. The modern models of pharmacokinetics of drugs describe the processes occurring in conditioned "chambers" (blood-organ-tumor), but fail to describe the spatial distribution of the drug in the tumor and in normal tissue. The mathematical model of the spatial distribution dynamics of drugs in the tissue, depending on the concentration of the drug in the blood, was developed. The modeling method is the representation of the biological structure in the form of a randomly inhomogeneous medium in which the 10B distribution occurs. The parameters of the model, which cannot be determined rigorously in the experiment, are taken as the quantities subject to the laws of the unconnected random processes. The estimates of 10B distribution preparations in the tumor and healthy tissue, inside/outside the cells, are obtained.

  10. Linear velocity fields in non-Gaussian models for large-scale structure

    NASA Technical Reports Server (NTRS)

    Scherrer, Robert J.

    1992-01-01

    Linear velocity fields in two types of physically motivated non-Gaussian models are examined for large-scale structure: seed models, in which the density field is a convolution of a density profile with a distribution of points, and local non-Gaussian fields, derived from a local nonlinear transformation on a Gaussian field. The distribution of a single component of the velocity is derived for seed models with randomly distributed seeds, and these results are applied to the seeded hot dark matter model and the global texture model with cold dark matter. An expression for the distribution of a single component of the velocity in arbitrary local non-Gaussian models is given, and these results are applied to such fields with chi-squared and lognormal distributions. It is shown that all seed models with randomly distributed seeds and all local non-Guassian models have single-component velocity distributions with positive kurtosis.

  11. Satisfaction in complete denture wearers with and without adhesives: A randomized, crossover, double-blind clinical trial

    PubMed Central

    Torres-Sánchez, Carlos; Montoya-Salazar, Vanessa; Gutierrez-Pérez, Jose-Luis; Jimenez-Castellanos, Emilio

    2018-01-01

    Background The purpose of this study was to compare the satisfaction of patients regarding retention, stability and accumulation of particles with a randomized, double-blind crossed method in users with complete dentures with and without adhesive. Material and Methods Seventeen edentulous individuals were randomized and received new upper and lower complete dentures. After a period of adaptation, they participated in some masticatory tests and clinical revisions, after use the protheses with and without the use of two denture adhesives: Adhesive A (Fittydent, Fittydent International GmbH) and adhesive B (Corega, GlaxoSmithKline) at 0, 7 and 14 days. Satisfaction was measured immediately after each test through a survey using a VAS scale (0-10) and data were analyzed with McNemar’s test with Bonferroni correction. Results The results showed significant differences (p<.01) between the study groups with adhesive A - B and the group without adhesive, but no significant differences were found between the two stickers for any of the variables studied. Conclusions Complete denture adhesives significantly improved the satisfaction of patients because a better retention, stability and less accumulation of particles of the food substitute between the denture and the mucosa is obtained compared with non-use of complete denture adhesives. Key words:Complete dentures, patient satisfaction, denture adhesives, clinical trials. PMID:29946414

  12. Direct Breakthrough Curve Prediction From Statistics of Heterogeneous Conductivity Fields

    NASA Astrophysics Data System (ADS)

    Hansen, Scott K.; Haslauer, Claus P.; Cirpka, Olaf A.; Vesselinov, Velimir V.

    2018-01-01

    This paper presents a methodology to predict the shape of solute breakthrough curves in heterogeneous aquifers at early times and/or under high degrees of heterogeneity, both cases in which the classical macrodispersion theory may not be applicable. The methodology relies on the observation that breakthrough curves in heterogeneous media are generally well described by lognormal distributions, and mean breakthrough times can be predicted analytically. The log-variance of solute arrival is thus sufficient to completely specify the breakthrough curves, and this is calibrated as a function of aquifer heterogeneity and dimensionless distance from a source plane by means of Monte Carlo analysis and statistical regression. Using the ensemble of simulated groundwater flow and solute transport realizations employed to calibrate the predictive regression, reliability estimates for the prediction are also developed. Additional theoretical contributions include heuristics for the time until an effective macrodispersion coefficient becomes applicable, and also an expression for its magnitude that applies in highly heterogeneous systems. It is seen that the results here represent a way to derive continuous time random walk transition distributions from physical considerations rather than from empirical field calibration.

  13. Delay and cost performance analysis of the diffie-hellman key exchange protocol in opportunistic mobile networks

    NASA Astrophysics Data System (ADS)

    Soelistijanto, B.; Muliadi, V.

    2018-03-01

    Diffie-Hellman (DH) provides an efficient key exchange system by reducing the number of cryptographic keys distributed in the network. In this method, a node broadcasts a single public key to all nodes in the network, and in turn each peer uses this key to establish a shared secret key which then can be utilized to encrypt and decrypt traffic between the peer and the given node. In this paper, we evaluate the key transfer delay and cost performance of DH in opportunistic mobile networks, a specific scenario of MANETs where complete end-to-end paths rarely exist between sources and destinations; consequently, the end-to-end delays in these networks are much greater than typical MANETs. Simulation results, driven by a random node movement model and real human mobility traces, showed that DH outperforms a typical key distribution scheme based on the RSA algorithm in terms of key transfer delay, measured by average key convergence time; however, DH performs as well as the benchmark in terms of key transfer cost, evaluated by total key (copies) forwards.

  14. Graphene materials having randomly distributed two-dimensional structural defects

    DOEpatents

    Kung, Harold H; Zhao, Xin; Hayner, Cary M; Kung, Mayfair C

    2013-10-08

    Graphene-based storage materials for high-power battery applications are provided. The storage materials are composed of vertical stacks of graphene sheets and have reduced resistance for Li ion transport. This reduced resistance is achieved by incorporating a random distribution of structural defects into the stacked graphene sheets, whereby the structural defects facilitate the diffusion of Li ions into the interior of the storage materials.

  15. Graphene materials having randomly distributed two-dimensional structural defects

    DOEpatents

    Kung, Harold H.; Zhao, Xin; Hayner, Cary M.; Kung, Mayfair C.

    2016-05-31

    Graphene-based storage materials for high-power battery applications are provided. The storage materials are composed of vertical stacks of graphene sheets and have reduced resistance for Li ion transport. This reduced resistance is achieved by incorporating a random distribution of structural defects into the stacked graphene sheets, whereby the structural defects facilitate the diffusion of Li ions into the interior of the storage materials.

  16. Gravitational lensing by eigenvalue distributions of random matrix models

    NASA Astrophysics Data System (ADS)

    Martínez Alonso, Luis; Medina, Elena

    2018-05-01

    We propose to use eigenvalue densities of unitary random matrix ensembles as mass distributions in gravitational lensing. The corresponding lens equations reduce to algebraic equations in the complex plane which can be treated analytically. We prove that these models can be applied to describe lensing by systems of edge-on galaxies. We illustrate our analysis with the Gaussian and the quartic unitary matrix ensembles.

  17. The Mean Distance to the nth Neighbour in a Uniform Distribution of Random Points: An Application of Probability Theory

    ERIC Educational Resources Information Center

    Bhattacharyya, Pratip; Chakrabarti, Bikas K.

    2008-01-01

    We study different ways of determining the mean distance (r[subscript n]) between a reference point and its nth neighbour among random points distributed with uniform density in a D-dimensional Euclidean space. First, we present a heuristic method; though this method provides only a crude mathematical result, it shows a simple way of estimating…

  18. Timing the Random and Anomalous Arrival of Particles in a Geiger Counter with GPS Devices

    ERIC Educational Resources Information Center

    Blanco, F.; La Rocca, P.; Riggi, F.; Riggi, S.

    2008-01-01

    The properties of the arrival time distribution of particles in a detector have been studied by the use of a small Geiger counter, with a GPS device to tag the event time. The experiment is intended to check the basic properties of the random arrival time distribution between successive events and to simulate the investigations carried out by…

  19. Spline methods for approximating quantile functions and generating random samples

    NASA Technical Reports Server (NTRS)

    Schiess, J. R.; Matthews, C. G.

    1985-01-01

    Two cubic spline formulations are presented for representing the quantile function (inverse cumulative distribution function) of a random sample of data. Both B-spline and rational spline approximations are compared with analytic representations of the quantile function. It is also shown how these representations can be used to generate random samples for use in simulation studies. Comparisons are made on samples generated from known distributions and a sample of experimental data. The spline representations are more accurate for multimodal and skewed samples and to require much less time to generate samples than the analytic representation.

  20. Computation of convex bounds for present value functions with random payments

    NASA Astrophysics Data System (ADS)

    Ahcan, Ales; Darkiewicz, Grzegorz; Goovaerts, Marc; Hoedemakers, Tom

    2006-02-01

    In this contribution we study the distribution of the present value function of a series of random payments in a stochastic financial environment. Such distributions occur naturally in a wide range of applications within fields of insurance and finance. We obtain accurate approximations by developing upper and lower bounds in the convex-order sense for present value functions. Technically speaking, our methodology is an extension of the results of Dhaene et al. [Insur. Math. Econom. 31(1) (2002) 3-33, Insur. Math. Econom. 31(2) (2002) 133-161] to the case of scalar products of mutually independent random vectors.

  1. The concordance index C and the Mann-Whitney parameter Pr(X>Y) with randomly censored data.

    PubMed

    Koziol, James A; Jia, Zhenyu

    2009-06-01

    Harrell's c-index or concordance C has been widely used as a measure of separation of two survival distributions. In the absence of censored data, the c-index estimates the Mann-Whitney parameter Pr(X>Y), which has been repeatedly utilized in various statistical contexts. In the presence of randomly censored data, the c-index no longer estimates Pr(X>Y); rather, a parameter that involves the underlying censoring distributions. This is in contrast to Efron's maximum likelihood estimator of the Mann-Whitney parameter, which is recommended in the setting of random censorship.

  2. The causes of spatial patterning of mounds of a fungus-cultivating termite: results from nearest-neighbour analysis and ecological studies.

    PubMed

    Korb, Judith; Linsenmair, Karl Eduard

    2001-05-01

    Little is known about processes regulating population dynamics in termites. We investigated the distribution of mound-colonies of the fungus-cultivating termite Macrotermes bellicosus (Smeathman) in two habitats in the Comoé National Park (Côte d'Ivoire) with nearest-neighbour analysis differentiating between different age classes. These results were compared with ecological data on processes influencing population dynamics. High mound densities were recorded in shrub savannah while only a few mounds were found in gallery forest. Mounds were distributed randomly in both habitats when all mounds were considered together, and when inhabited and uninhabited mounds were treated separately. However, distinctive non-random patterns were revealed in the savannah when we distinguished between different age classes. Small, young colonies were aggregated when they coexisted with larger, older colonies, which were more regularly distributed. This indicates that the distribution of older colonies is influenced by intraspecific competition whereas that of younger colonies is influenced by opposing factors that lead to aggregation. This is in accordance with ecological data. Food is a limiting resource for large colonies, while patchily distributed appropriate microclimatic conditions seem to be more important for young colonies. Colonies that had formerly coexisted (i.e. living colonies and recently dead colonies) showed aggregated, random and regular distribution patterns, suggesting several causes of mortality. Colonies that had never had contact with each other were randomly distributed and no specific regulation mechanism was implicated. These results show that different age classes seem to be regulated by different processes and that separation between age classes is necessary to reveal indicative spatial patterns in nearest-neighbour analysis.

  3. ATLAS, an integrated structural analysis and design system. Volume 4: Random access file catalog

    NASA Technical Reports Server (NTRS)

    Gray, F. P., Jr. (Editor)

    1979-01-01

    A complete catalog is presented for the random access files used by the ATLAS integrated structural analysis and design system. ATLAS consists of several technical computation modules which output data matrices to corresponding random access file. A description of the matrices written on these files is contained herein.

  4. A randomized intervention of reminder letter for human papillomavirus vaccine series completion.

    PubMed

    Chao, Chun; Preciado, Melissa; Slezak, Jeff; Xu, Lanfang

    2015-01-01

    Completion rate for the three-dose series of the human papillomavirus (HPV) vaccine has generally been low. This study evaluated the effectiveness of a reminder letter intervention on HPV vaccine three-dose series completion. Female members of Kaiser Permanente Southern California Health Plan who received at least one dose, but not more than two doses, of the HPV vaccine by February 13, 2013, and who were between ages 9 and 26 years at the time of first HPV vaccination were included. Eighty percent of these females were randomized to receive the reminder letter, and 20% were randomized to receive standard of care (control). The reminder letters were mailed quarterly to those who had not completed the series. The proportion of series completion at the end of the 12-month evaluation period was compared using chi-square test. A total of 9,760 females were included in the intervention group and 2,445 in the control group. HPV vaccine series completion was 56.4% in the intervention group and 46.6% in the control groups (p < .001). The effect of the intervention appeared to be stronger in girls aged 9-17 years compared with young women aged 18-26 years at the first dose and in blacks compared with whites. Reminder letters scheduled quarterly were effective to enhance HPV vaccine series completion among those who initiated the vaccine. However, a large gap in series completion remained despite the intervention. Future studies should address other barriers to series completion, including those at the providers and the health care system level. Copyright © 2015 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  5. Offdiagonal complexity: A computationally quick complexity measure for graphs and networks

    NASA Astrophysics Data System (ADS)

    Claussen, Jens Christian

    2007-02-01

    A vast variety of biological, social, and economical networks shows topologies drastically differing from random graphs; yet the quantitative characterization remains unsatisfactory from a conceptual point of view. Motivated from the discussion of small scale-free networks, a biased link distribution entropy is defined, which takes an extremum for a power-law distribution. This approach is extended to the node-node link cross-distribution, whose nondiagonal elements characterize the graph structure beyond link distribution, cluster coefficient and average path length. From here a simple (and computationally cheap) complexity measure can be defined. This offdiagonal complexity (OdC) is proposed as a novel measure to characterize the complexity of an undirected graph, or network. While both for regular lattices and fully connected networks OdC is zero, it takes a moderately low value for a random graph and shows high values for apparently complex structures as scale-free networks and hierarchical trees. The OdC approach is applied to the Helicobacter pylori protein interaction network and randomly rewired surrogates.

  6. Analysis of in vitro evolution reveals the underlying distribution of catalytic activity among random sequences.

    PubMed

    Pressman, Abe; Moretti, Janina E; Campbell, Gregory W; Müller, Ulrich F; Chen, Irene A

    2017-08-21

    The emergence of catalytic RNA is believed to have been a key event during the origin of life. Understanding how catalytic activity is distributed across random sequences is fundamental to estimating the probability that catalytic sequences would emerge. Here, we analyze the in vitro evolution of triphosphorylating ribozymes and translate their fitnesses into absolute estimates of catalytic activity for hundreds of ribozyme families. The analysis efficiently identified highly active ribozymes and estimated catalytic activity with good accuracy. The evolutionary dynamics follow Fisher's Fundamental Theorem of Natural Selection and a corollary, permitting retrospective inference of the distribution of fitness and activity in the random sequence pool for the first time. The frequency distribution of rate constants appears to be log-normal, with a surprisingly steep dropoff at higher activity, consistent with a mechanism for the emergence of activity as the product of many independent contributions. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. Random versus maximum entropy models of neural population activity

    NASA Astrophysics Data System (ADS)

    Ferrari, Ulisse; Obuchi, Tomoyuki; Mora, Thierry

    2017-04-01

    The principle of maximum entropy provides a useful method for inferring statistical mechanics models from observations in correlated systems, and is widely used in a variety of fields where accurate data are available. While the assumptions underlying maximum entropy are intuitive and appealing, its adequacy for describing complex empirical data has been little studied in comparison to alternative approaches. Here, data from the collective spiking activity of retinal neurons is reanalyzed. The accuracy of the maximum entropy distribution constrained by mean firing rates and pairwise correlations is compared to a random ensemble of distributions constrained by the same observables. For most of the tested networks, maximum entropy approximates the true distribution better than the typical or mean distribution from that ensemble. This advantage improves with population size, with groups as small as eight being almost always better described by maximum entropy. Failure of maximum entropy to outperform random models is found to be associated with strong correlations in the population.

  8. Using resource use logs to reduce the amount of missing data in economic evaluations alongside trials.

    PubMed

    Marques, Elsa; Johnson, Emma C; Gooberman-Hill, Rachael; Blom, Ashley W; Noble, Sian

    2013-01-01

    Economic evaluations alongside randomized controlled trials that collect data using patient-completed questionnaires are prone to missing data. Our objective was to determine whether giving patients a resource use log (RUL) at baseline would improve the odds of completing questions in a follow-up resource use questionnaire (RUQ) and to identify patients' views on RUL's usefulness and acceptability. The RUL study was a randomized controlled trial and qualitative study nested within a larger randomized controlled trial (the Arthroplasty Pain Experience Study trial). Eighty-five patients were randomized at baseline to receive or not receive an RUL. At 3-month follow-up, all participants received a postal RUQ. We created dummy variables for 13 resource use categories indicating whether complete information had been given for each category. We compared the completion rates between arms by using descriptive statistics and logistic regression. We explored patients' experience of using the RUL by interviewing a different subsample of Arthroplasty Pain Experience Study patients (n = 24) at 2- to 4-week follow-up. At 3 months, 74 of the 85 (87% in each arm) patients returned the RUQ. Patients in the RUL arm were 3.5 times more likely to complete the National Health Service community-based services category (P = 0.08). The RUL was positively received by patients and was generally seen as a useful memory aid. The RUL is a useful and acceptable tool in reducing the amount of missing data for some types of resource use. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  9. Evaluation of TOPLATS on three Mediterranean catchments

    NASA Astrophysics Data System (ADS)

    Loizu, Javier; Álvarez-Mozos, Jesús; Casalí, Javier; Goñi, Mikel

    2016-08-01

    Physically based hydrological models are complex tools that provide a complete description of the different processes occurring on a catchment. The TOPMODEL-based Land-Atmosphere Transfer Scheme (TOPLATS) simulates water and energy balances at different time steps, in both lumped and distributed modes. In order to gain insight on the behavior of TOPLATS and its applicability in different conditions a detailed evaluation needs to be carried out. This study aimed to develop a complete evaluation of TOPLATS including: (1) a detailed review of previous research works using this model; (2) a sensitivity analysis (SA) of the model with two contrasted methods (Morris and Sobol) of different complexity; (3) a 4-step calibration strategy based on a multi-start Powell optimization algorithm; and (4) an analysis of the influence of simulation time step (hourly vs. daily). The model was applied on three catchments of varying size (La Tejeria, Cidacos and Arga), located in Navarre (Northern Spain), and characterized by different levels of Mediterranean climate influence. Both Morris and Sobol methods showed very similar results that identified Brooks-Corey Pore Size distribution Index (B), Bubbling pressure (ψc) and Hydraulic conductivity decay (f) as the three overall most influential parameters in TOPLATS. After calibration and validation, adequate streamflow simulations were obtained in the two wettest catchments, but the driest (Cidacos) gave poor results in validation, due to the large climatic variability between calibration and validation periods. To overcome this issue, an alternative random and discontinuous method of cal/val period selection was implemented, improving model results.

  10. A Random Variable Transformation Process.

    ERIC Educational Resources Information Center

    Scheuermann, Larry

    1989-01-01

    Provides a short BASIC program, RANVAR, which generates random variates for various theoretical probability distributions. The seven variates include: uniform, exponential, normal, binomial, Poisson, Pascal, and triangular. (MVL)

  11. Overlap Properties of Clouds Generated by a Cloud Resolving Model

    NASA Technical Reports Server (NTRS)

    Oreopoulos, L.; Khairoutdinov, M.

    2002-01-01

    In order for General Circulation Models (GCMs), one of our most important tools to predict future climate, to correctly describe the propagation of solar and thermal radiation through the cloudy atmosphere a realistic description of the vertical distribution of cloud amount is needed. Actually, one needs not only the cloud amounts at different levels of the atmosphere, but also how these cloud amounts are related, in other words, how they overlap. Currently GCMs make some idealized assumptions about cloud overlap, for example that contiguous cloud layers overlap maximally and non-contiguous cloud layers overlap in a random fashion. Since there are difficulties in obtaining the vertical profile of cloud amount from observations, the realism of the overlap assumptions made in GCMs has not been yet rigorously investigated. Recently however, cloud observations from a relatively new type of ground radar have been used to examine the vertical distribution of cloudiness. These observations suggest that the GCM overlap assumptions are dubious. Our study uses cloud fields from sophisticated models dedicated to simulate cloud formation, maintenance, and dissipation called Cloud Resolving Models . These models are generally considered capable of producing realistic three-dimensional representation of cloudiness. Using numerous cloud fields produced by such a CRM we show that the degree of overlap between cloud layers is a function of their separation distance, and is in general described by a combination of the maximum and random overlap assumption, with random overlap dominating as separation distances increase. We show that it is possible to parameterize this behavior in a way that can eventually be incorporated in GCMs. Our results seem to have a significant resemblance to the results from the radar observations despite the completely different nature of the datasets. This consistency is encouraging and will promote development of new radiative transfer codes that will estimate the radiation effects of multi-layer cloud fields more accurately.

  12. Advanced analysis of forest fire clustering

    NASA Astrophysics Data System (ADS)

    Kanevski, Mikhail; Pereira, Mario; Golay, Jean

    2017-04-01

    Analysis of point pattern clustering is an important topic in spatial statistics and for many applications: biodiversity, epidemiology, natural hazards, geomarketing, etc. There are several fundamental approaches used to quantify spatial data clustering using topological, statistical and fractal measures. In the present research, the recently introduced multi-point Morisita index (mMI) is applied to study the spatial clustering of forest fires in Portugal. The data set consists of more than 30000 fire events covering the time period from 1975 to 2013. The distribution of forest fires is very complex and highly variable in space. mMI is a multi-point extension of the classical two-point Morisita index. In essence, mMI is estimated by covering the region under study by a grid and by computing how many times more likely it is that m points selected at random will be from the same grid cell than it would be in the case of a complete random Poisson process. By changing the number of grid cells (size of the grid cells), mMI characterizes the scaling properties of spatial clustering. From mMI, the data intrinsic dimension (fractal dimension) of the point distribution can be estimated as well. In this study, the mMI of forest fires is compared with the mMI of random patterns (RPs) generated within the validity domain defined as the forest area of Portugal. It turns out that the forest fires are highly clustered inside the validity domain in comparison with the RPs. Moreover, they demonstrate different scaling properties at different spatial scales. The results obtained from the mMI analysis are also compared with those of fractal measures of clustering - box counting and sand box counting approaches. REFERENCES Golay J., Kanevski M., Vega Orozco C., Leuenberger M., 2014: The multipoint Morisita index for the analysis of spatial patterns. Physica A, 406, 191-202. Golay J., Kanevski M. 2015: A new estimator of intrinsic dimension based on the multipoint Morisita index. Pattern Recognition, 48, 4070-4081.

  13. The impact of resource quality on the evolution of virulence in spatially heterogeneous environments.

    PubMed

    Su, Min; Boots, Mike

    2017-03-07

    Understanding the drivers of parasite evolution and in particular disease virulence remains a major focus of evolutionary theory. Here, we examine the role of resource quality and in particular spatial environmental heterogeneity in the distribution of these resources on the evolution of virulence. There may be direct effects of resources on host susceptibility and pathogenicity alongside effects on reproduction that indirectly impact host-parasite population dynamics. Therefore, we assume that high resource quality may lead to both increased host reproduction and/or increased disease resistance. In completely mixed populations there is no effect of resource quality on the outcome of disease evolution. However, when there are local interactions higher resource quality generally selects for higher virulence/transmission for both linear and saturating transmission-virulence trade-off assumptions. The exception is that in castrators (i.e., infected hosts have no reproduction), higher virulence is selected for both low and high resource qualities at mixed local and global infection. Heterogeneity in the distribution of environment resources only has an effect on the outcome in castrators where random distributions generally select for higher virulence. Overall, our results further underline the importance of considering spatial structure in order to understand evolutionary processes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. System for Measuring Conditional Amplitude, Phase, or Time Distributions of Pulsating Phenomena

    PubMed Central

    Van Brunt, Richard J.; Cernyar, Eric W.

    1992-01-01

    A detailed description is given of an electronic stochastic analyzer for use with direct “real-time” measurements of the conditional distributions needed for a complete stochastic characterization of pulsating phenomena that can be represented as random point processes. The measurement system described here is designed to reveal and quantify effects of pulse-to-pulse or phase-to-phase memory propagation. The unraveling of memory effects is required so that the physical basis for observed statistical properties of pulsating phenomena can be understood. The individual unique circuit components that comprise the system and the combinations of these components for various measurements, are thoroughly documented. The system has been applied to the measurement of pulsating partial discharges generated by applying alternating or constant voltage to a discharge gap. Examples are shown of data obtained for conditional and unconditional amplitude, time interval, and phase-of-occurrence distributions of partial-discharge pulses. The results unequivocally show the existence of significant memory effects as indicated, for example, by the observations that the most probable amplitudes and phases-of-occurrence of discharge pulses depend on the amplitudes and/or phases of the preceding pulses. Sources of error and fundamental limitations of the present measurement approach are analyzed. Possible extensions of the method are also discussed. PMID:28053450

  15. Assessment of readability, understandability, and completeness of pediatric hospital medicine discharge instructions.

    PubMed

    Unaka, Ndidi I; Statile, Angela; Haney, Julianne; Beck, Andrew F; Brady, Patrick W; Jerardi, Karen E

    2017-02-01

    The average American adult reads at an 8th-grade level. Discharge instructions written above this level might increase the risk of adverse outcomes for children as they transition from hospital to home. We conducted a cross-sectional study at a large urban academic children's hospital to describe readability levels, understandability scores, and completeness of written instructions given to families at hospital discharge. Two hundred charts for patients discharged from the hospital medicine service were randomly selected for review. Written discharge instructions were extracted and scored for readability (Fry Readability Scale [FRS]), understandability (Patient Education Materials Assessment Tool [PEMAT]), and completeness (5 criteria determined by consensus). Descriptive statistics enumerated the distribution of readability, understandability, and completeness of written discharge instructions. Of the patients included in the study, 51% were publicly insured. Median age was 3.1 years, and median length of stay was 2.0 days. The median readability score corresponded to a 10th-grade reading level (interquartile range, 8-12; range, 1-13). Median PEMAT score was 73% (interquartile range, 64%-82%; range, 45%-100%); 36% of instructions scored below 70%, correlating with suboptimal understandability. The diagnosis was described in only 33% of the instructions. Although explicit warning signs were listed in most instructions, 38% of the instructions did not include information on the person to contact if warning signs developed. Overall, the readability, understandability, and completeness of discharge instructions were subpar. Efforts to improve the content of discharge instructions may promote safe and effective transitions home. Journal of Hospital Medicine 2017;12:98-101. © 2017 Society of Hospital Medicine.

  16. Effects of zinc supplementation on subscales of anorexia in children: A randomized controlled trial

    PubMed Central

    Khademian, Majid; Farhangpajouh, Neda; Shahsanaee, Armindokht; Bahreynian, Maryam; Mirshamsi, Mehran; Kelishadi, Roya

    2014-01-01

    Objectives: This study aims to assess the effects of zinc supplementation on improving the appetite and its subscales in children. Methods: This study was conducted in 2013 in Isfahan, Iran. It had two phases. At the first step, after validation of the Child Eating Behaviour Questionaire (CEBQ), it was completed for 300 preschool children, who were randomly selected. The second phase was conducted as a randomized controlled trial. Eighty of these children were randomly selected, and were randomly assigned to two groups of equal number receiving zinc (10 mg/day) or placebo for 12 weeks. Results: Overall 77 children completed the trial (39 in the case and 3 in the control group).The results showed that zinc supplement can improve calorie intake in children by affecting some CEBQ subscales like Emotional over Eating and Food Responsible. Conclusion: Zinc supplementation had positive impact in promoting the calorie intake and some subscales of anorexia. PMID:25674110

  17. Measuring Symmetry, Asymmetry and Randomness in Neural Network Connectivity

    PubMed Central

    Esposito, Umberto; Giugliano, Michele; van Rossum, Mark; Vasilaki, Eleni

    2014-01-01

    Cognitive functions are stored in the connectome, the wiring diagram of the brain, which exhibits non-random features, so-called motifs. In this work, we focus on bidirectional, symmetric motifs, i.e. two neurons that project to each other via connections of equal strength, and unidirectional, non-symmetric motifs, i.e. within a pair of neurons only one neuron projects to the other. We hypothesise that such motifs have been shaped via activity dependent synaptic plasticity processes. As a consequence, learning moves the distribution of the synaptic connections away from randomness. Our aim is to provide a global, macroscopic, single parameter characterisation of the statistical occurrence of bidirectional and unidirectional motifs. To this end we define a symmetry measure that does not require any a priori thresholding of the weights or knowledge of their maximal value. We calculate its mean and variance for random uniform or Gaussian distributions, which allows us to introduce a confidence measure of how significantly symmetric or asymmetric a specific configuration is, i.e. how likely it is that the configuration is the result of chance. We demonstrate the discriminatory power of our symmetry measure by inspecting the eigenvalues of different types of connectivity matrices. We show that a Gaussian weight distribution biases the connectivity motifs to more symmetric configurations than a uniform distribution and that introducing a random synaptic pruning, mimicking developmental regulation in synaptogenesis, biases the connectivity motifs to more asymmetric configurations, regardless of the distribution. We expect that our work will benefit the computational modelling community, by providing a systematic way to characterise symmetry and asymmetry in network structures. Further, our symmetry measure will be of use to electrophysiologists that investigate symmetry of network connectivity. PMID:25006663

  18. Measuring symmetry, asymmetry and randomness in neural network connectivity.

    PubMed

    Esposito, Umberto; Giugliano, Michele; van Rossum, Mark; Vasilaki, Eleni

    2014-01-01

    Cognitive functions are stored in the connectome, the wiring diagram of the brain, which exhibits non-random features, so-called motifs. In this work, we focus on bidirectional, symmetric motifs, i.e. two neurons that project to each other via connections of equal strength, and unidirectional, non-symmetric motifs, i.e. within a pair of neurons only one neuron projects to the other. We hypothesise that such motifs have been shaped via activity dependent synaptic plasticity processes. As a consequence, learning moves the distribution of the synaptic connections away from randomness. Our aim is to provide a global, macroscopic, single parameter characterisation of the statistical occurrence of bidirectional and unidirectional motifs. To this end we define a symmetry measure that does not require any a priori thresholding of the weights or knowledge of their maximal value. We calculate its mean and variance for random uniform or Gaussian distributions, which allows us to introduce a confidence measure of how significantly symmetric or asymmetric a specific configuration is, i.e. how likely it is that the configuration is the result of chance. We demonstrate the discriminatory power of our symmetry measure by inspecting the eigenvalues of different types of connectivity matrices. We show that a Gaussian weight distribution biases the connectivity motifs to more symmetric configurations than a uniform distribution and that introducing a random synaptic pruning, mimicking developmental regulation in synaptogenesis, biases the connectivity motifs to more asymmetric configurations, regardless of the distribution. We expect that our work will benefit the computational modelling community, by providing a systematic way to characterise symmetry and asymmetry in network structures. Further, our symmetry measure will be of use to electrophysiologists that investigate symmetry of network connectivity.

  19. Record statistics for biased random walks, with an application to financial data

    NASA Astrophysics Data System (ADS)

    Wergen, Gregor; Bogner, Miro; Krug, Joachim

    2011-05-01

    We consider the occurrence of record-breaking events in random walks with asymmetric jump distributions. The statistics of records in symmetric random walks was previously analyzed by Majumdar and Ziff [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.101.050601 101, 050601 (2008)] and is well understood. Unlike the case of symmetric jump distributions, in the asymmetric case the statistics of records depends on the choice of the jump distribution. We compute the record rate Pn(c), defined as the probability for the nth value to be larger than all previous values, for a Gaussian jump distribution with standard deviation σ that is shifted by a constant drift c. For small drift, in the sense of c/σ≪n-1/2, the correction to Pn(c) grows proportional to arctan(n) and saturates at the value (c)/(2σ). For large n the record rate approaches a constant, which is approximately given by 1-(σ/2πc)exp(-c2/2σ2) for c/σ≫1. These asymptotic results carry over to other continuous jump distributions with finite variance. As an application, we compare our analytical results to the record statistics of 366 daily stock prices from the Standard & Poor's 500 index. The biased random walk accounts quantitatively for the increase in the number of upper records due to the overall trend in the stock prices, and after detrending the number of upper records is in good agreement with the symmetric random walk. However the number of lower records in the detrended data is significantly reduced by a mechanism that remains to be identified.

  20. Selective intra-dinucleotide interactions and periodicities of bases separated by K sites: a new vision and tool for phylogeny analyses.

    PubMed

    Valenzuela, Carlos Y

    2017-02-13

    Direct tests of the random or non-random distribution of nucleotides on genomes have been devised to test the hypothesis of neutral, nearly-neutral or selective evolution. These tests are based on the direct base distribution and are independent of the functional (coding or non-coding) or structural (repeated or unique sequences) properties of the DNA. The first approach described the longitudinal distribution of bases in tandem repeats under the Bose-Einstein statistics. A huge deviation from randomness was found. A second approach was the study of the base distribution within dinucleotides whose bases were separated by 0, 1, 2… K nucleotides. Again an enormous difference from the random distribution was found with significances out of tables and programs. These test values were periodical and included the 16 dinucleotides. For example a high "positive" (more observed than expected dinucleotides) value, found in dinucleotides whose bases were separated by (3K + 2) sites, was preceded by two smaller "negative" (less observed than expected dinucleotides) values, whose bases were separated by (3K) or (3K + 1) sites. We examined mtDNAs, prokaryote genomes and some eukaryote chromosomes and found that the significant non-random interactions and periodicities were present up to 1000 or more sites of base separation and in human chromosome 21 until separations of more than 10 millions sites. Each nucleotide has its own significant value of its distance to neutrality; this yields 16 hierarchical significances. A three dimensional table with the number of sites of separation between the bases and the 16 significances (the third dimension is the dinucleotide, individual or taxon involved) gives directly an evolutionary state of the analyzed genome that can be used to obtain phylogenies. An example is provided.

  1. Semi-quantum Dialogue Based on Single Photons

    NASA Astrophysics Data System (ADS)

    Ye, Tian-Yu; Ye, Chong-Qiang

    2018-02-01

    In this paper, we propose two semi-quantum dialogue (SQD) protocols by using single photons as the quantum carriers, where one requires the classical party to possess the measurement capability and the other does not have this requirement. The security toward active attacks from an outside Eve in the first SQD protocol is guaranteed by the complete robustness of present semi-quantum key distribution (SQKD) protocols, the classical one-time pad encryption, the classical party's randomization operation and the decoy photon technology. The information leakage problem of the first SQD protocol is overcome by the classical party' classical basis measurements on the single photons carrying messages which makes him share their initial states with the quantum party. The security toward active attacks from Eve in the second SQD protocol is guaranteed by the classical party's randomization operation, the complete robustness of present SQKD protocol and the classical one-time pad encryption. The information leakage problem of the second SQD protocol is overcome by the quantum party' classical basis measurements on each two adjacent single photons carrying messages which makes her share their initial states with the classical party. Compared with the traditional information leakage resistant QD protocols, the advantage of the proposed SQD protocols lies in that they only require one party to have quantum capabilities. Compared with the existing SQD protocol, the advantage of the proposed SQD protocols lies in that they only employ single photons rather than two-photon entangled states as the quantum carriers. The proposed SQD protocols can be implemented with present quantum technologies.

  2. Modal Identification of Tsing MA Bridge by Using Improved Eigensystem Realization Algorithm

    NASA Astrophysics Data System (ADS)

    QIN, Q.; LI, H. B.; QIAN, L. Z.; LAU, C.-K.

    2001-10-01

    This paper presents the results of research work on modal identification of Tsing Ma bridge ambient testing data by using an improved eigensystem realization algorithm. The testing was carried out before the bridge was open to traffic and after the completion of surfacing. Without traffic load, ambient excitations were much less intensive, and the bridge responses to such ambient excitation were also less intensive. Consequently, the bridge responses were significantly influenced by the random movement of heavy construction vehicles on the deck. To cut off noises in the testing data and make the ambient signals more stationary, the Chebyshev digital filter was used instead of the digital filter with a Hanning window. Random decrement (RD) functions were built to convert the ambient responses to free vibrations. An improved eigensystem realization algorithm was employed to improve the accuracy and the efficiency of modal identification. It uses cross-correlation functions ofRD functions to form the Hankel matrix instead of RD functions themselves and uses eigenvalue decomposition instead of singular value decomposition. The data for response accelerations were acquired group by group because of limited number of high-quality accelerometers and channels of data loggers available. The modes were identified group by group and then assembled by using response accelerations acquired at reference points to form modes of the complete bridge. Seventy-nine modes of the Tsing Ma bridge were identified, including five complex modes formed in accordance with unevenly distributed damping in the bridge. The identified modes in time domain were then compared with those identified in frequency domain and finite element analytical results.

  3. Generalized index for spatial data sets as a measure of complete spatial randomness

    NASA Astrophysics Data System (ADS)

    Hackett-Jones, Emily J.; Davies, Kale J.; Binder, Benjamin J.; Landman, Kerry A.

    2012-06-01

    Spatial data sets, generated from a wide range of physical systems can be analyzed by counting the number of objects in a set of bins. Previous work has been limited to equal-sized bins, which are inappropriate for some domains (e.g., circular). We consider a nonequal size bin configuration whereby overlapping or nonoverlapping bins cover the domain. A generalized index, defined in terms of a variance between bin counts, is developed to indicate whether or not a spatial data set, generated from exclusion or nonexclusion processes, is at the complete spatial randomness (CSR) state. Limiting values of the index are determined. Using examples, we investigate trends in the generalized index as a function of density and compare the results with those using equal size bins. The smallest bin size must be much larger than the mean size of the objects. We can determine whether a spatial data set is at the CSR state or not by comparing the values of a generalized index for different bin configurations—the values will be approximately the same if the data is at the CSR state, while the values will differ if the data set is not at the CSR state. In general, the generalized index is lower than the limiting value of the index, since objects do not have access to the entire region due to blocking by other objects. These methods are applied to two applications: (i) spatial data sets generated from a cellular automata model of cell aggregation in the enteric nervous system and (ii) a known plant data distribution.

  4. Ice Water Classification Using Statistical Distribution Based Conditional Random Fields in RADARSAT-2 Dual Polarization Imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Li, F.; Zhang, S.; Hao, W.; Zhu, T.; Yuan, L.; Xiao, F.

    2017-09-01

    In this paper, Statistical Distribution based Conditional Random Fields (STA-CRF) algorithm is exploited for improving marginal ice-water classification. Pixel level ice concentration is presented as the comparison of methods based on CRF. Furthermore, in order to explore the effective statistical distribution model to be integrated into STA-CRF, five statistical distribution models are investigated. The STA-CRF methods are tested on 2 scenes around Prydz Bay and Adélie Depression, where contain a variety of ice types during melt season. Experimental results indicate that the proposed method can resolve sea ice edge well in Marginal Ice Zone (MIZ) and show a robust distinction of ice and water.

  5. A polymer, random walk model for the size-distribution of large DNA fragments after high linear energy transfer radiation

    NASA Technical Reports Server (NTRS)

    Ponomarev, A. L.; Brenner, D.; Hlatky, L. R.; Sachs, R. K.

    2000-01-01

    DNA double-strand breaks (DSBs) produced by densely ionizing radiation are not located randomly in the genome: recent data indicate DSB clustering along chromosomes. Stochastic DSB clustering at large scales, from > 100 Mbp down to < 0.01 Mbp, is modeled using computer simulations and analytic equations. A random-walk, coarse-grained polymer model for chromatin is combined with a simple track structure model in Monte Carlo software called DNAbreak and is applied to data on alpha-particle irradiation of V-79 cells. The chromatin model neglects molecular details but systematically incorporates an increase in average spatial separation between two DNA loci as the number of base-pairs between the loci increases. Fragment-size distributions obtained using DNAbreak match data on large fragments about as well as distributions previously obtained with a less mechanistic approach. Dose-response relations, linear at small doses of high linear energy transfer (LET) radiation, are obtained. They are found to be non-linear when the dose becomes so large that there is a significant probability of overlapping or close juxtaposition, along one chromosome, for different DSB clusters from different tracks. The non-linearity is more evident for large fragments than for small. The DNAbreak results furnish an example of the RLC (randomly located clusters) analytic formalism, which generalizes the broken-stick fragment-size distribution of the random-breakage model that is often applied to low-LET data.

  6. On the distribution of a product of N Gaussian random variables

    NASA Astrophysics Data System (ADS)

    Stojanac, Željka; Suess, Daniel; Kliesch, Martin

    2017-08-01

    The product of Gaussian random variables appears naturally in many applications in probability theory and statistics. It has been known that the distribution of a product of N such variables can be expressed in terms of a Meijer G-function. Here, we compute a similar representation for the corresponding cumulative distribution function (CDF) and provide a power-log series expansion of the CDF based on the theory of the more general Fox H-functions. Numerical computations show that for small values of the argument the CDF of products of Gaussians is well approximated by the lowest orders of this expansion. Analogous results are also shown for the absolute value as well as the square of such products of N Gaussian random variables. For the latter two settings, we also compute the moment generating functions in terms of Meijer G-functions.

  7. Polynomial chaos representation of databases on manifolds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soize, C., E-mail: christian.soize@univ-paris-est.fr; Ghanem, R., E-mail: ghanem@usc.edu

    2017-04-15

    Characterizing the polynomial chaos expansion (PCE) of a vector-valued random variable with probability distribution concentrated on a manifold is a relevant problem in data-driven settings. The probability distribution of such random vectors is multimodal in general, leading to potentially very slow convergence of the PCE. In this paper, we build on a recent development for estimating and sampling from probabilities concentrated on a diffusion manifold. The proposed methodology constructs a PCE of the random vector together with an associated generator that samples from the target probability distribution which is estimated from data concentrated in the neighborhood of the manifold. Themore » method is robust and remains efficient for high dimension and large datasets. The resulting polynomial chaos construction on manifolds permits the adaptation of many uncertainty quantification and statistical tools to emerging questions motivated by data-driven queries.« less

  8. A multiple scattering theory for EM wave propagation in a dense random medium

    NASA Technical Reports Server (NTRS)

    Karam, M. A.; Fung, A. K.; Wong, K. W.

    1985-01-01

    For a dense medium of randomly distributed scatterers an integral formulation for the total coherent field has been developed. This formulation accounts for the multiple scattering of electromagnetic waves including both the twoand three-particle terms. It is shown that under the Markovian assumption the total coherent field and the effective field have the same effective wave number. As an illustration of this theory, the effective wave number and the extinction coefficient are derived in terms of the polarizability tensor and the pair distribution function for randomly distributed small spherical scatterers. It is found that the contribution of the three-particle term increases with the particle size, the volume fraction, the frequency and the permittivity of the particle. This increase is more significant with frequency and particle size than with other parameters.

  9. Parameter estimation of multivariate multiple regression model using bayesian with non-informative Jeffreys’ prior distribution

    NASA Astrophysics Data System (ADS)

    Saputro, D. R. S.; Amalia, F.; Widyaningsih, P.; Affan, R. C.

    2018-05-01

    Bayesian method is a method that can be used to estimate the parameters of multivariate multiple regression model. Bayesian method has two distributions, there are prior and posterior distributions. Posterior distribution is influenced by the selection of prior distribution. Jeffreys’ prior distribution is a kind of Non-informative prior distribution. This prior is used when the information about parameter not available. Non-informative Jeffreys’ prior distribution is combined with the sample information resulting the posterior distribution. Posterior distribution is used to estimate the parameter. The purposes of this research is to estimate the parameters of multivariate regression model using Bayesian method with Non-informative Jeffreys’ prior distribution. Based on the results and discussion, parameter estimation of β and Σ which were obtained from expected value of random variable of marginal posterior distribution function. The marginal posterior distributions for β and Σ are multivariate normal and inverse Wishart. However, in calculation of the expected value involving integral of a function which difficult to determine the value. Therefore, approach is needed by generating of random samples according to the posterior distribution characteristics of each parameter using Markov chain Monte Carlo (MCMC) Gibbs sampling algorithm.

  10. A compound scattering pdf for the ultrasonic echo envelope and its relationship to K and Nakagami distributions.

    PubMed

    Shankar, P Mohana

    2003-03-01

    A compound probability density function (pdf) is presented to describe the envelope of the backscattered echo from tissue. This pdf allows local and global variation in scattering cross sections in tissue. The ultrasonic backscattering cross sections are assumed to be gamma distributed. The gamma distribution also is used to model the randomness in the average cross sections. This gamma-gamma model results in the compound scattering pdf for the envelope. The relationship of this compound pdf to the Rayleigh, K, and Nakagami distributions is explored through an analysis of the signal-to-noise ratio of the envelopes and random number simulations. The three parameter compound pdf appears to be flexible enough to represent envelope statistics giving rise to Rayleigh, K, and Nakagami distributions.

  11. A distribution model for the aerial application of granular agricultural particles

    NASA Technical Reports Server (NTRS)

    Fernandes, S. T.; Ormsbee, A. I.

    1978-01-01

    A model is developed to predict the shape of the distribution of granular agricultural particles applied by aircraft. The particle is assumed to have a random size and shape and the model includes the effect of air resistance, distributor geometry and aircraft wake. General requirements for the maintenance of similarity of the distribution for scale model tests are derived and are addressed to the problem of a nongeneral drag law. It is shown that if the mean and variance of the particle diameter and density are scaled according to the scaling laws governing the system, the shape of the distribution will be preserved. Distributions are calculated numerically and show the effect of a random initial lateral position, particle size and drag coefficient. A listing of the computer code is included.

  12. Bayesian statistics and Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Koch, K. R.

    2018-03-01

    The Bayesian approach allows an intuitive way to derive the methods of statistics. Probability is defined as a measure of the plausibility of statements or propositions. Three rules are sufficient to obtain the laws of probability. If the statements refer to the numerical values of variables, the so-called random variables, univariate and multivariate distributions follow. They lead to the point estimation by which unknown quantities, i.e. unknown parameters, are computed from measurements. The unknown parameters are random variables, they are fixed quantities in traditional statistics which is not founded on Bayes' theorem. Bayesian statistics therefore recommends itself for Monte Carlo methods, which generate random variates from given distributions. Monte Carlo methods, of course, can also be applied in traditional statistics. The unknown parameters, are introduced as functions of the measurements, and the Monte Carlo methods give the covariance matrix and the expectation of these functions. A confidence region is derived where the unknown parameters are situated with a given probability. Following a method of traditional statistics, hypotheses are tested by determining whether a value for an unknown parameter lies inside or outside the confidence region. The error propagation of a random vector by the Monte Carlo methods is presented as an application. If the random vector results from a nonlinearly transformed vector, its covariance matrix and its expectation follow from the Monte Carlo estimate. This saves a considerable amount of derivatives to be computed, and errors of the linearization are avoided. The Monte Carlo method is therefore efficient. If the functions of the measurements are given by a sum of two or more random vectors with different multivariate distributions, the resulting distribution is generally not known. TheMonte Carlo methods are then needed to obtain the covariance matrix and the expectation of the sum.

  13. Quasirandom geometric networks from low-discrepancy sequences

    NASA Astrophysics Data System (ADS)

    Estrada, Ernesto

    2017-08-01

    We define quasirandom geometric networks using low-discrepancy sequences, such as Halton, Sobol, and Niederreiter. The networks are built in d dimensions by considering the d -tuples of digits generated by these sequences as the coordinates of the vertices of the networks in a d -dimensional Id unit hypercube. Then, two vertices are connected by an edge if they are at a distance smaller than a connection radius. We investigate computationally 11 network-theoretic properties of two-dimensional quasirandom networks and compare them with analogous random geometric networks. We also study their degree distribution and their spectral density distributions. We conclude from this intensive computational study that in terms of the uniformity of the distribution of the vertices in the unit square, the quasirandom networks look more random than the random geometric networks. We include an analysis of potential strategies for generating higher-dimensional quasirandom networks, where it is know that some of the low-discrepancy sequences are highly correlated. In this respect, we conclude that up to dimension 20, the use of scrambling, skipping and leaping strategies generate quasirandom networks with the desired properties of uniformity. Finally, we consider a diffusive process taking place on the nodes and edges of the quasirandom and random geometric graphs. We show that the diffusion time is shorter in the quasirandom graphs as a consequence of their larger structural homogeneity. In the random geometric graphs the diffusion produces clusters of concentration that make the process more slow. Such clusters are a direct consequence of the heterogeneous and irregular distribution of the nodes in the unit square in which the generation of random geometric graphs is based on.

  14. Multinuclear NMR studies of relaxor ferroelectrics

    NASA Astrophysics Data System (ADS)

    Zhou, Donghua

    Multinuclear NMR of 93Nb, 45Sc, and 207Pb has been carried out to study the structure, disorder, and dynamics of a series of important solid solutions: perovskite relaxor ferroelectric materials (1-x) Pb(Mg1/3Nb 2/3)O3-x Pb(Sc1/2Nb1/2)O 3 (PMN-PSN). 93Nb NMR investigations of the local structure and cation order/disorder are presented as a function of PSN concentration, x. The superb fidelity and accuracy of 3QMAS allows us to make clear and consistent assignments of spectral intensities to the 28 possible nearest B-site neighbor (nBn) configurations, (NMg, NSc, NNb), where each number ranges from 0 to 6 and their sum is 6. For most of the 28 possible nBn configurations, isotropic chemical shifts and quadrupole product constants have been extracted from the data. The seven configurations with only larger cations, Mg 2+ and Sc3+ (and no Nb5+) are assigned to the seven observed narrow peaks, whose deconvoluted intensities facilitate quantitative evaluation of, and differentiation between, different models of B-site (chemical) disorder. The "completely random" model is ruled out and the "random site" model is shown to be in qualitative agreement with the NMR experiments. To obtain quantitative agreement with observed NMR intensities, the random site model is slightly modified by including unlike-pair interaction energies. To date, 45Sc studies have not been as fruitful as 93Nb NMR because the resolution is lower in the 45Sc spectra. The lower resolution of 45Sc spectra is due to a smaller span of isotropic chemical shift (40 ppm for 45Sc vs. 82 ppm for 93Nb) and to the lack of a fortuitous mechanism that simplifies the 93Nb spectra; for 93Nb the overlap of the isotropic chemical shifts of 6-Sc and 6-Nb configurations results in the alignment of all the 28 configurations along only seven quadrupole distribution axes. Finally we present variable temperature 207Pb static, MAS, and 2D-PASS NMR studies. Strong linear correlations between isotropic and anisotropic chemical shifts show that Pb-O bonds vary from more ionic to more covalent environments. Distributions of Pb-O bond lengthes are also quantitatively described. Such distributions are used to examine two competing models of Pb displacements; the shell model and the unique direction model. Only the latter model is able to reproduce the observed Pb-O distance distribution.

  15. The patient general satisfaction of mandibular single-implant overdentures and conventional complete dentures: Study protocol for a randomized crossover trial.

    PubMed

    Kanazawa, Manabu; Tanoue, Mariko; Miyayasu, Anna; Takeshita, Shin; Sato, Daisuke; Asami, Mari; Lam, Thuy Vo; Thu, Khaing Myat; Oda, Ken; Komagamine, Yuriko; Minakuchi, Shunsuke; Feine, Jocelyne

    2018-05-01

    Mandibular overdentures retained by a single implant placed in the midline of edentulous mandible have been reported to be more comfortable and function better than complete dentures. Although single-implant overdentures are still more costly than conventional complete dentures, there are a few studies which investigated whether mandibular single-implant overdentures are superior to complete dentures when patient general satisfaction is compared. The aim of this study is to assess patient general satisfaction with mandibular single-implant overdentures and complete dentures. This study is a randomized crossover trial to compare mandibular single-implant overdentures and complete dentures in edentulous individuals. Participant recruitment is ongoing at the time of this submission. Twenty-two participants will be recruited. New mandibular complete dentures will be fabricated. A single implant will be placed in the midline of the edentulous mandible. The mucosal surface of the complete denture around the implant will be relieved for 3 months. The participants will then be randomly allocated into 2 groups according to the order of the interventions; group 1 will receive single-implant overdentures first and will wear them for 2 months, followed by complete dentures for 2 months. Group 2 will receive the same treatments in a reverse order. After experiencing the 2 interventions, the participants will choose one of the mandibular prostheses, and yearly follow-up visits are planned for 5 years. The primary outcome of this trial is patient ratings of general satisfaction on 100 mm visual analog scales. Assessments of the prostheses and oral health-related quality of life will also be recorded as patient-reported outcomes. The secondary outcomes are cost and time for treatment. Masticatory efficiency and cognitive capacity will also be recorded. Furthermore, qualitative research will be performed to investigate the factors associated with success of these mandibular denture types. Clinical outcomes, such as implant survival rate, marginal bone loss, and prosthodontic complications, will also be recorded. The results of this randomized crossover trial will clarify whether mandibular single implants and overdentures for edentulous individuals provide better patient general satisfaction when compared to conventional complete dentures. This clinical trial was registered at the University Hospital Medical Information Network (UMIN) Center (UMIN000017883).

  16. Comparison of Long-term Differences in Dysphagia: Cervical Arthroplasty and Anterior Cervical Fusion.

    PubMed

    Smucker, Joseph D; Bassuener, Scott R; Sasso, Rick C; Riew, K Daniel

    2017-10-01

    Retrospective cohort study. This study investigates the incidence of long-term dysphagia in cervical disc arthroplasty, and anterior cervical discectomy and fusion (ACDF) patients. No long-term comparison of dysphagia between cervical arthroplasty and fusion patients has been published. Widely variable short-term postsurgical dysphagia rates have been reported. Cohorts for this study are patients with single-level cervical degenerative disc disease previously enrolled in a randomized clinical trial comparing cervical arthroplasty and ACDF. Subjective modified Bazaz Dysphagia Severity questionnaires were distributed to each patient at a minimum of 5 years postoperative for the long-term assessment. Dysphagia severity data were pooled to compare the rate of patients with dysphagia (grade>1) to asymptomatic (grade=1). In the arthroplasty cohort, 15 of 22 (68%) patients completed long-term swallowing questionnaires with no reports of dysphagia. Eighteen of 25 (72%) ACDF patients completed questionnaires, with 5 of 18 (28%) reporting dysphagia. This is a statistically significant difference (P=0.042) favoring lower rates of long-term dysphagia after cervical arthroplasty at an average interval of 7 years postoperative (range, 5.5-8.5 y). No significant difference between rates of self-reported short-term dysphagia was noted with 12% (3/25) and 9% (2/22) in the ACDF and arthroplasty groups, respectively (P=0.56). All short-term dysphagia cases in the arthroplasty cohort reported complete resolution of symptoms within 12 months postoperative. In the ACDF cohort, persistent symptoms at 7 years were noted in all responding patients. Three ACDF patients reported new late-onset, which was not noted in the arthroplasty cohort. To date, these findings represent the longest reported follow-up interval comparing rates of dysphagia between randomized cohorts of cervical arthroplasty and fusion patients. Our study suggests that cervical arthroplasty is less likely than ACDF to cause sustained long-term or late-presenting dysphagia.

  17. Evaluating effectiveness of down-sampling for stratified designs and unbalanced prevalence in Random Forest models of tree species distributions in Nevada

    Treesearch

    Elizabeth A. Freeman; Gretchen G. Moisen; Tracy S. Frescino

    2012-01-01

    Random Forests is frequently used to model species distributions over large geographic areas. Complications arise when data used to train the models have been collected in stratified designs that involve different sampling intensity per stratum. The modeling process is further complicated if some of the target species are relatively rare on the landscape leading to an...

  18. Ages of Records in Random Walks

    NASA Astrophysics Data System (ADS)

    Szabó, Réka; Vető, Bálint

    2016-12-01

    We consider random walks with continuous and symmetric step distributions. We prove universal asymptotics for the average proportion of the age of the kth longest lasting record for k=1,2,ldots and for the probability that the record of the kth longest age is broken at step n. Due to the relation to the Chinese restaurant process, the ranked sequence of proportions of ages converges to the Poisson-Dirichlet distribution.

  19. Slow diffusion by Markov random flights

    NASA Astrophysics Data System (ADS)

    Kolesnik, Alexander D.

    2018-06-01

    We present a conception of the slow diffusion processes in the Euclidean spaces Rm , m ≥ 1, based on the theory of random flights with small constant speed that are driven by a homogeneous Poisson process of small rate. The slow diffusion condition that, on long time intervals, leads to the stationary distributions, is given. The stationary distributions of slow diffusion processes in some Euclidean spaces of low dimensions, are presented.

  20. Propagation in Striated Media

    DTIC Science & Technology

    1976-05-01

    random walk photon scattering, geometric optics refraction at a thin phase screen, plane wave scattering from a thin screen in the Fraunhofer limit and...significant cases. In the geometric optics regime the distribution of density of allowable multipath rays is gsslanly distributed and the power...3.1 Random Walk Approach to Scattering 10 3.2 Phase Screen Approximation to Strong Scattering 13 3.3 Ray Optics and Stationary Phase Analysis 21 3,3,1

Top