Sample records for statistical mechanical random

  1. Quantum mechanics as classical statistical mechanics with an ontic extension and an epistemic restriction.

    PubMed

    Budiyono, Agung; Rohrlich, Daniel

    2017-11-03

    Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.

  2. A procedure for combining acoustically induced and mechanically induced loads (first passage failure design criterion)

    NASA Technical Reports Server (NTRS)

    Crowe, D. R.; Henricks, W.

    1983-01-01

    The combined load statistics are developed by taking the acoustically induced load to be a random population, assumed to be stationary. Each element of this ensemble of acoustically induced loads is assumed to have the same power spectral density (PSD), obtained previously from a random response analysis employing the given acoustic field in the STS cargo bay as a stationary random excitation. The mechanically induced load is treated as either (1) a known deterministic transient, or (2) a nonstationary random variable of known first and second statistical moments which vary with time. A method is then shown for determining the probability that the combined load would, at any time, have a value equal to or less than a certain level. Having obtained a statistical representation of how the acoustic and mechanical loads are expected to combine, an analytical approximation for defining design levels for these loads is presented using the First Passage failure criterion.

  3. A first-order statistical smoothing approximation for the coherent wave field in random porous random media

    NASA Astrophysics Data System (ADS)

    Müller, Tobias M.; Gurevich, Boris

    2005-04-01

    An important dissipation mechanism for waves in randomly inhomogeneous poroelastic media is the effect of wave-induced fluid flow. In the framework of Biot's theory of poroelasticity, this mechanism can be understood as scattering from fast into slow compressional waves. To describe this conversion scattering effect in poroelastic random media, the dynamic characteristics of the coherent wavefield using the theory of statistical wave propagation are analyzed. In particular, the method of statistical smoothing is applied to Biot's equations of poroelasticity. Within the accuracy of the first-order statistical smoothing an effective wave number of the coherent field, which accounts for the effect of wave-induced flow, is derived. This wave number is complex and involves an integral over the correlation function of the medium's fluctuations. It is shown that the known one-dimensional (1-D) result can be obtained as a special case of the present 3-D theory. The expression for the effective wave number allows to derive a model for elastic attenuation and dispersion due to wave-induced fluid flow. These wavefield attributes are analyzed in a companion paper. .

  4. Random walk to a nonergodic equilibrium concept

    NASA Astrophysics Data System (ADS)

    Bel, G.; Barkai, E.

    2006-01-01

    Random walk models, such as the trap model, continuous time random walks, and comb models, exhibit weak ergodicity breaking, when the average waiting time is infinite. The open question is, what statistical mechanical theory replaces the canonical Boltzmann-Gibbs theory for such systems? In this paper a nonergodic equilibrium concept is investigated, for a continuous time random walk model in a potential field. In particular we show that in the nonergodic phase the distribution of the occupation time of the particle in a finite region of space approaches U- or W-shaped distributions related to the arcsine law. We show that when conditions of detailed balance are applied, these distributions depend on the partition function of the problem, thus establishing a relation between the nonergodic dynamics and canonical statistical mechanics. In the ergodic phase the distribution function of the occupation times approaches a δ function centered on the value predicted based on standard Boltzmann-Gibbs statistics. The relation of our work to single-molecule experiments is briefly discussed.

  5. Quantum probabilistic logic programming

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  6. Random matrices and condensation into multiple states

    NASA Astrophysics Data System (ADS)

    Sadeghi, Sina; Engel, Andreas

    2018-03-01

    In the present work, we employ methods from statistical mechanics of disordered systems to investigate static properties of condensation into multiple states in a general framework. We aim at showing how typical properties of random interaction matrices play a vital role in manifesting the statistics of condensate states. In particular, an analytical expression for the fraction of condensate states in the thermodynamic limit is provided that confirms the result of the mean number of coexisting species in a random tournament game. We also study the interplay between the condensation problem and zero-sum games with correlated random payoff matrices.

  7. Statistical Characterization of the Mechanical Parameters of Intact Rock Under Triaxial Compression: An Experimental Proof of the Jinping Marble

    NASA Astrophysics Data System (ADS)

    Jiang, Quan; Zhong, Shan; Cui, Jie; Feng, Xia-Ting; Song, Leibo

    2016-12-01

    We investigated the statistical characteristics and probability distribution of the mechanical parameters of natural rock using triaxial compression tests. Twenty cores of Jinping marble were tested under each different levels of confining stress (i.e., 5, 10, 20, 30, and 40 MPa). From these full stress-strain data, we summarized the numerical characteristics and determined the probability distribution form of several important mechanical parameters, including deformational parameters, characteristic strength, characteristic strains, and failure angle. The statistical proofs relating to the mechanical parameters of rock presented new information about the marble's probabilistic distribution characteristics. The normal and log-normal distributions were appropriate for describing random strengths of rock; the coefficients of variation of the peak strengths had no relationship to the confining stress; the only acceptable random distribution for both Young's elastic modulus and Poisson's ratio was the log-normal function; and the cohesive strength had a different probability distribution pattern than the frictional angle. The triaxial tests and statistical analysis also provided experimental evidence for deciding the minimum reliable number of experimental sample and for picking appropriate parameter distributions to use in reliability calculations for rock engineering.

  8. Higher Order Cumulant Studies of Ocean Surface Random Fields from Satellite Altimeter Data

    NASA Technical Reports Server (NTRS)

    Cheng, B.

    1996-01-01

    Higher order statistics, especially 2nd order statistics, have been used to study ocean processes for many years in the past, and occupy an appreciable part of the research literature on physical oceanography. They in turn form part of a much larger field of study in statistical fluid mechanics.

  9. The Shock and Vibration Digest. Volume 13. Number 7

    DTIC Science & Technology

    1981-07-01

    Richards, ISVR, University of Southampton Presidential Address "A Structural Dynamicist Looks at Statistical Energy Analysis " Professor B.L...excitation and for random and sine sweep mechanical excitation. Test data were used to assess prediction methods, in particular a statistical energy analysis method

  10. Generalized self-adjustment method for statistical mechanics of composite materials

    NASA Astrophysics Data System (ADS)

    Pan'kov, A. A.

    1997-03-01

    A new method is developed for the statistical mechanics of composite materials — the generalized selfadjustment method — which makes it possible to reduce the problem of predicting effective elastic properties of composites with random structures to the solution of two simpler "averaged" problems of an inclusion with transitional layers in a medium with the desired effective elastic properties. The inhomogeneous elastic properties and dimensions of the transitional layers take into account both the "approximate" order of mutual positioning, and also the variation in the dimensions and elastics properties of inclusions through appropriate special averaged indicator functions of the random structure of the composite. A numerical calculation of averaged indicator functions and effective elastic characteristics is performed by the generalized self-adjustment method for a unidirectional fiberglass on the basis of various models of actual random structures in the plane of isotropy.

  11. A novel complete-case analysis to determine statistical significance between treatments in an intention-to-treat population of randomized clinical trials involving missing data.

    PubMed

    Liu, Wei; Ding, Jinhui

    2018-04-01

    The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.

  12. Discrete-element modeling of nacre-like materials: Effects of random microstructures on strain localization and mechanical performance

    NASA Astrophysics Data System (ADS)

    Abid, Najmul; Mirkhalaf, Mohammad; Barthelat, Francois

    2018-03-01

    Natural materials such as nacre, collagen, and spider silk are composed of staggered stiff and strong inclusions in a softer matrix. This type of hybrid microstructure results in remarkable combinations of stiffness, strength, and toughness and it now inspires novel classes of high-performance composites. However, the analytical and numerical approaches used to predict and optimize the mechanics of staggered composites often neglect statistical variations and inhomogeneities, which may have significant impacts on modulus, strength, and toughness. Here we present an analysis of localization using small representative volume elements (RVEs) and large scale statistical volume elements (SVEs) based on the discrete element method (DEM). DEM is an efficient numerical method which enabled the evaluation of more than 10,000 microstructures in this study, each including about 5,000 inclusions. The models explore the combined effects of statistics, inclusion arrangement, and interface properties. We find that statistical variations have a negative effect on all properties, in particular on the ductility and energy absorption because randomness precipitates the localization of deformations. However, the results also show that the negative effects of random microstructures can be offset by interfaces with large strain at failure accompanied by strain hardening. More specifically, this quantitative study reveals an optimal range of interface properties where the interfaces are the most effective at delaying localization. These findings show how carefully designed interfaces in bioinspired staggered composites can offset the negative effects of microstructural randomness, which is inherent to most current fabrication methods.

  13. Randomized controlled trial comparing nasal intermittent positive pressure ventilation and nasal continuous positive airway pressure in premature infants after tracheal extubation.

    PubMed

    Komatsu, Daniela Franco Rizzo; Diniz, Edna Maria de Albuquerque; Ferraro, Alexandre Archanjo; Ceccon, Maria Esther Jurvest Rivero; Vaz, Flávio Adolfo Costa

    2016-09-01

    To analyze the frequency of extubation failure in premature infants using conventional mechanical ventilation (MV) after extubation in groups subjected to nasal intermittent positive pressure ventilation (nIPPV) and continuous positive airway pressure (nCPAP). Seventy-two premature infants with respiratory failure were studied, with a gestational age (GA) ≤ 36 weeks and birth weight (BW) > 750 g, who required tracheal intubation and mechanical ventilation. The study was controlled and randomized in order to ensure that the members of the groups used in the research were chosen at random. Randomization was performed at the time of extubation using sealed envelopes. Extubation failure was defined as the need for re-intubation and mechanical ventilation during the first 72 hours after extubation. Among the 36 premature infants randomized to nIPPV, six (16.6%) presented extubation failure in comparison to 11 (30.5%) of the 36 premature infants randomized to nCPAP. There was no statistical difference between the two study groups regarding BW, GA, classification of the premature infant, and MV time. The main cause of extubation failure was the occurrence of apnea. Gastrointestinal and neurological complications did not occur in the premature infants participating in the study. We found that, despite the extubation failure of the group of premature infants submitted to nIPPV being numerically smaller than in premature infants submitted to nCPAP, there was no statistically significant difference between the two modes of ventilatory support after extubation.

  14. Statistical mechanics of simple models of protein folding and design.

    PubMed Central

    Pande, V S; Grosberg, A Y; Tanaka, T

    1997-01-01

    It is now believed that the primary equilibrium aspects of simple models of protein folding are understood theoretically. However, current theories often resort to rather heavy mathematics to overcome some technical difficulties inherent in the problem or start from a phenomenological model. To this end, we take a new approach in this pedagogical review of the statistical mechanics of protein folding. The benefit of our approach is a drastic mathematical simplification of the theory, without resort to any new approximations or phenomenological prescriptions. Indeed, the results we obtain agree precisely with previous calculations. Because of this simplification, we are able to present here a thorough and self contained treatment of the problem. Topics discussed include the statistical mechanics of the random energy model (REM), tests of the validity of REM as a model for heteropolymer freezing, freezing transition of random sequences, phase diagram of designed ("minimally frustrated") sequences, and the degree to which errors in the interactions employed in simulations of either folding and design can still lead to correct folding behavior. Images FIGURE 2 FIGURE 3 FIGURE 4 FIGURE 6 PMID:9414231

  15. Statistical mechanical analysis of linear programming relaxation for combinatorial optimization problems

    NASA Astrophysics Data System (ADS)

    Takabe, Satoshi; Hukushima, Koji

    2016-05-01

    Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α -uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α =2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c =e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c =1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α ≥3 , minimum vertex covers on α -uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c =e /(α -1 ) where the replica symmetry is broken.

  16. Statistical mechanical analysis of linear programming relaxation for combinatorial optimization problems.

    PubMed

    Takabe, Satoshi; Hukushima, Koji

    2016-05-01

    Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α-uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α=2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c=e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c=1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α≥3, minimum vertex covers on α-uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c=e/(α-1) where the replica symmetry is broken.

  17. Hidden Statistics of Schroedinger Equation

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2011-01-01

    Work was carried out in determination of the mathematical origin of randomness in quantum mechanics and creating a hidden statistics of Schr dinger equation; i.e., to expose the transitional stochastic process as a "bridge" to the quantum world. The governing equations of hidden statistics would preserve such properties of quantum physics as superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods.

  18. Markov Random Fields, Stochastic Quantization and Image Analysis

    DTIC Science & Technology

    1990-01-01

    Markov random fields based on the lattice Z2 have been extensively used in image analysis in a Bayesian framework as a-priori models for the...of Image Analysis can be given some fundamental justification then there is a remarkable connection between Probabilistic Image Analysis , Statistical Mechanics and Lattice-based Euclidean Quantum Field Theory.

  19. Quantifying networks complexity from information geometry viewpoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felice, Domenico, E-mail: domenico.felice@unicam.it; Mancini, Stefano; INFN-Sezione di Perugia, Via A. Pascoli, I-06123 Perugia

    We consider a Gaussian statistical model whose parameter space is given by the variances of random variables. Underlying this model we identify networks by interpreting random variables as sitting on vertices and their correlations as weighted edges among vertices. We then associate to the parameter space a statistical manifold endowed with a Riemannian metric structure (that of Fisher-Rao). Going on, in analogy with the microcanonical definition of entropy in Statistical Mechanics, we introduce an entropic measure of networks complexity. We prove that it is invariant under networks isomorphism. Above all, considering networks as simplicial complexes, we evaluate this entropy onmore » simplexes and find that it monotonically increases with their dimension.« less

  20. Statistical analysis of loopy belief propagation in random fields

    NASA Astrophysics Data System (ADS)

    Yasuda, Muneki; Kataoka, Shun; Tanaka, Kazuyuki

    2015-10-01

    Loopy belief propagation (LBP), which is equivalent to the Bethe approximation in statistical mechanics, is a message-passing-type inference method that is widely used to analyze systems based on Markov random fields (MRFs). In this paper, we propose a message-passing-type method to analytically evaluate the quenched average of LBP in random fields by using the replica cluster variation method. The proposed analytical method is applicable to general pairwise MRFs with random fields whose distributions differ from each other and can give the quenched averages of the Bethe free energies over random fields, which are consistent with numerical results. The order of its computational cost is equivalent to that of standard LBP. In the latter part of this paper, we describe the application of the proposed method to Bayesian image restoration, in which we observed that our theoretical results are in good agreement with the numerical results for natural images.

  1. Increase in the Random Dopant Induced Threshold Fluctuations and Lowering in Sub 100 nm MOSFETs Due to Quantum Effects: A 3-D Density-Gradient Simulation Study

    NASA Technical Reports Server (NTRS)

    Asenov, Asen; Slavcheva, G.; Brown, A. R.; Davies, J. H.; Saini, S.

    2000-01-01

    In this paper we present a detailed simulation study of the influence of quantum mechanical effects in the inversion layer on random dopant induced threshold voltage fluctuations and lowering in sub 100 nm MOSFETs. The simulations have been performed using a 3-D implementation of the density gradient (DG) formalism incorporated in our established 3-D atomistic simulation approach. This results in a self-consistent 3-D quantum mechanical picture, which implies not only the vertical inversion layer quantisation but also the lateral confinement effects related to current filamentation in the 'valleys' of the random potential fluctuations. We have shown that the net result of including quantum mechanical effects, while considering statistical dopant fluctuations, is an increase in both threshold voltage fluctuations and lowering. At the same time, the random dopant induced threshold voltage lowering partially compensates for the quantum mechanical threshold voltage shift in aggressively scaled MOSFETs with ultrathin gate oxides.

  2. Jamming II: Edwards’ statistical mechanics of random packings of hard spheres

    NASA Astrophysics Data System (ADS)

    Wang, Ping; Song, Chaoming; Jin, Yuliang; Makse, Hernán A.

    2011-02-01

    The problem of finding the most efficient way to pack spheres has an illustrious history, dating back to the crystalline arrays conjectured by Kepler and the random geometries explored by Bernal in the 1960s. This problem finds applications spanning from the mathematician’s pencil, the processing of granular materials, the jamming and glass transitions, all the way to fruit packing in every grocery. There are presently numerous experiments showing that the loosest way to pack spheres gives a density of ∼55% (named random loose packing, RLP) while filling all the loose voids results in a maximum density of ∼63%-64% (named random close packing, RCP). While those values seem robustly true, to this date there is no well-accepted physical explanation or theoretical prediction for them. Here we develop a common framework for understanding the random packings of monodisperse hard spheres whose limits can be interpreted as the experimentally observed RLP and RCP. The reason for these limits arises from a statistical picture of jammed states in which the RCP can be interpreted as the ground state of the ensemble of jammed matter with zero compactivity, while the RLP arises in the infinite compactivity limit. We combine an extended statistical mechanics approach ‘a la Edwards’ (where the role traditionally played by the energy and temperature in thermal systems is substituted by the volume and compactivity) with a constraint on mechanical stability imposed by the isostatic condition. We show how such approaches can bring results that can be compared to experiments and allow for an exploitation of the statistical mechanics framework. The key result is the use of a relation between the local Voronoi volumes of the constituent grains (denoted the volume function) and the number of neighbors in contact that permits us to simply combine the two approaches to develop a theory of volume fluctuations in jammed matter. Ultimately, our results lead to a phase diagram that provides a unifying view of the disordered hard sphere packing problem and further sheds light on a diverse spectrum of data, including the RLP state. Theoretical results are well reproduced by numerical simulations that confirm the essential role played by friction in determining both the RLP and RCP limits. The RLP values depend on friction, explaining why varied experimental results can be obtained.

  3. Quantum Mechanical Enhancement of the Random Dopant Induced Threshold Voltage Fluctuations and Lowering in Sub 0.1 Micron MOSFETs

    NASA Technical Reports Server (NTRS)

    Asenov, Asen; Slavcheva, G.; Brown, A. R.; Davies, J. H.; Saini, Subhash

    1999-01-01

    A detailed study of the influence of quantum effects in the inversion layer on the random dopant induced threshold voltage fluctuations and lowering in sub 0.1 micron MOSFETs has been performed. This has been achieved using a full 3D implementation of the density gradient (DG) formalism incorporated in our previously published 3D 'atomistic' simulation approach. This results in a consistent, fully 3D, quantum mechanical picture which implies not only the vertical inversion layer quantisation but also the lateral confinement effects manifested by current filamentation in the 'valleys' of the random potential fluctuations. We have shown that the net result of including quantum mechanical effects, while considering statistical fluctuations, is an increase in both threshold voltage fluctuations and lowering.

  4. Random bursts determine dynamics of active filaments.

    PubMed

    Weber, Christoph A; Suzuki, Ryo; Schaller, Volker; Aranson, Igor S; Bausch, Andreas R; Frey, Erwin

    2015-08-25

    Constituents of living or synthetic active matter have access to a local energy supply that serves to keep the system out of thermal equilibrium. The statistical properties of such fluctuating active systems differ from those of their equilibrium counterparts. Using the actin filament gliding assay as a model, we studied how nonthermal distributions emerge in active matter. We found that the basic mechanism involves the interplay between local and random injection of energy, acting as an analog of a thermal heat bath, and nonequilibrium energy dissipation processes associated with sudden jump-like changes in the system's dynamic variables. We show here how such a mechanism leads to a nonthermal distribution of filament curvatures with a non-Gaussian shape. The experimental curvature statistics and filament relaxation dynamics are reproduced quantitatively by stochastic computer simulations and a simple kinetic model.

  5. Random bursts determine dynamics of active filaments

    PubMed Central

    Weber, Christoph A.; Suzuki, Ryo; Schaller, Volker; Aranson, Igor S.; Bausch, Andreas R.; Frey, Erwin

    2015-01-01

    Constituents of living or synthetic active matter have access to a local energy supply that serves to keep the system out of thermal equilibrium. The statistical properties of such fluctuating active systems differ from those of their equilibrium counterparts. Using the actin filament gliding assay as a model, we studied how nonthermal distributions emerge in active matter. We found that the basic mechanism involves the interplay between local and random injection of energy, acting as an analog of a thermal heat bath, and nonequilibrium energy dissipation processes associated with sudden jump-like changes in the system’s dynamic variables. We show here how such a mechanism leads to a nonthermal distribution of filament curvatures with a non-Gaussian shape. The experimental curvature statistics and filament relaxation dynamics are reproduced quantitatively by stochastic computer simulations and a simple kinetic model. PMID:26261319

  6. On Testability of Missing Data Mechanisms in Incomplete Data Sets

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2011-01-01

    This article is concerned with the question of whether the missing data mechanism routinely referred to as missing completely at random (MCAR) is statistically examinable via a test for lack of distributional differences between groups with observed and missing data, and related consequences. A discussion is initially provided, from a formal logic…

  7. Random mechanics: Nonlinear vibrations, turbulences, seisms, swells, fatigue

    NASA Astrophysics Data System (ADS)

    Kree, P.; Soize, C.

    The random modeling of physical phenomena, together with probabilistic methods for the numerical calculation of random mechanical forces, are analytically explored. Attention is given to theoretical examinations such as probabilistic concepts, linear filtering techniques, and trajectory statistics. Applications of the methods to structures experiencing atmospheric turbulence, the quantification of turbulence, and the dynamic responses of the structures are considered. A probabilistic approach is taken to study the effects of earthquakes on structures and to the forces exerted by ocean waves on marine structures. Theoretical analyses by means of vector spaces and stochastic modeling are reviewed, as are Markovian formulations of Gaussian processes and the definition of stochastic differential equations. Finally, random vibrations with a variable number of links and linear oscillators undergoing the square of Gaussian processes are investigated.

  8. Quantifying economic fluctuations by adapting methods of statistical physics

    NASA Astrophysics Data System (ADS)

    Plerou, Vasiliki

    2001-09-01

    The first focus of this thesis is the investigation of cross-correlations between the price fluctuations of different stocks using the conceptual framework of random matrix theory (RMT), developed in physics to describe the statistical properties of energy-level spectra of complex nuclei. RMT makes predictions for the statistical properties of matrices that are universal, i.e., do not depend on the interactions between the elements comprising the system. In physical systems, deviations from the predictions of RMT provide clues regarding the mechanisms controlling the dynamics of a given system so this framework is of potential value if applied to economic systems. This thesis compares the statistics of cross-correlation matrix C-whose elements Cij are the correlation coefficients of price fluctuations of stock i and j-against the ``null hypothesis'' of a random matrix having the same symmetry properties. It is shown that comparison of the eigenvalue statistics of C with RMT results can be used to distinguish random and non-random parts of C. The non-random part of C which deviates from RMT results, provides information regarding genuine cross-correlations between stocks. The interpretations and potential practical utility of these deviations are also investigated. The second focus is the characterization of the dynamics of stock price fluctuations. The statistical properties of the changes G Δt in price over a time interval Δ t are quantified and the statistical relation between G Δt and the trading activity-measured by the number of transactions NΔ t in the interval Δt is investigated. The statistical properties of the volatility, i.e., the time dependent standard deviation of price fluctuations, is related to two microscopic quantities: NΔt and the variance W2Dt of the price changes for all transactions in the interval Δ t. In addition, the statistical relationship between G Δt and the number of shares QΔt traded in Δ t is investigated.

  9. Analysis of Longitudinal Outcome Data with Missing Values in Total Knee Arthroplasty.

    PubMed

    Kang, Yeon Gwi; Lee, Jang Taek; Kang, Jong Yeal; Kim, Ga Hye; Kim, Tae Kyun

    2016-01-01

    We sought to determine the influence of missing data on the statistical results, and to determine which statistical method is most appropriate for the analysis of longitudinal outcome data of TKA with missing values among repeated measures ANOVA, generalized estimating equation (GEE) and mixed effects model repeated measures (MMRM). Data sets with missing values were generated with different proportion of missing data, sample size and missing-data generation mechanism. Each data set was analyzed with three statistical methods. The influence of missing data was greater with higher proportion of missing data and smaller sample size. MMRM tended to show least changes in the statistics. When missing values were generated by 'missing not at random' mechanism, no statistical methods could fully avoid deviations in the results. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Measuring Circulation Desk Activities Using a Random Alarm Mechanism.

    ERIC Educational Resources Information Center

    Mosborg, Stella Frank

    1980-01-01

    Reports a job analysis methodology to gather meaningful data related to circulation desk activity. The technique is designed to give librarians statistical data on actual time expenditures for complex and varying activities. (Author/RAA)

  11. Non-equilibrium statistical mechanics theory for the large scales of geophysical flows

    NASA Astrophysics Data System (ADS)

    Eric, S.; Bouchet, F.

    2010-12-01

    The aim of any theory of turbulence is to understand the statistical properties of the velocity field. As a huge number of degrees of freedom is involved, statistical mechanics is a natural approach. The self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. We discuss classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations, mean field approach) and thermodynamic concepts (ensemble inequivalence, negative heat capacity) are briefly explained and used to predict statistical equilibria for turbulent flows. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations will be discussed. We also present recent results for non-equilibrium situations, for which forces and dissipation are in a statistical balance. As an example, the concept of phase transition allows us to describe drastic changes of the whole system when a few external parameters are changed. F. Bouchet and E. Simonnet, Random Changes of Flow Topology in Two-Dimensional and Geophysical Turbulence, Physical Review Letters 102 (2009), no. 9, 094504-+. F. Bouchet and J. Sommeria, Emergence of intense jets and Jupiter's Great Red Spot as maximum-entropy structures, Journal of Fluid Mechanics 464 (2002), 165-207. A. Venaille and F. Bouchet, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. Bouchet and A. Venaille, Statistical mechanics of two-dimensional and geophysical flows, submitted to Physics Reports Non-equilibrium phase transitions for the 2D Navier-Stokes equations with stochastic forces (time series and probability density functions (PDFs) of the modulus of the largest scale Fourrier component, showing bistability between dipole and unidirectional flows). This bistability is predicted by statistical mechanics.

  12. Frequency-dependent scaling from mesoscale to macroscale in viscoelastic random composites

    PubMed Central

    Zhang, Jun

    2016-01-01

    This paper investigates the scaling from a statistical volume element (SVE; i.e. mesoscale level) to representative volume element (RVE; i.e. macroscale level) of spatially random linear viscoelastic materials, focusing on the quasi-static properties in the frequency domain. Requiring the material statistics to be spatially homogeneous and ergodic, the mesoscale bounds on the RVE response are developed from the Hill–Mandel homogenization condition adapted to viscoelastic materials. The bounds are obtained from two stochastic initial-boundary value problems set up, respectively, under uniform kinematic and traction boundary conditions. The frequency and scale dependencies of mesoscale bounds are obtained through computational mechanics for composites with planar random chessboard microstructures. In general, the frequency-dependent scaling to RVE can be described through a complex-valued scaling function, which generalizes the concept originally developed for linear elastic random composites. This scaling function is shown to apply for all different phase combinations on random chessboards and, essentially, is only a function of the microstructure and mesoscale. PMID:27274689

  13. Statistical mechanics of a single particle in a multiscale random potential: Parisi landscapes in finite-dimensional Euclidean spaces

    NASA Astrophysics Data System (ADS)

    Fyodorov, Yan V.; Bouchaud, Jean-Philippe

    2008-08-01

    We construct an N-dimensional Gaussian landscape with multiscale, translation invariant, logarithmic correlations and investigate the statistical mechanics of a single particle in this environment. In the limit of high dimension N → ∞ the free energy of the system and overlap function are calculated exactly using the replica trick and Parisi's hierarchical ansatz. In the thermodynamic limit, we recover the most general version of the Derrida's generalized random energy model (GREM). The low-temperature behaviour depends essentially on the spectrum of length scales involved in the construction of the landscape. If the latter consists of K discrete values, the system is characterized by a K-step replica symmetry breaking solution. We argue that our construction is in fact valid in any finite spatial dimensions N >= 1. We discuss the implications of our results for the singularity spectrum describing multifractality of the associated Boltzmann-Gibbs measure. Finally we discuss several generalizations and open problems, such as the dynamics in such a landscape and the construction of a generalized multifractal random walk.

  14. Theory-based Bayesian Models of Inductive Inference

    DTIC Science & Technology

    2010-07-19

    Subjective randomness and natural scene statistics. Psychonomic Bulletin & Review . http://cocosci.berkeley.edu/tom/papers/randscenes.pdf Page 1...in press). Exemplar models as a mechanism for performing Bayesian inference. Psychonomic Bulletin & Review . http://cocosci.berkeley.edu/tom

  15. Unifying Complexity and Information

    NASA Astrophysics Data System (ADS)

    Ke, Da-Guan

    2013-04-01

    Complex systems, arising in many contexts in the computer, life, social, and physical sciences, have not shared a generally-accepted complexity measure playing a fundamental role as the Shannon entropy H in statistical mechanics. Superficially-conflicting criteria of complexity measurement, i.e. complexity-randomness (C-R) relations, have given rise to a special measure intrinsically adaptable to more than one criterion. However, deep causes of the conflict and the adaptability are not much clear. Here I trace the root of each representative or adaptable measure to its particular universal data-generating or -regenerating model (UDGM or UDRM). A representative measure for deterministic dynamical systems is found as a counterpart of the H for random process, clearly redefining the boundary of different criteria. And a specific UDRM achieving the intrinsic adaptability enables a general information measure that ultimately solves all major disputes. This work encourages a single framework coving deterministic systems, statistical mechanics and real-world living organisms.

  16. On real statistics of relaxation in gases

    NASA Astrophysics Data System (ADS)

    Kuzovlev, Yu. E.

    2016-02-01

    By example of a particle interacting with ideal gas, it is shown that the statistics of collisions in statistical mechanics at any value of the gas rarefaction parameter qualitatively differ from that conjugated with Boltzmann's hypothetical molecular chaos and kinetic equation. In reality, the probability of collisions of the particle in itself is random. Because of that, the relaxation of particle velocity acquires a power-law asymptotic behavior. An estimate of its exponent is suggested on the basis of simple kinematic reasons.

  17. Three-Dimensional Color Code Thresholds via Statistical-Mechanical Mapping

    NASA Astrophysics Data System (ADS)

    Kubica, Aleksander; Beverland, Michael E.; Brandão, Fernando; Preskill, John; Svore, Krysta M.

    2018-05-01

    Three-dimensional (3D) color codes have advantages for fault-tolerant quantum computing, such as protected quantum gates with relatively low overhead and robustness against imperfect measurement of error syndromes. Here we investigate the storage threshold error rates for bit-flip and phase-flip noise in the 3D color code (3DCC) on the body-centered cubic lattice, assuming perfect syndrome measurements. In particular, by exploiting a connection between error correction and statistical mechanics, we estimate the threshold for 1D stringlike and 2D sheetlike logical operators to be p3DCC (1 )≃1.9 % and p3DCC (2 )≃27.6 % . We obtain these results by using parallel tempering Monte Carlo simulations to study the disorder-temperature phase diagrams of two new 3D statistical-mechanical models: the four- and six-body random coupling Ising models.

  18. Exploiting Data Missingness in Bayesian Network Modeling

    NASA Astrophysics Data System (ADS)

    Rodrigues de Morais, Sérgio; Aussem, Alex

    This paper proposes a framework built on the use of Bayesian networks (BN) for representing statistical dependencies between the existing random variables and additional dummy boolean variables, which represent the presence/absence of the respective random variable value. We show how augmenting the BN with these additional variables helps pinpoint the mechanism through which missing data contributes to the classification task. The missing data mechanism is thus explicitly taken into account to predict the class variable using the data at hand. Extensive experiments on synthetic and real-world incomplete data sets reveals that the missingness information improves classification accuracy.

  19. [Controlled study of oral administration of antibiotics in the preparation of digestive surgery (author's transl)].

    PubMed

    Mendes da Costa, P; Klastersky, J; Gérard, A

    1977-01-01

    Between November 30, 1971 and March 15, 1976, 46 patients underwent surgery on the colon or rectum. They were randomized into 2 groups, one receiving a mechanical preparation together with lincomycline, neomycine, polymyxine, kanamycine, bacitracine and nystatine, the other a mechanical preparation alone. Analysis of results reveals no statistically significant difference in the frequency of infections, neither local (11/24 with antibiotics vis 13/22 without; chi2 = 0.25) neither general (16/24 and 9/22; chi2 = 0.92). Nor was the postoperative use of antibiotics for local or general infection different in the 2 groups. No influence of age or preoperative radio-therapy could be shown. This randomized trial suggests that there is little advantage in associating antibiotics to mechanical preparation before colorectal surgery. The authors contemplate a new randomized trial in high-risk patients suffering from cancer.

  20. Anomalous Diffusion of Single Particles in Cytoplasm

    PubMed Central

    Regner, Benjamin M.; Vučinić, Dejan; Domnisoru, Cristina; Bartol, Thomas M.; Hetzer, Martin W.; Tartakovsky, Daniel M.; Sejnowski, Terrence J.

    2013-01-01

    The crowded intracellular environment poses a formidable challenge to experimental and theoretical analyses of intracellular transport mechanisms. Our measurements of single-particle trajectories in cytoplasm and their random-walk interpretations elucidate two of these mechanisms: molecular diffusion in crowded environments and cytoskeletal transport along microtubules. We employed acousto-optic deflector microscopy to map out the three-dimensional trajectories of microspheres migrating in the cytosolic fraction of a cellular extract. Classical Brownian motion (BM), continuous time random walk, and fractional BM were alternatively used to represent these trajectories. The comparison of the experimental and numerical data demonstrates that cytoskeletal transport along microtubules and diffusion in the cytosolic fraction exhibit anomalous (nonFickian) behavior and posses statistically distinct signatures. Among the three random-walk models used, continuous time random walk provides the best representation of diffusion, whereas microtubular transport is accurately modeled with fractional BM. PMID:23601312

  1. eLearning course may shorten the duration of mechanical restraint among psychiatric inpatients: a cluster-randomized trial.

    PubMed

    Kontio, Raija; Pitkänen, Anneli; Joffe, Grigori; Katajisto, Jouko; Välimäki, Maritta

    2014-10-01

    The management of psychiatric inpatients exhibiting severely disturbed and aggressive behaviour is an important educational topic. Well structured, IT-based educational programmes (eLearning) often ensure quality and may make training more affordable and accessible. The aim of this study was to explore the impact of an eLearning course for personnel on the rates and duration of seclusion and mechanical restraint among psychiatric inpatients. In a cluster-randomized intervention trial, the nursing personnel on 10 wards were randomly assigned to eLearning (intervention) or training-as-usual (control) groups. The eLearning course comprised six modules with specific topics (legal and ethical issues, behaviour-related factors, therapeutic relationship and self-awareness, teamwork and integrating knowledge with practice) and specific learning methods. The rates (incidents per 1000 occupied bed days) and durations of the coercion incidents were examined before and after the course. A total of 1283 coercion incidents (1143 seclusions [89%] and 140 incidents involving the use of mechanical restraints [11%]) were recorded on the study wards during the data collection period. On the intervention wards, there were no statistically significant changes in the rates of seclusion and mechanical restraint. However, the duration of incidents involving mechanical restraints shortened from 36.0 to 4.0 h (median) (P < 0.001). No statistically significant changes occurred on the control wards. After our eLearning course, the duration of incidents involving the use of mechanical restraints decreased. However, more studies are needed to ensure that the content of the course focuses on the most important factors associated with the seclusion-related elements. The eLearning course deserves further development and further studies. The duration of coercion incidents merits attention in future research.

  2. Three-Dimensional Color Code Thresholds via Statistical-Mechanical Mapping.

    PubMed

    Kubica, Aleksander; Beverland, Michael E; Brandão, Fernando; Preskill, John; Svore, Krysta M

    2018-05-04

    Three-dimensional (3D) color codes have advantages for fault-tolerant quantum computing, such as protected quantum gates with relatively low overhead and robustness against imperfect measurement of error syndromes. Here we investigate the storage threshold error rates for bit-flip and phase-flip noise in the 3D color code (3DCC) on the body-centered cubic lattice, assuming perfect syndrome measurements. In particular, by exploiting a connection between error correction and statistical mechanics, we estimate the threshold for 1D stringlike and 2D sheetlike logical operators to be p_{3DCC}^{(1)}≃1.9% and p_{3DCC}^{(2)}≃27.6%. We obtain these results by using parallel tempering Monte Carlo simulations to study the disorder-temperature phase diagrams of two new 3D statistical-mechanical models: the four- and six-body random coupling Ising models.

  3. Quantitative Skills, Critical Thinking, and Writing Mechanics in Blended versus Face-to-Face Versions of a Research Methods and Statistics Course

    ERIC Educational Resources Information Center

    Goode, Christopher T.; Lamoreaux, Marika; Atchison, Kristin J.; Jeffress, Elizabeth C.; Lynch, Heather L.; Sheehan, Elizabeth

    2018-01-01

    Hybrid or blended learning (BL) has been shown to be equivalent to or better than face-to-face (FTF) instruction in a broad variety of contexts. We randomly assigned students to either 50/50 BL or 100% FTF versions of a research methods and statistics in psychology course. Students who took the BL version of the course scored significantly lower…

  4. Bayesian Sensitivity Analysis of Statistical Models with Missing Data

    PubMed Central

    ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG

    2013-01-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718

  5. Statistical Modeling of Robotic Random Walks on Different Terrain

    NASA Astrophysics Data System (ADS)

    Naylor, Austin; Kinnaman, Laura

    Issues of public safety, especially with crowd dynamics and pedestrian movement, have been modeled by physicists using methods from statistical mechanics over the last few years. Complex decision making of humans moving on different terrains can be modeled using random walks (RW) and correlated random walks (CRW). The effect of different terrains, such as a constant increasing slope, on RW and CRW was explored. LEGO robots were programmed to make RW and CRW with uniform step sizes. Level ground tests demonstrated that the robots had the expected step size distribution and correlation angles (for CRW). The mean square displacement was calculated for each RW and CRW on different terrains and matched expected trends. The step size distribution was determined to change based on the terrain; theoretical predictions for the step size distribution were made for various simple terrains. It's Dr. Laura Kinnaman, not sure where to put the Prefix.

  6. The impact of loss to follow-up on hypothesis tests of the treatment effect for several statistical methods in substance abuse clinical trials.

    PubMed

    Hedden, Sarra L; Woolson, Robert F; Carter, Rickey E; Palesch, Yuko; Upadhyaya, Himanshu P; Malcolm, Robert J

    2009-07-01

    "Loss to follow-up" can be substantial in substance abuse clinical trials. When extensive losses to follow-up occur, one must cautiously analyze and interpret the findings of a research study. Aims of this project were to introduce the types of missing data mechanisms and describe several methods for analyzing data with loss to follow-up. Furthermore, a simulation study compared Type I error and power of several methods when missing data amount and mechanism varies. Methods compared were the following: Last observation carried forward (LOCF), multiple imputation (MI), modified stratified summary statistics (SSS), and mixed effects models. Results demonstrated nominal Type I error for all methods; power was high for all methods except LOCF. Mixed effect model, modified SSS, and MI are generally recommended for use; however, many methods require that the data are missing at random or missing completely at random (i.e., "ignorable"). If the missing data are presumed to be nonignorable, a sensitivity analysis is recommended.

  7. Quantum correlations and dynamics from classical random fields valued in complex Hilbert spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khrennikov, Andrei

    2010-08-15

    One of the crucial differences between mathematical models of classical and quantum mechanics (QM) is the use of the tensor product of the state spaces of subsystems as the state space of the corresponding composite system. (To describe an ensemble of classical composite systems, one uses random variables taking values in the Cartesian product of the state spaces of subsystems.) We show that, nevertheless, it is possible to establish a natural correspondence between the classical and the quantum probabilistic descriptions of composite systems. Quantum averages for composite systems (including entangled) can be represented as averages with respect to classical randommore » fields. It is essentially what Albert Einstein dreamed of. QM is represented as classical statistical mechanics with infinite-dimensional phase space. While the mathematical construction is completely rigorous, its physical interpretation is a complicated problem. We present the basic physical interpretation of prequantum classical statistical field theory in Sec. II. However, this is only the first step toward real physical theory.« less

  8. Statistics of contractive cracking patterns. [frozen soil-water rheology

    NASA Technical Reports Server (NTRS)

    Noever, David A.

    1991-01-01

    The statistics of convective soil patterns are analyzed using statistical crystallography. An underlying hierarchy of order is found to span four orders of magnitude in characteristic pattern length. Strict mathematical requirements determine the two-dimensional (2D) topology, such that random partitioning of space yields a predictable statistical geometry for polygons. For all lengths, Aboav's and Lewis's laws are verified; this result is consistent both with the need to fill 2D space and most significantly with energy carried not by the patterns' interior, but by the boundaries. Together, this suggests a common mechanism of formation for both micro- and macro-freezing patterns.

  9. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.

    1988-01-01

    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  10. How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems.

    PubMed

    Hanel, Rudolf; Thurner, Stefan; Gell-Mann, Murray

    2014-05-13

    The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there has been an ongoing controversy over whether the notion of the maximum entropy principle can be extended in a meaningful way to nonextensive, nonergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for nonergodic and complex statistical systems if their relative entropy can be factored into a generalized multiplicity and a constraint term. The problem of finding such a factorization reduces to finding an appropriate representation of relative entropy in a linear basis. In a particular example we show that path-dependent random processes with memory naturally require specific generalized entropies. The example is to our knowledge the first exact derivation of a generalized entropy from the microscopic properties of a path-dependent random process.

  11. Statistical mechanics of budget-constrained auctions

    NASA Astrophysics Data System (ADS)

    Altarelli, F.; Braunstein, A.; Realpe-Gomez, J.; Zecchina, R.

    2009-07-01

    Finding the optimal assignment in budget-constrained auctions is a combinatorial optimization problem with many important applications, a notable example being in the sale of advertisement space by search engines (in this context the problem is often referred to as the off-line AdWords problem). On the basis of the cavity method of statistical mechanics, we introduce a message-passing algorithm that is capable of solving efficiently random instances of the problem extracted from a natural distribution, and we derive from its properties the phase diagram of the problem. As the control parameter (average value of the budgets) is varied, we find two phase transitions delimiting a region in which long-range correlations arise.

  12. High-Speed Device-Independent Quantum Random Number Generation without a Detection Loophole

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Yuan, Xiao; Li, Ming-Han; Zhang, Weijun; Zhao, Qi; Zhong, Jiaqiang; Cao, Yuan; Li, Yu-Huai; Chen, Luo-Kan; Li, Hao; Peng, Tianyi; Chen, Yu-Ao; Peng, Cheng-Zhi; Shi, Sheng-Cai; Wang, Zhen; You, Lixing; Ma, Xiongfeng; Fan, Jingyun; Zhang, Qiang; Pan, Jian-Wei

    2018-01-01

    Quantum mechanics provides the means of generating genuine randomness that is impossible with deterministic classical processes. Remarkably, the unpredictability of randomness can be certified in a manner that is independent of implementation devices. Here, we present an experimental study of device-independent quantum random number generation based on a detection-loophole-free Bell test with entangled photons. In the randomness analysis, without the independent identical distribution assumption, we consider the worst case scenario that the adversary launches the most powerful attacks against the quantum adversary. After considering statistical fluctuations and applying an 80 Gb ×45.6 Mb Toeplitz matrix hashing, we achieve a final random bit rate of 114 bits /s , with a failure probability less than 10-5. This marks a critical step towards realistic applications in cryptography and fundamental physics tests.

  13. Statistical mechanics of monatomic liquids

    NASA Astrophysics Data System (ADS)

    Wallace, Duane C.

    1997-10-01

    Two key experimental properties of elemental liquids, together with an analysis of the condensed-system potential-energy surface, lead us logically to the dynamical theory of monatomic liquids. Experimentally, the ion motional specific heat is approximately 3Nk for N ions, implying the normal modes of motion are approximately 3N independent harmonic oscillators. This implies the potential surface contains nearly harmonic valleys. The equilibrium configuration at the bottom of each valley is a ``structure.'' Structures are crystalline or amorphous, and amorphous structures can have a remnant of local crystal symmetry, or can be random. The random structures are by far the most numerous, and hence dominate the statistical mechanics of the liquid state, and their macroscopic properties are uniform over the structure class, for large-N systems. The Hamiltonian for any structural valley is the static structure potential, a sum of harmonic normal modes, and an anharmonic correction. Again from experiment, the constant-density entropy of melting contains a universal disordering contribution of NkΔ, suggesting the random structural valleys are of universal number wN, where lnw=Δ. Our experimental estimate for Δ is 0.80. In quasiharmonic approximation, the liquid theory for entropy agrees with experiment, for all currently analyzable experimental data at elevated temperatures, to within 1-2% of the total entropy. Further testable predictions of the theory are mentioned.

  14. Thrombectomy for ischemic stroke: meta-analyses of recurrent strokes, vasospasms, and subarachnoid hemorrhages.

    PubMed

    Emprechtinger, Robert; Piso, Brigitte; Ringleb, Peter A

    2017-03-01

    Mechanical thrombectomy with stent retrievers is an effective treatment for patients with ischemic stroke. Results of recent meta-analyses report that the treatment is safe. However, the endpoints recurrent stroke, vasospasms, and subarachnoid hemorrhage have not been evaluated sufficiently. Hence, we extracted data on these outcomes from the five recent thrombectomy trials (MR CLEAN, ESCAPE, REVASCAT, SWIFT PRIME, and EXTEND IA published in 2015). Subsequently, we conducted meta-analyses for each outcome. We report the results of the fixed, as well as the random effects model. Three studies reported data on recurrent strokes. While the results did not reach statistical significance in the random effects model (despite a three times elevated risk), the fixed effects model revealed a significantly higher rate of recurrent strokes after thrombectomy. Four studies reported data on subarachnoid hemorrhage. The higher pooled rates in the intervention groups were statistically significant in both, the fixed and the random effects model. One study reported on vasospasms. We recorded 14 events in the intervention group and none in the control group. The efficacy of mechanical thrombectomy is not questioned, yet our results indicate an increased risk for recurrent strokes, subarachnoid hemorrhage, and vasospasms post-treatment. Therefore, we strongly recommend a thoroughly surveillance, concerning these adverse events in future clinical trials and routine registries.

  15. Estimation of trends

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The application of statistical methods to recorded ozone measurements. The effects of a long term depletion of ozone at magnitudes predicted by the NAS is harmful to most forms of life. Empirical prewhitening filters the derivation of which is independent of the underlying physical mechanisms were analyzed. Statistical analysis performs a checks and balances effort. Time series filters variations into systematic and random parts, errors are uncorrelated, and significant phase lag dependencies are identified. The use of time series modeling to enhance the capability of detecting trends is discussed.

  16. Statistical Mechanics of Combinatorial Auctions

    NASA Astrophysics Data System (ADS)

    Galla, Tobias; Leone, Michele; Marsili, Matteo; Sellitto, Mauro; Weigt, Martin; Zecchina, Riccardo

    2006-09-01

    Combinatorial auctions are formulated as frustrated lattice gases on sparse random graphs, allowing the determination of the optimal revenue by methods of statistical physics. Transitions between computationally easy and hard regimes are found and interpreted in terms of the geometric structure of the space of solutions. We introduce an iterative algorithm to solve intermediate and large instances, and discuss competing states of optimal revenue and maximal number of satisfied bidders. The algorithm can be generalized to the hard phase and to more sophisticated auction protocols.

  17. Design of a factorial experiment with randomization restrictions to assess medical device performance on vascular tissue.

    PubMed

    Diestelkamp, Wiebke S; Krane, Carissa M; Pinnell, Margaret F

    2011-05-20

    Energy-based surgical scalpels are designed to efficiently transect and seal blood vessels using thermal energy to promote protein denaturation and coagulation. Assessment and design improvement of ultrasonic scalpel performance relies on both in vivo and ex vivo testing. The objective of this work was to design and implement a robust, experimental test matrix with randomization restrictions and predictive statistical power, which allowed for identification of those experimental variables that may affect the quality of the seal obtained ex vivo. The design of the experiment included three factors: temperature (two levels); the type of solution used to perfuse the artery during transection (three types); and artery type (two types) resulting in a total of twelve possible treatment combinations. Burst pressures of porcine carotid and renal arteries sealed ex vivo were assigned as the response variable. The experimental test matrix was designed and carried out as a split-plot experiment in order to assess the contributions of several variables and their interactions while accounting for randomization restrictions present in the experimental setup. The statistical software package SAS was utilized and PROC MIXED was used to account for the randomization restrictions in the split-plot design. The combination of temperature, solution, and vessel type had a statistically significant impact on seal quality. The design and implementation of a split-plot experimental test-matrix provided a mechanism for addressing the existing technical randomization restrictions of ex vivo ultrasonic scalpel performance testing, while preserving the ability to examine the potential effects of independent factors or variables. This method for generating the experimental design and the statistical analyses of the resulting data are adaptable to a wide variety of experimental problems involving large-scale tissue-based studies of medical or experimental device efficacy and performance.

  18. Record statistics for biased random walks, with an application to financial data

    NASA Astrophysics Data System (ADS)

    Wergen, Gregor; Bogner, Miro; Krug, Joachim

    2011-05-01

    We consider the occurrence of record-breaking events in random walks with asymmetric jump distributions. The statistics of records in symmetric random walks was previously analyzed by Majumdar and Ziff [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.101.050601 101, 050601 (2008)] and is well understood. Unlike the case of symmetric jump distributions, in the asymmetric case the statistics of records depends on the choice of the jump distribution. We compute the record rate Pn(c), defined as the probability for the nth value to be larger than all previous values, for a Gaussian jump distribution with standard deviation σ that is shifted by a constant drift c. For small drift, in the sense of c/σ≪n-1/2, the correction to Pn(c) grows proportional to arctan(n) and saturates at the value (c)/(2σ). For large n the record rate approaches a constant, which is approximately given by 1-(σ/2πc)exp(-c2/2σ2) for c/σ≫1. These asymptotic results carry over to other continuous jump distributions with finite variance. As an application, we compare our analytical results to the record statistics of 366 daily stock prices from the Standard & Poor's 500 index. The biased random walk accounts quantitatively for the increase in the number of upper records due to the overall trend in the stock prices, and after detrending the number of upper records is in good agreement with the symmetric random walk. However the number of lower records in the detrended data is significantly reduced by a mechanism that remains to be identified.

  19. Social Noise: Generating Random Numbers from Twitter Streams

    NASA Astrophysics Data System (ADS)

    Fernández, Norberto; Quintas, Fernando; Sánchez, Luis; Arias, Jesús

    2015-12-01

    Due to the multiple applications of random numbers in computer systems (cryptography, online gambling, computer simulation, etc.) it is important to have mechanisms to generate these numbers. True Random Number Generators (TRNGs) are commonly used for this purpose. TRNGs rely on non-deterministic sources to generate randomness. Physical processes (like noise in semiconductors, quantum phenomenon, etc.) play this role in state of the art TRNGs. In this paper, we depart from previous work and explore the possibility of defining social TRNGs using the stream of public messages of the microblogging service Twitter as randomness source. Thus, we define two TRNGs based on Twitter stream information and evaluate them using the National Institute of Standards and Technology (NIST) statistical test suite. The results of the evaluation confirm the feasibility of the proposed approach.

  20. Fractal planetary rings: Energy inequalities and random field model

    NASA Astrophysics Data System (ADS)

    Malyarenko, Anatoliy; Ostoja-Starzewski, Martin

    2017-12-01

    This study is motivated by a recent observation, based on photographs from the Cassini mission, that Saturn’s rings have a fractal structure in radial direction. Accordingly, two questions are considered: (1) What Newtonian mechanics argument in support of such a fractal structure of planetary rings is possible? (2) What kinematics model of such fractal rings can be formulated? Both challenges are based on taking planetary rings’ spatial structure as being statistically stationary in time and statistically isotropic in space, but statistically nonstationary in space. An answer to the first challenge is given through an energy analysis of circular rings having a self-generated, noninteger-dimensional mass distribution [V. E. Tarasov, Int. J. Mod Phys. B 19, 4103 (2005)]. The second issue is approached by taking the random field of angular velocity vector of a rotating particle of the ring as a random section of a special vector bundle. Using the theory of group representations, we prove that such a field is completely determined by a sequence of continuous positive-definite matrix-valued functions defined on the Cartesian square F2 of the radial cross-section F of the rings, where F is a fat fractal.

  1. Confounding in statistical mediation analysis: What it is and how to address it.

    PubMed

    Valente, Matthew J; Pelham, William E; Smyth, Heather; MacKinnon, David P

    2017-11-01

    Psychology researchers are often interested in mechanisms underlying how randomized interventions affect outcomes such as substance use and mental health. Mediation analysis is a common statistical method for investigating psychological mechanisms that has benefited from exciting new methodological improvements over the last 2 decades. One of the most important new developments is methodology for estimating causal mediated effects using the potential outcomes framework for causal inference. Potential outcomes-based methods developed in epidemiology and statistics have important implications for understanding psychological mechanisms. We aim to provide a concise introduction to and illustration of these new methods and emphasize the importance of confounder adjustment. First, we review the traditional regression approach for estimating mediated effects. Second, we describe the potential outcomes framework. Third, we define what a confounder is and how the presence of a confounder can provide misleading evidence regarding mechanisms of interventions. Fourth, we describe experimental designs that can help rule out confounder bias. Fifth, we describe new statistical approaches to adjust for measured confounders of the mediator-outcome relation and sensitivity analyses to probe effects of unmeasured confounders on the mediated effect. All approaches are illustrated with application to a real counseling intervention dataset. Counseling psychologists interested in understanding the causal mechanisms of their interventions can benefit from incorporating the most up-to-date techniques into their mediation analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. Statistical mechanics of a cat's cradle

    NASA Astrophysics Data System (ADS)

    Shen, Tongye; Wolynes, Peter G.

    2006-11-01

    It is believed that, much like a cat's cradle, the cytoskeleton can be thought of as a network of strings under tension. We show that both regular and random bond-disordered networks having bonds that buckle upon compression exhibit a variety of phase transitions as a function of temperature and extension. The results of self-consistent phonon calculations for the regular networks agree very well with computer simulations at finite temperature. The analytic theory also yields a rigidity onset (mechanical percolation) and the fraction of extended bonds for random networks. There is very good agreement with the simulations by Delaney et al (2005 Europhys. Lett. 72 990). The mean field theory reveals a nontranslationally invariant phase with self-generated heterogeneity of tautness, representing 'antiferroelasticity'.

  3. Exploring the Connection Between Sampling Problems in Bayesian Inference and Statistical Mechanics

    NASA Technical Reports Server (NTRS)

    Pohorille, Andrew

    2006-01-01

    The Bayesian and statistical mechanical communities often share the same objective in their work - estimating and integrating probability distribution functions (pdfs) describing stochastic systems, models or processes. Frequently, these pdfs are complex functions of random variables exhibiting multiple, well separated local minima. Conventional strategies for sampling such pdfs are inefficient, sometimes leading to an apparent non-ergodic behavior. Several recently developed techniques for handling this problem have been successfully applied in statistical mechanics. In the multicanonical and Wang-Landau Monte Carlo (MC) methods, the correct pdfs are recovered from uniform sampling of the parameter space by iteratively establishing proper weighting factors connecting these distributions. Trivial generalizations allow for sampling from any chosen pdf. The closely related transition matrix method relies on estimating transition probabilities between different states. All these methods proved to generate estimates of pdfs with high statistical accuracy. In another MC technique, parallel tempering, several random walks, each corresponding to a different value of a parameter (e.g. "temperature"), are generated and occasionally exchanged using the Metropolis criterion. This method can be considered as a statistically correct version of simulated annealing. An alternative approach is to represent the set of independent variables as a Hamiltonian system. Considerab!e progress has been made in understanding how to ensure that the system obeys the equipartition theorem or, equivalently, that coupling between the variables is correctly described. Then a host of techniques developed for dynamical systems can be used. Among them, probably the most powerful is the Adaptive Biasing Force method, in which thermodynamic integration and biased sampling are combined to yield very efficient estimates of pdfs. The third class of methods deals with transitions between states described by rate constants. These problems are isomorphic with chemical kinetics problems. Recently, several efficient techniques for this purpose have been developed based on the approach originally proposed by Gillespie. Although the utility of the techniques mentioned above for Bayesian problems has not been determined, further research along these lines is warranted

  4. A novel conductivity mechanism of highly disordered carbon systems based on an investigation of graph zeta function

    NASA Astrophysics Data System (ADS)

    Matsutani, Shigeki; Sato, Iwao

    2017-09-01

    In the previous report (Matsutani and Suzuki, 2000 [21]), by proposing the mechanism under which electric conductivity is caused by the activational hopping conduction with the Wigner surmise of the level statistics, the temperature-dependent of electronic conductivity of a highly disordered carbon system was evaluated including apparent metal-insulator transition. Since the system consists of small pieces of graphite, it was assumed that the reason why the level statistics appears is due to the behavior of the quantum chaos in each granular graphite. In this article, we revise the assumption and show another origin of the Wigner surmise, which is more natural for the carbon system based on a recent investigation of graph zeta function in graph theory. Our method can be applied to the statistical treatment of the electronic properties of the randomized molecular system in general.

  5. A randomized clinical trial comparing an extended-use hygroscopic condenser humidifier with heated-water humidification in mechanically ventilated patients.

    PubMed

    Kollef, M H; Shapiro, S D; Boyd, V; Silver, P; Von Harz, B; Trovillion, E; Prentice, D

    1998-03-01

    To determine the safety and cost-effectiveness of mechanical ventilation with an extended-use hygroscopic condenser humidifier (Duration; Nellcor Puritan-Bennett; Eden Prairie, Minn) compared with mechanical ventilation with heated-water humidification. Prospective randomized clinical trial. Medical and surgical ICUs of Barnes-Jewish Hospital, St. Louis, a university-affiliated teaching hospital. Three hundred ten consecutive qualified patients undergoing mechanical ventilation. Patients requiring mechanical ventilation were randomly assigned to receive humidification with either an extended-use hygroscopic condenser humidifier (for up to the first 7 days of mechanical ventilation) or heated-water humidification. Occurrence of ventilator-associated pneumonia, endotracheal tube occlusion, duration of mechanical ventilation, lengths of intensive care and hospitalization, acquired multiorgan dysfunction, and hospital mortality. One hundred sixty-three patients were randomly assigned to receive humidification with an extended-use hygroscopic condenser humidifier, and 147 patients were randomly assigned to receive heated-water humidification. The two groups were similar at the time of randomization with regard to demographic characteristics, ICU admission diagnoses, and severity of illness. Risk factors for the development of ventilator-associated pneumonia were also similar during the study period for both treatment groups. Ventilator-associated pneumonia was seen in 15 (9.2%) patients receiving humidification with an extended-use hygroscopic condenser humidifier and in 15 (10.2%) patients receiving heated-water humidification (relative risk, 0.90; 95% confidence interval=0.46 to 1.78; p=0.766). No statistically significant differences for hospital mortality, duration of mechanical ventilation, lengths of stay in the hospital ICU, or acquired organ system derangements were found between the two treatment groups. No episode of endotracheal tube occlusion occurred during the study period in either treatment group. The total cost of providing humidification was $2,605 for patients receiving a hygroscopic condenser humidifier compared with $5,625 for patients receiving heated-water humidification. Our findings suggest that the initial application of an extended-use hygroscopic condenser humidifier is a safe and more cost-effective method of providing humidification to patients requiring mechanical ventilation compared with heated-water humidification.

  6. High-Speed Device-Independent Quantum Random Number Generation without a Detection Loophole.

    PubMed

    Liu, Yang; Yuan, Xiao; Li, Ming-Han; Zhang, Weijun; Zhao, Qi; Zhong, Jiaqiang; Cao, Yuan; Li, Yu-Huai; Chen, Luo-Kan; Li, Hao; Peng, Tianyi; Chen, Yu-Ao; Peng, Cheng-Zhi; Shi, Sheng-Cai; Wang, Zhen; You, Lixing; Ma, Xiongfeng; Fan, Jingyun; Zhang, Qiang; Pan, Jian-Wei

    2018-01-05

    Quantum mechanics provides the means of generating genuine randomness that is impossible with deterministic classical processes. Remarkably, the unpredictability of randomness can be certified in a manner that is independent of implementation devices. Here, we present an experimental study of device-independent quantum random number generation based on a detection-loophole-free Bell test with entangled photons. In the randomness analysis, without the independent identical distribution assumption, we consider the worst case scenario that the adversary launches the most powerful attacks against the quantum adversary. After considering statistical fluctuations and applying an 80  Gb×45.6  Mb Toeplitz matrix hashing, we achieve a final random bit rate of 114  bits/s, with a failure probability less than 10^{-5}. This marks a critical step towards realistic applications in cryptography and fundamental physics tests.

  7. Multifractality and freezing phenomena in random energy landscapes: An introduction

    NASA Astrophysics Data System (ADS)

    Fyodorov, Yan V.

    2010-10-01

    We start our lectures with introducing and discussing the general notion of multifractality spectrum for random measures on lattices, and how it can be probed using moments of that measure. Then we show that the Boltzmann-Gibbs probability distributions generated by logarithmically correlated random potentials provide a simple yet non-trivial example of disorder-induced multifractal measures. The typical values of the multifractality exponents can be extracted from calculating the free energy of the associated Statistical Mechanics problem. To succeed in such a calculation we introduce and discuss in some detail two analytically tractable models for logarithmically correlated potentials. The first model uses a special definition of distances between points in space and is based on the idea of multiplicative cascades which originated in theory of turbulent motion. It is essentially equivalent to statistical mechanics of directed polymers on disordered trees studied long ago by Derrida and Spohn (1988) in Ref. [12]. In this way we introduce the notion of the freezing transition which is identified with an abrupt change in the multifractality spectrum. Second model which allows for explicit analytical evaluation of the free energy is the infinite-dimensional version of the problem which can be solved by employing the replica trick. In particular, the latter version allows one to identify the freezing phenomenon with a mechanism of the replica symmetry breaking (RSB) and to elucidate its physical meaning. The corresponding one-step RSB solution turns out to be marginally stable everywhere in the low-temperature phase. We finish with a short discussion of recent developments and extensions of models with logarithmic correlations, in particular in the context of extreme value statistics. The first appendix summarizes the standard elementary information about Gaussian integrals and related subjects, and introduces the notion of the Gaussian free field characterized by logarithmic correlations. Three other appendices provide the detailed exposition of a few technical details underlying the replica analysis of the model discussed in the lectures.

  8. Error threshold for color codes and random three-body Ising models.

    PubMed

    Katzgraber, Helmut G; Bombin, H; Martin-Delgado, M A

    2009-08-28

    We study the error threshold of color codes, a class of topological quantum codes that allow a direct implementation of quantum Clifford gates suitable for entanglement distillation, teleportation, and fault-tolerant quantum computation. We map the error-correction process onto a statistical mechanical random three-body Ising model and study its phase diagram via Monte Carlo simulations. The obtained error threshold of p(c) = 0.109(2) is very close to that of Kitaev's toric code, showing that enhanced computational capabilities do not necessarily imply lower resistance to noise.

  9. JOURNAL SCOPE GUIDELINES: Paper classification scheme

    NASA Astrophysics Data System (ADS)

    2005-06-01

    This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas

  10. The glassy random laser: replica symmetry breaking in the intensity fluctuations of emission spectra

    PubMed Central

    Antenucci, Fabrizio; Crisanti, Andrea; Leuzzi, Luca

    2015-01-01

    The behavior of a newly introduced overlap parameter, measuring the correlation between intensity fluctuations of waves in random media, is analyzed in different physical regimes, with varying amount of disorder and non-linearity. This order parameter allows to identify the laser transition in random media and describes its possible glassy nature in terms of emission spectra data, the only data so far accessible in random laser measurements. The theoretical analysis is performed in terms of the complex spherical spin-glass model, a statistical mechanical model describing the onset and the behavior of random lasers in open cavities. Replica Symmetry Breaking theory allows to discern different kinds of randomness in the high pumping regime, including the most complex and intriguing glassy randomness. The outcome of the theoretical study is, eventually, compared to recent intensity fluctuation overlap measurements demonstrating the validity of the theory and providing a straightforward interpretation of qualitatively different spectral behaviors in different random lasers. PMID:26616194

  11. Effect of Acetazolamide vs Placebo on Duration of Invasive Mechanical Ventilation Among Patients With Chronic Obstructive Pulmonary Disease: A Randomized Clinical Trial.

    PubMed

    Faisy, Christophe; Meziani, Ferhat; Planquette, Benjamin; Clavel, Marc; Gacouin, Arnaud; Bornstain, Caroline; Schneider, Francis; Duguet, Alexandre; Gibot, Sébastien; Lerolle, Nicolas; Ricard, Jean-Damien; Sanchez, Olivier; Djibre, Michel; Ricome, Jean-Louis; Rabbat, Antoine; Heming, Nicholas; Urien, Saïk; Esvan, Maxime; Katsahian, Sandrine

    2016-02-02

    Acetazolamide has been used for decades as a respiratory stimulant for patients with chronic obstructive pulmonary disease (COPD) and metabolic alkalosis, but no large randomized placebo-controlled trial is available to confirm this approach. To determine whether acetazolamide reduces mechanical ventilation duration in critically ill patients with COPD and metabolic alkalosis. The DIABOLO study, a randomized, double-blind, multicenter trial, was conducted from October 2011 through July 2014 in 15 intensive care units (ICUs) in France. A total of 382 patients with COPD who were expected to receive mechanical ventilation for more 24 hours were randomized to the acetazolamide or placebo group and 380 were included in an intention-to treat analysis. Acetazolamide (500-1000 mg, twice daily) vs placebo administered intravenously in cases of pure or mixed metabolic alkalosis, initiated within 48 hours of ICU admission and continued during the ICU stay for a maximum of 28 days. The primary outcome was the duration of invasive mechanical ventilation via endotracheal intubation or tracheotomy. Secondary outcomes included changes in arterial blood gas and respiratory parameters, weaning duration, adverse events, use of noninvasive ventilation after extubation, successful weaning, the duration of ICU stay, and in-ICU mortality. Among 382 randomized patients, 380 (mean age, 69 years; 272 men [71.6%]; 379 [99.7%] with endotracheal intubation) completed the study. For the acetazolamide group (n = 187), compared with the placebo group (n = 193), no significant between-group differences were found for median duration of mechanical ventilation (-16.0 hours; 95% CI, -36.5 to 4.0 hours; P = .17), duration of weaning off mechanical ventilation (-0.9 hours; 95% CI, -4.3 to 1.3 hours; P = .36), daily changes of minute-ventilation (-0.0 L/min; 95% CI, -0.2 to 0.2 L/min; P = .72), or partial carbon-dioxide pressure in arterial blood (-0.3 mm Hg; 95% CI, -0.8 to 0.2 mm Hg; P = .25), although daily changes of serum bicarbonate (between-group difference, -0.8 mEq/L; 95% CI, -1.2 to -0.5 mEq/L; P < .001) and number of days with metabolic alkalosis (between-group difference, -1; 95% CI, -2 to -1 days; P < .001) decreased significantly more in the acetazolamide group. Other secondary outcomes also did not differ significantly between groups. Among patients with COPD receiving invasive mechanical ventilation, the use of acetazolamide, compared with placebo, did not result in a statistically significant reduction in the duration of invasive mechanical ventilation. However, the magnitude of the difference was clinically important, and it is possible that the study was underpowered to establish statistical significance. clinicaltrials.gov Identifier: NCT01627639.

  12. User's Guide to Galoper: A Program for Simulating the Shapes of Crystal Size Distributions from Growth Mechanisms - and Associated Programs

    USGS Publications Warehouse

    Eberl, Dennis D.; Drits, V.A.; Srodon, J.

    2000-01-01

    GALOPER is a computer program that simulates the shapes of crystal size distributions (CSDs) from crystal growth mechanisms. This manual describes how to use the program. The theory for the program's operation has been described previously (Eberl, Drits, and Srodon, 1998). CSDs that can be simulated using GALOPER include those that result from growth mechanisms operating in the open system, such as constant-rate nucleation and growth, nucleation with a decaying nucleation rate and growth, surface-controlled growth, supply-controlled growth, and constant-rate and random growth; and those that result from mechanisms operating in the closed system such as Ostwald ripening, random ripening, and crystal coalescence. In addition, CSDs for two types weathering reactions can be simulated. The operation of associated programs also is described, including two statistical programs used for comparing calculated with measured CSDs, a program used for calculating lognormal CSDs, and a program for arranging measured crystal sizes into size groupings (bins).

  13. Statistical analysis and handling of missing data in cluster randomized trials: a systematic review.

    PubMed

    Fiero, Mallorie H; Huang, Shuang; Oren, Eyal; Bell, Melanie L

    2016-02-09

    Cluster randomized trials (CRTs) randomize participants in groups, rather than as individuals and are key tools used to assess interventions in health research where treatment contamination is likely or if individual randomization is not feasible. Two potential major pitfalls exist regarding CRTs, namely handling missing data and not accounting for clustering in the primary analysis. The aim of this review was to evaluate approaches for handling missing data and statistical analysis with respect to the primary outcome in CRTs. We systematically searched for CRTs published between August 2013 and July 2014 using PubMed, Web of Science, and PsycINFO. For each trial, two independent reviewers assessed the extent of the missing data and method(s) used for handling missing data in the primary and sensitivity analyses. We evaluated the primary analysis and determined whether it was at the cluster or individual level. Of the 86 included CRTs, 80 (93%) trials reported some missing outcome data. Of those reporting missing data, the median percent of individuals with a missing outcome was 19% (range 0.5 to 90%). The most common way to handle missing data in the primary analysis was complete case analysis (44, 55%), whereas 18 (22%) used mixed models, six (8%) used single imputation, four (5%) used unweighted generalized estimating equations, and two (2%) used multiple imputation. Fourteen (16%) trials reported a sensitivity analysis for missing data, but most assumed the same missing data mechanism as in the primary analysis. Overall, 67 (78%) trials accounted for clustering in the primary analysis. High rates of missing outcome data are present in the majority of CRTs, yet handling missing data in practice remains suboptimal. Researchers and applied statisticians should carry out appropriate missing data methods, which are valid under plausible assumptions in order to increase statistical power in trials and reduce the possibility of bias. Sensitivity analysis should be performed, with weakened assumptions regarding the missing data mechanism to explore the robustness of results reported in the primary analysis.

  14. Qualitatively Assessing Randomness in SVD Results

    NASA Astrophysics Data System (ADS)

    Lamb, K. W.; Miller, W. P.; Kalra, A.; Anderson, S.; Rodriguez, A.

    2012-12-01

    Singular Value Decomposition (SVD) is a powerful tool for identifying regions of significant co-variability between two spatially distributed datasets. SVD has been widely used in atmospheric research to define relationships between sea surface temperatures, geopotential height, wind, precipitation and streamflow data for myriad regions across the globe. A typical application for SVD is to identify leading climate drivers (as observed in the wind or pressure data) for a particular hydrologic response variable such as precipitation, streamflow, or soil moisture. One can also investigate the lagged relationship between a climate variable and the hydrologic response variable using SVD. When performing these studies it is important to limit the spatial bounds of the climate variable to reduce the chance of random co-variance relationships being identified. On the other hand, a climate region that is too small may ignore climate signals which have more than a statistical relationship to a hydrologic response variable. The proposed research seeks to identify a qualitative method of identifying random co-variability relationships between two data sets. The research identifies the heterogeneous correlation maps from several past results and compares these results with correlation maps produced using purely random and quasi-random climate data. The comparison identifies a methodology to determine if a particular region on a correlation map may be explained by a physical mechanism or is simply statistical chance.

  15. Modeling and Simulation of Linear and Nonlinear MEMS Scale Electromagnetic Energy Harvesters for Random Vibration Environments

    PubMed Central

    Sassani, Farrokh

    2014-01-01

    The simulation results for electromagnetic energy harvesters (EMEHs) under broad band stationary Gaussian random excitations indicate the importance of both a high transformation factor and a high mechanical quality factor to achieve favourable mean power, mean square load voltage, and output spectral density. The optimum load is different for random vibrations and for sinusoidal vibration. Reducing the total damping ratio under band-limited random excitation yields a higher mean square load voltage. Reduced bandwidth resulting from decreased mechanical damping can be compensated by increasing the electrical damping (transformation factor) leading to a higher mean square load voltage and power. Nonlinear EMEHs with a Duffing spring and with linear plus cubic damping are modeled using the method of statistical linearization. These nonlinear EMEHs exhibit approximately linear behaviour under low levels of broadband stationary Gaussian random vibration; however, at higher levels of such excitation the central (resonant) frequency of the spectral density of the output voltage shifts due to the increased nonlinear stiffness and the bandwidth broadens slightly. Nonlinear EMEHs exhibit lower maximum output voltage and central frequency of the spectral density with nonlinear damping compared to linear damping. Stronger nonlinear damping yields broader bandwidths at stable resonant frequency. PMID:24605063

  16. Random electric field instabilities of relaxor ferroelectrics

    NASA Astrophysics Data System (ADS)

    Arce-Gamboa, José R.; Guzmán-Verri, Gian G.

    2017-06-01

    Relaxor ferroelectrics are complex oxide materials which are rather unique to study the effects of compositional disorder on phase transitions. Here, we study the effects of quenched cubic random electric fields on the lattice instabilities that lead to a ferroelectric transition and show that, within a microscopic model and a statistical mechanical solution, even weak compositional disorder can prohibit the development of long-range order and that a random field state with anisotropic and power-law correlations of polarization emerges from the combined effect of their characteristic dipole forces and their inherent charge disorder. We compare and reproduce several key experimental observations in the well-studied relaxor PbMg1/3Nb2/3O3-PbTiO3.

  17. Uniform quantized electron gas

    NASA Astrophysics Data System (ADS)

    Høye, Johan S.; Lomba, Enrique

    2016-10-01

    In this work we study the correlation energy of the quantized electron gas of uniform density at temperature T  =  0. To do so we utilize methods from classical statistical mechanics. The basis for this is the Feynman path integral for the partition function of quantized systems. With this representation the quantum mechanical problem can be interpreted as, and is equivalent to, a classical polymer problem in four dimensions where the fourth dimension is imaginary time. Thus methods, results, and properties obtained in the statistical mechanics of classical fluids can be utilized. From this viewpoint we recover the well known RPA (random phase approximation). Then to improve it we modify the RPA by requiring the corresponding correlation function to be such that electrons with equal spins can not be on the same position. Numerical evaluations are compared with well known results of a standard parameterization of Monte Carlo correlation energies.

  18. Evaluation of high-resolution sea ice models on the basis of statistical and scaling properties of Arctic sea ice drift and deformation

    NASA Astrophysics Data System (ADS)

    Girard, L.; Weiss, J.; Molines, J. M.; Barnier, B.; Bouillon, S.

    2009-08-01

    Sea ice drift and deformation from models are evaluated on the basis of statistical and scaling properties. These properties are derived from two observation data sets: the RADARSAT Geophysical Processor System (RGPS) and buoy trajectories from the International Arctic Buoy Program (IABP). Two simulations obtained with the Louvain-la-Neuve Ice Model (LIM) coupled to a high-resolution ocean model and a simulation obtained with the Los Alamos Sea Ice Model (CICE) were analyzed. Model ice drift compares well with observations in terms of large-scale velocity field and distributions of velocity fluctuations although a significant bias on the mean ice speed is noted. On the other hand, the statistical properties of ice deformation are not well simulated by the models: (1) The distributions of strain rates are incorrect: RGPS distributions of strain rates are power law tailed, i.e., exhibit "wild randomness," whereas models distributions remain in the Gaussian attraction basin, i.e., exhibit "mild randomness." (2) The models are unable to reproduce the spatial and temporal correlations of the deformation fields: In the observations, ice deformation follows spatial and temporal scaling laws that express the heterogeneity and the intermittency of deformation. These relations do not appear in simulated ice deformation. Mean deformation in models is almost scale independent. The statistical properties of ice deformation are a signature of the ice mechanical behavior. The present work therefore suggests that the mechanical framework currently used by models is inappropriate. A different modeling framework based on elastic interactions could improve the representation of the statistical and scaling properties of ice deformation.

  19. Statistical mechanics of the vertex-cover problem

    NASA Astrophysics Data System (ADS)

    Hartmann, Alexander K.; Weigt, Martin

    2003-10-01

    We review recent progress in the study of the vertex-cover problem (VC). The VC belongs to the class of NP-complete graph theoretical problems, which plays a central role in theoretical computer science. On ensembles of random graphs, VC exhibits a coverable-uncoverable phase transition. Very close to this transition, depending on the solution algorithm, easy-hard transitions in the typical running time of the algorithms occur. We explain a statistical mechanics approach, which works by mapping the VC to a hard-core lattice gas, and then applying techniques such as the replica trick or the cavity approach. Using these methods, the phase diagram of the VC could be obtained exactly for connectivities c < e, where the VC is replica symmetric. Recently, this result could be confirmed using traditional mathematical techniques. For c > e, the solution of the VC exhibits full replica symmetry breaking. The statistical mechanics approach can also be used to study analytically the typical running time of simple complete and incomplete algorithms for the VC. Finally, we describe recent results for the VC when studied on other ensembles of finite- and infinite-dimensional graphs.

  20. Probabilistic models for reactive behaviour in heterogeneous condensed phase media

    NASA Astrophysics Data System (ADS)

    Baer, M. R.; Gartling, D. K.; DesJardin, P. E.

    2012-02-01

    This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.

  1. Evaluating the Use of Random Distribution Theory to Introduce Statistical Inference Concepts to Business Students

    ERIC Educational Resources Information Center

    Larwin, Karen H.; Larwin, David A.

    2011-01-01

    Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…

  2. Mechanics of single cells: rheology, time dependence, and fluctuations.

    PubMed

    Massiera, Gladys; Van Citters, Kathleen M; Biancaniello, Paul L; Crocker, John C

    2007-11-15

    The results of mechanical measurements on single cultured epithelial cells using both magnetic twisting cytometry (MTC) and laser tracking microrheology (LTM) are described. Our unique approach uses laser deflection for high-performance tracking of cell-adhered magnetic beads either in response to an oscillatory magnetic torque (MTC) or due to random Brownian or ATP-dependent forces (LTM). This approach is well suited for accurately determining the rheology of single cells, the study of temporal and cell-to-cell variations in the MTC signal amplitude, and assessing the statistical character of the tracers' random motion in detail. The temporal variation of the MTC rocking amplitude is surprisingly large and manifests as a frequency-independent multiplicative factor having a 1/f spectrum in living cells, which disappears upon ATP depletion. In the epithelial cells we study, random bead position fluctuations are Gaussian to the limits of detection both in the Brownian and ATP-dependent cases, unlike earlier studies on other cell types.

  3. Methodological reporting of randomized trials in five leading Chinese nursing journals.

    PubMed

    Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu

    2014-01-01

    Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34 ± 0.97 (Mean ± SD). No RCT reported descriptions and changes in "trial design," changes in "outcomes" and "implementation," or descriptions of the similarity of interventions for "blinding." Poor reporting was found in detailing the "settings of participants" (13.1%), "type of randomization sequence generation" (1.8%), calculation methods of "sample size" (0.4%), explanation of any interim analyses and stopping guidelines for "sample size" (0.3%), "allocation concealment mechanism" (0.3%), additional analyses in "statistical methods" (2.1%), and targeted subjects and methods of "blinding" (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of "participants," "interventions," and definitions of the "outcomes" and "statistical methods." The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods.

  4. Design of a factorial experiment with randomization restrictions to assess medical device performance on vascular tissue

    PubMed Central

    2011-01-01

    Background Energy-based surgical scalpels are designed to efficiently transect and seal blood vessels using thermal energy to promote protein denaturation and coagulation. Assessment and design improvement of ultrasonic scalpel performance relies on both in vivo and ex vivo testing. The objective of this work was to design and implement a robust, experimental test matrix with randomization restrictions and predictive statistical power, which allowed for identification of those experimental variables that may affect the quality of the seal obtained ex vivo. Methods The design of the experiment included three factors: temperature (two levels); the type of solution used to perfuse the artery during transection (three types); and artery type (two types) resulting in a total of twelve possible treatment combinations. Burst pressures of porcine carotid and renal arteries sealed ex vivo were assigned as the response variable. Results The experimental test matrix was designed and carried out as a split-plot experiment in order to assess the contributions of several variables and their interactions while accounting for randomization restrictions present in the experimental setup. The statistical software package SAS was utilized and PROC MIXED was used to account for the randomization restrictions in the split-plot design. The combination of temperature, solution, and vessel type had a statistically significant impact on seal quality. Conclusions The design and implementation of a split-plot experimental test-matrix provided a mechanism for addressing the existing technical randomization restrictions of ex vivo ultrasonic scalpel performance testing, while preserving the ability to examine the potential effects of independent factors or variables. This method for generating the experimental design and the statistical analyses of the resulting data are adaptable to a wide variety of experimental problems involving large-scale tissue-based studies of medical or experimental device efficacy and performance. PMID:21599963

  5. Enhanced backscattering through a deep random phase screen

    NASA Astrophysics Data System (ADS)

    Jakeman, E.

    1988-10-01

    The statistical properties of radiation scattered by a system consisting of a plane mirror placed in the Fresnel region behind a smoothly varying deep random-phase screen with off-axis beam illumination are studied. It is found that two mechanisms cause enhanced scattering around the backward direction, according to the mirror position with respect to the focusing plane of the screen. In all of the plane mirror geometries considered, the scattered field remains a complex Gaussian process with a spatial coherence function identical to that expected for a single screen, and a speckle size smaller than the width of backscatter enhancement.

  6. At least some errors are randomly generated (Freud was wrong)

    NASA Technical Reports Server (NTRS)

    Sellen, A. J.; Senders, J. W.

    1986-01-01

    An experiment was carried out to expose something about human error generating mechanisms. In the context of the experiment, an error was made when a subject pressed the wrong key on a computer keyboard or pressed no key at all in the time allotted. These might be considered, respectively, errors of substitution and errors of omission. Each of seven subjects saw a sequence of three digital numbers, made an easily learned binary judgement about each, and was to press the appropriate one of two keys. Each session consisted of 1,000 presentations of randomly permuted, fixed numbers broken into 10 blocks of 100. One of two keys should have been pressed within one second of the onset of each stimulus. These data were subjected to statistical analyses in order to probe the nature of the error generating mechanisms. Goodness of fit tests for a Poisson distribution for the number of errors per 50 trial interval and for an exponential distribution of the length of the intervals between errors were carried out. There is evidence for an endogenous mechanism that may best be described as a random error generator. Furthermore, an item analysis of the number of errors produced per stimulus suggests the existence of a second mechanism operating on task driven factors producing exogenous errors. Some errors, at least, are the result of constant probability generating mechanisms with error rate idiosyncratically determined for each subject.

  7. Statistical optics

    NASA Astrophysics Data System (ADS)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  8. On-Orbit System Identification

    NASA Technical Reports Server (NTRS)

    Mettler, E.; Milman, M. H.; Bayard, D.; Eldred, D. B.

    1987-01-01

    Information derived from accelerometer readings benefits important engineering and control functions. Report discusses methodology for detection, identification, and analysis of motions within space station. Techniques of vibration and rotation analyses, control theory, statistics, filter theory, and transform methods integrated to form system for generating models and model parameters that characterize total motion of complicated space station, with respect to both control-induced and random mechanical disturbances.

  9. Statistical mechanics model for the emergence of consensus

    NASA Astrophysics Data System (ADS)

    Raffaelli, Giacomo; Marsili, Matteo

    2005-07-01

    The statistical properties of pairwise majority voting over S alternatives are analyzed in an infinite random population. We first compute the probability that the majority is transitive (i.e., that if it prefers A to B to C , then it prefers A to C ) and then study the case of an interacting population. This is described by a constrained multicomponent random field Ising model whose ferromagnetic phase describes the emergence of a strong transitive majority. We derive the phase diagram, which is characterized by a tricritical point and show that, contrary to intuition, it may be more likely for an interacting population to reach consensus on a number S of alternatives when S increases. This effect is due to the constraint imposed by transitivity on voting behavior. Indeed if agents are allowed to express nontransitive votes, the agents’ interaction may decrease considerably the probability of a transitive majority.

  10. GAFFE: a gaze-attentive fixation finding engine.

    PubMed

    Rajashekar, U; van der Linde, I; Bovik, A C; Cormack, L K

    2008-04-01

    The ability to automatically detect visually interesting regions in images has many practical applications, especially in the design of active machine vision and automatic visual surveillance systems. Analysis of the statistics of image features at observers' gaze can provide insights into the mechanisms of fixation selection in humans. Using a foveated analysis framework, we studied the statistics of four low-level local image features: luminance, contrast, and bandpass outputs of both luminance and contrast, and discovered that image patches around human fixations had, on average, higher values of each of these features than image patches selected at random. Contrast-bandpass showed the greatest difference between human and random fixations, followed by luminance-bandpass, RMS contrast, and luminance. Using these measurements, we present a new algorithm that selects image regions as likely candidates for fixation. These regions are shown to correlate well with fixations recorded from human observers.

  11. Multiscale volatility duration characteristics on financial multi-continuum percolation dynamics

    NASA Astrophysics Data System (ADS)

    Wang, Min; Wang, Jun

    A random stock price model based on the multi-continuum percolation system is developed to investigate the nonlinear dynamics of stock price volatility duration, in an attempt to explain various statistical facts found in financial data, and have a deeper understanding of mechanisms in the financial market. The continuum percolation system is usually referred to be a random coverage process or a Boolean model, it is a member of a class of statistical physics systems. In this paper, the multi-continuum percolation (with different values of radius) is employed to model and reproduce the dispersal of information among the investors. To testify the rationality of the proposed model, the nonlinear analyses of return volatility duration series are preformed by multifractal detrending moving average analysis and Zipf analysis. The comparison empirical results indicate the similar nonlinear behaviors for the proposed model and the actual Chinese stock market.

  12. Addressing the statistical mechanics of planet orbits in the solar system

    NASA Astrophysics Data System (ADS)

    Mogavero, Federico

    2017-10-01

    The chaotic nature of planet dynamics in the solar system suggests the relevance of a statistical approach to planetary orbits. In such a statistical description, the time-dependent position and velocity of the planets are replaced by the probability density function (PDF) of their orbital elements. It is natural to set up this kind of approach in the framework of statistical mechanics. In the present paper, I focus on the collisionless excitation of eccentricities and inclinations via gravitational interactions in a planetary system. The future planet trajectories in the solar system constitute the prototype of this kind of dynamics. I thus address the statistical mechanics of the solar system planet orbits and try to reproduce the PDFs numerically constructed by Laskar (2008, Icarus, 196, 1). I show that the microcanonical ensemble of the Laplace-Lagrange theory accurately reproduces the statistics of the giant planet orbits. To model the inner planets I then investigate the ansatz of equiprobability in the phase space constrained by the secular integrals of motion. The eccentricity and inclination PDFs of Earth and Venus are reproduced with no free parameters. Within the limitations of a stationary model, the predictions also show a reasonable agreement with Mars PDFs and that of Mercury inclination. The eccentricity of Mercury demands in contrast a deeper analysis. I finally revisit the random walk approach of Laskar to the time dependence of the inner planet PDFs. Such a statistical theory could be combined with direct numerical simulations of planet trajectories in the context of planet formation, which is likely to be a chaotic process.

  13. Spectral statistics of random geometric graphs

    NASA Astrophysics Data System (ADS)

    Dettmann, C. P.; Georgiou, O.; Knight, G.

    2017-04-01

    We use random matrix theory to study the spectrum of random geometric graphs, a fundamental model of spatial networks. Considering ensembles of random geometric graphs we look at short-range correlations in the level spacings of the spectrum via the nearest-neighbour and next-nearest-neighbour spacing distribution and long-range correlations via the spectral rigidity Δ3 statistic. These correlations in the level spacings give information about localisation of eigenvectors, level of community structure and the level of randomness within the networks. We find a parameter-dependent transition between Poisson and Gaussian orthogonal ensemble statistics. That is the spectral statistics of spatial random geometric graphs fits the universality of random matrix theory found in other models such as Erdős-Rényi, Barabási-Albert and Watts-Strogatz random graphs.

  14. Generation Mechanism of Nonlinear Rayleigh Surface Waves for Randomly Distributed Surface Micro-Cracks.

    PubMed

    Ding, Xiangyan; Li, Feilong; Zhao, Youxuan; Xu, Yongmei; Hu, Ning; Cao, Peng; Deng, Mingxi

    2018-04-23

    This paper investigates the propagation of Rayleigh surface waves in structures with randomly distributed surface micro-cracks using numerical simulations. The results revealed a significant ultrasonic nonlinear effect caused by the surface micro-cracks, which is mainly represented by a second harmonic with even more distinct third/quadruple harmonics. Based on statistical analysis from the numerous results of random micro-crack models, it is clearly found that the acoustic nonlinear parameter increases linearly with micro-crack density, the proportion of surface cracks, the size of micro-crack zone, and the excitation frequency. This study theoretically reveals that nonlinear Rayleigh surface waves are feasible for use in quantitatively identifying the physical characteristics of surface micro-cracks in structures.

  15. Generation Mechanism of Nonlinear Rayleigh Surface Waves for Randomly Distributed Surface Micro-Cracks

    PubMed Central

    Ding, Xiangyan; Li, Feilong; Xu, Yongmei; Cao, Peng; Deng, Mingxi

    2018-01-01

    This paper investigates the propagation of Rayleigh surface waves in structures with randomly distributed surface micro-cracks using numerical simulations. The results revealed a significant ultrasonic nonlinear effect caused by the surface micro-cracks, which is mainly represented by a second harmonic with even more distinct third/quadruple harmonics. Based on statistical analysis from the numerous results of random micro-crack models, it is clearly found that the acoustic nonlinear parameter increases linearly with micro-crack density, the proportion of surface cracks, the size of micro-crack zone, and the excitation frequency. This study theoretically reveals that nonlinear Rayleigh surface waves are feasible for use in quantitatively identifying the physical characteristics of surface micro-cracks in structures. PMID:29690580

  16. Is using multiple imputation better than complete case analysis for estimating a prevalence (risk) difference in randomized controlled trials when binary outcome observations are missing?

    PubMed

    Mukaka, Mavuto; White, Sarah A; Terlouw, Dianne J; Mwapasa, Victor; Kalilani-Phiri, Linda; Faragher, E Brian

    2016-07-22

    Missing outcomes can seriously impair the ability to make correct inferences from randomized controlled trials (RCTs). Complete case (CC) analysis is commonly used, but it reduces sample size and is perceived to lead to reduced statistical efficiency of estimates while increasing the potential for bias. As multiple imputation (MI) methods preserve sample size, they are generally viewed as the preferred analytical approach. We examined this assumption, comparing the performance of CC and MI methods to determine risk difference (RD) estimates in the presence of missing binary outcomes. We conducted simulation studies of 5000 simulated data sets with 50 imputations of RCTs with one primary follow-up endpoint at different underlying levels of RD (3-25 %) and missing outcomes (5-30 %). For missing at random (MAR) or missing completely at random (MCAR) outcomes, CC method estimates generally remained unbiased and achieved precision similar to or better than MI methods, and high statistical coverage. Missing not at random (MNAR) scenarios yielded invalid inferences with both methods. Effect size estimate bias was reduced in MI methods by always including group membership even if this was unrelated to missingness. Surprisingly, under MAR and MCAR conditions in the assessed scenarios, MI offered no statistical advantage over CC methods. While MI must inherently accompany CC methods for intention-to-treat analyses, these findings endorse CC methods for per protocol risk difference analyses in these conditions. These findings provide an argument for the use of the CC approach to always complement MI analyses, with the usual caveat that the validity of the mechanism for missingness be thoroughly discussed. More importantly, researchers should strive to collect as much data as possible.

  17. A Statistical Approach to Establishing Subsystem Environmental Test Specifications

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1974-01-01

    Results are presented of a research task to evaluate structural responses at various subsystem mounting locations during spacecraft level test exposures to the environments of mechanical shock, acoustic noise, and random vibration. This statistical evaluation is presented in the form of recommended subsystem test specifications for these three environments as normalized to a reference set of spacecraft test levels and are thus suitable for extrapolation to a set of different spacecraft test levels. The recommendations are dependent upon a subsystem's mounting location in a spacecraft, and information is presented on how to determine this mounting zone for a given subsystem.

  18. Linguistic Analysis of the Human Heartbeat Using Frequency and Rank Order Statistics

    NASA Astrophysics Data System (ADS)

    Yang, Albert C.-C.; Hseu, Shu-Shya; Yien, Huey-Wen; Goldberger, Ary L.; Peng, C.-K.

    2003-03-01

    Complex physiologic signals may carry unique dynamical signatures that are related to their underlying mechanisms. We present a method based on rank order statistics of symbolic sequences to investigate the profile of different types of physiologic dynamics. We apply this method to heart rate fluctuations, the output of a central physiologic control system. The method robustly discriminates patterns generated from healthy and pathologic states, as well as aging. Furthermore, we observe increased randomness in the heartbeat time series with physiologic aging and pathologic states and also uncover nonrandom patterns in the ventricular response to atrial fibrillation.

  19. Balancing strength and toughness of calcium-silicate-hydrate via random nanovoids and particle inclusions: Atomistic modeling and statistical analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Ning; Shahsavari, Rouzbeh

    2016-11-01

    As the most widely used manufactured material on Earth, concrete poses serious societal and environmental concerns which call for innovative strategies to develop greener concrete with improved strength and toughness, properties that are exclusive in man-made materials. Herein, we focus on calcium silicate hydrate (C-S-H), the major binding phase of all Portland cement concretes, and study how engineering its nanovoids and portlandite particle inclusions can impart a balance of strength, toughness and stiffness. By performing an extensive +600 molecular dynamics simulations coupled with statistical analysis tools, our results provide new evidence of ductile fracture mechanisms in C-S-H - reminiscent of crystalline alloys and ductile metals - decoding the interplay between the crack growth, nanovoid/particle inclusions, and stoichiometry, which dictates the crystalline versus amorphous nature of the underlying matrix. We found that introduction of voids and portlandite particles can significantly increase toughness and ductility, specially in C-S-H with more amorphous matrices, mainly owing to competing mechanisms of crack deflection, voids coalescence, internal necking, accommodation, and geometry alteration of individual voids/particles, which together regulate toughness versus strength. Furthermore, utilizing a comprehensive global sensitivity analysis on random configuration-property relations, we show that the mean diameter of voids/particles is the most critical statistical parameter influencing the mechanical properties of C-S-H, irrespective of stoichiometry or crystalline or amorphous nature of the matrix. This study provides new fundamental insights, design guidelines, and de novo strategies to turn the brittle C-S-H into a ductile material, impacting modern engineering of strong and tough concrete infrastructures and potentially other complex brittle materials.

  20. Characteristics of level-spacing statistics in chaotic graphene billiards.

    PubMed

    Huang, Liang; Lai, Ying-Cheng; Grebogi, Celso

    2011-03-01

    A fundamental result in nonrelativistic quantum nonlinear dynamics is that the spectral statistics of quantum systems that possess no geometric symmetry, but whose classical dynamics are chaotic, are described by those of the Gaussian orthogonal ensemble (GOE) or the Gaussian unitary ensemble (GUE), in the presence or absence of time-reversal symmetry, respectively. For massless spin-half particles such as neutrinos in relativistic quantum mechanics in a chaotic billiard, the seminal work of Berry and Mondragon established the GUE nature of the level-spacing statistics, due to the combination of the chirality of Dirac particles and the confinement, which breaks the time-reversal symmetry. A question is whether the GOE or the GUE statistics can be observed in experimentally accessible, relativistic quantum systems. We demonstrate, using graphene confinements in which the quasiparticle motions are governed by the Dirac equation in the low-energy regime, that the level-spacing statistics are persistently those of GOE random matrices. We present extensive numerical evidence obtained from the tight-binding approach and a physical explanation for the GOE statistics. We also find that the presence of a weak magnetic field switches the statistics to those of GUE. For a strong magnetic field, Landau levels become influential, causing the level-spacing distribution to deviate markedly from the random-matrix predictions. Issues addressed also include the effects of a number of realistic factors on level-spacing statistics such as next nearest-neighbor interactions, different lattice orientations, enhanced hopping energy for atoms on the boundary, and staggered potential due to graphene-substrate interactions.

  1. Elastin: a representative ideal protein elastomer.

    PubMed Central

    Urry, D W; Hugel, T; Seitz, M; Gaub, H E; Sheiba, L; Dea, J; Xu, J; Parker, T

    2002-01-01

    During the last half century, identification of an ideal (predominantly entropic) protein elastomer was generally thought to require that the ideal protein elastomer be a random chain network. Here, we report two new sets of data and review previous data. The first set of new data utilizes atomic force microscopy to report single-chain force-extension curves for (GVGVP)(251) and (GVGIP)(260), and provides evidence for single-chain ideal elasticity. The second class of new data provides a direct contrast between low-frequency sound absorption (0.1-10 kHz) exhibited by random-chain network elastomers and by elastin protein-based polymers. Earlier composition, dielectric relaxation (1-1000 MHz), thermoelasticity, molecular mechanics and dynamics calculations and thermodynamic and statistical mechanical analyses are presented, that combine with the new data to contrast with random-chain network rubbers and to detail the presence of regular non-random structural elements of the elastin-based systems that lose entropic elastomeric force upon thermal denaturation. The data and analyses affirm an earlier contrary argument that components of elastin, the elastic protein of the mammalian elastic fibre, and purified elastin fibre itself contain dynamic, non-random, regularly repeating structures that exhibit dominantly entropic elasticity by means of a damping of internal chain dynamics on extension. PMID:11911774

  2. Record statistics of a strongly correlated time series: random walks and Lévy flights

    NASA Astrophysics Data System (ADS)

    Godrèche, Claude; Majumdar, Satya N.; Schehr, Grégory

    2017-08-01

    We review recent advances on the record statistics of strongly correlated time series, whose entries denote the positions of a random walk or a Lévy flight on a line. After a brief survey of the theory of records for independent and identically distributed random variables, we focus on random walks. During the last few years, it was indeed realized that random walks are a very useful ‘laboratory’ to test the effects of correlations on the record statistics. We start with the simple one-dimensional random walk with symmetric jumps (both continuous and discrete) and discuss in detail the statistics of the number of records, as well as of the ages of the records, i.e. the lapses of time between two successive record breaking events. Then we review the results that were obtained for a wide variety of random walk models, including random walks with a linear drift, continuous time random walks, constrained random walks (like the random walk bridge) and the case of multiple independent random walkers. Finally, we discuss further observables related to records, like the record increments, as well as some questions raised by physical applications of record statistics, like the effects of measurement error and noise.

  3. A comparative, prospective, and randomized study of two conservative treatment protocols for first-episode lateral ankle ligament injuries.

    PubMed

    Prado, Marcelo Pires; Mendes, Alberto Abussamra Moreira; Amodio, Daniel Tasseto; Camanho, Gilberto Luis; Smyth, Niall A; Fernandes, Tulio Diniz

    2014-03-01

    The objective of this study was to investigate functional results, the amount of time that patients missed from regular working activities, and the incidence of residual mechanical ankle instability following conservative treatment of a first episode of severe lateral ankle ligament sprain (with articular instability). This prospective and randomized study included 186 patients with severe lateral ankle ligament injuries, who were randomly assigned into 2 conservative treatment groups. In group A, participants were treated with a walking boot with weight-bearing allowed, pain management, ice, and elevation with restricted joint mobilization for 3 weeks. In group B, patients were treated with a functional brace for 3 weeks. After this period, patients from both groups were placed in a short, functional brace for an additional 3 weeks, during which they also started a rehabilitation program. No statistically significant difference was found in pain intensity score between the 2 groups; however, functional evaluations based on the AOFAS ankle and hindfoot score system showed a statistically significant improvement in the group treated with the functional brace. In addition, the average recovery period necessary for patients of group B to resume their duties was shorter than that for patients in group A. No significant difference was detected in residual mechanical ankle instability between the 2 groups. Patients with severe lateral ankle ligament lesions treated with a functional brace were shown to exhibit somewhat better results than those treated with a walking boot, and both methods presented a very low incidence of residual chronic instability. We found adequate conservative treatment was sufficient to reestablish ankle stability and that functional treatment had a marginally better clinical short-term outcome with a shorter average recovery period. Level I, prospective randomized study.

  4. Errors in causal inference: an organizational schema for systematic error and random error.

    PubMed

    Suzuki, Etsuji; Tsuda, Toshihide; Mitsuhashi, Toshiharu; Mansournia, Mohammad Ali; Yamamoto, Eiji

    2016-11-01

    To provide an organizational schema for systematic error and random error in estimating causal measures, aimed at clarifying the concept of errors from the perspective of causal inference. We propose to divide systematic error into structural error and analytic error. With regard to random error, our schema shows its four major sources: nondeterministic counterfactuals, sampling variability, a mechanism that generates exposure events and measurement variability. Structural error is defined from the perspective of counterfactual reasoning and divided into nonexchangeability bias (which comprises confounding bias and selection bias) and measurement bias. Directed acyclic graphs are useful to illustrate this kind of error. Nonexchangeability bias implies a lack of "exchangeability" between the selected exposed and unexposed groups. A lack of exchangeability is not a primary concern of measurement bias, justifying its separation from confounding bias and selection bias. Many forms of analytic errors result from the small-sample properties of the estimator used and vanish asymptotically. Analytic error also results from wrong (misspecified) statistical models and inappropriate statistical methods. Our organizational schema is helpful for understanding the relationship between systematic error and random error from a previously less investigated aspect, enabling us to better understand the relationship between accuracy, validity, and precision. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Two statistical mechanics aspects of complex networks

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Biely, Christoly

    2006-12-01

    By adopting an ensemble interpretation of non-growing rewiring networks, network theory can be reduced to a counting problem of possible network states and an identification of their associated probabilities. We present two scenarios of how different rewirement schemes can be used to control the state probabilities of the system. In particular, we review how by generalizing the linking rules of random graphs, in combination with superstatistics and quantum mechanical concepts, one can establish an exact relation between the degree distribution of any given network and the nodes’ linking probability distributions. In a second approach, we control state probabilities by a network Hamiltonian, whose characteristics are motivated by biological and socio-economical statistical systems. We demonstrate that a thermodynamics of networks becomes a fully consistent concept, allowing to study e.g. ‘phase transitions’ and computing entropies through thermodynamic relations.

  6. Path statistics, memory, and coarse-graining of continuous-time random walks on networks

    PubMed Central

    Kion-Crosby, Willow; Morozov, Alexandre V.

    2015-01-01

    Continuous-time random walks (CTRWs) on discrete state spaces, ranging from regular lattices to complex networks, are ubiquitous across physics, chemistry, and biology. Models with coarse-grained states (for example, those employed in studies of molecular kinetics) or spatial disorder can give rise to memory and non-exponential distributions of waiting times and first-passage statistics. However, existing methods for analyzing CTRWs on complex energy landscapes do not address these effects. Here we use statistical mechanics of the nonequilibrium path ensemble to characterize first-passage CTRWs on networks with arbitrary connectivity, energy landscape, and waiting time distributions. Our approach can be applied to calculating higher moments (beyond the mean) of path length, time, and action, as well as statistics of any conservative or non-conservative force along a path. For homogeneous networks, we derive exact relations between length and time moments, quantifying the validity of approximating a continuous-time process with its discrete-time projection. For more general models, we obtain recursion relations, reminiscent of transfer matrix and exact enumeration techniques, to efficiently calculate path statistics numerically. We have implemented our algorithm in PathMAN (Path Matrix Algorithm for Networks), a Python script that users can apply to their model of choice. We demonstrate the algorithm on a few representative examples which underscore the importance of non-exponential distributions, memory, and coarse-graining in CTRWs. PMID:26646868

  7. Central Limit Theorem for Exponentially Quasi-local Statistics of Spin Models on Cayley Graphs

    NASA Astrophysics Data System (ADS)

    Reddy, Tulasi Ram; Vadlamani, Sreekar; Yogeshwaran, D.

    2018-04-01

    Central limit theorems for linear statistics of lattice random fields (including spin models) are usually proven under suitable mixing conditions or quasi-associativity. Many interesting examples of spin models do not satisfy mixing conditions, and on the other hand, it does not seem easy to show central limit theorem for local statistics via quasi-associativity. In this work, we prove general central limit theorems for local statistics and exponentially quasi-local statistics of spin models on discrete Cayley graphs with polynomial growth. Further, we supplement these results by proving similar central limit theorems for random fields on discrete Cayley graphs taking values in a countable space, but under the stronger assumptions of α -mixing (for local statistics) and exponential α -mixing (for exponentially quasi-local statistics). All our central limit theorems assume a suitable variance lower bound like many others in the literature. We illustrate our general central limit theorem with specific examples of lattice spin models and statistics arising in computational topology, statistical physics and random networks. Examples of clustering spin models include quasi-associated spin models with fast decaying covariances like the off-critical Ising model, level sets of Gaussian random fields with fast decaying covariances like the massive Gaussian free field and determinantal point processes with fast decaying kernels. Examples of local statistics include intrinsic volumes, face counts, component counts of random cubical complexes while exponentially quasi-local statistics include nearest neighbour distances in spin models and Betti numbers of sub-critical random cubical complexes.

  8. Record statistics of financial time series and geometric random walks

    NASA Astrophysics Data System (ADS)

    Sabir, Behlool; Santhanam, M. S.

    2014-09-01

    The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.

  9. Dietary Weight Loss and Exercise Effects on Serum Biomarkers of Angiogenesis in Overweight Postmenopausal Women: A Randomized Controlled Trial.

    PubMed

    Duggan, Catherine; Tapsoba, Jean de Dieu; Wang, Ching-Yun; McTiernan, Anne

    2016-07-15

    Obese and sedentary persons have an increased risk for cancer, but underlying mechanisms are poorly understood. Angiogenesis is common to adipose tissue formation and remodeling, and to tumor vascularization. A total of 439 overweight/obese, healthy, postmenopausal women [body mass index (BMI) > 25 kg/m(2)] ages 50-75 years, recruited between 2005 and 2008 were randomized to a 4-arm 12-month randomized controlled trial, comparing a caloric restriction diet arm (goal: 10% weight loss, N = 118), aerobic exercise arm (225 minutes/week of moderate-to-vigorous activity, N = 117), a combined diet + exercise arm (N = 117), or control (N = 87) on circulating levels of angiogenic biomarkers. VEGF, plasminogen activator inhibitor-1 (PAI-1), and pigment epithelium-derived factor (PEDF) were measured by immunoassay at baseline and 12 months. Changes were compared using generalized estimating equations, adjusting for baseline BMI, age, and race/ethnicity. Participants randomized to the diet + exercise arms had statistically significantly greater reductions in PAI-1 at 12 months compared with controls (-19.3% vs. +3.48%, respectively, P < 0.0001). Participants randomized to the diet and diet + exercise arms had statistically significantly greater reductions in PEDF (-9.20%, -9.90%, respectively, both P < 0.0001) and VEGF (-8.25%, P = 0.0005; -9.98%, P < 0.0001, respectively) compared with controls. There were no differences in any of the analytes in participants randomized to the exercise arm compared with controls. Increasing weight loss was statistically significantly associated with linear trends of greater reductions in PAI-1, PEDF, and VEGF. Weight loss is significantly associated with reduced circulating VEGF, PEDF, and PAI-1, and could provide incentive for reducing weight as a cancer prevention method in overweight and obese individuals. Cancer Res; 76(14); 4226-35. ©2016 AACR. ©2016 American Association for Cancer Research.

  10. Differential privacy-based evaporative cooling feature selection and classification with relief-F and random forests.

    PubMed

    Le, Trang T; Simmons, W Kyle; Misaki, Masaya; Bodurka, Jerzy; White, Bill C; Savitz, Jonathan; McKinney, Brett A

    2017-09-15

    Classification of individuals into disease or clinical categories from high-dimensional biological data with low prediction error is an important challenge of statistical learning in bioinformatics. Feature selection can improve classification accuracy but must be incorporated carefully into cross-validation to avoid overfitting. Recently, feature selection methods based on differential privacy, such as differentially private random forests and reusable holdout sets, have been proposed. However, for domains such as bioinformatics, where the number of features is much larger than the number of observations p≫n , these differential privacy methods are susceptible to overfitting. We introduce private Evaporative Cooling, a stochastic privacy-preserving machine learning algorithm that uses Relief-F for feature selection and random forest for privacy preserving classification that also prevents overfitting. We relate the privacy-preserving threshold mechanism to a thermodynamic Maxwell-Boltzmann distribution, where the temperature represents the privacy threshold. We use the thermal statistical physics concept of Evaporative Cooling of atomic gases to perform backward stepwise privacy-preserving feature selection. On simulated data with main effects and statistical interactions, we compare accuracies on holdout and validation sets for three privacy-preserving methods: the reusable holdout, reusable holdout with random forest, and private Evaporative Cooling, which uses Relief-F feature selection and random forest classification. In simulations where interactions exist between attributes, private Evaporative Cooling provides higher classification accuracy without overfitting based on an independent validation set. In simulations without interactions, thresholdout with random forest and private Evaporative Cooling give comparable accuracies. We also apply these privacy methods to human brain resting-state fMRI data from a study of major depressive disorder. Code available at http://insilico.utulsa.edu/software/privateEC . brett-mckinney@utulsa.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  11. Postural stability effects of random vibration at the feet of construction workers in simulated elevation.

    PubMed

    Simeonov, P; Hsiao, H; Powers, J; Ammons, D; Kau, T; Amendola, A

    2011-07-01

    The risk of falls from height on a construction site increases under conditions which degrade workers' postural control. At elevation, workers depend heavily on sensory information from their feet to maintain balance. The study tested two hypotheses: "sensory enhancement"--sub-sensory (undetectable) random mechanical vibrations at the plantar surface of the feet can improve worker's balance at elevation; and "sensory suppression"--supra-sensory (detectable) random mechanical vibrations can have a degrading effect on balance in the same experimental settings. Six young (age 20-35) and six aging (age 45-60) construction workers were tested while standing in standard and semi-tandem postures on instrumented gel insoles. The insoles applied sub- or supra-sensory levels of random mechanical vibrations to the feet. The tests were conducted in a surround-screen virtual reality system, which simulated a narrow plank at elevation on a construction site. Upper body kinematics was assessed with a motion-measurement system. Postural stability effects were evaluated by conventional and statistical mechanics sway measures, as well as trunk angular displacement parameters. Analysis of variance did not confirm the "sensory enhancement" hypothesis, but provided evidence for the "sensory suppression" hypothesis. The supra-sensory vibration had a destabilizing effect, which was considerably stronger in the semi-tandem posture and affected most of the sway variables. Sensory suppression associated with elevated vibration levels on a construction site may increase the danger of losing balance. Construction workers at elevation, e.g., on a beam or narrow plank might be at increased risk of fall if they can detect vibrations under their feet. To reduce the possibility of losing balance, mechanical vibration to supporting structures used as walking/working surfaces should be minimized when performing construction tasks at elevation. Published by Elsevier Ltd.

  12. Impact of oral care with versus without toothbrushing on the prevention of ventilator-associated pneumonia: a systematic review and meta-analysis of randomized controlled trials

    PubMed Central

    2012-01-01

    Introduction Ventilator-associated pneumonia (VAP) remains a common hazardous complication in mechanically ventilated patients and is associated with increased morbidity and mortality. We undertook a systematic review and meta-analysis of randomized controlled trials to assess the effect of toothbrushing as a component of oral care on the prevention of VAP in adult critically ill patients. Methods A systematic literature search of PubMed and Embase (up to April 2012) was conducted. Eligible studies were randomized controlled trials of mechanically ventilated adult patients receiving oral care with toothbrushing. Relative risks (RRs), weighted mean differences (WMDs), and 95% confidence intervals (CIs) were calculated and heterogeneity was assessed with the I2 test. Results Four studies with a total of 828 patients met the inclusion criteria. Toothbrushing did not significantly reduce the incidence of VAP (RR, 0.77; 95% CI, 0.50 to 1.21) and intensive care unit mortality (RR, 0.88; 95% CI, 0.70 to 1.10). Toothbrushing was not associated with a statistically significant reduction in duration of mechanical ventilation (WMD, -0.88 days; 95% CI, -2.58 to 0.82), length of intensive care unit stay (WMD, -1.48 days; 95% CI, -3.40 to 0.45), antibiotic-free day (WMD, -0.52 days; 95% CI, -2.82 to 1.79), or mechanical ventilation-free day (WMD, -0.43 days; 95% CI, -1.23 to 0.36). Conclusions Oral care with toothbrushing versus without toothbrushing does not significantly reduce the incidence of VAP and alter other important clinical outcomes in mechanically ventilated patients. However, the results should be interpreted cautiously since relevant evidence is still limited, although accumulating. Further large-scale, well-designed randomized controlled trials are urgently needed. PMID:23062250

  13. Fluid-driven fracture propagation in heterogeneous media: Probability distributions of fracture trajectories

    NASA Astrophysics Data System (ADS)

    Santillán, David; Mosquera, Juan-Carlos; Cueto-Felgueroso, Luis

    2017-11-01

    Hydraulic fracture trajectories in rocks and other materials are highly affected by spatial heterogeneity in their mechanical properties. Understanding the complexity and structure of fluid-driven fractures and their deviation from the predictions of homogenized theories is a practical problem in engineering and geoscience. We conduct a Monte Carlo simulation study to characterize the influence of heterogeneous mechanical properties on the trajectories of hydraulic fractures propagating in elastic media. We generate a large number of random fields of mechanical properties and simulate pressure-driven fracture propagation using a phase-field model. We model the mechanical response of the material as that of an elastic isotropic material with heterogeneous Young modulus and Griffith energy release rate, assuming that fractures propagate in the toughness-dominated regime. Our study shows that the variance and the spatial covariance of the mechanical properties are controlling factors in the tortuousness of the fracture paths. We characterize the deviation of fracture paths from the homogenous case statistically, and conclude that the maximum deviation grows linearly with the distance from the injection point. Additionally, fracture path deviations seem to be normally distributed, suggesting that fracture propagation in the toughness-dominated regime may be described as a random walk.

  14. Fluid-driven fracture propagation in heterogeneous media: Probability distributions of fracture trajectories.

    PubMed

    Santillán, David; Mosquera, Juan-Carlos; Cueto-Felgueroso, Luis

    2017-11-01

    Hydraulic fracture trajectories in rocks and other materials are highly affected by spatial heterogeneity in their mechanical properties. Understanding the complexity and structure of fluid-driven fractures and their deviation from the predictions of homogenized theories is a practical problem in engineering and geoscience. We conduct a Monte Carlo simulation study to characterize the influence of heterogeneous mechanical properties on the trajectories of hydraulic fractures propagating in elastic media. We generate a large number of random fields of mechanical properties and simulate pressure-driven fracture propagation using a phase-field model. We model the mechanical response of the material as that of an elastic isotropic material with heterogeneous Young modulus and Griffith energy release rate, assuming that fractures propagate in the toughness-dominated regime. Our study shows that the variance and the spatial covariance of the mechanical properties are controlling factors in the tortuousness of the fracture paths. We characterize the deviation of fracture paths from the homogenous case statistically, and conclude that the maximum deviation grows linearly with the distance from the injection point. Additionally, fracture path deviations seem to be normally distributed, suggesting that fracture propagation in the toughness-dominated regime may be described as a random walk.

  15. Occupation times and ergodicity breaking in biased continuous time random walks

    NASA Astrophysics Data System (ADS)

    Bel, Golan; Barkai, Eli

    2005-12-01

    Continuous time random walk (CTRW) models are widely used to model diffusion in condensed matter. There are two classes of such models, distinguished by the convergence or divergence of the mean waiting time. Systems with finite average sojourn time are ergodic and thus Boltzmann-Gibbs statistics can be applied. We investigate the statistical properties of CTRW models with infinite average sojourn time; in particular, the occupation time probability density function is obtained. It is shown that in the non-ergodic phase the distribution of the occupation time of the particle on a given lattice point exhibits bimodal U or trimodal W shape, related to the arcsine law. The key points are as follows. (a) In a CTRW with finite or infinite mean waiting time, the distribution of the number of visits on a lattice point is determined by the probability that a member of an ensemble of particles in equilibrium occupies the lattice point. (b) The asymmetry parameter of the probability distribution function of occupation times is related to the Boltzmann probability and to the partition function. (c) The ensemble average is given by Boltzmann-Gibbs statistics for either finite or infinite mean sojourn time, when detailed balance conditions hold. (d) A non-ergodic generalization of the Boltzmann-Gibbs statistical mechanics for systems with infinite mean sojourn time is found.

  16. Stochastic modelling, Bayesian inference, and new in vivo measurements elucidate the debated mtDNA bottleneck mechanism

    PubMed Central

    Johnston, Iain G; Burgstaller, Joerg P; Havlicek, Vitezslav; Kolbe, Thomas; Rülicke, Thomas; Brem, Gottfried; Poulton, Jo; Jones, Nick S

    2015-01-01

    Dangerous damage to mitochondrial DNA (mtDNA) can be ameliorated during mammalian development through a highly debated mechanism called the mtDNA bottleneck. Uncertainty surrounding this process limits our ability to address inherited mtDNA diseases. We produce a new, physically motivated, generalisable theoretical model for mtDNA populations during development, allowing the first statistical comparison of proposed bottleneck mechanisms. Using approximate Bayesian computation and mouse data, we find most statistical support for a combination of binomial partitioning of mtDNAs at cell divisions and random mtDNA turnover, meaning that the debated exact magnitude of mtDNA copy number depletion is flexible. New experimental measurements from a wild-derived mtDNA pairing in mice confirm the theoretical predictions of this model. We analytically solve a mathematical description of this mechanism, computing probabilities of mtDNA disease onset, efficacy of clinical sampling strategies, and effects of potential dynamic interventions, thus developing a quantitative and experimentally-supported stochastic theory of the bottleneck. DOI: http://dx.doi.org/10.7554/eLife.07464.001 PMID:26035426

  17. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    ERIC Educational Resources Information Center

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  18. Random electric field instabilities of relaxor ferroelectrics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arce-Gamboa, Jose R.; Guzman-Verri, Gian G.

    Relaxor ferroelectrics are complex oxide materials which are rather unique to study the effects of compositional disorder on phase transitions. Here, we study the effects of quenched cubic random electric fields on the lattice instabilities that lead to a ferroelectric transition and show that, within a microscopic model and a statistical mechanical solution, even weak compositional disorder can prohibit the development of long-range order and that a random field state with anisotropic and power-law correlations of polarization emerges from the combined effect of their characteristic dipole forces and their inherent charge disorder. As a result, we compare and reproduce severalmore » key experimental observations in the well-studied relaxor PbMg 1/3Nb 2/3O 3–PbTiO 3.« less

  19. Random electric field instabilities of relaxor ferroelectrics

    DOE PAGES

    Arce-Gamboa, Jose R.; Guzman-Verri, Gian G.

    2017-06-13

    Relaxor ferroelectrics are complex oxide materials which are rather unique to study the effects of compositional disorder on phase transitions. Here, we study the effects of quenched cubic random electric fields on the lattice instabilities that lead to a ferroelectric transition and show that, within a microscopic model and a statistical mechanical solution, even weak compositional disorder can prohibit the development of long-range order and that a random field state with anisotropic and power-law correlations of polarization emerges from the combined effect of their characteristic dipole forces and their inherent charge disorder. As a result, we compare and reproduce severalmore » key experimental observations in the well-studied relaxor PbMg 1/3Nb 2/3O 3–PbTiO 3.« less

  20. Statistics-related and reliability-physics-related failure processes in electronics devices and products

    NASA Astrophysics Data System (ADS)

    Suhir, E.

    2014-05-01

    The well known and widely used experimental reliability "passport" of a mass manufactured electronic or a photonic product — the bathtub curve — reflects the combined contribution of the statistics-related and reliability-physics (physics-of-failure)-related processes. When time progresses, the first process results in a decreasing failure rate, while the second process associated with the material aging and degradation leads to an increased failure rate. An attempt has been made in this analysis to assess the level of the reliability physics-related aging process from the available bathtub curve (diagram). It is assumed that the products of interest underwent the burn-in testing and therefore the obtained bathtub curve does not contain the infant mortality portion. It has been also assumed that the two random processes in question are statistically independent, and that the failure rate of the physical process can be obtained by deducting the theoretically assessed statistical failure rate from the bathtub curve ordinates. In the carried out numerical example, the Raleigh distribution for the statistical failure rate was used, for the sake of a relatively simple illustration. The developed methodology can be used in reliability physics evaluations, when there is a need to better understand the roles of the statistics-related and reliability-physics-related irreversible random processes in reliability evaluations. The future work should include investigations on how powerful and flexible methods and approaches of the statistical mechanics can be effectively employed, in addition to reliability physics techniques, to model the operational reliability of electronic and photonic products.

  1. Retention of Statistical Concepts in a Preliminary Randomization-Based Introductory Statistics Curriculum

    ERIC Educational Resources Information Center

    Tintle, Nathan; Topliff, Kylie; VanderStoep, Jill; Holmes, Vicki-Lynn; Swanson, Todd

    2012-01-01

    Previous research suggests that a randomization-based introductory statistics course may improve student learning compared to the consensus curriculum. However, it is unclear whether these gains are retained by students post-course. We compared the conceptual understanding of a cohort of students who took a randomization-based curriculum (n = 76)…

  2. Statistical Physics on the Eve of the 21st Century: in Honour of J B McGuire on the Occasion of His 65th Birthday

    NASA Astrophysics Data System (ADS)

    Batchelor, Murray T.; Wille, Luc T.

    The Table of Contents for the book is as follows: * Preface * Modelling the Immune System - An Example of the Simulation of Complex Biological Systems * Brief Overview of Quantum Computation * Quantal Information in Statistical Physics * Modeling Economic Randomness: Statistical Mechanics of Market Phenomena * Essentially Singular Solutions of Feigenbaum- Type Functional Equations * Spatiotemporal Chaotic Dynamics in Coupled Map Lattices * Approach to Equilibrium of Chaotic Systems * From Level to Level in Brain and Behavior * Linear and Entropic Transformations of the Hydrophobic Free Energy Sequence Help Characterize a Novel Brain Polyprotein: CART's Protein * Dynamical Systems Response to Pulsed High-Frequency Fields * Bose-Einstein Condensates in the Light of Nonlinear Physics * Markov Superposition Expansion for the Entropy and Correlation Functions in Two and Three Dimensions * Calculation of Wave Center Deflection and Multifractal Analysis of Directed Waves Through the Study of su(1,1)Ferromagnets * Spectral Properties and Phases in Hierarchical Master Equations * Universality of the Distribution Functions of Random Matrix Theory * The Universal Chiral Partition Function for Exclusion Statistics * Continuous Space-Time Symmetries in a Lattice Field Theory * Quelques Cas Limites du Problème à N Corps Unidimensionnel * Integrable Models of Correlated Electrons * On the Riemann Surface of the Three-State Chiral Potts Model * Two Exactly Soluble Lattice Models in Three Dimensions * Competition of Ferromagnetic and Antiferromagnetic Order in the Spin-l/2 XXZ Chain at Finite Temperature * Extended Vertex Operator Algebras and Monomial Bases * Parity and Charge Conjugation Symmetries and S Matrix of the XXZ Chain * An Exactly Solvable Constrained XXZ Chain * Integrable Mixed Vertex Models Ftom the Braid-Monoid Algebra * From Yang-Baxter Equations to Dynamical Zeta Functions for Birational Tlansformations * Hexagonal Lattice Directed Site Animals * Direction in the Star-Triangle Relations * A Self-Avoiding Walk Through Exactly Solved Lattice Models in Statistical Mechanics

  3. Alcohol-assisted debridement in PRK with intraoperative mitomycin C.

    PubMed

    Nassiri, Nader; Sheibani, Kourosh; Safi, Sare; Haghnegahdar, Maryam; Nassiri, Saman; Panahi, Nekoo; Mehravaran, Shiva; Nassiri, Nariman

    2014-09-01

    To compare corneal stromal and endothelial cells after photorefractive keratectomy with intraoperative mitomycin C in alcohol-assisted versus mechanical epithelial debridement using confocal microscopy. This prospective randomized comparative study was performed on 88 eyes (44 patients) with myopia up to -6.00 diopters. The right eye of each patient was randomly assigned to either mechanical or alcohol-assisted groups, and the left eye was assigned to the alternate group. Confocal microscopy was performed preoperatively and at 3 months postoperatively. The main outcome measures were epithelial thickness; number of keratocytes in the anterior, mid-, and posterior stroma; and characteristics of the central corneal endothelial cells in terms of density, mean cell area, and polymegathism and hexagonality. Three months after surgery, no statistically significant difference was noted between the study groups in terms of epithelial thickness. We also found no statistically significant difference in central corneal endothelial cells regarding cell density, mean cell area, hexagonality, or polymegathism. Compared with baseline values, the density of mid- and posterior stromal keratocytes showed no significant change in either group, whereas it decreased significantly in the anterior stroma in both groups 3 months after surgery. We found that the adverse effects of photorefractive keratectomy with mitomycin C on central corneal endothelial cells were comparable between the mechanical and alcohol-assisted epithelial debridement groups and the significant decrease in postoperative keratocyte density in anterior stroma was comparable between the two groups. The choice of their application could be left to the discretion of the ophthalmologist.

  4. Adhesive loose packings of small dry particles.

    PubMed

    Liu, Wenwei; Li, Shuiqing; Baule, Adrian; Makse, Hernán A

    2015-08-28

    We explore adhesive loose packings of small dry spherical particles of micrometer size using 3D discrete-element simulations with adhesive contact mechanics and statistical ensemble theory. A dimensionless adhesion parameter (Ad) successfully combines the effects of particle velocities, sizes and the work of adhesion, identifying a universal regime of adhesive packings for Ad > 1. The structural properties of the packings in this regime are well described by an ensemble approach based on a coarse-grained volume function that includes the correlation between bulk and contact spheres. Our theoretical and numerical results predict: (i) an equation of state for adhesive loose packings that appear as a continuation from the frictionless random close packing (RCP) point in the jamming phase diagram and (ii) the existence of an asymptotic adhesive loose packing point at a coordination number Z = 2 and a packing fraction ϕ = 1/2(3). Our results highlight that adhesion leads to a universal packing regime at packing fractions much smaller than the random loose packing (RLP), which can be described within a statistical mechanical framework. We present a general phase diagram of jammed matter comprising frictionless, frictional, adhesive as well as non-spherical particles, providing a classification of packings in terms of their continuation from the spherical frictionless RCP.

  5. Why Are People Bad at Detecting Randomness? A Statistical Argument

    ERIC Educational Resources Information Center

    Williams, Joseph J.; Griffiths, Thomas L.

    2013-01-01

    Errors in detecting randomness are often explained in terms of biases and misconceptions. We propose and provide evidence for an account that characterizes the contribution of the inherent statistical difficulty of the task. Our account is based on a Bayesian statistical analysis, focusing on the fact that a random process is a special case of…

  6. Statistical properties of solar flares and coronal mass ejections through the solar cycle

    NASA Astrophysics Data System (ADS)

    Telloni, Daniele; Carbone, Vincenzo; Lepreti, Fabio; Antonucci, Ester

    2016-03-01

    Waiting Time Distributions (WTDs) of solar flares are investigated all through the solar cycle. The same approach applied to Coronal Mass Ejections (CMEs) in a previous work is considered here for flare occurrence. Our analysis reveals that flares and CMEs share some common statistical properties, which result dependent on the level of solar activity. Both flares and CMEs seem to independently occur during minimum solar activity phases, whilst their WTDs significantly deviate from a Poisson function at solar maximum, thus suggesting that these events are correlated. The characteristics of WTDs are constrained by the physical processes generating those eruptions associated with flares and CMEs. A scenario may be drawn in which different mechanisms are actively at work during different phases of the solar cycle. Stochastic processes, most likely related to random magnetic reconnections of the field lines, seem to play a key role during solar minimum periods. On the other hand, persistent processes, like sympathetic eruptions associated to the variability of the photospheric magnetism, are suggested to dominate during periods of high solar activity. Moreover, despite the similar statistical properties shown by flares and CMEs, as it was mentioned above, their WTDs appear different in some aspects. During solar minimum periods, the flare occurrence randomness seems to be more evident than for CMEs. Those persistent mechanisms generating interdependent events during maximum periods of solar activity can be suggested to play a more important role for CMEs than for flares, thus mitigating the competitive action of the random processes, which seem instead strong enough to weaken the correlations among flare event occurrence during solar minimum periods. However, it cannot be excluded that the physical processes at the basis of the origin of the temporal correlation between solar events are different for flares and CMEs, or that, more likely, more sophisticated effects are at work at the same time leading to an even more complex picture. This work represents a first step for further investigations.

  7. Complex patterns of abnormal heartbeats

    NASA Technical Reports Server (NTRS)

    Schulte-Frohlinde, Verena; Ashkenazy, Yosef; Goldberger, Ary L.; Ivanov, Plamen Ch; Costa, Madalena; Morley-Davies, Adrian; Stanley, H. Eugene; Glass, Leon

    2002-01-01

    Individuals having frequent abnormal heartbeats interspersed with normal heartbeats may be at an increased risk of sudden cardiac death. However, mechanistic understanding of such cardiac arrhythmias is limited. We present a visual and qualitative method to display statistical properties of abnormal heartbeats. We introduce dynamical "heartprints" which reveal characteristic patterns in long clinical records encompassing approximately 10(5) heartbeats and may provide information about underlying mechanisms. We test if these dynamics can be reproduced by model simulations in which abnormal heartbeats are generated (i) randomly, (ii) at a fixed time interval following a preceding normal heartbeat, or (iii) by an independent oscillator that may or may not interact with the normal heartbeat. We compare the results of these three models and test their limitations to comprehensively simulate the statistical features of selected clinical records. This work introduces methods that can be used to test mathematical models of arrhythmogenesis and to develop a new understanding of underlying electrophysiologic mechanisms of cardiac arrhythmia.

  8. Permissive or Trophic Enteral Nutrition and Full Enteral Nutrition Had Similar Effects on Clinical Outcomes in Intensive Care: A Systematic Review of Randomized Clinical Trials.

    PubMed

    Silva, Camila F A; de Vasconcelos, Simone G; da Silva, Thales A; Silva, Flávia M

    2018-01-26

    The aim of this study was to systematically review the effect of permissive underfeeding/trophic feeding on the clinical outcomes of critically ill patients. A systematic review of randomized clinical trials to evaluate the mortality, length of stay, and mechanical ventilation duration in patients randomized to either hypocaloric or full-energy enteral nutrition was performed. Data sources included PubMed and Scopus and the reference lists of the articles retrieved. Two independent reviewers participated in all phases of this systematic review as proposed by the Cochrane Handbook, and the review was reported according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. A total of 7 randomized clinical trials that included a total of 1,717 patients were reviewed. Intensive care unit length of stay and mechanical ventilation duration were not statistically different between the intervention and control groups in all randomized clinical trials, and mortality rate was also not different between the groups. In conclusion, hypocaloric enteral nutrition had no significantly different effects on morbidity and mortality in critically ill patients when compared with full-energy nutrition. It is still necessary to determine the safety of this intervention in this group of patients, the optimal amount of energy provided, and the duration of this therapy. © 2018 American Society for Parenteral and Enteral Nutrition.

  9. Random matrices and the New York City subway system

    NASA Astrophysics Data System (ADS)

    Jagannath, Aukosh; Trogdon, Thomas

    2017-09-01

    We analyze subway arrival times in the New York City subway system. We find regimes where the gaps between trains are well modeled by (unitarily invariant) random matrix statistics and Poisson statistics. The departure from random matrix statistics is captured by the value of the Coulomb potential along the subway route. This departure becomes more pronounced as trains make more stops.

  10. Clustering, randomness, and regularity in cloud fields. 4. Stratocumulus cloud fields

    NASA Astrophysics Data System (ADS)

    Lee, J.; Chou, J.; Weger, R. C.; Welch, R. M.

    1994-07-01

    To complete the analysis of the spatial distribution of boundary layer cloudiness, the present study focuses on nine stratocumulus Landsat scenes. The results indicate many similarities between stratocumulus and cumulus spatial distributions. Most notably, at full spatial resolution all scenes exhibit a decidedly clustered distribution. The strength of the clustering signal decreases with increasing cloud size; the clusters themselves consist of a few clouds (less than 10), occupy a small percentage of the cloud field area (less than 5%), contain between 20% and 60% of the cloud field population, and are randomly located within the scene. In contrast, stratocumulus in almost every respect are more strongly clustered than are cumulus cloud fields. For instance, stratocumulus clusters contain more clouds per cluster, occupy a larger percentage of the total area, and have a larger percentage of clouds participating in clusters than the corresponding cumulus examples. To investigate clustering at intermediate spatial scales, the local dimensionality statistic is introduced. Results obtained from this statistic provide the first direct evidence for regularity among large (>900 m in diameter) clouds in stratocumulus and cumulus cloud fields, in support of the inhibition hypothesis of Ramirez and Bras (1990). Also, the size compensated point-to-cloud cumulative distribution function statistic is found to be necessary to obtain a consistent description of stratocumulus cloud distributions. A hypothesis regarding the underlying physical mechanisms responsible for cloud clustering is presented. It is suggested that cloud clusters often arise from 4 to 10 triggering events localized within regions less than 2 km in diameter and randomly distributed within the cloud field. As the size of the cloud surpasses the scale of the triggering region, the clustering signal weakens and the larger cloud locations become more random.

  11. Clustering, randomness, and regularity in cloud fields. 4: Stratocumulus cloud fields

    NASA Technical Reports Server (NTRS)

    Lee, J.; Chou, J.; Weger, R. C.; Welch, R. M.

    1994-01-01

    To complete the analysis of the spatial distribution of boundary layer cloudiness, the present study focuses on nine stratocumulus Landsat scenes. The results indicate many similarities between stratocumulus and cumulus spatial distributions. Most notably, at full spatial resolution all scenes exhibit a decidedly clustered distribution. The strength of the clustering signal decreases with increasing cloud size; the clusters themselves consist of a few clouds (less than 10), occupy a small percentage of the cloud field area (less than 5%), contain between 20% and 60% of the cloud field population, and are randomly located within the scene. In contrast, stratocumulus in almost every respect are more strongly clustered than are cumulus cloud fields. For instance, stratocumulus clusters contain more clouds per cluster, occupy a larger percentage of the total area, and have a larger percentage of clouds participating in clusters than the corresponding cumulus examples. To investigate clustering at intermediate spatial scales, the local dimensionality statistic is introduced. Results obtained from this statistic provide the first direct evidence for regularity among large (more than 900 m in diameter) clouds in stratocumulus and cumulus cloud fields, in support of the inhibition hypothesis of Ramirez and Bras (1990). Also, the size compensated point-to-cloud cumulative distribution function statistic is found to be necessary to obtain a consistent description of stratocumulus cloud distributions. A hypothesis regarding the underlying physical mechanisms responsible for cloud clustering is presented. It is suggested that cloud clusters often arise from 4 to 10 triggering events localized within regions less than 2 km in diameter and randomly distributed within the cloud field. As the size of the cloud surpasses the scale of the triggering region, the clustering signal weakens and the larger cloud locations become more random.

  12. Quantum Glass of Interacting Bosons with Off-Diagonal Disorder

    NASA Astrophysics Data System (ADS)

    Piekarska, A. M.; Kopeć, T. K.

    2018-04-01

    We study disordered interacting bosons described by the Bose-Hubbard model with Gaussian-distributed random tunneling amplitudes. It is shown that the off-diagonal disorder induces a spin-glass-like ground state, characterized by randomly frozen quantum-mechanical U(1) phases of bosons. To access criticality, we employ the "n -replica trick," as in the spin-glass theory, and the Trotter-Suzuki method for decomposition of the statistical density operator, along with numerical calculations. The interplay between disorder, quantum, and thermal fluctuations leads to phase diagrams exhibiting a glassy state of bosons, which are studied as a function of model parameters. The considered system may be relevant for quantum simulators of optical-lattice bosons, where the randomness can be introduced in a controlled way. The latter is supported by a proposition of experimental realization of the system in question.

  13. Random Walk Analysis of the Effect of Mechanical Degradation on All-Solid-State Battery Power

    DOE PAGES

    Bucci, Giovanna; Swamy, Tushar; Chiang, Yet-Ming; ...

    2017-09-06

    Mechanical and electrochemical phenomena are coupled in defining the battery reliability, particularly for solid-state batteries. Micro-cracks act as barriers to Li-ion diffusion in the electrolyte, increasing the average electrode’s tortuosity. In our previous work, we showed that solid electrolytes are likely to suffer from mechanical degradation if their fracture energy is lower than 4 J m -2 [G. Bucci, T. Swamy, Y.-M. Chiang, and W. C. Carter, J. Mater. Chem. A (2017)]. Here we study the effect of electrolyte micro-cracking on the effective conductivity of composite electrodes. Via random analyzes, we predict the average diffusivity of lithium in a solid-statemore » electrode to decrease linearly with the extension of mechanical degradation. Furthermore, the statistical distribution of first passage times indicates that the microstructure becomes more and more heterogeneous as damage progresses. In addition to power and capacity loss, a non-uniform increase of the electrode tortuosity can lead to heterogeneous lithiation and further stress localization. Finally, the understanding of these phenomena at the mesoscale is essential to the implementation of safe high-energy solid-state batteries.« less

  14. Random Walk Analysis of the Effect of Mechanical Degradation on All-Solid-State Battery Power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bucci, Giovanna; Swamy, Tushar; Chiang, Yet-Ming

    Mechanical and electrochemical phenomena are coupled in defining the battery reliability, particularly for solid-state batteries. Micro-cracks act as barriers to Li-ion diffusion in the electrolyte, increasing the average electrode’s tortuosity. In our previous work, we showed that solid electrolytes are likely to suffer from mechanical degradation if their fracture energy is lower than 4 J m -2 [G. Bucci, T. Swamy, Y.-M. Chiang, and W. C. Carter, J. Mater. Chem. A (2017)]. Here we study the effect of electrolyte micro-cracking on the effective conductivity of composite electrodes. Via random analyzes, we predict the average diffusivity of lithium in a solid-statemore » electrode to decrease linearly with the extension of mechanical degradation. Furthermore, the statistical distribution of first passage times indicates that the microstructure becomes more and more heterogeneous as damage progresses. In addition to power and capacity loss, a non-uniform increase of the electrode tortuosity can lead to heterogeneous lithiation and further stress localization. Finally, the understanding of these phenomena at the mesoscale is essential to the implementation of safe high-energy solid-state batteries.« less

  15. Wave turbulence

    NASA Astrophysics Data System (ADS)

    Nazarenko, Sergey

    2015-07-01

    Wave turbulence is the statistical mechanics of random waves with a broadband spectrum interacting via non-linearity. To understand its difference from non-random well-tuned coherent waves, one could compare the sound of thunder to a piece of classical music. Wave turbulence is surprisingly common and important in a great variety of physical settings, starting with the most familiar ocean waves to waves at quantum scales or to much longer waves in astrophysics. We will provide a basic overview of the wave turbulence ideas, approaches and main results emphasising the physics of the phenomena and using qualitative descriptions avoiding, whenever possible, involved mathematical derivations. In particular, dimensional analysis will be used for obtaining the key scaling solutions in wave turbulence - Kolmogorov-Zakharov (KZ) spectra.

  16. Statistical characteristics of trajectories of diamagnetic unicellular organisms in a magnetic field.

    PubMed

    Gorobets, Yu I; Gorobets, O Yu

    2015-01-01

    The statistical model is proposed in this paper for description of orientation of trajectories of unicellular diamagnetic organisms in a magnetic field. The statistical parameter such as the effective energy is calculated on basis of this model. The resulting effective energy is the statistical characteristics of trajectories of diamagnetic microorganisms in a magnetic field connected with their metabolism. The statistical model is applicable for the case when the energy of the thermal motion of bacteria is negligible in comparison with their energy in a magnetic field and the bacteria manifest the significant "active random movement", i.e. there is the randomizing motion of the bacteria of non thermal nature, for example, movement of bacteria by means of flagellum. The energy of the randomizing active self-motion of bacteria is characterized by the new statistical parameter for biological objects. The parameter replaces the energy of the randomizing thermal motion in calculation of the statistical distribution. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Treatment of peri-implantitis: clinical outcome of chloramine as an adjunctive to non-surgical therapy, a randomized clinical trial.

    PubMed

    Roos-Jansåker, Ann-Marie; Almhöjd, Ulrica S; Jansson, Henrik

    2017-01-01

    To evaluate the adjunctive clinical effects of a chloramine to non-surgical treatment of peri-implantitis. Eighteen individuals diagnosed with peri-implantitis (clinical signs of inflammation and progressive bone loss) on at least two implants were included. Clinical variables; plaque accumulation (Pl), probing depth (PD), clinical attachment level (CAL) and bleeding on probing (BoP), were recorded at baseline and at 3-month follow-up. Primary clinical efficacy variable was the change in the number of sites with BoP. The implants were randomized into two different treatment groups: test and control. Both implants received supra- and submucosal debridement by ultrasonic instrumentation supplemented with hand instruments. The implants assigned to the test group first received local applications of a chloramine gel (Perisolv ™ ; RLS Global AB, Gothenburg, Sweden) followed by mechanical instrumentation. The oral hygiene was checked at 6 weeks. After 3 months, implants of both groups showed statistically significant reduction (P < 0.001) in the number of BoP-positive sites compared with baseline. The reduction of BoP-positive sites in the test group changed from 0.97 (SD ± 0.12) to 0.38 (SD ± 0.46), and in the control group from 0.97 (SD ± 0.12) to 0.31 (SD ± 0.42). Between-group comparisons revealed no statistically significant differences at baseline and after 3 months, for BoP or any of the other variables. In the present randomized clinical trial of peri-implantitis therapy; non-surgical mechanical debridement with adjunctive use of a chloramine is equally effective in the reduction of mucosal inflammation as conventional non-surgical mechanical debridement up to 3 months. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. Do the methods used to analyse missing data really matter? An examination of data from an observational study of Intermediate Care patients.

    PubMed

    Kaambwa, Billingsley; Bryan, Stirling; Billingham, Lucinda

    2012-06-27

    Missing data is a common statistical problem in healthcare datasets from populations of older people. Some argue that arbitrarily assuming the mechanism responsible for the missingness and therefore the method for dealing with this missingness is not the best option-but is this always true? This paper explores what happens when extra information that suggests that a particular mechanism is responsible for missing data is disregarded and methods for dealing with the missing data are chosen arbitrarily. Regression models based on 2,533 intermediate care (IC) patients from the largest evaluation of IC done and published in the UK to date were used to explain variation in costs, EQ-5D and Barthel index. Three methods for dealing with missingness were utilised, each assuming a different mechanism as being responsible for the missing data: complete case analysis (assuming missing completely at random-MCAR), multiple imputation (assuming missing at random-MAR) and Heckman selection model (assuming missing not at random-MNAR). Differences in results were gauged by examining the signs of coefficients as well as the sizes of both coefficients and associated standard errors. Extra information strongly suggested that missing cost data were MCAR. The results show that MCAR and MAR-based methods yielded similar results with sizes of most coefficients and standard errors differing by less than 3.4% while those based on MNAR-methods were statistically different (up to 730% bigger). Significant variables in all regression models also had the same direction of influence on costs. All three mechanisms of missingness were shown to be potential causes of the missing EQ-5D and Barthel data. The method chosen to deal with missing data did not seem to have any significant effect on the results for these data as they led to broadly similar conclusions with sizes of coefficients and standard errors differing by less than 54% and 322%, respectively. Arbitrary selection of methods to deal with missing data should be avoided. Using extra information gathered during the data collection exercise about the cause of missingness to guide this selection would be more appropriate.

  19. Single-electron thermal noise

    NASA Astrophysics Data System (ADS)

    Nishiguchi, Katsuhiko; Ono, Yukinori; Fujiwara, Akira

    2014-07-01

    We report the observation of thermal noise in the motion of single electrons in an ultimately small dynamic random access memory (DRAM). The nanometer-scale transistors that compose the DRAM resolve the thermal noise in single-electron motion. A complete set of fundamental tests conducted on this single-electron thermal noise shows that the noise perfectly follows all the aspects predicted by statistical mechanics, which include the occupation probability, the law of equipartition, a detailed balance, and the law of kT/C. In addition, the counting statistics on the directional motion (i.e., the current) of the single-electron thermal noise indicate that the individual electron motion follows the Poisson process, as it does in shot noise.

  20. Single-electron thermal noise.

    PubMed

    Nishiguchi, Katsuhiko; Ono, Yukinori; Fujiwara, Akira

    2014-07-11

    We report the observation of thermal noise in the motion of single electrons in an ultimately small dynamic random access memory (DRAM). The nanometer-scale transistors that compose the DRAM resolve the thermal noise in single-electron motion. A complete set of fundamental tests conducted on this single-electron thermal noise shows that the noise perfectly follows all the aspects predicted by statistical mechanics, which include the occupation probability, the law of equipartition, a detailed balance, and the law of kT/C. In addition, the counting statistics on the directional motion (i.e., the current) of the single-electron thermal noise indicate that the individual electron motion follows the Poisson process, as it does in shot noise.

  1. The Statistical Power of the Cluster Randomized Block Design with Matched Pairs--A Simulation Study

    ERIC Educational Resources Information Center

    Dong, Nianbo; Lipsey, Mark

    2010-01-01

    This study uses simulation techniques to examine the statistical power of the group- randomized design and the matched-pair (MP) randomized block design under various parameter combinations. Both nearest neighbor matching and random matching are used for the MP design. The power of each design for any parameter combination was calculated from…

  2. Randomization Procedures Applied to Analysis of Ballistic Data

    DTIC Science & Technology

    1991-06-01

    test,;;15. NUMBER OF PAGES data analysis; computationally intensive statistics ; randomization tests; permutation tests; 16 nonparametric statistics ...be 0.13. 8 Any reasonable statistical procedure would fail to support the notion of improvement of dynamic over standard indexing based on this data ...AD-A238 389 TECHNICAL REPORT BRL-TR-3245 iBRL RANDOMIZATION PROCEDURES APPLIED TO ANALYSIS OF BALLISTIC DATA MALCOLM S. TAYLOR BARRY A. BODT - JUNE

  3. A method for determining the weak statistical stationarity of a random process

    NASA Technical Reports Server (NTRS)

    Sadeh, W. Z.; Koper, C. A., Jr.

    1978-01-01

    A method for determining the weak statistical stationarity of a random process is presented. The core of this testing procedure consists of generating an equivalent ensemble which approximates a true ensemble. Formation of an equivalent ensemble is accomplished through segmenting a sufficiently long time history of a random process into equal, finite, and statistically independent sample records. The weak statistical stationarity is ascertained based on the time invariance of the equivalent-ensemble averages. Comparison of these averages with their corresponding time averages over a single sample record leads to a heuristic estimate of the ergodicity of a random process. Specific variance tests are introduced for evaluating the statistical independence of the sample records, the time invariance of the equivalent-ensemble autocorrelations, and the ergodicity. Examination and substantiation of these procedures were conducted utilizing turbulent velocity signals.

  4. Solution-Processed Carbon Nanotube True Random Number Generator.

    PubMed

    Gaviria Rojas, William A; McMorrow, Julian J; Geier, Michael L; Tang, Qianying; Kim, Chris H; Marks, Tobin J; Hersam, Mark C

    2017-08-09

    With the growing adoption of interconnected electronic devices in consumer and industrial applications, there is an increasing demand for robust security protocols when transmitting and receiving sensitive data. Toward this end, hardware true random number generators (TRNGs), commonly used to create encryption keys, offer significant advantages over software pseudorandom number generators. However, the vast network of devices and sensors envisioned for the "Internet of Things" will require small, low-cost, and mechanically flexible TRNGs with low computational complexity. These rigorous constraints position solution-processed semiconducting single-walled carbon nanotubes (SWCNTs) as leading candidates for next-generation security devices. Here, we demonstrate the first TRNG using static random access memory (SRAM) cells based on solution-processed SWCNTs that digitize thermal noise to generate random bits. This bit generation strategy can be readily implemented in hardware with minimal transistor and computational overhead, resulting in an output stream that passes standardized statistical tests for randomness. By using solution-processed semiconducting SWCNTs in a low-power, complementary architecture to achieve TRNG, we demonstrate a promising approach for improving the security of printable and flexible electronics.

  5. Probalistic Finite Elements (PFEM) structural dynamics and fracture mechanics

    NASA Technical Reports Server (NTRS)

    Liu, Wing-Kam; Belytschko, Ted; Mani, A.; Besterfield, G.

    1989-01-01

    The purpose of this work is to develop computationally efficient methodologies for assessing the effects of randomness in loads, material properties, and other aspects of a problem by a finite element analysis. The resulting group of methods is called probabilistic finite elements (PFEM). The overall objective of this work is to develop methodologies whereby the lifetime of a component can be predicted, accounting for the variability in the material and geometry of the component, the loads, and other aspects of the environment; and the range of response expected in a particular scenario can be presented to the analyst in addition to the response itself. Emphasis has been placed on methods which are not statistical in character; that is, they do not involve Monte Carlo simulations. The reason for this choice of direction is that Monte Carlo simulations of complex nonlinear response require a tremendous amount of computation. The focus of efforts so far has been on nonlinear structural dynamics. However, in the continuation of this project, emphasis will be shifted to probabilistic fracture mechanics so that the effect of randomness in crack geometry and material properties can be studied interactively with the effect of random load and environment.

  6. Application of the quantum spin glass theory to image restoration.

    PubMed

    Inoue, J I

    2001-04-01

    Quantum fluctuation is introduced into the Markov random-field model for image restoration in the context of a Bayesian approach. We investigate the dependence of the quantum fluctuation on the quality of a black and white image restoration by making use of statistical mechanics. We find that the maximum posterior marginal (MPM) estimate based on the quantum fluctuation gives a fine restoration in comparison with the maximum a posteriori estimate or the thermal fluctuation based MPM estimate.

  7. Sequential Monte Carlo tracking of the marginal artery by multiple cue fusion and random forest regression.

    PubMed

    Cherry, Kevin M; Peplinski, Brandon; Kim, Lauren; Wang, Shijun; Lu, Le; Zhang, Weidong; Liu, Jianfei; Wei, Zhuoshi; Summers, Ronald M

    2015-01-01

    Given the potential importance of marginal artery localization in automated registration in computed tomography colonography (CTC), we have devised a semi-automated method of marginal vessel detection employing sequential Monte Carlo tracking (also known as particle filtering tracking) by multiple cue fusion based on intensity, vesselness, organ detection, and minimum spanning tree information for poorly enhanced vessel segments. We then employed a random forest algorithm for intelligent cue fusion and decision making which achieved high sensitivity and robustness. After applying a vessel pruning procedure to the tracking results, we achieved statistically significantly improved precision compared to a baseline Hessian detection method (2.7% versus 75.2%, p<0.001). This method also showed statistically significantly improved recall rate compared to a 2-cue baseline method using fewer vessel cues (30.7% versus 67.7%, p<0.001). These results demonstrate that marginal artery localization on CTC is feasible by combining a discriminative classifier (i.e., random forest) with a sequential Monte Carlo tracking mechanism. In so doing, we present the effective application of an anatomical probability map to vessel pruning as well as a supplementary spatial coordinate system for colonic segmentation and registration when this task has been confounded by colon lumen collapse. Published by Elsevier B.V.

  8. An analytic technique for statistically modeling random atomic clock errors in estimation

    NASA Technical Reports Server (NTRS)

    Fell, P. J.

    1981-01-01

    Minimum variance estimation requires that the statistics of random observation errors be modeled properly. If measurements are derived through the use of atomic frequency standards, then one source of error affecting the observable is random fluctuation in frequency. This is the case, for example, with range and integrated Doppler measurements from satellites of the Global Positioning and baseline determination for geodynamic applications. An analytic method is presented which approximates the statistics of this random process. The procedure starts with a model of the Allan variance for a particular oscillator and develops the statistics of range and integrated Doppler measurements. A series of five first order Markov processes is used to approximate the power spectral density obtained from the Allan variance.

  9. Random close packing of polydisperse jammed emulsions

    NASA Astrophysics Data System (ADS)

    Brujic, Jasna

    2010-03-01

    Packing problems are everywhere, ranging from oil extraction through porous rocks to grain storage in silos and the compaction of pharmaceutical powders into tablets. At a given density, particulate systems pack into a mechanically stable and amorphous jammed state. Theoretical frameworks have proposed a connection between this jammed state and the glass transition, a thermodynamics of jamming, as well as geometric modeling of random packings. Nevertheless, a simple underlying mechanism for the random assembly of athermal particles, analogous to crystalline ordering, remains unknown. Here we use 3D measurements of polydisperse packings of emulsion droplets to build a simple statistical model in which the complexity of the global packing is distilled into a local stochastic process. From the perspective of a single particle the packing problem is reduced to the random formation of nearest neighbors, followed by a choice of contacts among them. The two key parameters in the model, the available space around a particle and the ratio of contacts to neighbors, are directly obtained from experiments. Remarkably, we demonstrate that this ``granocentric'' view captures the properties of the polydisperse emulsion packing, ranging from the microscopic distributions of nearest neighbors and contacts to local density fluctuations and all the way to the global packing density. Further applications to monodisperse and bidisperse systems quantitatively agree with previously measured trends in global density. This model therefore reveals a general principle of organization for random packing and lays the foundations for a theory of jammed matter.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forrester, Peter J., E-mail: p.forrester@ms.unimelb.edu.au; Thompson, Colin J.

    The Golden-Thompson inequality, Tr (e{sup A+B}) ⩽ Tr (e{sup A}e{sup B}) for A, B Hermitian matrices, appeared in independent works by Golden and Thompson published in 1965. Both of these were motivated by considerations in statistical mechanics. In recent years the Golden-Thompson inequality has found applications to random matrix theory. In this article, we detail some historical aspects relating to Thompson's work, giving in particular a hitherto unpublished proof due to Dyson, and correspondence with Pólya. We show too how the 2 × 2 case relates to hyperbolic geometry, and how the original inequality holds true with the trace operation replaced bymore » any unitarily invariant norm. In relation to the random matrix applications, we review its use in the derivation of concentration type lemmas for sums of random matrices due to Ahlswede-Winter, and Oliveira, generalizing various classical results.« less

  11. Using Temporal Correlations and Full Distributions to Separate Intrinsic and Extrinsic Fluctuations in Biological Systems

    NASA Astrophysics Data System (ADS)

    Hilfinger, Andreas; Chen, Mark; Paulsson, Johan

    2012-12-01

    Studies of stochastic biological dynamics typically compare observed fluctuations to theoretically predicted variances, sometimes after separating the intrinsic randomness of the system from the enslaving influence of changing environments. But variances have been shown to discriminate surprisingly poorly between alternative mechanisms, while for other system properties no approaches exist that rigorously disentangle environmental influences from intrinsic effects. Here, we apply the theory of generalized random walks in random environments to derive exact rules for decomposing time series and higher statistics, rather than just variances. We show for which properties and for which classes of systems intrinsic fluctuations can be analyzed without accounting for extrinsic stochasticity and vice versa. We derive two independent experimental methods to measure the separate noise contributions and show how to use the additional information in temporal correlations to detect multiplicative effects in dynamical systems.

  12. Typical performance of approximation algorithms for NP-hard problems

    NASA Astrophysics Data System (ADS)

    Takabe, Satoshi; Hukushima, Koji

    2016-11-01

    Typical performance of approximation algorithms is studied for randomized minimum vertex cover problems. A wide class of random graph ensembles characterized by an arbitrary degree distribution is discussed with the presentation of a theoretical framework. Herein, three approximation algorithms are examined: linear-programming relaxation, loopy-belief propagation, and the leaf-removal algorithm. The former two algorithms are analyzed using a statistical-mechanical technique, whereas the average-case analysis of the last one is conducted using the generating function method. These algorithms have a threshold in the typical performance with increasing average degree of the random graph, below which they find true optimal solutions with high probability. Our study reveals that there exist only three cases, determined by the order of the typical performance thresholds. In addition, we provide some conditions for classification of the graph ensembles and demonstrate explicitly some examples for the difference in thresholds.

  13. Statistics of Delta v magnitude for a trajectory correction maneuver containing deterministic and random components

    NASA Technical Reports Server (NTRS)

    Bollman, W. E.; Chadwick, C.

    1982-01-01

    A number of interplanetary missions now being planned involve placing deterministic maneuvers along the flight path to alter the trajectory. Lee and Boain (1973) examined the statistics of trajectory correction maneuver (TCM) magnitude with no deterministic ('bias') component. The Delta v vector magnitude statistics were generated for several values of random Delta v standard deviations using expansions in terms of infinite hypergeometric series. The present investigation uses a different technique (Monte Carlo simulation) to generate Delta v magnitude statistics for a wider selection of random Delta v standard deviations and also extends the analysis to the case of nonzero deterministic Delta v's. These Delta v magnitude statistics are plotted parametrically. The plots are useful in assisting the analyst in quickly answering questions about the statistics of Delta v magnitude for single TCM's consisting of both a deterministic and a random component. The plots provide quick insight into the nature of the Delta v magnitude distribution for the TCM.

  14. Spatial distribution of impact craters on Deimos

    NASA Astrophysics Data System (ADS)

    Hirata, Naoyuki

    2017-05-01

    Deimos, one of the Martian moons, has numerous impact craters. However, it is unclear whether crater saturation has been reached on this satellite. To address this issue, we apply a statistical test known as nearest-neighbor analysis to analyze the crater distribution of Deimos. When a planetary surface such as the Moon is saturated with impact craters, the spatial distribution of craters is generally changed from random to more ordered. We measured impact craters on Deimos from Viking and HiRISE images and found (1) that the power law of the size-frequency distribution of the craters is approximately -1.7, which is significantly shallower than those of potential impactors, and (2) that the spatial distribution of craters over 30 m in diameter cannot be statistically distinguished from completely random distribution, which indicates that the surface of Deimos is inconsistent with a surface saturated with impact craters. Although a crater size-frequency distribution curve with a slope of -2 is generally interpreted as indicating saturation equilibrium, it is here proposed that two competing mechanisms, seismic shaking and ejecta emplacement, have played a major role in erasing craters on Deimos and are therefore responsible for the shallow slope of this curve. The observed crater density may have reached steady state owing to the obliterations induced by the two competing mechanisms. Such an occurrence indicates that the surface is saturated with impact craters despite the random distribution of craters on Deimos. Therefore, this work proposes that the age determined by the current craters on Deimos reflects neither the age of Deimos itself nor that of the formation of the large concavity centered at its south pole because craters should be removed by later impacts. However, a few of the largest craters on Deimos may be indicative of the age of the south pole event.

  15. Is Mutation Random or Targeted?: No Evidence for Hypermutability in Snail Toxin Genes.

    PubMed

    Roy, Scott W

    2016-10-01

    Ever since Luria and Delbruck, the notion that mutation is random with respect to fitness has been foundational to modern biology. However, various studies have claimed striking exceptions to this rule. One influential case involves toxin-encoding genes in snails of the genus Conus, termed conotoxins, a large gene family that undergoes rapid diversification of their protein-coding sequences by positive selection. Previous reconstructions of the sequence evolution of conotoxin genes claimed striking patterns: (1) elevated synonymous change, interpreted as being due to targeted "hypermutation" in this region; (2) elevated transversion-to-transition ratios, interpreted as reflective of the particular mechanism of hypermutation; and (3) much lower rates of synonymous change in the codons encoding several highly conserved cysteine residues, interpreted as strong position-specific codon bias. This work has spawned a variety of studies on the potential mechanisms of hypermutation and on causes for cysteine codon bias, and has inspired hypermutation hypotheses for various other fast-evolving genes. Here, I show that all three findings are likely to be artifacts of statistical reconstruction. First, by simulating nonsynonymous change I show that high rates of dN can lead to overestimation of dS. Second, I show that there is no evidence for any of these three patterns in comparisons of closely related conotoxin sequences, suggesting that the reported findings are due to breakdown of statistical methods at high levels of sequence divergence. The current findings suggest that mutation and codon bias in conotoxin genes may not be atypical, and that random mutation and selection can explain the evolution of even these exceptional loci. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Statistical metrology—measurement and modeling of variation for advanced process development and design rule generation

    NASA Astrophysics Data System (ADS)

    Boning, Duane S.; Chung, James E.

    1998-11-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.

  17. Addressing issues associated with evaluating prediction models for survival endpoints based on the concordance statistic.

    PubMed

    Wang, Ming; Long, Qi

    2016-09-01

    Prediction models for disease risk and prognosis play an important role in biomedical research, and evaluating their predictive accuracy in the presence of censored data is of substantial interest. The standard concordance (c) statistic has been extended to provide a summary measure of predictive accuracy for survival models. Motivated by a prostate cancer study, we address several issues associated with evaluating survival prediction models based on c-statistic with a focus on estimators using the technique of inverse probability of censoring weighting (IPCW). Compared to the existing work, we provide complete results on the asymptotic properties of the IPCW estimators under the assumption of coarsening at random (CAR), and propose a sensitivity analysis under the mechanism of noncoarsening at random (NCAR). In addition, we extend the IPCW approach as well as the sensitivity analysis to high-dimensional settings. The predictive accuracy of prediction models for cancer recurrence after prostatectomy is assessed by applying the proposed approaches. We find that the estimated predictive accuracy for the models in consideration is sensitive to NCAR assumption, and thus identify the best predictive model. Finally, we further evaluate the performance of the proposed methods in both settings of low-dimensional and high-dimensional data under CAR and NCAR through simulations. © 2016, The International Biometric Society.

  18. Dependence of prevalence of contiguous pathways in proteins on structural complexity.

    PubMed

    Thayer, Kelly M; Galganov, Jesse C; Stein, Avram J

    2017-01-01

    Allostery is a regulatory mechanism in proteins where an effector molecule binds distal from an active site to modulate its activity. Allosteric signaling may occur via a continuous path of residues linking the active and allosteric sites, which has been suggested by large conformational changes evident in crystal structures. An alternate possibility is that the signal occurs in the realm of ensemble dynamics via an energy landscape change. While the latter was first proposed on theoretical grounds, increasing evidence suggests that such a control mechanism is plausible. A major difficulty for testing the two methods is the ability to definitively determine that a residue is directly involved in allosteric signal transduction. Statistical Coupling Analysis (SCA) is a method that has been successful at predicting pathways, and experimental tests involving mutagenesis or domain substitution provide the best available evidence of signaling pathways. However, ascertaining energetic pathways which need not be contiguous is far more difficult. To date, simple estimates of the statistical significance of a pathway in a protein remain to be established. The focus of this work is to estimate such benchmarks for the statistical significance of contiguous pathways for the null model of selecting residues at random. We found that when 20% of residues in proteins are randomly selected, contiguous pathways at the 6 Å cutoff level were found with success rates of 51% in PDZ, 30% in p53, and 3% in MutS. The results suggest that the significance of pathways may have system specific factors involved. Furthermore, the possible existence of false positives for contiguous pathways implies that signaling could be occurring via alternate routes including those consistent with the energetic landscape model.

  19. Exploratory plasma proteomic analysis in a randomized crossover trial of aspirin among healthy men and women.

    PubMed

    Wang, Xiaoliang; Shojaie, Ali; Zhang, Yuzheng; Shelley, David; Lampe, Paul D; Levy, Lisa; Peters, Ulrike; Potter, John D; White, Emily; Lampe, Johanna W

    2017-01-01

    Long-term use of aspirin is associated with lower risk of colorectal cancer and other cancers; however, the mechanism of chemopreventive effect of aspirin is not fully understood. Animal studies suggest that COX-2, NFκB signaling and Wnt/β-catenin pathways may play a role, but no clinical trials have systematically evaluated the biological response to aspirin in healthy humans. Using a high-density antibody array, we assessed the difference in plasma protein levels after 60 days of regular dose aspirin (325 mg/day) compared to placebo in a randomized double-blinded crossover trial of 44 healthy non-smoking men and women, aged 21-45 years. The plasma proteome was analyzed on an antibody microarray with ~3,300 full-length antibodies, printed in triplicate. Moderated paired t-tests were performed on individual antibodies, and gene-set analyses were performed based on KEGG and GO pathways. Among the 3,000 antibodies analyzed, statistically significant differences in plasma protein levels were observed for nine antibodies after adjusting for false discoveries (FDR adjusted p-value<0.1). The most significant protein was succinate dehydrogenase subunit C (SDHC), a key enzyme complex of the mitochondrial tricarboxylic acid (TCA) cycle. The other statistically significant proteins (NR2F1, MSI1, MYH1, FOXO1, KHDRBS3, NFKBIE, LYZ and IKZF1) are involved in multiple pathways, including DNA base-pair repair, inflammation and oncogenic pathways. None of the 258 KEGG and 1,139 GO pathways was found to be statistically significant after FDR adjustment. This study suggests several chemopreventive mechanisms of aspirin in humans, which have previously been reported to play a role in anti- or pro-carcinogenesis in cell systems; however, larger, confirmatory studies are needed.

  20. Quantifying fluctuations in economic systems by adapting methods of statistical physics

    NASA Astrophysics Data System (ADS)

    Stanley, H. E.; Gopikrishnan, P.; Plerou, V.; Amaral, L. A. N.

    2000-12-01

    The emerging subfield of econophysics explores the degree to which certain concepts and methods from statistical physics can be appropriately modified and adapted to provide new insights into questions that have been the focus of interest in the economics community. Here we give a brief overview of two examples of research topics that are receiving recent attention. A first topic is the characterization of the dynamics of stock price fluctuations. For example, we investigate the relation between trading activity - measured by the number of transactions NΔ t - and the price change GΔ t for a given stock, over a time interval [t, t+ Δt] . We relate the time-dependent standard deviation of price fluctuations - volatility - to two microscopic quantities: the number of transactions NΔ t in Δ t and the variance WΔ t2 of the price changes for all transactions in Δ t. Our work indicates that while the pronounced tails in the distribution of price fluctuations arise from WΔ t, the long-range correlations found in ∣ GΔ t∣ are largely due to NΔ t. We also investigate the relation between price fluctuations and the number of shares QΔ t traded in Δ t. We find that the distribution of QΔ t is consistent with a stable Lévy distribution, suggesting a Lévy scaling relationship between QΔ t and NΔ t, which would provide one explanation for volume-volatility co-movement. A second topic concerns cross-correlations between the price fluctuations of different stocks. We adapt a conceptual framework, random matrix theory (RMT), first used in physics to interpret statistical properties of nuclear energy spectra. RMT makes predictions for the statistical properties of matrices that are universal, that is, do not depend on the interactions between the elements comprising the system. In physics systems, deviations from the predictions of RMT provide clues regarding the mechanisms controlling the dynamics of a given system, so this framework can be of potential value if applied to economic systems. We discuss a systematic comparison between the statistics of the cross-correlation matrix C - whose elements Cij are the correlation-coefficients between the returns of stock i and j - and that of a random matrix having the same symmetry properties. Our work suggests that RMT can be used to distinguish random and non-random parts of C; the non-random part of C, which deviates from RMT results provides information regarding genuine cross-correlations between stocks.

  1. The non-equilibrium allele frequency spectrum in a Poisson random field framework.

    PubMed

    Kaj, Ingemar; Mugal, Carina F

    2016-10-01

    In population genetic studies, the allele frequency spectrum (AFS) efficiently summarizes genome-wide polymorphism data and shapes a variety of allele frequency-based summary statistics. While existing theory typically features equilibrium conditions, emerging methodology requires an analytical understanding of the build-up of the allele frequencies over time. In this work, we use the framework of Poisson random fields to derive new representations of the non-equilibrium AFS for the case of a Wright-Fisher population model with selection. In our approach, the AFS is a scaling-limit of the expectation of a Poisson stochastic integral and the representation of the non-equilibrium AFS arises in terms of a fixation time probability distribution. The known duality between the Wright-Fisher diffusion process and a birth and death process generalizing Kingman's coalescent yields an additional representation. The results carry over to the setting of a random sample drawn from the population and provide the non-equilibrium behavior of sample statistics. Our findings are consistent with and extend a previous approach where the non-equilibrium AFS solves a partial differential forward equation with a non-traditional boundary condition. Moreover, we provide a bridge to previous coalescent-based work, and hence tie several frameworks together. Since frequency-based summary statistics are widely used in population genetics, for example, to identify candidate loci of adaptive evolution, to infer the demographic history of a population, or to improve our understanding of the underlying mechanics of speciation events, the presented results are potentially useful for a broad range of topics. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Applications of the first digit law to measure correlations.

    PubMed

    Gramm, R; Yost, J; Su, Q; Grobe, R

    2017-04-01

    The quasiempirical Benford law predicts that the distribution of the first significant digit of random numbers obtained from mixed probability distributions is surprisingly meaningful and reveals some universal behavior. We generalize this finding to examine the joint first-digit probability of a pair of two random numbers and show that undetectable correlations by means of the usual covariance-based measure can be identified in the statistics of the corresponding first digits. We illustrate this new measure by analyzing the correlations and anticorrelations of the positions of two interacting particles in their quantum mechanical ground state. This suggests that by using this measure, the presence or absence of correlations can be determined even if only the first digit of noisy experimental data can be measured accurately.

  3. Generalized energy detector for weak random signals via vibrational resonance

    NASA Astrophysics Data System (ADS)

    Ren, Yuhao; Pan, Yan; Duan, Fabing

    2018-03-01

    In this paper, the generalized energy (GE) detector is investigated for detecting weak random signals via vibrational resonance (VR). By artificially injecting the high-frequency sinusoidal interferences into an array of GE statistics formed for the detector, we show that the normalized asymptotic efficacy can be maximized when the interference intensity takes an appropriate non-zero value. It is demonstrated that the normalized asymptotic efficacy of the dead-zone-limiter detector, aided by the VR mechanism, outperforms that of the GE detector without the help of high-frequency interferences. Moreover, the maximum normalized asymptotic efficacy of dead-zone-limiter detectors can approach a quarter of the second-order Fisher information for a wide range of non-Gaussian noise types.

  4. Giant mesoscopic fluctuations of the elastic cotunneling thermopower of a single-electron transistor

    NASA Astrophysics Data System (ADS)

    Vasenko, A. S.; Basko, D. M.; Hekking, F. W. J.

    2015-02-01

    We study the thermoelectric transport of a small metallic island weakly coupled to two electrodes by tunnel junctions. In the Coulomb blockade regime, in the case when the ground state of the system corresponds to an even number of electrons on the island, the main mechanism of electron transport at the lowest temperatures is elastic cotunneling. In this regime, the transport coefficients strongly depend on the realization of the random impurity potential or the shape of the island. Using random-matrix theory, we calculate the thermopower and the thermoelectric kinetic coefficient and study the statistics of their mesoscopic fluctuations in the elastic cotunneling regime. The fluctuations of the thermopower turn out to be much larger than the average value.

  5. Robust PRNG based on homogeneously distributed chaotic dynamics

    NASA Astrophysics Data System (ADS)

    Garasym, Oleg; Lozi, René; Taralova, Ina

    2016-02-01

    This paper is devoted to the design of new chaotic Pseudo Random Number Generator (CPRNG). Exploring several topologies of network of 1-D coupled chaotic mapping, we focus first on two dimensional networks. Two topologically coupled maps are studied: TTL rc non-alternate, and TTL SC alternate. The primary idea of the novel maps has been based on an original coupling of the tent and logistic maps to achieve excellent random properties and homogeneous /uniform/ density in the phase plane, thus guaranteeing maximum security when used for chaos base cryptography. In this aim two new nonlinear CPRNG: MTTL 2 sc and NTTL 2 are proposed. The maps successfully passed numerous statistical, graphical and numerical tests, due to proposed ring coupling and injection mechanisms.

  6. Coverage-maximization in networks under resource constraints.

    PubMed

    Nandi, Subrata; Brusch, Lutz; Deutsch, Andreas; Ganguly, Niloy

    2010-06-01

    Efficient coverage algorithms are essential for information search or dispersal in all kinds of networks. We define an extended coverage problem which accounts for constrained resources of consumed bandwidth B and time T . Our solution to the network challenge is here studied for regular grids only. Using methods from statistical mechanics, we develop a coverage algorithm with proliferating message packets and temporally modulated proliferation rate. The algorithm performs as efficiently as a single random walker but O(B(d-2)/d) times faster, resulting in significant service speed-up on a regular grid of dimension d . The algorithm is numerically compared to a class of generalized proliferating random walk strategies and on regular grids shown to perform best in terms of the product metric of speed and efficiency.

  7. Universal statistics of vortex tangles in three-dimensional random waves

    NASA Astrophysics Data System (ADS)

    Taylor, Alexander J.

    2018-02-01

    The tangled nodal lines (wave vortices) in random, three-dimensional wavefields are studied as an exemplar of a fractal loop soup. Their statistics are a three-dimensional counterpart to the characteristic random behaviour of nodal domains in quantum chaos, but in three dimensions the filaments can wind around one another to give distinctly different large scale behaviours. By tracing numerically the structure of the vortices, their conformations are shown to follow recent analytical predictions for random vortex tangles with periodic boundaries, where the local disorder of the model ‘averages out’ to produce large scale power law scaling relations whose universality classes do not depend on the local physics. These results explain previous numerical measurements in terms of an explicit effect of the periodic boundaries, where the statistics of the vortices are strongly affected by the large scale connectedness of the system even at arbitrarily high energies. The statistics are investigated primarily for static (monochromatic) wavefields, but the analytical results are further shown to directly describe the reconnection statistics of vortices evolving in certain dynamic systems, or occurring during random perturbations of the static configuration.

  8. Mendelian randomization analysis associates increased serum urate, due to genetic variation in uric acid transporters, with improved renal function.

    PubMed

    Hughes, Kim; Flynn, Tanya; de Zoysa, Janak; Dalbeth, Nicola; Merriman, Tony R

    2014-02-01

    Increased serum urate predicts chronic kidney disease independent of other risk factors. The use of xanthine oxidase inhibitors coincides with improved renal function. Whether this is due to reduced serum urate or reduced production of oxidants by xanthine oxidase or another physiological mechanism remains unresolved. Here we applied Mendelian randomization, a statistical genetics approach allowing disentangling of cause and effect in the presence of potential confounding, to determine whether lowering of serum urate by genetic modulation of renal excretion benefits renal function using data from 7979 patients of the Atherosclerosis Risk in Communities and Framingham Heart studies. Mendelian randomization by the two-stage least squares method was done with serum urate as the exposure, a uric acid transporter genetic risk score as instrumental variable, and estimated glomerular filtration rate and serum creatinine as the outcomes. Increased genetic risk score was associated with significantly improved renal function in men but not in women. Analysis of individual genetic variants showed the effect size associated with serum urate did not correlate with that associated with renal function in the Mendelian randomization model. This is consistent with the possibility that the physiological action of these genetic variants in raising serum urate correlates directly with improved renal function. Further studies are required to understand the mechanism of the potential renal function protection mediated by xanthine oxidase inhibitors.

  9. Elasticity and Fluctuations of Incompatible Nanoribbons

    NASA Astrophysics Data System (ADS)

    Grossman, Doron; Sharon, Eran; Diamant, Haim

    Geometrically incompatible ribbons are ubiquitous in nature, from the growing of biological tissues, to self assemblies of peptides and lipids. These exhibit unusual characteristics such shape bifurcations, and abnormal mechanical properties. When considering nano and micro ribbons, thermal fluctuations convert these properties into nontrivial statistics. We derive a reduced quasi-one-dimensional theory, which describes a wide range of incompatible elastic ribbons, and can be integrated into statistical mechanics formalism. Using it, we compute equilibrium configurations and statistical properties of two types of incompatible ribbons, with experimental significance: ribbons with positive spontaneous curvature, and ribbons with negative spontaneous curvature. The former, above a critical width, has a continuous family of degenerate configurations. In turn this causes the ribbons to behave as a random coils. The latter, however, exhibits a twisted-to-helical transition at a critical width, and behaves as an abnormal coil. It's persistence length is non-monotonic in the ribbon width and vanishes at a critical width, with principal modes of deformation different than compatible ribbons. Measurements of twisted ribbons made of chiral peptides, confirm some predictions of the model. European Research Council SoftGrowth project and The Harvey M. Kruger Family Center of Nanoscience and Nanotechnology.

  10. Local dependence in random graph models: characterization, properties and statistical inference

    PubMed Central

    Schweinberger, Michael; Handcock, Mark S.

    2015-01-01

    Summary Dependent phenomena, such as relational, spatial and temporal phenomena, tend to be characterized by local dependence in the sense that units which are close in a well-defined sense are dependent. In contrast with spatial and temporal phenomena, though, relational phenomena tend to lack a natural neighbourhood structure in the sense that it is unknown which units are close and thus dependent. Owing to the challenge of characterizing local dependence and constructing random graph models with local dependence, many conventional exponential family random graph models induce strong dependence and are not amenable to statistical inference. We take first steps to characterize local dependence in random graph models, inspired by the notion of finite neighbourhoods in spatial statistics and M-dependence in time series, and we show that local dependence endows random graph models with desirable properties which make them amenable to statistical inference. We show that random graph models with local dependence satisfy a natural domain consistency condition which every model should satisfy, but conventional exponential family random graph models do not satisfy. In addition, we establish a central limit theorem for random graph models with local dependence, which suggests that random graph models with local dependence are amenable to statistical inference. We discuss how random graph models with local dependence can be constructed by exploiting either observed or unobserved neighbourhood structure. In the absence of observed neighbourhood structure, we take a Bayesian view and express the uncertainty about the neighbourhood structure by specifying a prior on a set of suitable neighbourhood structures. We present simulation results and applications to two real world networks with ‘ground truth’. PMID:26560142

  11. Subjective randomness as statistical inference.

    PubMed

    Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B

    2018-06-01

    Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. A statistical approach to selecting and confirming validation targets in -omics experiments

    PubMed Central

    2012-01-01

    Background Genomic technologies are, by their very nature, designed for hypothesis generation. In some cases, the hypotheses that are generated require that genome scientists confirm findings about specific genes or proteins. But one major advantage of high-throughput technology is that global genetic, genomic, transcriptomic, and proteomic behaviors can be observed. Manual confirmation of every statistically significant genomic result is prohibitively expensive. This has led researchers in genomics to adopt the strategy of confirming only a handful of the most statistically significant results, a small subset chosen for biological interest, or a small random subset. But there is no standard approach for selecting and quantitatively evaluating validation targets. Results Here we present a new statistical method and approach for statistically validating lists of significant results based on confirming only a small random sample. We apply our statistical method to show that the usual practice of confirming only the most statistically significant results does not statistically validate result lists. We analyze an extensively validated RNA-sequencing experiment to show that confirming a random subset can statistically validate entire lists of significant results. Finally, we analyze multiple publicly available microarray experiments to show that statistically validating random samples can both (i) provide evidence to confirm long gene lists and (ii) save thousands of dollars and hundreds of hours of labor over manual validation of each significant result. Conclusions For high-throughput -omics studies, statistical validation is a cost-effective and statistically valid approach to confirming lists of significant results. PMID:22738145

  13. Modeling stock price dynamics by continuum percolation system and relevant complex systems analysis

    NASA Astrophysics Data System (ADS)

    Xiao, Di; Wang, Jun

    2012-10-01

    The continuum percolation system is developed to model a random stock price process in this work. Recent empirical research has demonstrated various statistical features of stock price changes, the financial model aiming at understanding price fluctuations needs to define a mechanism for the formation of the price, in an attempt to reproduce and explain this set of empirical facts. The continuum percolation model is usually referred to as a random coverage process or a Boolean model, the local interaction or influence among traders is constructed by the continuum percolation, and a cluster of continuum percolation is applied to define the cluster of traders sharing the same opinion about the market. We investigate and analyze the statistical behaviors of normalized returns of the price model by some analysis methods, including power-law tail distribution analysis, chaotic behavior analysis and Zipf analysis. Moreover, we consider the daily returns of Shanghai Stock Exchange Composite Index from January 1997 to July 2011, and the comparisons of return behaviors between the actual data and the simulation data are exhibited.

  14. Influence of Intradialytic Aerobic Training in Cerebral Blood Flow and Cognitive Function in Patients with Chronic Kidney Disease: A Pilot Randomized Controlled Trial.

    PubMed

    Stringuetta Belik, Fernanda; Oliveira E Silva, Viviana Rugolo; Braga, Gabriel Pereira; Bazan, Rodrigo; Perez Vogt, Barbara; Costa Teixeira Caramori, Jacqueline; Barretti, Pasqual; de Souza Gonçalves, Renato; Fortes Villas Bôas, Paulo José; Hueb, João Carlos; Martin, Luis Cuadrado; da Silva Franco, Roberto Jorge

    2018-06-07

    Changes in cerebral blood flow may play an important role in cognitive impairment among hemodialysis (HD) patients. Physical activity has a promising role in delaying cognitive impairment in general population, but there are only a few studies in HD to confirm this finding. We aimed to evaluate the effects of intradialytic aerobic training on cerebral blood flow and cognitive impairment in HD. This is a pilot, controlled, randomized trial. Fifteen patients underwent intradialytic aerobic training 3 times a week for 4 months. The control group was comprised of another 15 patients. Trained patients had a statistically significant improvement of cognitive impairment and basilar maximum blood flow velocity. The proportion of arteries with increased flow velocity was statistically significant between groups. Intradialytic aerobic training improves cognitive impairment and cerebral blood flow of patients in HD, suggesting a possible mechanism improving cognitive impairment by physical training in HD. These data still need to be confirmed by major trials. © 2018 S. Karger AG, Basel.

  15. Variable Selection in the Presence of Missing Data: Imputation-based Methods.

    PubMed

    Zhao, Yize; Long, Qi

    2017-01-01

    Variable selection plays an essential role in regression analysis as it identifies important variables that associated with outcomes and is known to improve predictive accuracy of resulting models. Variable selection methods have been widely investigated for fully observed data. However, in the presence of missing data, methods for variable selection need to be carefully designed to account for missing data mechanisms and statistical techniques used for handling missing data. Since imputation is arguably the most popular method for handling missing data due to its ease of use, statistical methods for variable selection that are combined with imputation are of particular interest. These methods, valid used under the assumptions of missing at random (MAR) and missing completely at random (MCAR), largely fall into three general strategies. The first strategy applies existing variable selection methods to each imputed dataset and then combine variable selection results across all imputed datasets. The second strategy applies existing variable selection methods to stacked imputed datasets. The third variable selection strategy combines resampling techniques such as bootstrap with imputation. Despite recent advances, this area remains under-developed and offers fertile ground for further research.

  16. Random Fields

    NASA Astrophysics Data System (ADS)

    Vanmarcke, Erik

    1983-03-01

    Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.

  17. Memory matters: influence from a cognitive map on animal space use.

    PubMed

    Gautestad, Arild O

    2011-10-21

    A vertebrate individual's cognitive map provides a capacity for site fidelity and long-distance returns to favorable patches. Fractal-geometrical analysis of individual space use based on collection of telemetry fixes makes it possible to verify the influence of a cognitive map on the spatial scatter of habitat use and also to what extent space use has been of a scale-specific versus a scale-free kind. This approach rests on a statistical mechanical level of system abstraction, where micro-scale details of behavioral interactions are coarse-grained to macro-scale observables like the fractal dimension of space use. In this manner, the magnitude of the fractal dimension becomes a proxy variable for distinguishing between main classes of habitat exploration and site fidelity, like memory-less (Markovian) Brownian motion and Levy walk and memory-enhanced space use like Multi-scaled Random Walk (MRW). In this paper previous analyses are extended by exploring MRW simulations under three scenarios: (1) central place foraging, (2) behavioral adaptation to resource depletion (avoidance of latest visited locations) and (3) transition from MRW towards Levy walk by narrowing memory capacity to a trailing time window. A generalized statistical-mechanical theory with the power to model cognitive map influence on individual space use will be important for statistical analyses of animal habitat preferences and the mechanics behind site fidelity and home ranges. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Statistical patterns of visual search for hidden objects

    PubMed Central

    Credidio, Heitor F.; Teixeira, Elisângela N.; Reis, Saulo D. S.; Moreira, André A.; Andrade Jr, José S.

    2012-01-01

    The movement of the eyes has been the subject of intensive research as a way to elucidate inner mechanisms of cognitive processes. A cognitive task that is rather frequent in our daily life is the visual search for hidden objects. Here we investigate through eye-tracking experiments the statistical properties associated with the search of target images embedded in a landscape of distractors. Specifically, our results show that the twofold process of eye movement, composed of sequences of fixations (small steps) intercalated by saccades (longer jumps), displays characteristic statistical signatures. While the saccadic jumps follow a log-normal distribution of distances, which is typical of multiplicative processes, the lengths of the smaller steps in the fixation trajectories are consistent with a power-law distribution. Moreover, the present analysis reveals a clear transition between a directional serial search to an isotropic random movement as the difficulty level of the searching task is increased. PMID:23226829

  19. Effect of Oxide Interface Roughness on the Threshold Voltage Fluctuations in Decanano MOSFETs with Ultrathin Gate Oxides

    NASA Technical Reports Server (NTRS)

    Asenov, Asen; Kaya, S.

    2000-01-01

    In this paper we use the Density Gradient (DG) simulation approach to study, in 3-D, the effect of local oxide thickness fluctuations on the threshold voltage of decanano MOSFETs on a statistical scale. The random 2-D surfaces used to represent the interface are constructed using the standard assumptions for the auto-correlation function of the interface. The importance of the Quantum Mechanical effects when studying oxide thickness fluctuations are illustrated in several simulation examples.

  20. Directed polymers versus directed percolation

    NASA Astrophysics Data System (ADS)

    Halpin-Healy, Timothy

    1998-10-01

    Universality plays a central role within the rubric of modern statistical mechanics, wherein an insightful continuum formulation rises above irrelevant microscopic details, capturing essential scaling behaviors. Nevertheless, occasions do arise where the lattice or another discrete aspect can constitute a formidable legacy. Directed polymers in random media, along with its close sibling, directed percolation, provide an intriguing case in point. Indeed, the deep blood relation between these two models may have sabotaged past efforts to fully characterize the Kardar-Parisi-Zhang universality class, to which the directed polymer belongs.

  1. Randomized central limit theorems: A unified theory.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  2. Randomized central limit theorems: A unified theory

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles’ aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles’ extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic—scaling all ensemble components by a common deterministic scale. However, there are “random environment” settings in which the underlying scaling schemes are stochastic—scaling the ensemble components by different random scales. Examples of such settings include Holtsmark’s law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)—in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes—and present “randomized counterparts” to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  3. Theory of Financial Risk and Derivative Pricing

    NASA Astrophysics Data System (ADS)

    Bouchaud, Jean-Philippe; Potters, Marc

    2009-01-01

    Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.

  4. Theory of Financial Risk and Derivative Pricing - 2nd Edition

    NASA Astrophysics Data System (ADS)

    Bouchaud, Jean-Philippe; Potters, Marc

    2003-12-01

    Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.

  5. Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization

    ERIC Educational Resources Information Center

    Lock, Robin H.; Lock, Patti Frazer

    2008-01-01

    Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…

  6. The level crossing rates and associated statistical properties of a random frequency response function

    NASA Astrophysics Data System (ADS)

    Langley, Robin S.

    2018-03-01

    This work is concerned with the statistical properties of the frequency response function of the energy of a random system. Earlier studies have considered the statistical distribution of the function at a single frequency, or alternatively the statistics of a band-average of the function. In contrast the present analysis considers the statistical fluctuations over a frequency band, and results are obtained for the mean rate at which the function crosses a specified level (or equivalently, the average number of times the level is crossed within the band). Results are also obtained for the probability of crossing a specified level at least once, the mean rate of occurrence of peaks, and the mean trough-to-peak height. The analysis is based on the assumption that the natural frequencies and mode shapes of the system have statistical properties that are governed by the Gaussian Orthogonal Ensemble (GOE), and the validity of this assumption is demonstrated by comparison with numerical simulations for a random plate. The work has application to the assessment of the performance of dynamic systems that are sensitive to random imperfections.

  7. Coherent random lasing controlled by Brownian motion of the active scatterer

    NASA Astrophysics Data System (ADS)

    Liang, Shuofeng; Yin, Leicheng; Zhang, ZhenZhen; Xia, Jiangying; Xie, Kang; Zou, Gang; Hu, Zhijia; Zhang, Qijin

    2018-05-01

    The stability of the scattering loop is fundamental for coherent random lasing in a dynamic scattering system. In this work, fluorescence of DPP (N, N-di [3-(isobutyl polyhedral oligomeric silsesquioxanes) propyl] perylene diimide) is scattered to produce RL and we realize the transition from incoherent RL to coherent RL by controlling the Brownian motion of the scatterers (dimer aggregates of DPP) and the stability of scattering loop. To produce coherent random lasers, the loop needs to maintain a stable state within the loop-stable time, which can be determined through controlled Brownian motion of scatterers in the scattering system. The result shows that the loop-stable time is within 5.83 × 10‑5 s to 1.61 × 10‑4 s based on the transition from coherent to incoherent random lasing. The time range could be tuned by finely controlling the viscosity of the solution. This work not only develops a method to predict the loop-stable time, but also develops the study between Brownian motion and random lasers, which opens the road to a variety of novel interdisciplinary investigations involving modern statistical mechanics and disordered photonics.

  8. A random walk model for evaluating clinical trials involving serial observations.

    PubMed

    Hopper, J L; Young, G P

    1988-05-01

    For clinical trials where the variable of interest is ordered and categorical (for example, disease severity, symptom scale), and where measurements are taken at intervals, it might be possible to achieve a greater discrimination between the efficacy of treatments by modelling each patient's progress as a stochastic process. The random walk is a simple, easily interpreted model that can be fitted by maximum likelihood using a maximization routine with inference based on standard likelihood theory. In general the model can allow for randomly censored data, incorporates measured prognostic factors, and inference is conditional on the (possibly non-random) allocation of patients. Tests of fit and of model assumptions are proposed, and application to two therapeutic trials of gastroenterological disorders are presented. The model gave measures of the rate of, and variability in, improvement for patients under different treatments. A small simulation study suggested that the model is more powerful than considering the difference between initial and final scores, even when applied to data generated by a mechanism other than the random walk model assumed in the analysis. It thus provides a useful additional statistical method for evaluating clinical trials.

  9. A random-sum Wilcoxon statistic and its application to analysis of ROC and LROC data.

    PubMed

    Tang, Liansheng Larry; Balakrishnan, N

    2011-01-01

    The Wilcoxon-Mann-Whitney statistic is commonly used for a distribution-free comparison of two groups. One requirement for its use is that the sample sizes of the two groups are fixed. This is violated in some of the applications such as medical imaging studies and diagnostic marker studies; in the former, the violation occurs since the number of correctly localized abnormal images is random, while in the latter the violation is due to some subjects not having observable measurements. For this reason, we propose here a random-sum Wilcoxon statistic for comparing two groups in the presence of ties, and derive its variance as well as its asymptotic distribution for large sample sizes. The proposed statistic includes the regular Wilcoxon rank-sum statistic. Finally, we apply the proposed statistic for summarizing location response operating characteristic data from a liver computed tomography study, and also for summarizing diagnostic accuracy of biomarker data.

  10. Robustness of optimal random searches in fragmented environments

    NASA Astrophysics Data System (ADS)

    Wosniack, M. E.; Santos, M. C.; Raposo, E. P.; Viswanathan, G. M.; da Luz, M. G. E.

    2015-05-01

    The random search problem is a challenging and interdisciplinary topic of research in statistical physics. Realistic searches usually take place in nonuniform heterogeneous distributions of targets, e.g., patchy environments and fragmented habitats in ecological systems. Here we present a comprehensive numerical study of search efficiency in arbitrarily fragmented landscapes with unlimited visits to targets that can only be found within patches. We assume a random walker selecting uniformly distributed turning angles and step lengths from an inverse power-law tailed distribution with exponent μ . Our main finding is that for a large class of fragmented environments the optimal strategy corresponds approximately to the same value μopt≈2 . Moreover, this exponent is indistinguishable from the well-known exact optimal value μopt=2 for the low-density limit of homogeneously distributed revisitable targets. Surprisingly, the best search strategies do not depend (or depend only weakly) on the specific details of the fragmentation. Finally, we discuss the mechanisms behind this observed robustness and comment on the relevance of our results to both the random search theory in general, as well as specifically to the foraging problem in the biological context.

  11. Listening to the Noise: Random Fluctuations Reveal Gene Network Parameters

    NASA Astrophysics Data System (ADS)

    Munsky, Brian; Trinh, Brooke; Khammash, Mustafa

    2010-03-01

    The cellular environment is abuzz with noise originating from the inherent random motion of reacting molecules in the living cell. In this noisy environment, clonal cell populations exhibit cell-to-cell variability that can manifest significant prototypical differences. Noise induced stochastic fluctuations in cellular constituents can be measured and their statistics quantified using flow cytometry, single molecule fluorescence in situ hybridization, time lapse fluorescence microscopy and other single cell and single molecule measurement techniques. We show that these random fluctuations carry within them valuable information about the underlying genetic network. Far from being a nuisance, the ever-present cellular noise acts as a rich source of excitation that, when processed through a gene network, carries its distinctive fingerprint that encodes a wealth of information about that network. We demonstrate that in some cases the analysis of these random fluctuations enables the full identification of network parameters, including those that may otherwise be difficult to measure. We use theoretical investigations to establish experimental guidelines for the identification of gene regulatory networks, and we apply these guideline to experimentally identify predictive models for different regulatory mechanisms in bacteria and yeast.

  12. Two-Year Trends of Taxane-Induced Neuropathy in Women Enrolled in a Randomized Trial of Acetyl-L-Carnitine (SWOG S0715).

    PubMed

    Hershman, Dawn L; Unger, Joseph M; Crew, Katherine D; Till, Cathee; Greenlee, Heather; Minasian, Lori M; Moinpour, Carol M; Lew, Danika L; Fehrenbacher, Louis; Wade, James L; Wong, Siu-Fun; Fisch, Michael J; Lynn Henry, N; Albain, Kathy S

    2018-06-01

    Chemotherapy-induced peripheral neuropathy (CIPN) is a common and disabling side effect of taxanes. Acetyl-L-carnitine (ALC) was unexpectedly found to increase CIPN in a randomized trial. We investigated the long-term patterns of CIPN among patients in this trial. S0715 was a randomized, double-blind, multicenter trial comparing ALC (1000 mg three times a day) with placebo for 24 weeks in women undergoing adjuvant taxane-based chemotherapy for breast cancer. CIPN was measured by the 11-item neurotoxicity (NTX) component of the FACT-Taxane scale at weeks 12, 24, 36, 52, and 104. We examined NTX scores over two years using linear mixed models for longitudinal data. Individual time points were examined using linear regression. Regression analyses included stratification factors and the baseline score as covariates. All statistical tests were two-sided. Four-hundred nine subjects were eligible for evaluation. Patients receiving ALC had a statistically significantly (P = .01) greater reduction in NTX scores (worse CIPN) of -1.39 points (95% confidence interval [CI] = -2.48 to -0.30) than the placebo group. These differences were particularly evident at weeks 24 (-1.68, 95% CI = -3.02 to -0.33), 36 (-1.37, 95% CI = -2.69 to -0.04), and 52 (-1.83, 95% CI = -3.35 to -0.32). At 104 weeks, 39.5% on the ALC arm and 34.4% on the placebo arm reported a five-point (10%) decrease from baseline. For both treatment groups, 104-week NTX scores were statistically significantly different compared with baseline (P < .001). For both groups, NTX scores were reduced from baseline and remained persistently low. Twenty-four weeks of ALC therapy resulted in statistically significantly worse CIPN over two years. Understanding the mechanism of this persistent effect may inform prevention and treatment strategies. Until then, the potential efficacy and harms of commonly used supplements should be rigorously studied.

  13. Randomized prospective crossover study of biphasic intermittent positive airway pressure ventilation (BIPAP) versus pressure support ventilation (PSV) in surgical intensive care patients.

    PubMed

    Elrazek, E Abd

    2004-10-01

    The aim of this prospective, randomized and crossover study was to assess the role of a relatively new mode of mechanical ventilation, biphasic intermittent positive airway pressure (BIPAP) in comparison to another well established one, pressure-support ventilation (PSV) in surgical intensive care patients. 24 generally stable patients, breathing on their own after short-term (< 24 hours) postoperative controlled mechanical ventilation (CMV) were randomized to start on either PSV or BIPAP, and indirect calorimetry measurements were performed after 1 hour adaptation period at two time intervals; immediately after the investigated ventilatory mode was started and 1 hour later. Statistics included a two-tailed paired t-test to compare the two sets of different data, p < 0.5 was considered significant. Oxygen consumption (VO2), energy expenditure (EE), Carbon dioxide production (VCO2), and respiratory quotient (RQ) did not differ significantly between the two groups. There were also no significant differences regarding respiratory rate (RR), minute volume (MV) and arterial blood gas analysis (ABGs). Both modes of ventilation were well tolerated by all patients. PSV and BIPAP can be used for weaning patients comfortably in surgical intensive care after short-term postoperative ventilation. BIPAP may have the credit of being smoother than PSV where no patient effort is required.

  14. Bootstrapping on Undirected Binary Networks Via Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Fushing, Hsieh; Chen, Chen; Liu, Shan-Yu; Koehl, Patrice

    2014-09-01

    We propose a new method inspired from statistical mechanics for extracting geometric information from undirected binary networks and generating random networks that conform to this geometry. In this method an undirected binary network is perceived as a thermodynamic system with a collection of permuted adjacency matrices as its states. The task of extracting information from the network is then reformulated as a discrete combinatorial optimization problem of searching for its ground state. To solve this problem, we apply multiple ensembles of temperature regulated Markov chains to establish an ultrametric geometry on the network. This geometry is equipped with a tree hierarchy that captures the multiscale community structure of the network. We translate this geometry into a Parisi adjacency matrix, which has a relative low energy level and is in the vicinity of the ground state. The Parisi adjacency matrix is then further optimized by making block permutations subject to the ultrametric geometry. The optimal matrix corresponds to the macrostate of the original network. An ensemble of random networks is then generated such that each of these networks conforms to this macrostate; the corresponding algorithm also provides an estimate of the size of this ensemble. By repeating this procedure at different scales of the ultrametric geometry of the network, it is possible to compute its evolution entropy, i.e. to estimate the evolution of its complexity as we move from a coarse to a fine description of its geometric structure. We demonstrate the performance of this method on simulated as well as real data networks.

  15. Randomized, double-blind, placebo-controlled trial of saw palmetto in men with lower urinary tract symptoms.

    PubMed

    Gerber, G S; Kuznetsov, D; Johnson, B C; Burstein, J D

    2001-12-01

    To assess the effects of saw palmetto on urinary symptoms, sexual function, and urinary flow rate in men with lower urinary tract symptoms using a double-blind, randomized, placebo-controlled trial. The eligible patients were 45 years of age or older and had an International Prostate Symptom Score of 8 or greater. After a 1-month placebo run-in period, 85 men were randomized to receive saw palmetto or placebo for 6 months. Patients were evaluated using the International Prostate Symptom Score, a sexual function questionnaire, and by measurement of the urinary flow rate. The mean symptom score decreased from 16.7 to 12.3 in the saw palmetto group compared with 15.8 to 13.6 in the placebo group (P = 0.038). The quality-of-life score improved to a greater degree in the saw palmetto group, but this difference was not statistically significant. No change occurred in the sexual function questionnaire results in either group. The peak flow rate increased by 1.0 mL/s and 1.4 mL/s in the saw palmetto and placebo groups, respectively (P = 0.73). Saw palmetto led to a statistically significant improvement in urinary symptoms in men with lower urinary tract symptoms compared with placebo. Saw palmetto had no measurable effect on the urinary flow rates. The mechanism by which saw palmetto improves urinary symptoms remains unknown.

  16. The invariant statistical rule of aerosol scattering pulse signal modulated by random noise

    NASA Astrophysics Data System (ADS)

    Yan, Zhen-gang; Bian, Bao-Min; Yang, Juan; Peng, Gang; Li, Zhen-hua

    2010-11-01

    A model of the random background noise acting on particle signals is established to study the impact of the background noise of the photoelectric sensor in the laser airborne particle counter on the statistical character of the aerosol scattering pulse signals. The results show that the noises broaden the statistical distribution of the particle's measurement. Further numerical research shows that the output of the signal amplitude still has the same distribution when the airborne particle with the lognormal distribution was modulated by random noise which has lognormal distribution. Namely it follows the statistics law of invariance. Based on this model, the background noise of photoelectric sensor and the counting distributions of random signal for aerosol's scattering pulse are obtained and analyzed by using a high-speed data acquisition card PCI-9812. It is found that the experiment results and simulation results are well consistent.

  17. The relationship between environmental exposure to cadmium and lead and blood selenium concentration in randomly selected population of children inhabiting industrial regions of Silesian Voivodship (Poland).

    PubMed

    Gać, P; Pawlas, N; Poręba, R; Poręba, M; Pawlas, K

    2014-06-01

    This study aimed at determining the relationship between environmental exposure to lead (Pb) and cadmium (Cd) and blood selenium (Se) concentration in randomly selected population of children inhabiting the industrial regions of Silesian Voivodship, Poland. The study was conducted on a group of consecutive randomly selected 349 children aged below 15 years and inhabiting the industrial regions in Upper Silesia. The examined variables included whole blood Cd concentration (Cd-B), whole blood Pb concentration (Pb-B) and whole blood Se concentration (Se-B). The concentration of Cd-B, Pb-B and Se-B in the studied group of children amounted to 0.26 ± 0.14, 37.62 ± 25.30 and 78.31 ± 12.82 μg/L, respectively. In the entire examined group a statistically significant negative linear relationship was noted between Pb-B and Se-B (r = -0.12, p < 0.05). Also, a statistically insignificant negative correlation was detected between Cd-B and Se-B (r = -0.02, p > 0.05) and a statistically insignificant positive correlation between Pb-B and Cd-B (r = 0.08, p > 0.05). A multivariate backward stepwise regression analysis demonstrated that in the studied group of children higher Pb-B and a more advanced age-represented independent risk factors for a decreased Se-B. Environmental exposure to Pb may represent an independent risk factor for Se deficit in blood of the studied population of children. In children, the lowered Se-B may create one of the mechanisms in which Pb unfavourably affects human body. © The Author(s) 2014.

  18. Truly random number generation: an example

    NASA Astrophysics Data System (ADS)

    Frauchiger, Daniela; Renner, Renato

    2013-10-01

    Randomness is crucial for a variety of applications, ranging from gambling to computer simulations, and from cryptography to statistics. However, many of the currently used methods for generating randomness do not meet the criteria that are necessary for these applications to work properly and safely. A common problem is that a sequence of numbers may look random but nevertheless not be truly random. In fact, the sequence may pass all standard statistical tests and yet be perfectly predictable. This renders it useless for many applications. For example, in cryptography, the predictability of a "andomly" chosen password is obviously undesirable. Here, we review a recently developed approach to generating true | and hence unpredictable | randomness.

  19. Making statistical inferences about software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1988-01-01

    Failure times of software undergoing random debugging can be modelled as order statistics of independent but nonidentically distributed exponential random variables. Using this model inferences can be made about current reliability and, if debugging continues, future reliability. This model also shows the difficulty inherent in statistical verification of very highly reliable software such as that used by digital avionics in commercial aircraft.

  20. Assessing Fun Items' Effectiveness in Increasing Learning of College Introductory Statistics Students: Results of a Randomized Experiment

    ERIC Educational Resources Information Center

    Lesser, Lawrence M.; Pearl, Dennis K.; Weber, John J., III

    2016-01-01

    There has been a recent emergence of scholarship on the use of fun in the college statistics classroom, with at least 20 modalities identified. While there have been randomized experiments that suggest that fun can enhance student achievement or attitudes in statistics, these studies have generally been limited to one particular fun modality or…

  1. Statistical Properties of Cell Topology and Geometry in a Tissue-Growth Model

    NASA Astrophysics Data System (ADS)

    Sahlin, Patrik; Hamant, Olivier; Jönsson, Henrik

    Statistical properties of cell topologies in two-dimensional tissues have recently been suggested to be a consequence of cell divisions. Different rules for the positioning of new walls in plants have been proposed, where e.g. Errara’s rule state that new walls are added with the shortest possible path dividing the mother cell’s volume into two equal parts. Here, we show that for an isotropically growing tissue Errara’s rule results in the correct distributions of number of cell neighbors as well as cellular geometries, in contrast to a random division rule. Further we show that wall mechanics constrain the isotropic growth such that the resulting cell shape distributions more closely agree with experimental data extracted from the shoot apex of Arabidopsis thaliana.

  2. Statistical auditing and randomness test of lotto k/N-type games

    NASA Astrophysics Data System (ADS)

    Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Rapallo, F.; Scalas, E.

    2008-11-01

    One of the most popular lottery games worldwide is the so-called “lotto k/N”. It considers N numbers 1,2,…,N from which k are drawn randomly, without replacement. A player selects k or more numbers and the first prize is shared amongst those players whose selected numbers match all of the k randomly drawn. Exact rules may vary in different countries. In this paper, mean values and covariances for the random variables representing the numbers drawn from this kind of game are presented, with the aim of using them to audit statistically the consistency of a given sample of historical results with theoretical values coming from a hypergeometric statistical model. The method can be adapted to test pseudorandom number generators.

  3. Quantum Entanglement in Random Physical States

    NASA Astrophysics Data System (ADS)

    Hamma, Alioscia; Santra, Siddhartha; Zanardi, Paolo

    2012-07-01

    Most states in the Hilbert space are maximally entangled. This fact has proven useful to investigate—among other things—the foundations of statistical mechanics. Unfortunately, most states in the Hilbert space of a quantum many-body system are not physically accessible. We define physical ensembles of states acting on random factorized states by a circuit of length k of random and independent unitaries with local support. We study the typicality of entanglement by means of the purity of the reduced state. We find that for a time k=O(1), the typical purity obeys the area law. Thus, the upper bounds for area law are actually saturated, on average, with a variance that goes to zero for large systems. Similarly, we prove that by means of local evolution a subsystem of linear dimensions L is typically entangled with a volume law when the time scales with the size of the subsystem. Moreover, we show that for large values of k the reduced state becomes very close to the completely mixed state.

  4. Variational approach to probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1991-01-01

    Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  5. Variational approach to probabilistic finite elements

    NASA Astrophysics Data System (ADS)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1991-08-01

    Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  6. Variational approach to probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1987-01-01

    Probabilistic finite element method (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties, and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  7. Weighted networks as randomly reinforced urn processes

    NASA Astrophysics Data System (ADS)

    Caldarelli, Guido; Chessa, Alessandro; Crimaldi, Irene; Pammolli, Fabio

    2013-02-01

    We analyze weighted networks as randomly reinforced urn processes, in which the edge-total weights are determined by a reinforcement mechanism. We develop a statistical test and a procedure based on it to study the evolution of networks over time, detecting the “dominance” of some edges with respect to the others and then assessing if a given instance of the network is taken at its steady state or not. Distance from the steady state can be considered as a measure of the relevance of the observed properties of the network. Our results are quite general, in the sense that they are not based on a particular probability distribution or functional form of the random weights. Moreover, the proposed tool can be applied also to dense networks, which have received little attention by the network community so far, since they are often problematic. We apply our procedure in the context of the International Trade Network, determining a core of “dominant edges.”

  8. Random versus maximum entropy models of neural population activity

    NASA Astrophysics Data System (ADS)

    Ferrari, Ulisse; Obuchi, Tomoyuki; Mora, Thierry

    2017-04-01

    The principle of maximum entropy provides a useful method for inferring statistical mechanics models from observations in correlated systems, and is widely used in a variety of fields where accurate data are available. While the assumptions underlying maximum entropy are intuitive and appealing, its adequacy for describing complex empirical data has been little studied in comparison to alternative approaches. Here, data from the collective spiking activity of retinal neurons is reanalyzed. The accuracy of the maximum entropy distribution constrained by mean firing rates and pairwise correlations is compared to a random ensemble of distributions constrained by the same observables. For most of the tested networks, maximum entropy approximates the true distribution better than the typical or mean distribution from that ensemble. This advantage improves with population size, with groups as small as eight being almost always better described by maximum entropy. Failure of maximum entropy to outperform random models is found to be associated with strong correlations in the population.

  9. Introductory Statistics Students' Conceptual Understanding of Study Design and Conclusions

    NASA Astrophysics Data System (ADS)

    Fry, Elizabeth Brondos

    Recommended learning goals for students in introductory statistics courses include the ability to recognize and explain the key role of randomness in designing studies and in drawing conclusions from those studies involving generalizations to a population or causal claims (GAISE College Report ASA Revision Committee, 2016). The purpose of this study was to explore introductory statistics students' understanding of the distinct roles that random sampling and random assignment play in study design and the conclusions that can be made from each. A study design unit lasting two and a half weeks was designed and implemented in four sections of an undergraduate introductory statistics course based on modeling and simulation. The research question that this study attempted to answer is: How does introductory statistics students' conceptual understanding of study design and conclusions (in particular, unbiased estimation and establishing causation) change after participating in a learning intervention designed to promote conceptual change in these areas? In order to answer this research question, a forced-choice assessment called the Inferences from Design Assessment (IDEA) was developed as a pretest and posttest, along with two open-ended assignments, a group quiz and a lab assignment. Quantitative analysis of IDEA results and qualitative analysis of the group quiz and lab assignment revealed that overall, students' mastery of study design concepts significantly increased after the unit, and the great majority of students successfully made the appropriate connections between random sampling and generalization, and between random assignment and causal claims. However, a small, but noticeable portion of students continued to demonstrate misunderstandings, such as confusion between random sampling and random assignment.

  10. Gift from statistical learning: Visual statistical learning enhances memory for sequence elements and impairs memory for items that disrupt regularities.

    PubMed

    Otsuka, Sachio; Saiki, Jun

    2016-02-01

    Prior studies have shown that visual statistical learning (VSL) enhances familiarity (a type of memory) of sequences. How do statistical regularities influence the processing of each triplet element and inserted distractors that disrupt the regularity? Given that increased attention to triplets induced by VSL and inhibition of unattended triplets, we predicted that VSL would promote memory for each triplet constituent, and degrade memory for inserted stimuli. Across the first two experiments, we found that objects from structured sequences were more likely to be remembered than objects from random sequences, and that letters (Experiment 1) or objects (Experiment 2) inserted into structured sequences were less likely to be remembered than those inserted into random sequences. In the subsequent two experiments, we examined an alternative account for our results, whereby the difference in memory for inserted items between structured and random conditions is due to individuation of items within random sequences. Our findings replicated even when control letters (Experiment 3A) or objects (Experiment 3B) were presented before or after, rather than inserted into, random sequences. Our findings suggest that statistical learning enhances memory for each item in a regular set and impairs memory for items that disrupt the regularity. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Viewing health expenditures, payment and coping mechanisms with an equity lens in Nigeria

    PubMed Central

    2013-01-01

    Background This paper examines socio-economic and geographic differences in payment and payment coping mechanisms for health services in southeast Nigeria. It shows the extent to which the poor and rural dwellers disproportionally bear the burden of health care costs and offers policy recommendations for improvements. Methods Questionnaires were used to collect data from 3071 randomly selected households in six communities in southeast Nigeria using a four week recall. The sample was divided into quintiles (Q1-Q5) using a socio-economic status (SES) index as well as into geographic groups (rural, peri-urban and urban). Tabulations and logistic regression were used to determine the relationships between payment and payment coping mechanisms and key independent variables. Q1/Q5 and rural/urban ratios were the measures of equity. Results Most of the respondents used out-of-pocket spending (OOPS) and own money to pay for healthcare. There was statistically significant geographic differences in the use of own money to pay for health services indicating more use among rural dwellers. Logistic regression showed statistically significant geographic differences in the use of both OOPS and own money when controlling for the effects of potential cofounders. Conclusions This study shows statistically significant geographic differences in the use of OOPS and own money to pay for health services. Though the SES differences were not statistically significant, they showed high equity ratios indicating more use among poor and rural dwellers. The high expenditure incurred on drugs alone highlights the need for expediting pro-poor interventions like exemptions and waivers aimed at improving access to health care for the vulnerable poor and rural dwellers. PMID:23497246

  12. On Probability Domains IV

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  13. RANDOMNESS of Numbers DEFINITION(QUERY:WHAT? V HOW?) ONLY Via MAXWELL-BOLTZMANN CLASSICAL-Statistics(MBCS) Hot-Plasma VS. Digits-Clumping Log-Law NON-Randomness Inversion ONLY BOSE-EINSTEIN QUANTUM-Statistics(BEQS) .

    NASA Astrophysics Data System (ADS)

    Siegel, Z.; Siegel, Edward Carl-Ludwig

    2011-03-01

    RANDOMNESS of Numbers cognitive-semantics DEFINITION VIA Cognition QUERY: WHAT???, NOT HOW?) VS. computer-``science" mindLESS number-crunching (Harrel-Sipser-...) algorithmics Goldreich "PSEUDO-randomness"[Not.AMS(02)] mea-culpa is ONLY via MAXWELL-BOLTZMANN CLASSICAL-STATISTICS(NOT FDQS!!!) "hot-plasma" REPULSION VERSUS Newcomb(1881)-Weyl(1914;1916)-Benford(1938) "NeWBe" logarithmic-law digit-CLUMPING/ CLUSTERING NON-Randomness simple Siegel[AMS Joint.Mtg.(02)-Abs. # 973-60-124] algebraic-inversion to THE QUANTUM and ONLY BEQS preferentially SEQUENTIALLY lower-DIGITS CLUMPING/CLUSTERING with d = 0 BEC, is ONLY VIA Siegel-Baez FUZZYICS=CATEGORYICS (SON OF TRIZ)/"Category-Semantics"(C-S), latter intersection/union of Lawvere(1964)-Siegel(1964)] category-theory (matrix: MORPHISMS V FUNCTORS) "+" cognitive-semantics'' (matrix: ANTONYMS V SYNONYMS) yields Siegel-Baez FUZZYICS=CATEGORYICS/C-S tabular list-format matrix truth-table analytics: MBCS RANDOMNESS TRUTH/EMET!!!

  14. Peculiarities of the statistics of spectrally selected fluorescence radiation in laser-pumped dye-doped random media

    NASA Astrophysics Data System (ADS)

    Yuvchenko, S. A.; Ushakova, E. V.; Pavlova, M. V.; Alonova, M. V.; Zimnyakov, D. A.

    2018-04-01

    We consider the practical realization of a new optical probe method of the random media which is defined as the reference-free path length interferometry with the intensity moments analysis. A peculiarity in the statistics of the spectrally selected fluorescence radiation in laser-pumped dye-doped random medium is discussed. Previously established correlations between the second- and the third-order moments of the intensity fluctuations in the random interference patterns, the coherence function of the probe radiation, and the path difference probability density for the interfering partial waves in the medium are confirmed. The correlations were verified using the statistical analysis of the spectrally selected fluorescence radiation emitted by a laser-pumped dye-doped random medium. Water solution of Rhodamine 6G was applied as the doping fluorescent agent for the ensembles of the densely packed silica grains, which were pumped by the 532 nm radiation of a solid state laser. The spectrum of the mean path length for a random medium was reconstructed.

  15. Programmable quantum random number generator without postprocessing.

    PubMed

    Nguyen, Lac; Rehain, Patrick; Sua, Yong Meng; Huang, Yu-Ping

    2018-02-15

    We demonstrate a viable source of unbiased quantum random numbers whose statistical properties can be arbitrarily programmed without the need for any postprocessing such as randomness distillation or distribution transformation. It is based on measuring the arrival time of single photons in shaped temporal modes that are tailored with an electro-optical modulator. We show that quantum random numbers can be created directly in customized probability distributions and pass all randomness tests of the NIST and Dieharder test suites without any randomness extraction. The min-entropies of such generated random numbers are measured close to the theoretical limits, indicating their near-ideal statistics and ultrahigh purity. Easy to implement and arbitrarily programmable, this technique can find versatile uses in a multitude of data analysis areas.

  16. Statistical properties of several models of fractional random point processes

    NASA Astrophysics Data System (ADS)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  17. The Evolution of Random Number Generation in MUVES

    DTIC Science & Technology

    2017-01-01

    mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results identical to the current...MUVES, includ- ing the mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results...questionable numerical and statistical properties. The development of the modern system is traced through software change requests, resulting in a random number

  18. Measuring effectiveness of drugs in observational databanks: promises and perils

    PubMed Central

    Krishnan, Eswar; Fries, James F

    2004-01-01

    Observational databanks have inherent strengths and shortcomings. As in randomized controlled trials, poor design of these databanks can either exaggerate or reduce estimates of drug effectiveness and can limit generalizability. This commentary highlights selected aspects of study design, data collection and statistical analysis that can help overcome many of these inadequacies. An international metaRegister and a formal mechanism for standardizing and sharing drug data could help improve the utility of databanks. Medical journals have a vital role in enforcing a quality checklist that improves reporting. PMID:15059263

  19. Controlling species richness in spin-glass model ecosystems

    NASA Astrophysics Data System (ADS)

    Poderoso, Fábio C.; Fontanari, José F.

    2006-11-01

    Within the framework of the random replicator model of ecosystems, we use equilibrium statistical mechanics tools to study the effect of manipulating the ecosystem so as to guarantee that a fixed fraction of the surviving species at equilibrium display a predefined set of characters (e.g., characters of economic value). Provided that the intraspecies competition is not too weak, we find that the consequence of such intervention on the ecosystem composition is a significant increase on the number of species that become extinct, and so the impoverishment of the ecosystem.

  20. Keyword extraction by nonextensivity measure.

    PubMed

    Mehri, Ali; Darooneh, Amir H

    2011-05-01

    The presence of a long-range correlation in the spatial distribution of a relevant word type, in spite of random occurrences of an irrelevant word type, is an important feature of human-written texts. We classify the correlation between the occurrences of words by nonextensive statistical mechanics for the word-ranking process. In particular, we look at the nonextensivity parameter as an alternative metric to measure the spatial correlation in the text, from which the words may be ranked in terms of this measure. Finally, we compare different methods for keyword extraction. © 2011 American Physical Society

  1. Efficacy of respiratory muscle training in weaning of mechanical ventilation in patients with mechanical ventilation for 48hours or more: A Randomized Controlled Clinical Trial.

    PubMed

    Sandoval Moreno, L M; Casas Quiroga, I C; Wilches Luna, E C; García, A F

    2018-02-02

    To evaluate the efficacy of respiratory muscular training in the weaning of mechanical ventilation and respiratory muscle strength in patients on mechanical ventilation of 48hours or more. Randomized controlled trial of parallel groups, double-blind. Ambit: Intensive Care Unit of a IV level clinic in the city of Cali. 126 patients in mechanical ventilation for 48hours or more. The experimental group received daily a respiratory muscle training program with treshold, adjusted to 50% of maximal inspiratory pressure, additional to standard care, conventional received standard care of respiratory physiotherapy. MAIN INTEREST VARIABLES: weaning of mechanical ventilation. Other variables evaluated: respiratory muscle strength, requirement of non-invasive mechanical ventilation and frequency of reintubation. intention-to-treat analysis was performed with all variables evaluated and analysis stratified by sepsis condition. There were no statistically significant differences in the median weaning time of the MV between the groups or in the probability of extubation between groups (HR: 0.82 95% CI: 0.55-1.20 P=.29). The maximum inspiratory pressure was increased in the experimental group on average 9.43 (17.48) cmsH20 and in the conventional 5.92 (11.90) cmsH20 (P=.48). The difference between the means of change in maximal inspiratory pressure was 0.46 (P=.83 95%CI -3.85 to -4.78). respiratory muscle training did not demonstrate efficacy in the reduction of the weaning period of mechanical ventilation nor in the increase of respiratory muscle strength in the study population. Registered study at ClinicalTrials.gov (NCT02469064). Copyright © 2017 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.

  2. Brownian motion properties of optoelectronic random bit generators based on laser chaos.

    PubMed

    Li, Pu; Yi, Xiaogang; Liu, Xianglian; Wang, Yuncai; Wang, Yongge

    2016-07-11

    The nondeterministic property of the optoelectronic random bit generator (RBG) based on laser chaos are experimentally analyzed from two aspects of the central limit theorem and law of iterated logarithm. The random bits are extracted from an optical feedback chaotic laser diode using a multi-bit extraction technique in the electrical domain. Our experimental results demonstrate that the generated random bits have no statistical distance from the Brownian motion, besides that they can pass the state-of-the-art industry-benchmark statistical test suite (NIST SP800-22). All of them give a mathematically provable evidence that the ultrafast random bit generator based on laser chaos can be used as a nondeterministic random bit source.

  3. The Acute Effects of Upper Extremity Stretching on Throwing Velocity in Baseball Throwers

    PubMed Central

    Melton, Jason; Delobel, Ashley; Puentedura, Emilio J.

    2013-01-01

    Purpose. To examine the effects of static and proprioceptive neuromuscular facilitation (PNF) stretching of the shoulder internal rotators on throwing velocity. Subjects. 27 male throwers (mean age = 25.1 years old, SD = 2.4) with adequate knowledge of demonstrable throwing mechanics. Study Design. Randomized crossover trial with repeated measures. Methods. Subjects warmed up, threw 10 pitches at their maximum velocity, were randomly assigned to 1 of 3 stretching protocols (static, PNF, or no stretch), and then repeated their 10 pitches. Velocities were recorded after each pitch and average and peak velocities were recorded after each session. Results. Data were analyzed using a 3 × 2 repeated measures ANOVA. No significant interaction between stretching and throwing velocity was observed. Main effects for time were not statistically significant. Main effects for the stretching groups were statistically significant. Discussion. Results suggest that stretching of the shoulder internal rotators did not significantly affect throwing velocity immediately after stretching. This may be due to the complexity of the throwing task. Conclusions. Stretching may be included in a thrower's warm-up without any effects on throwing velocity. Further research should be performed using a population with more throwing experience and skill. PMID:26464880

  4. Helmholtz and Gibbs ensembles, thermodynamic limit and bistability in polymer lattice models

    NASA Astrophysics Data System (ADS)

    Giordano, Stefano

    2017-12-01

    Representing polymers by random walks on a lattice is a fruitful approach largely exploited to study configurational statistics of polymer chains and to develop efficient Monte Carlo algorithms. Nevertheless, the stretching and the folding/unfolding of polymer chains within the Gibbs (isotensional) and the Helmholtz (isometric) ensembles of the statistical mechanics have not been yet thoroughly analysed by means of the lattice methodology. This topic, motivated by the recent introduction of several single-molecule force spectroscopy techniques, is investigated in the present paper. In particular, we analyse the force-extension curves under the Gibbs and Helmholtz conditions and we give a proof of the ensembles equivalence in the thermodynamic limit for polymers represented by a standard random walk on a lattice. Then, we generalize these concepts for lattice polymers that can undergo conformational transitions or, equivalently, for chains composed of bistable or two-state elements (that can be either folded or unfolded). In this case, the isotensional condition leads to a plateau-like force-extension response, whereas the isometric condition causes a sawtooth-like force-extension curve, as predicted by numerous experiments. The equivalence of the ensembles is finally proved also for lattice polymer systems exhibiting conformational transitions.

  5. Model for interevent times with long tails and multifractality in human communications: An application to financial trading

    NASA Astrophysics Data System (ADS)

    Perelló, Josep; Masoliver, Jaume; Kasprzak, Andrzej; Kutner, Ryszard

    2008-09-01

    Social, technological, and economic time series are divided by events which are usually assumed to be random, albeit with some hierarchical structure. It is well known that the interevent statistics observed in these contexts differs from the Poissonian profile by being long-tailed distributed with resting and active periods interwoven. Understanding mechanisms generating consistent statistics has therefore become a central issue. The approach we present is taken from the continuous-time random-walk formalism and represents an analytical alternative to models of nontrivial priority that have been recently proposed. Our analysis also goes one step further by looking at the multifractal structure of the interevent times of human decisions. We here analyze the intertransaction time intervals of several financial markets. We observe that empirical data describe a subtle multifractal behavior. Our model explains this structure by taking the pausing-time density in the form of a superstatistics where the integral kernel quantifies the heterogeneous nature of the executed tasks. A stretched exponential kernel provides a multifractal profile valid for a certain limited range. A suggested heuristic analytical profile is capable of covering a broader region.

  6. The Virtual Quake Earthquake Simulator: Earthquake Probability Statistics for the El Mayor-Cucapah Region and Evidence of Predictability in Simulated Earthquake Sequences

    NASA Astrophysics Data System (ADS)

    Schultz, K.; Yoder, M. R.; Heien, E. M.; Rundle, J. B.; Turcotte, D. L.; Parker, J. W.; Donnellan, A.

    2015-12-01

    We introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert based forecasting metric similar to those presented in Keilis-Borok (2002); Molchan (1997), and show that it exhibits significant information gain compared to random forecasts. We also discuss the long standing question of activation vs quiescent type earthquake triggering. We show that VQ exhibits both behaviors separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California USA and northern Baja California Norte, Mexico.

  7. Dynamical and statistical behavior of discrete combustion waves: a theoretical and numerical study.

    PubMed

    Bharath, Naine Tarun; Rashkovskiy, Sergey A; Tewari, Surya P; Gundawar, Manoj Kumar

    2013-04-01

    We present a detailed theoretical and numerical study of combustion waves in a discrete one-dimensional disordered system. The distances between neighboring reaction cells were modeled with a gamma distribution. The results show that the random structure of the microheterogeneous system plays a crucial role in the dynamical and statistical behavior of the system. This is a consequence of the nonlinear interaction of the random structure of the system with the thermal wave. An analysis of the experimental data on the combustion of a gasless system (Ti + xSi) and a wide range of thermite systems was performed in view of the developed model. We have shown that the burning rate of the powder system sensitively depends on its internal structure. The present model allows for reproducing theoretically the experimental data for a wide range of pyrotechnic mixtures. We show that Arrhenius' macrokinetics at combustion of disperse systems can take place even in the absence of Arrhenius' microkinetics; it can have a purely thermal nature and be related to their heterogeneity and to the existence of threshold temperature. It is also observed that the combustion of disperse systems always occurs in the microheterogeneous mode according to the relay-race mechanism.

  8. Dynamical and statistical behavior of discrete combustion waves: A theoretical and numerical study

    NASA Astrophysics Data System (ADS)

    Bharath, Naine Tarun; Rashkovskiy, Sergey A.; Tewari, Surya P.; Gundawar, Manoj Kumar

    2013-04-01

    We present a detailed theoretical and numerical study of combustion waves in a discrete one-dimensional disordered system. The distances between neighboring reaction cells were modeled with a gamma distribution. The results show that the random structure of the microheterogeneous system plays a crucial role in the dynamical and statistical behavior of the system. This is a consequence of the nonlinear interaction of the random structure of the system with the thermal wave. An analysis of the experimental data on the combustion of a gasless system (Ti + xSi) and a wide range of thermite systems was performed in view of the developed model. We have shown that the burning rate of the powder system sensitively depends on its internal structure. The present model allows for reproducing theoretically the experimental data for a wide range of pyrotechnic mixtures. We show that Arrhenius’ macrokinetics at combustion of disperse systems can take place even in the absence of Arrhenius’ microkinetics; it can have a purely thermal nature and be related to their heterogeneity and to the existence of threshold temperature. It is also observed that the combustion of disperse systems always occurs in the microheterogeneous mode according to the relay-race mechanism.

  9. Bayesian statistics and Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Koch, K. R.

    2018-03-01

    The Bayesian approach allows an intuitive way to derive the methods of statistics. Probability is defined as a measure of the plausibility of statements or propositions. Three rules are sufficient to obtain the laws of probability. If the statements refer to the numerical values of variables, the so-called random variables, univariate and multivariate distributions follow. They lead to the point estimation by which unknown quantities, i.e. unknown parameters, are computed from measurements. The unknown parameters are random variables, they are fixed quantities in traditional statistics which is not founded on Bayes' theorem. Bayesian statistics therefore recommends itself for Monte Carlo methods, which generate random variates from given distributions. Monte Carlo methods, of course, can also be applied in traditional statistics. The unknown parameters, are introduced as functions of the measurements, and the Monte Carlo methods give the covariance matrix and the expectation of these functions. A confidence region is derived where the unknown parameters are situated with a given probability. Following a method of traditional statistics, hypotheses are tested by determining whether a value for an unknown parameter lies inside or outside the confidence region. The error propagation of a random vector by the Monte Carlo methods is presented as an application. If the random vector results from a nonlinearly transformed vector, its covariance matrix and its expectation follow from the Monte Carlo estimate. This saves a considerable amount of derivatives to be computed, and errors of the linearization are avoided. The Monte Carlo method is therefore efficient. If the functions of the measurements are given by a sum of two or more random vectors with different multivariate distributions, the resulting distribution is generally not known. TheMonte Carlo methods are then needed to obtain the covariance matrix and the expectation of the sum.

  10. The effect of expiratory rib cage compression before endotracheal suctioning on the vital signs in patients under mechanical ventilation.

    PubMed

    Bousarri, Mitra Payami; Shirvani, Yadolah; Agha-Hassan-Kashani, Saeed; Nasab, Nouredin Mousavi

    2014-05-01

    In patients undergoing mechanical ventilation, mucus production and secretion is high as a result of the endotracheal tube. Because endotracheal suction in these patients is essential, chest physiotherapy techniques such as expiratory rib cage compression before endotracheal suctioning can be used as a means to facilitate mobilizing and removing airway secretion and improving alveolar ventilation. As one of the complications of mechanical ventilation and endotracheal suctioning is decrease of cardiac output, this study was carried out to determine the effect of expiratory rib cage compression before endotracheal suctioning on the vital signs in patients under mechanical ventilation. This study was a randomized clinical trial with a crossover design. The study subjects included 50 mechanically ventilated patients, hospitalized in intensive care wards of Valiasr and Mousavi hospitals in Zanjan, Iran. Subjects were selected by consecutive sampling and randomly allocated to groups 1 and 2. The patients received endotracheal suctioning with or without rib cage compression, with a minimum of 3 h interval between the two interventions. Expiratory rib cage compression was performed for 5 min before endotracheal suctioning. Vital signs were measured 5 min before and 15 and 25 min after endotracheal suctioning. Data were recorded on a data recording sheet. Data were analyzed using paired t-tests. There were statistically significant differences in the means of vital signs measured 5 min before with 15 and 25 min after endotracheal suctioning with rib cage compression (P < 0. 01). There was no significant difference in the means of diastolic pressure measured 25 min after with baseline in this stage). But on the reverse mode, there was a significant difference between the means of pulse and respiratory rate 15 min after endotracheal suctioning and the baseline values (P < 0.002). This effect continued up to 25 min after endotracheal suctioning just for respiratory rate (P = 0.016). Moreover, there were statistically significant differences in the means of vital signs measured 5 min before and 15 min after endotracheal suctioning between the two methods (P ≤ 0001). Findings showed that expiratory rib cage compression before endotracheal suctioning improves the vital signs to normal range in patients under mechanical ventilation. More studies are suggested on performing expiratory rib cage compression before endotracheal suctioning in patients undergoing mechanical ventilation.

  11. Non-Hookean statistical mechanics of clamped graphene ribbons

    NASA Astrophysics Data System (ADS)

    Bowick, Mark J.; Košmrlj, Andrej; Nelson, David R.; Sknepnek, Rastko

    2017-03-01

    Thermally fluctuating sheets and ribbons provide an intriguing forum in which to investigate strong violations of Hooke's Law: Large distance elastic parameters are in fact not constant but instead depend on the macroscopic dimensions. Inspired by recent experiments on free-standing graphene cantilevers, we combine the statistical mechanics of thin elastic plates and large-scale numerical simulations to investigate the thermal renormalization of the bending rigidity of graphene ribbons clamped at one end. For ribbons of dimensions W ×L (with L ≥W ), the macroscopic bending rigidity κR determined from cantilever deformations is independent of the width when W <ℓth , where ℓth is a thermal length scale, as expected. When W >ℓth , however, this thermally renormalized bending rigidity begins to systematically increase, in agreement with the scaling theory, although in our simulations we were not quite able to reach the system sizes necessary to determine the fully developed power law dependence on W . When the ribbon length L >ℓp , where ℓp is the W -dependent thermally renormalized ribbon persistence length, we observe a scaling collapse and the beginnings of large scale random walk behavior.

  12. Statistical summaries of fatigue data for design purposes

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.

    1983-01-01

    Two methods are discussed for constructing a design curve on the safe side of fatigue data. Both the tolerance interval and equivalent prediction interval (EPI) concepts provide such a curve while accounting for both the distribution of the estimators in small samples and the data scatter. The EPI is also useful as a mechanism for providing necessary statistics on S-N data for a full reliability analysis which includes uncertainty in all fatigue design factors. Examples of statistical analyses of the general strain life relationship are presented. The tolerance limit and EPI techniques for defining a design curve are demonstrated. Examples usng WASPALOY B and RQC-100 data demonstrate that a reliability model could be constructed by considering the fatigue strength and fatigue ductility coefficients as two independent random variables. A technique given for establishing the fatigue strength for high cycle lives relies on an extrapolation technique and also accounts for "runners." A reliability model or design value can be specified.

  13. Scanning electron microscopic evaluation of the influence of manual and mechanical glide path on the surface of nickel-titanium rotary instruments in moderately curved root canals: An in-vivo study

    PubMed Central

    Patel, Dishant; Bashetty, Kusum; Srirekha, A.; Archana, S.; Savitha, B.; Vijay, R.

    2016-01-01

    Aim: The aim of this study was to evaluate the influence of manual versus mechanical glide path (GP) on the surface changes of two different nickel-titanium rotary instruments used during root canal therapy in a moderately curved root canal. Materials and Methods: Sixty systemically healthy controls were selected for the study. Controls were divided randomly into four groups: Group 1: Manual GP followed by RaCe rotary instruments, Group 2: Manual GP followed by HyFlex rotary instruments, Group 3: Mechanical GP followed by RaCe rotary instruments, Group 4: Mechanical GP followed by HyFlex rotary instruments. After access opening, GP was prepared and rotary instruments were used according to manufacturer's instructions. All instruments were evaluated for defects under standard error mean before their use and after a single use. The scorings for the files were given at apical and middle third. Statistical Analysis Used: Chi-squared test was used. Results: The results showed that there is no statistical difference between any of the groups. Irrespective of the GP and rotary files used, more defects were present in the apical third when compared to middle third of the rotary instrument. Conclusion: Within the limitations of this study, it can be concluded that there was no effect of manual or mechanical GP on surface defects of subsequent rotary file system used. PMID:27994317

  14. A System-Level Pathway-Phenotype Association Analysis Using Synthetic Feature Random Forest

    PubMed Central

    Pan, Qinxin; Hu, Ting; Malley, James D.; Andrew, Angeline S.; Karagas, Margaret R.; Moore, Jason H.

    2015-01-01

    As the cost of genome-wide genotyping decreases, the number of genome-wide association studies (GWAS) has increased considerably. However, the transition from GWAS findings to the underlying biology of various phenotypes remains challenging. As a result, due to its system-level interpretability, pathway analysis has become a popular tool for gaining insights on the underlying biology from high-throughput genetic association data. In pathway analyses, gene sets representing particular biological processes are tested for significant associations with a given phenotype. Most existing pathway analysis approaches rely on single-marker statistics and assume that pathways are independent of each other. As biological systems are driven by complex biomolecular interactions, embracing the complex relationships between single-nucleotide polymorphisms (SNPs) and pathways needs to be addressed. To incorporate the complexity of gene-gene interactions and pathway-pathway relationships, we propose a system-level pathway analysis approach, synthetic feature random forest (SF-RF), which is designed to detect pathway-phenotype associations without making assumptions about the relationships among SNPs or pathways. In our approach, the genotypes of SNPs in a particular pathway are aggregated into a synthetic feature representing that pathway via Random Forest (RF). Multiple synthetic features are analyzed using RF simultaneously and the significance of a synthetic feature indicates the significance of the corresponding pathway. We further complement SF-RF with pathway-based Statistical Epistasis Network (SEN) analysis that evaluates interactions among pathways. By investigating the pathway SEN, we hope to gain additional insights into the genetic mechanisms contributing to the pathway-phenotype association. We apply SF-RF to a population-based genetic study of bladder cancer and further investigate the mechanisms that help explain the pathway-phenotype associations using SEN. The bladder cancer associated pathways we found are both consistent with existing biological knowledge and reveal novel and plausible hypotheses for future biological validations. PMID:24535726

  15. Entanglement Entropy of Eigenstates of Quantum Chaotic Hamiltonians.

    PubMed

    Vidmar, Lev; Rigol, Marcos

    2017-12-01

    In quantum statistical mechanics, it is of fundamental interest to understand how close the bipartite entanglement entropy of eigenstates of quantum chaotic Hamiltonians is to maximal. For random pure states in the Hilbert space, the average entanglement entropy is known to be nearly maximal, with a deviation that is, at most, a constant. Here we prove that, in a system that is away from half filling and divided in two equal halves, an upper bound for the average entanglement entropy of random pure states with a fixed particle number and normally distributed real coefficients exhibits a deviation from the maximal value that grows with the square root of the volume of the system. Exact numerical results for highly excited eigenstates of a particle number conserving quantum chaotic model indicate that the bound is saturated with increasing system size.

  16. Transition in the decay rates of stationary distributions of Lévy motion in an energy landscape.

    PubMed

    Kaleta, Kamil; Lőrinczi, József

    2016-02-01

    The time evolution of random variables with Lévy statistics has the ability to develop jumps, displaying very different behaviors from continuously fluctuating cases. Such patterns appear in an ever broadening range of examples including random lasers, non-Gaussian kinetics, or foraging strategies. The penalizing or reinforcing effect of the environment, however, has been little explored so far. We report a new phenomenon which manifests as a qualitative transition in the spatial decay behavior of the stationary measure of a jump process under an external potential, occurring on a combined change in the characteristics of the process and the lowest eigenvalue resulting from the effect of the potential. This also provides insight into the fundamental question of what is the mechanism of the spatial decay of a ground state.

  17. Quantum Adiabatic Optimization and Combinatorial Landscapes

    NASA Technical Reports Server (NTRS)

    Smelyanskiy, V. N.; Knysh, S.; Morris, R. D.

    2003-01-01

    In this paper we analyze the performance of the Quantum Adiabatic Evolution (QAE) algorithm on a variant of Satisfiability problem for an ensemble of random graphs parametrized by the ratio of clauses to variables, gamma = M / N. We introduce a set of macroscopic parameters (landscapes) and put forward an ansatz of universality for random bit flips. We then formulate the problem of finding the smallest eigenvalue and the excitation gap as a statistical mechanics problem. We use the so-called annealing approximation with a refinement that a finite set of macroscopic variables (verses only energy) is used, and are able to show the existence of a dynamic threshold gamma = gammad, beyond which QAE should take an exponentially long time to find a solution. We compare the results for extended and simplified sets of landscapes and provide numerical evidence in support of our universality ansatz.

  18. The mean field theory in EM procedures for blind Markov random field image restoration.

    PubMed

    Zhang, J

    1993-01-01

    A Markov random field (MRF) model-based EM (expectation-maximization) procedure for simultaneously estimating the degradation model and restoring the image is described. The MRF is a coupled one which provides continuity (inside regions of smooth gray tones) and discontinuity (at region boundaries) constraints for the restoration problem which is, in general, ill posed. The computational difficulty associated with the EM procedure for MRFs is resolved by using the mean field theory from statistical mechanics. An orthonormal blur decomposition is used to reduce the chances of undesirable locally optimal estimates. Experimental results on synthetic and real-world images show that this approach provides good blur estimates and restored images. The restored images are comparable to those obtained by a Wiener filter in mean-square error, but are most visually pleasing.

  19. Universal self-similarity of propagating populations

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2010-07-01

    This paper explores the universal self-similarity of propagating populations. The following general propagation model is considered: particles are randomly emitted from the origin of a d -dimensional Euclidean space and propagate randomly and independently of each other in space; all particles share a statistically common—yet arbitrary—motion pattern; each particle has its own random propagation parameters—emission epoch, motion frequency, and motion amplitude. The universally self-similar statistics of the particles’ displacements and first passage times (FPTs) are analyzed: statistics which are invariant with respect to the details of the displacement and FPT measurements and with respect to the particles’ underlying motion pattern. Analysis concludes that the universally self-similar statistics are governed by Poisson processes with power-law intensities and by the Fréchet and Weibull extreme-value laws.

  20. Universal self-similarity of propagating populations.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2010-07-01

    This paper explores the universal self-similarity of propagating populations. The following general propagation model is considered: particles are randomly emitted from the origin of a d-dimensional Euclidean space and propagate randomly and independently of each other in space; all particles share a statistically common--yet arbitrary--motion pattern; each particle has its own random propagation parameters--emission epoch, motion frequency, and motion amplitude. The universally self-similar statistics of the particles' displacements and first passage times (FPTs) are analyzed: statistics which are invariant with respect to the details of the displacement and FPT measurements and with respect to the particles' underlying motion pattern. Analysis concludes that the universally self-similar statistics are governed by Poisson processes with power-law intensities and by the Fréchet and Weibull extreme-value laws.

  1. An On-Demand Optical Quantum Random Number Generator with In-Future Action and Ultra-Fast Response

    PubMed Central

    Stipčević, Mario; Ursin, Rupert

    2015-01-01

    Random numbers are essential for our modern information based society e.g. in cryptography. Unlike frequently used pseudo-random generators, physical random number generators do not depend on complex algorithms but rather on a physicsal process to provide true randomness. Quantum random number generators (QRNG) do rely on a process, wich can be described by a probabilistic theory only, even in principle. Here we present a conceptualy simple implementation, which offers a 100% efficiency of producing a random bit upon a request and simultaneously exhibits an ultra low latency. A careful technical and statistical analysis demonstrates its robustness against imperfections of the actual implemented technology and enables to quickly estimate randomness of very long sequences. Generated random numbers pass standard statistical tests without any post-processing. The setup described, as well as the theory presented here, demonstrate the maturity and overall understanding of the technology. PMID:26057576

  2. Sleep-Related Safety Behaviors and Dysfunctional Beliefs Mediate the Efficacy of Online CBT for Insomnia: A Randomized Controlled Trial.

    PubMed

    Lancee, Jaap; Eisma, Maarten C; van Straten, Annemieke; Kamphuis, Jan H

    2015-01-01

    Several trials have demonstrated the efficacy of online cognitive behavioral therapy (CBT) for insomnia. However, few studies have examined putative mechanisms of change based on the cognitive model of insomnia. Identification of modifiable mechanisms by which the treatment works may guide efforts to further improve the efficacy of insomnia treatment. The current study therefore has two aims: (1) to replicate the finding that online CBT is effective for insomnia and (2) to test putative mechanism of change (i.e., safety behaviors and dysfunctional beliefs). Accordingly, we conducted a randomized controlled trial in which individuals with insomnia were randomized to either online CBT for insomnia (n = 36) or a waiting-list control group (n = 27). Baseline and posttest assessments included questionnaires assessing insomnia severity, safety behaviors, dysfunctional beliefs, anxiety and depression, and a sleep diary. Three- and six-month assessments were administered to the CBT group only. Results show moderate to large statistically significant effects of the online treatment compared to the waiting list on insomnia severity, sleep measures, sleep safety behaviors, and dysfunctional beliefs. Furthermore, dysfunctional beliefs and safety behaviors mediated the effects of treatment on insomnia severity and sleep efficiency. Together, these findings corroborate the efficacy of online CBT for insomnia, and suggest that these effects were produced by changing maladaptive beliefs, as well as safety behaviors. Treatment protocols for insomnia may specifically be enhanced by more focused attention on the comprehensive fading of sleep safety behaviors, for instance through behavioral experiments.

  3. How to deal with morning bad breath: A randomized, crossover clinical trial

    PubMed Central

    Oliveira-Neto, Jeronimo M.; Sato, Sandra; Pedrazzi, Vinícius

    2013-01-01

    Context: The absence of a protocol for the treatment of halitosis has led us to compare mouthrinses with mechanical oral hygiene procedures for treating morning breath by employing a hand-held sulfide monitor. Aims: To compare the efficacy of five modalities of treatment for controlling morning halitosis in subjects with no dental or periodontal disease. Settings and Design: This is a five-period, randomized, crossover clinical trial. Materials and Methods: Twenty volunteers were randomly assigned to the trial. Testing involved the use of a conventional tongue scraper, a tongue scraper joined to the back of a toothbrush's head, two mouthrinses (0.05% cetylpyridinium chloride and 0.12% chlorhexidine digluconate) and a soft-bristled toothbrush and fluoride toothpaste for practicing oral hygiene. Statistical Analysis Used: Data analysis was performed using SPSS version 17 for Windows and NCSS 2007 software (P < 0.05). The products and the periods were compared with each other using the Friedman's test. When significant differences (P < 0.05) were determined, the products and periods were compared in pairs by using the Wilcoxon's test and by adjusting the original significance level (0.05) for multiple comparisons by using the Bonferroni's method. Results: The toothbrush's tongue scraper was able to significantly reduce bad breath for up to 2 h. Chlorhexidine reduced bad breath only at the end of the second hour, an effect that lasted for 3 h. Conclusions: Mechanical tongue cleaning was able to immediately reduce bad breath for a short period, whereas chlorhexidine and mechanical oral hygiene reduced bad breath for longer periods, achieving the best results against morning breath. PMID:24554886

  4. Random-phase metasurfaces at optical wavelengths

    NASA Astrophysics Data System (ADS)

    Pors, Anders; Ding, Fei; Chen, Yiting; Radko, Ilya P.; Bozhevolnyi, Sergey I.

    2016-06-01

    Random-phase metasurfaces, in which the constituents scatter light with random phases, have the property that an incident plane wave will diffusely scatter, hereby leading to a complex far-field response that is most suitably described by statistical means. In this work, we present and exemplify the statistical description of the far-field response, particularly highlighting how the response for polarised and unpolarised light might be alike or different depending on the correlation of scattering phases for two orthogonal polarisations. By utilizing gap plasmon-based metasurfaces, consisting of an optically thick gold film overlaid by a subwavelength thin glass spacer and an array of gold nanobricks, we design and realize random-phase metasurfaces at a wavelength of 800 nm. Optical characterisation of the fabricated samples convincingly demonstrates the diffuse scattering of reflected light, with statistics obeying the theoretical predictions. We foresee the use of random-phase metasurfaces for camouflage applications and as high-quality reference structures in dark-field microscopy, while the control of the statistics for polarised and unpolarised light might find usage in security applications. Finally, by incorporating a certain correlation between scattering by neighbouring metasurface constituents new types of functionalities can be realised, such as a Lambertian reflector.

  5. Sensitivity to imputation models and assumptions in receiver operating characteristic analysis with incomplete data

    PubMed Central

    Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M.

    2015-01-01

    Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms. PMID:26379316

  6. Mode-selective control of thermal Brownian vibration of micro-resonator (Generation of a thermal no-equilibrium state by mechanical feedback control)

    NASA Astrophysics Data System (ADS)

    Kawamura, Y.; Kanegae, R.

    2017-09-01

    Recently, there have been various attempts to dampen the vibration amplitude of the Brownian motion of a microresonator below the thermal vibration amplitude, with the goal of reaching the quantum ground vibration level. To further develop the approach of reaching the quantum ground state, it is essential to clarify whether or not coupling exists between the different vibration modes of the resonator. In this paper, the mode-selective control of thermal Brownian vibration is shown. The first and the second vibration modes of a micro-cantilever moved by a random Brownian motion are cooled selectively and independently below the thermal vibration amplitude, as determined by the statistical thermodynamic theory, using a mechanical feedback control method. This experimental result shows that the thermal no-equilibrium condition was generated by mechanical feedback control.

  7. Virial expansion for almost diagonal random matrices

    NASA Astrophysics Data System (ADS)

    Yevtushenko, Oleg; Kravtsov, Vladimir E.

    2003-08-01

    Energy level statistics of Hermitian random matrices hat H with Gaussian independent random entries Higeqj is studied for a generic ensemble of almost diagonal random matrices with langle|Hii|2rangle ~ 1 and langle|Hi\

  8. Histological analysis of effects of 24% EDTA gel for nonsurgical treatment of periodontal tissues.

    PubMed

    de Vasconcellos, Luana Marotta Reis; Ricardo, Lucilene Hernandes; Balducci, Ivan; de Vasconcellos, Luis Gustavo Oliveira; Carvalho, Yasmin Rodarte

    2006-12-01

    The aim of this study was to investigate, by means of histological and histomorphometric analysis, the effects of 24% ethylenediaminetetraacetic acid (EDTA) gel in periodontal tissue when used in combination with conventional periodontal treatment. Periodontitis was induced in the 2nd upper left permanent molars of 45 male Wistar rats by means of ligature. After 5 weeks, this was removed and debridement was performed. The animals were then randomly divided into 3 groups; group 1: mechanical treatment, group 2: mechanical treatment and EDTA gel application for 2 min, and group 3: mechanical treatment and placebo gel application for 2 min. After the treatment, rinsing was done with 0.9% saline solution for 1 min in all cases, followed by root notching in the deepest part of the pocket. After 4, 10, and 28 days the animals were sacrificed. The averages obtained were evaluated by means of test two-way analysis of variance (ANOVA) and Tukey statistical tests (P < 0.05). The results showed that with respect to the type of treatment employed, there were no statistically significant differences in the vitality of the periodontal tissue. It was concluded that 24% EDTA gel did not interfere with periodontal tissue repair when used in combination with conventional periodontal treatment.

  9. A full quantum analysis of the Stern-Gerlach experiment using the evolution operator method: analyzing current issues in teaching quantum mechanics

    NASA Astrophysics Data System (ADS)

    Benítez Rodríguez, E.; Arévalo Aguilar, L. M.; Piceno Martínez, E.

    2017-03-01

    To the quantum mechanics specialists community it is a well-known fact that the famous original Stern-Gerlach experiment (SGE) produces entanglement between the external degrees of freedom (position) and the internal degree of freedom (spin) of silver atoms. Despite this fact, almost all textbooks on quantum mechanics explain this experiment using a semiclassical approach, where the external degrees of freedom are considered classical variables, the internal degree is treated as a quantum variable, and Newton's second law is used to describe the dynamics. In the literature there are some works that analyze this experiment in its full quantum mechanical form. However, astonishingly, to the best of our knowledge the original experiment, where the initial states of the spin degree of freedom are randomly oriented coming from the oven, has not been analyzed yet in the available textbooks using the Schrödinger equation (to the best of our knowledge there is only one paper that treats this case: Hsu et al (2011 Phys. Rev. A 83 012109)). Therefore, in this contribution we use the time-evolution operator to give a full quantum mechanics analysis of the SGE when the initial state of the internal degree of freedom is completely random, i.e. when it is a statistical mixture. Additionally, as the SGE and the development of quantum mechanics are heavily intermingled, we analyze some features and drawbacks in the current teaching of quantum mechanics. We focus on textbooks that use the SGE as a starting point, based on the fact that most physicist do not use results from physics education research, and comment on traditional pedagogical attitudes in the physics community.

  10. The statistical mechanics of relativistic orbits around a massive black hole

    NASA Astrophysics Data System (ADS)

    Bar-Or, Ben; Alexander, Tal

    2014-12-01

    Stars around a massive black hole (MBH) move on nearly fixed Keplerian orbits, in a centrally-dominated potential. The random fluctuations of the discrete stellar background cause small potential perturbations, which accelerate the evolution of orbital angular momentum by resonant relaxation. This drives many phenomena near MBHs, such as extreme mass-ratio gravitational wave inspirals, the warping of accretion disks, and the formation of exotic stellar populations. We present here a formal statistical mechanics framework to analyze such systems, where the background potential is described as a correlated Gaussian noise. We derive the leading order, phase-averaged 3D stochastic Hamiltonian equations of motion, for evolving the orbital elements of a test star, and obtain the effective Fokker-Planck equation for a general correlated Gaussian noise, for evolving the stellar distribution function. We show that the evolution of angular momentum depends critically on the temporal smoothness of the background potential fluctuations. Smooth noise has a maximal variability frequency {{ν }max }. We show that in the presence of such noise, the evolution of the normalized angular momentum j=\\sqrt{1-{{e}2}} of a relativistic test star, undergoing Schwarzschild (in-plane) general relativistic precession with frequency {{ν }GR}/{{j}2}, is exponentially suppressed for j\\lt {{j}b}, where {{ν }GR}/jb2˜ {{ν }max }, due to the adiabatic invariance of the precession against the slowly varying random background torques. This results in an effective Schwarzschild precession-induced barrier in angular momentum. When jb is large enough, this barrier can have significant dynamical implications for processes near the MBH.

  11. Statistical mechanics of a time-homogeneous system of money and antimoney

    NASA Astrophysics Data System (ADS)

    Schmitt, Matthias; Schacker, Andreas; Braun, Dieter

    2014-03-01

    Financial crises appear throughout human history. While there are many schools of thought on what the actual causes of such crises are, it has been suggested that the creation of credit money might be a source of financial instability. We discuss how the credit mechanism in a system of fractional reserve banking leads to non-local transfers of purchasing power that also affect non-involved agents. To overcome this issue, we impose the local symmetry of time homogeneity on the monetary system. A bi-currency system of non-bank assets (money) and bank assets (antimoney) is considered. A payment is either made by passing on money or by receiving antimoney. As a result, a free floating exchange rate between non-bank assets and bank assets is established. Credit creation is replaced by the simultaneous transfer of money and antimoney at a negotiated exchange rate. This is in contrast to traditional discussions of full reserve banking, which stalls creditary lending. With money and antimoney, the problem of credit crunches is mitigated while a full time symmetry of the monetary system is maintained. As a test environment for such a monetary system, we discuss an economy of random transfers. Random transfers are a strong criterion to probe the stability of monetary systems. The analysis using statistical physics provides analytical solutions and confirms that a money-antimoney system could be functional. Equally important to the probing of the stability of such a monetary system is the question of how to implement the credit default dynamics. This issue remains open.

  12. The effect of regional sea level atmospheric pressure on sea level variations at globally distributed tide gauge stations with long records

    NASA Astrophysics Data System (ADS)

    Iz, H. Bâki

    2018-05-01

    This study provides additional information about the impact of atmospheric pressure on sea level variations. The observed regularity in sea level atmospheric pressure depends mainly on the latitude and verified to be dominantly random closer to the equator. It was demonstrated that almost all the annual and semiannual sea level variations at 27 globally distributed tide gauge stations can be attributed to the regional/local atmospheric forcing as an inverted barometric effect. Statistically significant non-linearities were detected in the regional atmospheric pressure series, which in turn impacted other sea level variations as compounders in tandem with the lunar nodal forcing, generating lunar sub-harmonics with multidecadal periods. It was shown that random component of regional atmospheric pressure tends to cluster at monthly intervals. The clusters are likely to be caused by the intraannual seasonal atmospheric temperature changes,which may also act as random beats in generating sub-harmonics observed in sea level changes as another mechanism. This study also affirmed that there are no statistically significant secular trends in the progression of regional atmospheric pressures, hence there was no contribution to the sea level trends during the 20th century by the atmospheric pressure.Meanwhile, the estimated nonuniform scale factors of the inverted barometer effects suggest that the sea level atmospheric pressure will bias the sea level trends inferred from satellite altimetry measurements if their impact is accounted for as corrections without proper scaling.

  13. Turbulent mass inhomogeneities induced by a point-source

    NASA Astrophysics Data System (ADS)

    Thalabard, Simon

    2018-03-01

    We describe how turbulence distributes tracers away from a localized source of injection, and analyze how the spatial inhomogeneities of the concentration field depend on the amount of randomness in the injection mechanism. For that purpose, we contrast the mass correlations induced by purely random injections with those induced by continuous injections in the environment. Using the Kraichnan model of turbulent advection, whereby the underlying velocity field is assumed to be shortly correlated in time, we explicitly identify scaling regions for the statistics of the mass contained within a shell of radius r and located at a distance ρ away from the source. The two key parameters are found to be (i) the ratio s 2 between the absolute and the relative timescales of dispersion and (ii) the ratio Λ between the size of the cloud and its distance away from the source. When the injection is random, only the former is relevant, as previously shown by Celani et al (2007 J. Fluid Mech. 583 189–98) in the case of an incompressible fluid. It is argued that the space partition in terms of s 2 and Λ is a robust feature of the injection mechanism itself, which should remain relevant beyond the Kraichnan model. This is for instance the case in a generalized version of the model, where the absolute dispersion is prescribed to be ballistic rather than diffusive.

  14. Antipyretic effect of ibuprofen in Gabonese children with uncomplicated falciparum malaria: a randomized, double-blind, placebo-controlled trial

    PubMed Central

    Matsiégui, Pierre-Blaise; Missinou, Michel A; Necek, Magdalena; Mavoungou, Elie; Issifou, Saadou; Lell, Bertrand; Kremsner, Peter G

    2008-01-01

    Background Antipyretic drugs are widely used in children with fever, though there is a controversy about the benefit of reducing fever in children with malaria. In order to assess the effect of ibuprofen on fever compared to placebo in children with uncomplicated Plasmodium falciparum malaria in Gabon, a randomized double blind placebo controlled trial, was designed. Methods Fifty children between two and seven years of age with uncomplicated malaria were included in the study. For the treatment of fever, all patients "received" mechanical treatment when the temperature rose above 37.5°C. In addition to the mechanical treatment, continuous fanning and cooling blanket, patients were assigned randomly to receive ibuprofen (7 mg/kg body weight, every eight hours) or placebo. Results The fever clearance time using a fever threshold of 37.5°C was similar in children receiving ibuprofen compared to those receiving placebo. The difference was also not statistically significant using a fever threshold of 37.8°C or 38.0°C. However, the fever time and the area under the fever curve were significantly smaller in the ibuprofen group compared to the placebo group. Conclusion Ibuprofen is effective in reducing the time with fever. The effect on fever clearance is less obvious and depends on definition of the fever threshold. Trial registration The trial registration number is: NCT00167713 PMID:18503714

  15. Focal muscle vibration as a possible intervention to prevent falls in elderly women: a pragmatic randomized controlled trial.

    PubMed

    Celletti, Claudia; Fattorini, Luigi; Camerota, Filippo; Ricciardi, Diego; La Torre, Giuseppe; Landi, Francesco; Filippi, Guido Maria

    2015-12-01

    Different and new approaches have been proposed to prevent the risk of falling of elderly people, particularly women. This study investigates the possibility that a new protocol based on the focal mechanical muscle vibration may reduce the risk of falling of elderly women. A pragmatic randomized controlled triple-blind trial with a 6-month follow-up after intervention randomized 350 women (mean age 73.4 years + 3.11), members of local senior citizen centers in Rome, into two groups: vibrated group (VG) and control group (CG). For VG participants a mechanical vibration (lasting 10 min) was focally applied on voluntary contracted quadriceps muscles, three times a day during three consecutive days. CG subjects received a placebo vibratory stimulation. Subjects were tested immediately before (T0) and 30 (T1) and 180 (T2) days after the intervention with the Performance-Oriented Mobility Assessment (POMA) test. All subjects were asked not to change their lifestyle during the study. CG underwent sham vibratory treatment. While CG did not show any statistically significant change of POMA at T1 and T2, VG revealed significant differences. At T2, ≈47% of the subjects who completed the study obtained the full score on the POMA test and ≈59% reached the full POMA score. The new protocol seems to be promising in reducing the risk of falling of elderly subjects.

  16. Noise, chaos, and (ɛ, τ)-entropy per unit time

    NASA Astrophysics Data System (ADS)

    Gaspard, Pierre; Wang, Xiao-Jing

    1993-12-01

    The degree of dynamical randomness of different time processes is characterized in terms of the (ε, τ)-entropy per unit time. The (ε, τ)-entropy is the amount of information generated per unit time, at different scales τ of time and ε of the observables. This quantity generalizes the Kolmogorov-Sinai entropy per unit time from deterministic chaotic processes, to stochastic processes such as fluctuations in mesoscopic physico-chemical phenomena or strong turbulence in macroscopic spacetime dynamics. The random processes that are characterized include chaotic systems, Bernoulli and Markov chains, Poisson and birth-and-death processes, Ornstein-Uhlenbeck and Yaglom noises, fractional Brownian motions, different regimes of hydrodynamical turbulence, and the Lorentz-Boltzmann process of nonequilibrium statistical mechanics. We also extend the (ε, τ)-entropy to spacetime processes like cellular automata, Conway's game of life, lattice gas automata, coupled maps, spacetime chaos in partial differential equations, as well as the ideal, the Lorentz, and the hard sphere gases. Through these examples it is demonstrated that the (ε, τ)-entropy provides a unified quantitative measure of dynamical randomness to both chaos and noises, and a method to detect transitions between dynamical states of different degrees of randomness as a parameter of the system is varied.

  17. A statistic-thermodynamic model for the DOM degradation in the estuary

    NASA Astrophysics Data System (ADS)

    Zheng, Quanan; Chen, Qin; Zhao, Haihong; Shi, Jiuxin; Cao, Yong; Wang, Dan

    2008-03-01

    This study aims to clarify the role of dissolved salts playing in the degradation process of terrestrial dissolved organic matter (DOM) at a scale of molecular movement. The molecular thermal movement is perpetual motion. In a multi-molecular system, this random motion also causes collision between the molecules. Seawater is a multi-molecular system consisting from water, salt, and terrestrial DOM molecules. This study attributes the DOM degradation in the estuary to the inelastic collision of DOM molecule with charged salt ions. From statistic-thermodynamic theories of molecular collision, the DOM degradation model and the DOM distribution model are derived. The models are validated by the field observations and satellite data. Thus, we conclude that the inelastic collision between the terrestrial DOM molecules and dissolved salt ions in seawater is a decisive dynamic mechanism for rapid loss of terrestrial DOM.

  18. Estimation of macroscopic elastic characteristics for hierarchical anisotropic solids based on probabilistic approach

    NASA Astrophysics Data System (ADS)

    Smolina, Irina Yu.

    2015-10-01

    Mechanical properties of a cable are of great importance in design and strength calculation of flexible cables. The problem of determination of elastic properties and rigidity characteristics of a cable modeled by anisotropic helical elastic rod is considered. These characteristics are calculated indirectly by means of the parameters received from statistical processing of experimental data. These parameters are considered as random quantities. With taking into account probable nature of these parameters the formulas for estimation of the macroscopic elastic moduli of a cable are obtained. The calculating expressions for macroscopic flexural rigidity, shear rigidity and torsion rigidity using the macroscopic elastic characteristics obtained before are presented. Statistical estimations of the rigidity characteristics of some cable grades are adduced. A comparison with those characteristics received on the basis of deterministic approach is given.

  19. Broken Ergodicity in Ideal, Homogeneous, Incompressible Turbulence

    NASA Technical Reports Server (NTRS)

    Morin, Lee; Shebalin, John; Fu, Terry; Nguyen, Phu; Shum, Victor

    2010-01-01

    We discuss the statistical mechanics of numerical models of ideal homogeneous, incompressible turbulence and their relevance for dissipative fluids and magnetofluids. These numerical models are based on Fourier series and the relevant statistical theory predicts that Fourier coefficients of fluid velocity and magnetic fields (if present) are zero-mean random variables. However, numerical simulations clearly show that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation. We explain this phenomena in terms of broken ergodicity', which is defined to occur when dynamical behavior does not match ensemble predictions on very long time-scales. We review the theoretical basis of broken ergodicity, apply it to 2-D and 3-D fluid and magnetohydrodynamic simulations of homogeneous turbulence, and show new results from simulations using GPU (graphical processing unit) computers.

  20. Repeated Random Sampling in Year 5

    ERIC Educational Resources Information Center

    Watson, Jane M.; English, Lyn D.

    2016-01-01

    As an extension to an activity introducing Year 5 students to the practice of statistics, the software "TinkerPlots" made it possible to collect repeated random samples from a finite population to informally explore students' capacity to begin reasoning with a distribution of sample statistics. This article provides background for the…

  1. Statistical Analysis Experiment for Freshman Chemistry Lab.

    ERIC Educational Resources Information Center

    Salzsieder, John C.

    1995-01-01

    Describes a laboratory experiment dissolving zinc from galvanized nails in which data can be gathered very quickly for statistical analysis. The data have sufficient significant figures and the experiment yields a nice distribution of random errors. Freshman students can gain an appreciation of the relationships between random error, number of…

  2. The Statistical Drake Equation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2010-12-01

    We provide the statistical generalization of the Drake equation. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov Form of the CLT, or the Lindeberg Form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the LOGNORMAL distribution. Then, as a consequence, the mean value of this lognormal distribution is the ordinary N in the Drake equation. The standard deviation, mode, and all the moments of this lognormal N are also found. The seven factors in the ordinary Drake equation now become seven positive random variables. The probability distribution of each random variable may be ARBITRARY. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) DISTANCE between any two neighboring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density function, apparently previously unknown and dubbed "Maccone distribution" by Paul Davies. DATA ENRICHMENT PRINCIPLE. It should be noticed that ANY positive number of random variables in the Statistical Drake Equation is compatible with the CLT. So, our generalization allows for many more factors to be added in the future as long as more refined scientific knowledge about each factor will be known to the scientists. This capability to make room for more future factors in the statistical Drake equation, we call the "Data Enrichment Principle," and we regard it as the key to more profound future results in the fields of Astrobiology and SETI. Finally, a practical example is given of how our statistical Drake equation works numerically. We work out in detail the case, where each of the seven random variables is uniformly distributed around its own mean value and has a given standard deviation. For instance, the number of stars in the Galaxy is assumed to be uniformly distributed around (say) 350 billions with a standard deviation of (say) 1 billion. Then, the resulting lognormal distribution of N is computed numerically by virtue of a MathCad file that the author has written. This shows that the mean value of the lognormal random variable N is actually of the same order as the classical N given by the ordinary Drake equation, as one might expect from a good statistical generalization.

  3. Random and externally controlled occurrences of Dansgaard-Oeschger events

    NASA Astrophysics Data System (ADS)

    Lohmann, Johannes; Ditlevsen, Peter D.

    2018-05-01

    Dansgaard-Oeschger (DO) events constitute the most pronounced mode of centennial to millennial climate variability of the last glacial period. Since their discovery, many decades of research have been devoted to understand the origin and nature of these rapid climate shifts. In recent years, a number of studies have appeared that report emergence of DO-type variability in fully coupled general circulation models via different mechanisms. These mechanisms result in the occurrence of DO events at varying degrees of regularity, ranging from periodic to random. When examining the full sequence of DO events as captured in the North Greenland Ice Core Project (NGRIP) ice core record, one can observe high irregularity in the timing of individual events at any stage within the last glacial period. In addition to the prevailing irregularity, certain properties of the DO event sequence, such as the average event frequency or the relative distribution of cold versus warm periods, appear to be changing throughout the glacial. By using statistical hypothesis tests on simple event models, we investigate whether the observed event sequence may have been generated by stationary random processes or rather was strongly modulated by external factors. We find that the sequence of DO warming events is consistent with a stationary random process, whereas dividing the event sequence into warming and cooling events leads to inconsistency with two independent event processes. As we include external forcing, we find a particularly good fit to the observed DO sequence in a model where the average residence time in warm periods are controlled by global ice volume and cold periods by boreal summer insolation.

  4. On Fluctuations of Eigenvalues of Random Band Matrices

    NASA Astrophysics Data System (ADS)

    Shcherbina, M.

    2015-10-01

    We consider the fluctuations of linear eigenvalue statistics of random band matrices whose entries have the form with i.i.d. possessing the th moment, where the function u has a finite support , so that M has only nonzero diagonals. The parameter b (called the bandwidth) is assumed to grow with n in a way such that . Without any additional assumptions on the growth of b we prove CLT for linear eigenvalue statistics for a rather wide class of test functions. Thus we improve and generalize the results of the previous papers (Jana et al., arXiv:1412.2445; Li et al. Random Matrices 2:04, 2013), where CLT was proven under the assumption . Moreover, we develop a method which allows to prove automatically the CLT for linear eigenvalue statistics of the smooth test functions for almost all classical models of random matrix theory: deformed Wigner and sample covariance matrices, sparse matrices, diluted random matrices, matrices with heavy tales etc.

  5. Nonstationary envelope process and first excursion probability

    NASA Technical Reports Server (NTRS)

    Yang, J.

    1972-01-01

    A definition of the envelope of nonstationary random processes is proposed. The establishment of the envelope definition makes it possible to simulate the nonstationary random envelope directly. Envelope statistics, such as the density function, joint density function, moment function, and level crossing rate, which are relevent to analyses of catastrophic failure, fatigue, and crack propagation in structures, are derived. Applications of the envelope statistics to the prediction of structural reliability under random loadings are discussed in detail.

  6. Fracture mechanics concepts in reliability analysis of monolithic ceramics

    NASA Technical Reports Server (NTRS)

    Manderscheid, Jane M.; Gyekenyesi, John P.

    1987-01-01

    Basic design concepts for high-performance, monolithic ceramic structural components are addressed. The design of brittle ceramics differs from that of ductile metals because of the inability of ceramic materials to redistribute high local stresses caused by inherent flaws. Random flaw size and orientation requires that a probabilistic analysis be performed in order to determine component reliability. The current trend in probabilistic analysis is to combine linear elastic fracture mechanics concepts with the two parameter Weibull distribution function to predict component reliability under multiaxial stress states. Nondestructive evaluation supports this analytical effort by supplying data during verification testing. It can also help to determine statistical parameters which describe the material strength variation, in particular the material threshold strength (the third Weibull parameter), which in the past was often taken as zero for simplicity.

  7. Propagation of mechanical waves through a stochastic medium with spherical symmetry

    NASA Astrophysics Data System (ADS)

    Avendaño, Carlos G.; Reyes, J. Adrián

    2018-01-01

    We theoretically analyze the propagation of outgoing mechanical waves through an infinite isotropic elastic medium possessing spherical symmetry whose Lamé coefficients and density are spatial random functions characterized by well-defined statistical parameters. We derive the differential equation that governs the average displacement for a system whose properties depend on the radial coordinate. We show that such an equation is an extended version of the well-known Bessel differential equation whose perturbative additional terms contain coefficients that depend directly on the squared noise intensities and the autocorrelation lengths in an exponential decay fashion. We numerically solve the second order differential equation for several values of noise intensities and autocorrelation lengths and compare the corresponding displacement profiles with that of the exact analytic solution for the case of absent inhomogeneities.

  8. [Scale Relativity Theory in living beings morphogenesis: fratal, determinism and chance].

    PubMed

    Chaline, J

    2012-10-01

    The Scale Relativity Theory has many biological applications from linear to non-linear and, from classical mechanics to quantum mechanics. Self-similar laws have been used as model for the description of a huge number of biological systems. Theses laws may explain the origin of basal life structures. Log-periodic behaviors of acceleration or deceleration can be applied to branching macroevolution, to the time sequences of major evolutionary leaps. The existence of such a law does not mean that the role of chance in evolution is reduced, but instead that randomness and contingency may occur within a framework which may itself be structured in a partly statistical way. The scale relativity theory can open new perspectives in evolution. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  9. Statistics of time delay and scattering correlation functions in chaotic systems. I. Random matrix theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novaes, Marcel

    2015-06-15

    We consider the statistics of time delay in a chaotic cavity having M open channels, in the absence of time-reversal invariance. In the random matrix theory approach, we compute the average value of polynomial functions of the time delay matrix Q = − iħS{sup †}dS/dE, where S is the scattering matrix. Our results do not assume M to be large. In a companion paper, we develop a semiclassical approximation to S-matrix correlation functions, from which the statistics of Q can also be derived. Together, these papers contribute to establishing the conjectured equivalence between the random matrix and the semiclassical approaches.

  10. Random dopant fluctuations and statistical variability in n-channel junctionless FETs

    NASA Astrophysics Data System (ADS)

    Akhavan, N. D.; Umana-Membreno, G. A.; Gu, R.; Antoszewski, J.; Faraone, L.

    2018-01-01

    The influence of random dopant fluctuations on the statistical variability of the electrical characteristics of n-channel silicon junctionless nanowire transistor (JNT) has been studied using three dimensional quantum simulations based on the non-equilibrium Green’s function (NEGF) formalism. Average randomly distributed body doping densities of 2 × 1019, 6 × 1019 and 1 × 1020 cm-3 have been considered employing an atomistic model for JNTs with gate lengths of 5, 10 and 15 nm. We demonstrate that by properly adjusting the doping density in the JNT, a near ideal statistical variability and electrical performance can be achieved, which can pave the way for the continuation of scaling in silicon CMOS technology.

  11. Finite Element Analysis of Reverberation Chambers

    NASA Technical Reports Server (NTRS)

    Bunting, Charles F.; Nguyen, Duc T.

    2000-01-01

    The primary motivating factor behind the initiation of this work was to provide a deterministic means of establishing the validity of the statistical methods that are recommended for the determination of fields that interact in -an avionics system. The application of finite element analysis to reverberation chambers is the initial step required to establish a reasonable course of inquiry in this particularly data-intensive study. The use of computational electromagnetics provides a high degree of control of the "experimental" parameters that can be utilized in a simulation of reverberating structures. As the work evolved there were four primary focus areas they are: 1. The eigenvalue problem for the source free problem. 2. The development of a complex efficient eigensolver. 3. The application of a source for the TE and TM fields for statistical characterization. 4. The examination of shielding effectiveness in a reverberating environment. One early purpose of this work was to establish the utility of finite element techniques in the development of an extended low frequency statistical model for reverberation phenomena. By employing finite element techniques, structures of arbitrary complexity can be analyzed due to the use of triangular shape functions in the spatial discretization. The effects of both frequency stirring and mechanical stirring are presented. It is suggested that for the low frequency operation the typical tuner size is inadequate to provide a sufficiently random field and that frequency stirring should be used. The results of the finite element analysis of the reverberation chamber illustrate io-W the potential utility of a 2D representation for enhancing the basic statistical characteristics of the chamber when operating in a low frequency regime. The basic field statistics are verified for frequency stirring over a wide range of frequencies. Mechanical stirring is shown to provide an effective frequency deviation.

  12. Mechanisms of Change During Attention Training and Mindfulness in High Trait-Anxious Individuals: A Randomized Controlled Study.

    PubMed

    McEvoy, Peter M; Graville, Rachel; Hayes, Sarra; Kane, Robert T; Foster, Jonathan K

    2017-09-01

    The first aim of this study was to compare attention manipulation techniques deriving from metacognitive therapy (the Attention Training Technique; ATT) and mindfulness-based approaches (Mindfulness-Based Progressive Muscle Relaxation, MB-PMR) to a thought wandering control (TWC) condition, in terms of their impact on anxiety and four mechanisms: distancing, present-focused attention, uncontrollability and dangerousness, metacognitive beliefs, and cognitive flexibility (Stroop task). The second aim was to test indirect effects of the techniques on anxiety via the mechanism measures. High trait anxious participants (N = 81, M age = 23.60, SD age = 7.66, 80% female) were randomized to receive ATT, MB-PMR, or the TWC condition. Measures of cognitive and somatic anxiety, distancing, present-focused attention, metacognitive beliefs, and cognitive flexibility were administered before or after the attention manipulation task. Compared to the TWC group, ATT and MB-PMR were associated with greater changes on cognitive (but not somatic) anxiety, present-focused attention, metacognitive beliefs, and uncorrected errors for threat-related words on the Stroop task. The pattern of means was similar for distancing, but this did not reach statistical significance, and Stroop speed increased equally for all conditions. Indirect effects models revealed significant effects of condition on state anxiety via distancing, metacognitive beliefs, and present-focused attention, but not via Stroop errors. ATT and MB-PMR were associated with changes on anxiety and the mechanism measures, suggesting that the mechanisms of change may be more similar than different across these techniques. Copyright © 2017. Published by Elsevier Ltd.

  13. Effect of prone positioning during mechanical ventilation on mortality among patients with acute respiratory distress syndrome: a systematic review and meta-analysis.

    PubMed

    Sud, Sachin; Friedrich, Jan O; Adhikari, Neill K J; Taccone, Paolo; Mancebo, Jordi; Polli, Federico; Latini, Roberto; Pesenti, Antonio; Curley, Martha A Q; Fernandez, Rafael; Chan, Ming-Cheng; Beuret, Pascal; Voggenreiter, Gregor; Sud, Maneesh; Tognoni, Gianni; Gattinoni, Luciano; Guérin, Claude

    2014-07-08

    Mechanical ventilation in the prone position is used to improve oxygenation and to mitigate the harmful effects of mechanical ventilation in patients with acute respiratory distress syndrome (ARDS). We sought to determine the effect of prone positioning on mortality among patients with ARDS receiving protective lung ventilation. We searched electronic databases and conference proceedings to identify relevant randomized controlled trials (RCTs) published through August 2013. We included RCTs that compared prone and supine positioning during mechanical ventilation in patients with ARDS. We assessed risk of bias and obtained data on all-cause mortality (determined at hospital discharge or, if unavailable, after longest follow-up period). We used random-effects models for the pooled analyses. We identified 11 RCTs (n=2341) that met our inclusion criteria. In the 6 trials (n=1016) that used a protective ventilation strategy with reduced tidal volumes, prone positioning significantly reduced mortality (risk ratio 0.74, 95% confidence interval 0.59-0.95; I2=29%) compared with supine positioning. The mortality benefit remained in several sensitivity analyses. The overall quality of evidence was high. The risk of bias was low in all of the trials except one, which was small. Statistical heterogeneity was low (I2<50%) for most of the clinical and physiologic outcomes. Our analysis of high-quality evidence showed that use of the prone position during mechanical ventilation improved survival among patients with ARDS who received protective lung ventilation. © 2014 Canadian Medical Association or its licensors.

  14. Probabilistic Material Strength Degradation Model for Inconel 718 Components Subjected to High Temperature, Mechanical Fatigue, Creep and Thermal Fatigue Effects

    NASA Technical Reports Server (NTRS)

    Bast, Callie Corinne Scheidt

    1994-01-01

    This thesis presents the on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes four effects that typically reduce lifetime strength: high temperature, mechanical fatigue, creep, and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for four variables, namely, high temperature, mechanical fatigue, creep, and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using the current version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of mechanical fatigue, creep, and thermal fatigue was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.

  15. Endovascular Treatment of Ischemic Stroke: An Updated Meta-Analysis of Efficacy and Safety.

    PubMed

    Vidale, Simone; Agostoni, Elio

    2017-05-01

    Recent randomized trials demonstrated the superiority of the mechanical thrombectomy over the best medical treatment in patients with acute ischemic stroke due to an occlusion of arteries of proximal anterior circulation. In this updated meta-analysis, we aimed to summarize the total clinical effects of the treatment, including the last trials. We performed literature search of Randomized Crontrolled Trials (RCTs) published between 2010 and October 2016, comparing endovenous thrombolysis plus mechanical thrombectomy (intervention group) with best medical care alone (control group). We identified 8 trials. Primary outcomes were reduced disability at 90 days from the event and symptomatic intracranial hemorrhage. Statistical analysis was performed pooling data into the 2 groups, evaluating outcome heterogeneity. The Mantel-Haenszel method was used to calculate odds ratios (ORs). We analyzed data for 1845 patients (interventional group: 911; control group: 934). Mechanical thrombectomy contributed to a significant reduction in disability rate compared to the best medical treatment alone (OR: 2.087; 95% confidence interval [CI]: 1.718-2.535; P < .001). We calculated that for every 100 treated patients, 16 more participants have a good outcome as a result of mechanical treatment. No significant differences between groups were observed concerning the occurrence of symptomatic hemorrhage (OR: 1.021; 95% CI: 0.641-1.629; P = .739). Mechanical thrombectomy contributes to significantly increase the functional benefit of endovenous thrombolysis in patients with acute ischemic stroke caused by arterial occlusion of proximal anterior circulation, without reduction in safety. These findings are relevant for the optimization of the acute stroke management, including the implementation of networks between stroke centers.

  16. Effect of prone positioning during mechanical ventilation on mortality among patients with acute respiratory distress syndrome: a systematic review and meta-analysis

    PubMed Central

    Sud, Sachin; Friedrich, Jan O.; Adhikari, Neill K. J.; Taccone, Paolo; Mancebo, Jordi; Polli, Federico; Latini, Roberto; Pesenti, Antonio; Curley, Martha A.Q.; Fernandez, Rafael; Chan, Ming-Cheng; Beuret, Pascal; Voggenreiter, Gregor; Sud, Maneesh; Tognoni, Gianni; Gattinoni, Luciano; Guérin, Claude

    2014-01-01

    Background: Mechanical ventilation in the prone position is used to improve oxygenation and to mitigate the harmful effects of mechanical ventilation in patients with acute respiratory distress syndrome (ARDS). We sought to determine the effect of prone positioning on mortality among patients with ARDS receiving protective lung ventilation. Methods: We searched electronic databases and conference proceedings to identify relevant randomized controlled trials (RCTs) published through August 2013. We included RCTs that compared prone and supine positioning during mechanical ventilation in patients with ARDS. We assessed risk of bias and obtained data on all-cause mortality (determined at hospital discharge or, if unavailable, after longest follow-up period). We used random-effects models for the pooled analyses. Results: We identified 11 RCTs (n = 2341) that met our inclusion criteria. In the 6 trials (n = 1016) that used a protective ventilation strategy with reduced tidal volumes, prone positioning significantly reduced mortality (risk ratio 0.74, 95% confidence interval 0.59–0.95; I2 = 29%) compared with supine positioning. The mortality benefit remained in several sensitivity analyses. The overall quality of evidence was high. The risk of bias was low in all of the trials except one, which was small. Statistical heterogeneity was low (I2 < 50%) for most of the clinical and physiologic outcomes. Interpretation: Our analysis of high-quality evidence showed that use of the prone position during mechanical ventilation improved survival among patients with ARDS who received protective lung ventilation. PMID:24863923

  17. In Search of the Most Likely Value

    ERIC Educational Resources Information Center

    Letkowski, Jerzy

    2014-01-01

    Descripting Statistics provides methodology and tools for user-friendly presentation of random data. Among the summary measures that describe focal tendencies in random data, the mode is given the least amount of attention and it is frequently misinterpreted in many introductory textbooks on statistics. The purpose of the paper is to provide a…

  18. Mainstreaming Remedial Mathematics Students in Introductory Statistics: Results Using a Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Logue, Alexandra W.; Watanabe-Rose, Mari

    2014-01-01

    This study used a randomized controlled trial to determine whether students, assessed by their community colleges as needing an elementary algebra (remedial) mathematics course, could instead succeed at least as well in a college-level, credit-bearing introductory statistics course with extra support (a weekly workshop). Researchers randomly…

  19. Reflexology: its effects on physiological anxiety signs and sedation needs.

    PubMed

    Akin Korhan, Esra; Khorshid, Leyla; Uyar, Mehmet

    2014-01-01

    To investigate whether reflexology has an effect on the physiological signs of anxiety and level of sedation in patients receiving mechanically ventilated support, a single blinded, randomized controlled design with repeated measures was used in the intensive care unit of a university hospital in Turkey. Patients (n = 60) aged between 18 and 70 years and were hospitalized in the intensive care unit and receiving mechanically ventilated support. Participants were randomized to a control group or an intervention group. The latter received 30 minutes of reflexology therapy on their feet, hands, and ears for 5 days. Subjects had vital signs taken immediately before the intervention and at the 10th, 20th, and 30th minutes of the intervention. In the collection of the data, "American Association of Critical-Care Nurses Sedation Assessment Scale" was used. The reflexology therapy group had a significantly lower heart rate, systolic blood pressure, diastolic blood pressure, and respiratory rate than the control group. A statistically significant difference was found between the averages of the scores that the patients included in the experimental and control groups received from the agitation, anxiety, sleep, and patient-ventilator synchrony subscales of the American Association of Critical-Care Nurses Sedation Assessment Scale. Reflexology can serve as an effective method of decreasing the physiological signs of anxiety and the required level of sedation in patients receiving mechanically ventilated support. Nurses who have appropriate training and certification may include reflexology in routine care to reduce the physiological signs of anxiety of patients receiving mechanical ventilation.

  20. Dornase alpha compared to hypertonic saline for lung atelectasis in critically ill patients.

    PubMed

    Youness, Houssein A; Mathews, Kathryn; Elya, Marwan K; Kinasewitz, Gary T; Keddissi, Jean I

    2012-12-01

    Despite the lack of randomized trials, nebulized Dornase alpha and hypertonic saline are used empirically to treat atelectasis in mechanically ventilated patients. Our objective was to determine the clinical and radiological efficacy of these medications as an adjunct to standard therapy in critically ill patients. Mechanically ventilated patients with new onset (<48 h) lobar or multilobar atelectasis were randomized into three groups: nebulized Dornase alpha, hypertonic (7%) saline or normal saline every 12 h. All patients received standard therapy, including chest percussion therapy, kinetic therapy, and bronchodilators. The primary endpoint was the change in the daily chest X-ray atelectasis score. A total of 33 patients met the inclusion criteria and were randomized equally into the three groups. Patients in the Dornase alpha group showed a reduction of 2.18±1.33 points in the CXR score from baseline to day 7, whereas patients in the normal saline group had a reduction of 1.00±1.79 points, and patients in the hypertonic saline group showed a score reduction of 1.09±1.51 points. Pairwise comparison of the mean change of the CXR score showed no statistical difference between hypertonic saline, normal saline, and dornase alpha. Airway pressures as well as oxygenation, expressed as PaO(2)/F(I)O(2) and time to extubation also were similar among groups. During the study period the rate of extubation was 54% (6/11), 45% (5/11), and 63% (7/11) in the normal saline, hypertonic saline, and Dornase alpha groups, respectively (p=0.09). No treatment related complications were observed. There was no significant improvement in the chest X-ray atelectasis score in mechanically ventilated patients with new onset atelectasis who were nebulized with Dornase alpha twice a day. Hypertonic saline was no more effective than normal saline in this population. Larger randomized control trials are needed to confirm our results.

  1. Bayesian approach to non-Gaussian field statistics for diffusive broadband terahertz pulses.

    PubMed

    Pearce, Jeremy; Jian, Zhongping; Mittleman, Daniel M

    2005-11-01

    We develop a closed-form expression for the probability distribution function for the field components of a diffusive broadband wave propagating through a random medium. We consider each spectral component to provide an individual observation of a random variable, the configurationally averaged spectral intensity. Since the intensity determines the variance of the field distribution at each frequency, this random variable serves as the Bayesian prior that determines the form of the non-Gaussian field statistics. This model agrees well with experimental results.

  2. On the explicit construction of Parisi landscapes in finite dimensional Euclidean spaces

    NASA Astrophysics Data System (ADS)

    Fyodorov, Y. V.; Bouchaud, J.-P.

    2007-12-01

    An N-dimensional Gaussian landscape with multiscale translation-invariant logarithmic correlations has been constructed, and the statistical mechanics of a single particle in this environment has been investigated. In the limit of a high dimensional N → ∞, the free energy of the system in the thermodynamic limit coincides with the most general version of Derrida’s generalized random energy model. The low-temperature behavior depends essentially on the spectrum of length scales involved in the construction of the landscape. The construction is argued to be valid in any finite spatial dimensions N ≥1.

  3. Typical entanglement

    NASA Astrophysics Data System (ADS)

    Deelan Cunden, Fabio; Facchi, Paolo; Florio, Giuseppe; Pascazio, Saverio

    2013-05-01

    Let a pure state | ψ> be chosen randomly in an NM-dimensional Hilbert space, and consider the reduced density matrix ρ A of an N-dimensional subsystem. The bipartite entanglement properties of | ψ> are encoded in the spectrum of ρ A . By means of a saddle point method and using a "Coulomb gas" model for the eigenvalues, we obtain the typical spectrum of reduced density matrices. We consider the cases of an unbiased ensemble of pure states and of a fixed value of the purity. We finally obtain the eigenvalue distribution by using a statistical mechanics approach based on the introduction of a partition function.

  4. A comparison of medical records and patient questionnaires as sources for the estimation of costs within research studies and the implications for economic evaluation.

    PubMed

    Gillespie, Paddy; O'Shea, Eamon; Smith, Susan M; Cupples, Margaret E; Murphy, Andrew W

    2016-12-01

    Data on health care utilization may be collected using a variety of mechanisms within research studies, each of which may have implications for cost and cost effectiveness. The aim of this observational study is to compare data collected from medical records searches and self-report questionnaires for the cost analysis of a cardiac secondary prevention intervention. Secondary data analysis of the Secondary Prevention of Heart Disease in General Practice (SPHERE) randomized controlled trial (RCT). Resource use data for a range of health care services were collected by research nurse searches of medical records and self-report questionnaires and costs of care estimated for each data collection mechanism. A series of statistical analyses were conducted to compare the mean costs for medical records data versus questionnaire data and to conduct incremental analyses for the intervention and control arms in the trial. Data were available to estimate costs for 95% of patients in the intervention and 96% of patients in the control using the medical records data compared to 65% and 66%, respectively, using the questionnaire data. The incremental analysis revealed a statistically significant difference in mean cost of -€796 (95% CI: -1447, -144; P-value: 0.017) for the intervention relative to the control. This compared to no significant difference in mean cost (95% CI: -1446, 860; P-value: 0.619) for the questionnaire analysis. Our findings illustrate the importance of the choice of health care utilization data collection mechanism for the conduct of economic evaluation alongside randomized trials in primary care. This choice will have implications for the costing methodology employed and potentially, for the cost and cost effectiveness outcomes generated. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Welfare state retrenchment and increasing mental health inequality by educational credentials in Finland: a multicohort study

    PubMed Central

    Kokkinen, Lauri; Muntaner, Carles; Kouvonen, Anne; Koskinen, Aki; Varje, Pekka; Väänänen, Ari

    2015-01-01

    Objectives Epidemiological studies have shown an association between educational credentials and mental disorders, but have not offered any explanation for the varying strength of this association in different historical contexts. In this study, we investigate the education-specific trends in hospitalisation due to psychiatric disorders in Finnish working-age men and women between 1976 and 2010, and offer a welfare state explanation for the secular trends found. Setting Population-based setting with a 25% random sample of the population aged 30–65 years in 7 independent consecutive cohorts (1976–1980, 1981–1985, 1986–1990, 1991–1995, 1996–2000, 2001–2005, 2006–2010). Participants Participants were randomly selected from the Statistics Finland population database (n=2 865 746). These data were linked to diagnosis-specific records on hospitalisations, drawn from the National Hospital Discharge Registry using personal identification numbers. Employment rates by educational credentials were drawn from the Statistics Finland employment database. Primary and secondary outcome measures Hospitalisation and employment. Results We found an increasing trend in psychiatric hospitalisation rates among the population with only an elementary school education, and a decreasing trend in those with higher educational credentials. The employment rate of the population with only an elementary school education decreased more than that of those with higher educational credentials. Conclusions We propose that restricted employment opportunities are the main mechanism behind the increased educational inequality in hospitalisation for psychiatric disorders, while several secondary mechanisms (lack of outpatient healthcare services, welfare cuts, decreased alcohol duty) further accelerated the diverging long-term trends. All of these inequality-increasing mechanisms were activated by welfare state retrenchment, which included the liberalisation of financial markets and labour markets, severe austerity measures and narrowing down of public sector employment commitment. PMID:26041491

  6. Singular Behavior of the Leading Lyapunov Exponent of a Product of Random {2 × 2} Matrices

    NASA Astrophysics Data System (ADS)

    Genovese, Giuseppe; Giacomin, Giambattista; Greenblatt, Rafael Leon

    2017-05-01

    We consider a certain infinite product of random {2 × 2} matrices appearing in the solution of some 1 and 1 + 1 dimensional disordered models in statistical mechanics, which depends on a parameter ɛ > 0 and on a real random variable with distribution {μ}. For a large class of {μ}, we prove the prediction by Derrida and Hilhorst (J Phys A 16:2641, 1983) that the Lyapunov exponent behaves like {C ɛ^{2 α}} in the limit {ɛ \\searrow 0}, where {α \\in (0,1)} and {C > 0} are determined by {μ}. Derrida and Hilhorst performed a two-scale analysis of the integral equation for the invariant distribution of the Markov chain associated to the matrix product and obtained a probability measure that is expected to be close to the invariant one for small {ɛ}. We introduce suitable norms and exploit contractivity properties to show that such a probability measure is indeed close to the invariant one in a sense that implies a suitable control of the Lyapunov exponent.

  7. Hurdles and sorting by inversions: combinatorial, statistical, and experimental results.

    PubMed

    Swenson, Krister M; Lin, Yu; Rajan, Vaibhav; Moret, Bernard M E

    2009-10-01

    As data about genomic architecture accumulates, genomic rearrangements have attracted increasing attention. One of the main rearrangement mechanisms, inversions (also called reversals), was characterized by Hannenhalli and Pevzner and this characterization in turn extended by various authors. The characterization relies on the concepts of breakpoints, cycles, and obstructions colorfully named hurdles and fortresses. In this paper, we study the probability of generating a hurdle in the process of sorting a permutation if one does not take special precautions to avoid them (as in a randomized algorithm, for instance). To do this we revisit and extend the work of Caprara and of Bergeron by providing simple and exact characterizations of the probability of encountering a hurdle in a random permutation. Using similar methods we provide the first asymptotically tight analysis of the probability that a fortress exists in a random permutation. Finally, we study other aspects of hurdles, both analytically and through experiments: when are they created in a sequence of sorting inversions, how much later are they detected, and how much work may need to be undone to return to a sorting sequence.

  8. Observers Exploit Stochastic Models of Sensory Change to Help Judge the Passage of Time

    PubMed Central

    Ahrens, Misha B.; Sahani, Maneesh

    2011-01-01

    Summary Sensory stimulation can systematically bias the perceived passage of time [1–5], but why and how this happens is mysterious. In this report, we provide evidence that such biases may ultimately derive from an innate and adaptive use of stochastically evolving dynamic stimuli to help refine estimates derived from internal timekeeping mechanisms [6–15]. A simplified statistical model based on probabilistic expectations of stimulus change derived from the second-order temporal statistics of the natural environment [16, 17] makes three predictions. First, random noise-like stimuli whose statistics violate natural expectations should induce timing bias. Second, a previously unexplored obverse of this effect is that similar noise stimuli with natural statistics should reduce the variability of timing estimates. Finally, this reduction in variability should scale with the interval being timed, so as to preserve the overall Weber law of interval timing. All three predictions are borne out experimentally. Thus, in the context of our novel theoretical framework, these results suggest that observers routinely rely on sensory input to augment their sense of the passage of time, through a process of Bayesian inference based on expectations of change in the natural environment. PMID:21256018

  9. Wigner surmises and the two-dimensional homogeneous Poisson point process.

    PubMed

    Sakhr, Jamal; Nieminen, John M

    2006-04-01

    We derive a set of identities that relate the higher-order interpoint spacing statistics of the two-dimensional homogeneous Poisson point process to the Wigner surmises for the higher-order spacing distributions of eigenvalues from the three classical random matrix ensembles. We also report a remarkable identity that equates the second-nearest-neighbor spacing statistics of the points of the Poisson process and the nearest-neighbor spacing statistics of complex eigenvalues from Ginibre's ensemble of 2 x 2 complex non-Hermitian random matrices.

  10. Comparative analysis of ferroelectric domain statistics via nonlinear diffraction in random nonlinear materials.

    PubMed

    Wang, B; Switowski, K; Cojocaru, C; Roppo, V; Sheng, Y; Scalora, M; Kisielewski, J; Pawlak, D; Vilaseca, R; Akhouayri, H; Krolikowski, W; Trull, J

    2018-01-22

    We present an indirect, non-destructive optical method for domain statistic characterization in disordered nonlinear crystals having homogeneous refractive index and spatially random distribution of ferroelectric domains. This method relies on the analysis of the wave-dependent spatial distribution of the second harmonic, in the plane perpendicular to the optical axis in combination with numerical simulations. We apply this technique to the characterization of two different media, Calcium Barium Niobate and Strontium Barium Niobate, with drastically different statistical distributions of ferroelectric domains.

  11. Deviations from Rayleigh statistics in ultrasonic speckle.

    PubMed

    Tuthill, T A; Sperry, R H; Parker, K J

    1988-04-01

    The statistics of speckle patterns in ultrasound images have potential for tissue characterization. In "fully developed speckle" from many random scatterers, the amplitude is widely recognized as possessing a Rayleigh distribution. This study examines how scattering populations and signal processing can produce non-Rayleigh distributions. The first order speckle statistics are shown to depend on random scatterer density and the amplitude and spacing of added periodic scatterers. Envelope detection, amplifier compression, and signal bandwidth are also shown to cause distinct changes in the signal distribution.

  12. The living Drake equation of the Tau Zero Foundation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2011-03-01

    The living Drake equation is our statistical generalization of the Drake equation such that it can take into account any number of factors. This new result opens up the possibility to enrich the equation by inserting more new factors as long as the scientific learning increases. The adjective "Living" refers just to this continuous enrichment of the Drake equation and is the goal of a new research project that the Tau Zero Foundation has entrusted to this author as the discoverer of the statistical Drake equation described hereafter. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be arbitrarily distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov form of the CLT, or the Lindeberg form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the lognormal distribution. Then, the mean value, standard deviation, mode, median and all the moments of this lognormal N can be derived from the means and standard deviations of the seven input random variables. In fact, the seven factors in the ordinary Drake equation now become seven independent positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) distance between any two neighbouring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, this distance now becomes a new random variable. We derive the relevant probability density function, apparently previously unknown (dubbed "Maccone distribution" by Paul Davies). Data Enrichment Principle. It should be noticed that any positive number of random variables in the statistical Drake equation is compatible with the CLT. So, our generalization allows for many more factors to be added in the future as long as more refined scientific knowledge about each factor will be known to the scientists. This capability to make room for more future factors in the statistical Drake equation we call the "Data Enrichment Principle", and regard as the key to more profound, future results in Astrobiology and SETI.

  13. Random Evolution of Idiotypic Networks: Dynamics and Architecture

    NASA Astrophysics Data System (ADS)

    Brede, Markus; Behn, Ulrich

    The paper deals with modelling a subsystem of the immune system, the so-called idiotypic network (INW). INWs, conceived by N.K. Jerne in 1974, are functional networks of interacting antibodies and B cells. In principle, Jernes' framework provides solutions to many issues in immunology, such as immunological memory, mechanisms for antigen recognition and self/non-self discrimination. Explaining the interconnection between the elementary components, local dynamics, network formation and architecture, and possible modes of global system function appears to be an ideal playground of statistical mechanics. We present a simple cellular automaton model, based on a graph representation of the system. From a simplified description of idiotypic interactions, rules for the random evolution of networks of occupied and empty sites on these graphs are derived. In certain biologically relevant parameter ranges the resultant dynamics leads to stationary states. A stationary state is found to correspond to a specific pattern of network organization. It turns out that even these very simple rules give rise to a multitude of different kinds of patterns. We characterize these networks by classifying `static' and `dynamic' network-patterns. A type of `dynamic' network is found to display many features of real INWs.

  14. The Statistical Fermi Paradox

    NASA Astrophysics Data System (ADS)

    Maccone, C.

    In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in 2008. 4. A practical example is then given of how the SEH works numerically. Each of the ten random variables is uniformly distributed around its own mean value as given by Dole (1964) and a standard deviation of 10% is assumed. The conclusion is that the average number of habitable planets in the Galaxy should be around 100 million ±200 million, and the average distance in between any two nearby habitable planets should be about 88 light years ±40 light years. 5. The SEH results are matched against the results of the Statistical Drake Equation from reference 4. As expected, the number of currently communicating ET civilizations in the Galaxy turns out to be much smaller than the number of habitable planets (about 10,000 against 100 million, i.e. one ET civilization out of 10,000 habitable planets). The average distance between any two nearby habitable planets is much smaller that the average distance between any two neighbouring ET civilizations: 88 light years vs. 2000 light years, respectively. This means an ET average distance about 20 times higher than the average distance between any pair of adjacent habitable planets. 6. Finally, a statistical model of the Fermi Paradox is derived by applying the above results to the coral expansion model of Galactic colonization. The symbolic manipulator "Macsyma" is used to solve these difficult equations. A new random variable Tcol, representing the time needed to colonize a new planet is introduced, which follows the lognormal distribution, Then the new quotient random variable Tcol/D is studied and its probability density function is derived by Macsyma. Finally a linear transformation of random variables yields the overall time TGalaxy needed to colonize the whole Galaxy. We believe that our mathematical work in deriving this STATISTICAL Fermi Paradox is highly innovative and fruitful for the future.

  15. A new u-statistic with superior design sensitivity in matched observational studies.

    PubMed

    Rosenbaum, Paul R

    2011-09-01

    In an observational or nonrandomized study of treatment effects, a sensitivity analysis indicates the magnitude of bias from unmeasured covariates that would need to be present to alter the conclusions of a naïve analysis that presumes adjustments for observed covariates suffice to remove all bias. The power of sensitivity analysis is the probability that it will reject a false hypothesis about treatment effects allowing for a departure from random assignment of a specified magnitude; in particular, if this specified magnitude is "no departure" then this is the same as the power of a randomization test in a randomized experiment. A new family of u-statistics is proposed that includes Wilcoxon's signed rank statistic but also includes other statistics with substantially higher power when a sensitivity analysis is performed in an observational study. Wilcoxon's statistic has high power to detect small effects in large randomized experiments-that is, it often has good Pitman efficiency-but small effects are invariably sensitive to small unobserved biases. Members of this family of u-statistics that emphasize medium to large effects can have substantially higher power in a sensitivity analysis. For example, in one situation with 250 pair differences that are Normal with expectation 1/2 and variance 1, the power of a sensitivity analysis that uses Wilcoxon's statistic is 0.08 while the power of another member of the family of u-statistics is 0.66. The topic is examined by performing a sensitivity analysis in three observational studies, using an asymptotic measure called the design sensitivity, and by simulating power in finite samples. The three examples are drawn from epidemiology, clinical medicine, and genetic toxicology. © 2010, The International Biometric Society.

  16. Molecular association of pathogenetic contributors to pre-eclampsia (pre-eclampsia associome)

    PubMed Central

    2015-01-01

    Background Pre-eclampsia is the most common complication occurring during pregnancy. In the majority of cases, it is concurrent with other pathologies in a comorbid manner (frequent co-occurrences in patients), such as diabetes mellitus, gestational diabetes and obesity. Providing bronchial asthma, pulmonary tuberculosis, certain neurodegenerative diseases and cancers as examples, we have shown previously that pairs of inversely comorbid pathologies (rare co-occurrences in patients) are more closely related to each other at the molecular genetic level compared with randomly generated pairs of diseases. Data in the literature concerning the causes of pre-eclampsia are abundant. However, the key mechanisms triggering this disease that are initiated by other pathological processes are thus far unknown. The aim of this work was to analyse the characteristic features of genetic networks that describe interactions between comorbid diseases, using pre-eclampsia as a case in point. Results The use of ANDSystem, Pathway Studio and STRING computer tools based on text-mining and database-mining approaches allowed us to reconstruct associative networks, representing molecular genetic interactions between genes, associated concurrently with comorbid disease pairs, including pre-eclampsia, diabetes mellitus, gestational diabetes and obesity. It was found that these associative networks statistically differed in the number of genes and interactions between them from those built for randomly chosen pairs of diseases. The associative network connecting all four diseases was composed of 16 genes (PLAT, ADIPOQ, ADRB3, LEPR, HP, TGFB1, TNFA, INS, CRP, CSRP1, IGFBP1, MBL2, ACE, ESR1, SHBG, ADA). Such an analysis allowed us to reveal differential gene risk factors for these diseases, and to propose certain, most probable, theoretical mechanisms of pre-eclampsia development in pregnant women. The mechanisms may include the following pathways: [TGFB1 or TNFA]-[IL1B]-[pre-eclampsia]; [TNFA or INS]-[NOS3]-[pre-eclampsia]; [INS]-[HSPA4 or CLU]-[pre-eclampsia]; [ACE]-[MTHFR]-[pre-eclampsia]. Conclusions For pre-eclampsia, diabetes mellitus, gestational diabetes and obesity, we showed that the size and connectivity of the associative molecular genetic networks, which describe interactions between comorbid diseases, statistically exceeded the size and connectivity of those built for randomly chosen pairs of diseases. Recently, we have shown a similar result for inversely comorbid diseases. This suggests that comorbid and inversely comorbid diseases have common features concerning structural organization of associative molecular genetic networks. PMID:25879409

  17. Paretian Poisson Processes

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2008-05-01

    Many random populations can be modeled as a countable set of points scattered randomly on the positive half-line. The points may represent magnitudes of earthquakes and tornados, masses of stars, market values of public companies, etc. In this article we explore a specific class of random such populations we coin ` Paretian Poisson processes'. This class is elemental in statistical physics—connecting together, in a deep and fundamental way, diverse issues including: the Poisson distribution of the Law of Small Numbers; Paretian tail statistics; the Fréchet distribution of Extreme Value Theory; the one-sided Lévy distribution of the Central Limit Theorem; scale-invariance, renormalization and fractality; resilience to random perturbations.

  18. Stochastic space interval as a link between quantum randomness and macroscopic randomness?

    NASA Astrophysics Data System (ADS)

    Haug, Espen Gaarder; Hoff, Harald

    2018-03-01

    For many stochastic phenomena, we observe statistical distributions that have fat-tails and high-peaks compared to the Gaussian distribution. In this paper, we will explain how observable statistical distributions in the macroscopic world could be related to the randomness in the subatomic world. We show that fat-tailed (leptokurtic) phenomena in our everyday macroscopic world are ultimately rooted in Gaussian - or very close to Gaussian-distributed subatomic particle randomness, but they are not, in a strict sense, Gaussian distributions. By running a truly random experiment over a three and a half-year period, we observed a type of random behavior in trillions of photons. Combining our results with simple logic, we find that fat-tailed and high-peaked statistical distributions are exactly what we would expect to observe if the subatomic world is quantized and not continuously divisible. We extend our analysis to the fact that one typically observes fat-tails and high-peaks relative to the Gaussian distribution in stocks and commodity prices and many aspects of the natural world; these instances are all observable and documentable macro phenomena that strongly suggest that the ultimate building blocks of nature are discrete (e.g. they appear in quanta).

  19. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    NASA Astrophysics Data System (ADS)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  20. Effectiveness of adjunctive subgingival administration of amino acids and sodium hyaluronate gel on clinical and immunological parameters in the treatment of chronic periodontitis

    PubMed Central

    Bevilacqua, Lorenzo; Eriani, Jessica; Serroni, Ilde; Liani, Giuliana; Borelli, Violetta; Castronovo, Gaetano; Di Lenarda, Roberto

    2012-01-01

    Summary Aims The aim of this clinical trial was to compare clinical and biochemical healing outcomes following ultrasonic mechanical instrumentation versus ultrasonic mechanical instrumentation associated with topical subgingival application of amino acids and sodium hyaluronate gel. Methods Eleven systemically healthy subjects with moderate-severe chronic periodontitis, who had four sites with pocket probing depth and clinical attachment level greater than or equal to 5 mm were randomly assigned to two different types of treatment: two pockets were treated with ultrasonic debridement (Control Group) and two pockets with ultrasonic mechanical instrumentation associated with 0,5 ml of amino acids and sodium hyaluronate gel (Test Group). Probing depth, clinical attachment level, plaque index and bleeding on probing were recorded at baseline, 45 and 90 days. Levels of calprotectin and myeloperoxidase activity in gingival crevicular fluid were assessed at baseline and on day 7 and 45. Results Statistical significance was found between baseline and day 45 in relation to probing depth reduction and bleeding on probing between groups for both of the tested treatments. Significant reductions in μg/sample of calprotectin and myeloperoxidase were found after 1-week and an increase at 45 days in both groups. There were no statistically significant differences between other variables evaluated in this study. Conclusions These data suggest that subgingival application of hyaluronic acid following ultrasonic mechanical instrumentation is beneficial for improving periodontal parameters. PMID:23087790

  1. Statistical power in parallel group point exposure studies with time-to-event outcomes: an empirical comparison of the performance of randomized controlled trials and the inverse probability of treatment weighting (IPTW) approach.

    PubMed

    Austin, Peter C; Schuster, Tibor; Platt, Robert W

    2015-10-15

    Estimating statistical power is an important component of the design of both randomized controlled trials (RCTs) and observational studies. Methods for estimating statistical power in RCTs have been well described and can be implemented simply. In observational studies, statistical methods must be used to remove the effects of confounding that can occur due to non-random treatment assignment. Inverse probability of treatment weighting (IPTW) using the propensity score is an attractive method for estimating the effects of treatment using observational data. However, sample size and power calculations have not been adequately described for these methods. We used an extensive series of Monte Carlo simulations to compare the statistical power of an IPTW analysis of an observational study with time-to-event outcomes with that of an analysis of a similarly-structured RCT. We examined the impact of four factors on the statistical power function: number of observed events, prevalence of treatment, the marginal hazard ratio, and the strength of the treatment-selection process. We found that, on average, an IPTW analysis had lower statistical power compared to an analysis of a similarly-structured RCT. The difference in statistical power increased as the magnitude of the treatment-selection model increased. The statistical power of an IPTW analysis tended to be lower than the statistical power of a similarly-structured RCT.

  2. The Random Forests Statistical Technique: An Examination of Its Value for the Study of Reading

    ERIC Educational Resources Information Center

    Matsuki, Kazunaga; Kuperman, Victor; Van Dyke, Julie A.

    2016-01-01

    Studies investigating individual differences in reading ability often involve data sets containing a large number of collinear predictors and a small number of observations. In this article, we discuss the method of Random Forests and demonstrate its suitability for addressing the statistical concerns raised by such data sets. The method is…

  3. Eliciting and Developing Teachers' Conceptions of Random Processes in a Probability and Statistics Course

    ERIC Educational Resources Information Center

    Smith, Toni M.; Hjalmarson, Margret A.

    2013-01-01

    The purpose of this study is to examine prospective mathematics specialists' engagement in an instructional sequence designed to elicit and develop their understandings of random processes. The study was conducted with two different sections of a probability and statistics course for K-8 teachers. Thirty-two teachers participated. Video analyses…

  4. Statistics of premixed flame cells

    NASA Technical Reports Server (NTRS)

    Noever, David A.

    1991-01-01

    The statistics of random cellular patterns in premixed flames are analyzed. Agreement is found with a variety of topological relations previously found for other networks, namely, Lewis's law and Aboav's law. Despite the diverse underlying physics, flame cells are shown to share a broad class of geometric properties with other random networks-metal grains, soap foams, bioconvection, and Langmuir monolayers.

  5. Asking Sensitive Questions: A Statistical Power Analysis of Randomized Response Models

    ERIC Educational Resources Information Center

    Ulrich, Rolf; Schroter, Hannes; Striegel, Heiko; Simon, Perikles

    2012-01-01

    This article derives the power curves for a Wald test that can be applied to randomized response models when small prevalence rates must be assessed (e.g., detecting doping behavior among elite athletes). These curves enable the assessment of the statistical power that is associated with each model (e.g., Warner's model, crosswise model, unrelated…

  6. Levy Matrices and Financial Covariances

    NASA Astrophysics Data System (ADS)

    Burda, Zdzislaw; Jurkiewicz, Jerzy; Nowak, Maciej A.; Papp, Gabor; Zahed, Ismail

    2003-10-01

    In a given market, financial covariances capture the intra-stock correlations and can be used to address statistically the bulk nature of the market as a complex system. We provide a statistical analysis of three SP500 covariances with evidence for raw tail distributions. We study the stability of these tails against reshuffling for the SP500 data and show that the covariance with the strongest tails is robust, with a spectral density in remarkable agreement with random Lévy matrix theory. We study the inverse participation ratio for the three covariances. The strong localization observed at both ends of the spectral density is analogous to the localization exhibited in the random Lévy matrix ensemble. We discuss two competitive mechanisms responsible for the occurrence of an extensive and delocalized eigenvalue at the edge of the spectrum: (a) the Lévy character of the entries of the correlation matrix and (b) a sort of off-diagonal order induced by underlying inter-stock correlations. (b) can be destroyed by reshuffling, while (a) cannot. We show that the stocks with the largest scattering are the least susceptible to correlations, and likely candidates for the localized states. We introduce a simple model for price fluctuations which captures behavior of the SP500 covariances. It may be of importance for assets diversification.

  7. Adhesion promotion at a homopolymer-solid interface using random heteropolymers

    NASA Astrophysics Data System (ADS)

    Simmons, Edward Read; Chakraborty, Arup K.

    1998-11-01

    We investigate the potential uses for random heteropolymers (RHPs) as adhesion promoters between a homopolymer melt and a solid surface. We consider homopolymers of monomer (segment) type A which are naturally repelled from a solid surface. To this system we add RHPs with both A and B (attractive to the surface) type monomers to promote adhesion between the two incompatible substrates. We employ Monte Carlo simulations to investigate the effects of variations in the sequence statistics of the RHPs, amount of promoter added, and strength of the segment-segment and segment-surface interaction parameters. Clearly, the parameter space in such a system is quite large, but we are able to describe, in a qualitative manner, the optimal parameters for adhesion promotion. The optimal set of parameters yield interfacial conformational statistics for the RHPs which have a relatively high adsorbed fraction and also long loops extending away from the surface that promote entanglements with the bulk homopolymer melt. In addition, we present qualitative evidence that the concentration of RHP segments per surface site plays an important role in determining the mechanism of failure (cohesive versus adhesive) at such an interface. Our results also provide the necessary input for future simulations in which the system may be strained to the limit of fracture.

  8. Generation of pseudo-random numbers

    NASA Technical Reports Server (NTRS)

    Howell, L. W.; Rheinfurth, M. H.

    1982-01-01

    Practical methods for generating acceptable random numbers from a variety of probability distributions which are frequently encountered in engineering applications are described. The speed, accuracy, and guarantee of statistical randomness of the various methods are discussed.

  9. Acupressure Improves the Weaning Indices of Tidal Volumes and Rapid Shallow Breathing Index in Stable Coma Patients Receiving Mechanical Ventilation: Randomized Controlled Trial

    PubMed Central

    Maa, Suh-Hwa; Wang, Chiu-Hua; Hsu, Kuang-Hung; Lin, Horng-Chyuan; Yee, Brian; MacDonald, Karen

    2013-01-01

    Background. Acupressure has been shown to improve respiratory parameters. We investigated the effects of acupressure on weaning indices in stable coma patients receiving mechanical ventilation. Methods. Patients were randomly allocated to one of three treatments: standard care with adjunctive acupressure on one (n = 32) or two days (n = 31) and standard care (n = 31). Acupressure in the form of 10 minutes of bilateral stimulation at five acupoints was administered per treatment session. Weaning indices were collected on two days before, right after, and at 0.5 hrs, 1 hr, 1.5 hrs, 2 hrs, 2.5 hrs, 3 hrs, 3.5 hrs, and 4 hrs after the start of treatment. Results. There were statistically significant improvements in tidal volumes and index of rapid shallow breathing in the one-day and two-day adjunctive acupressure study arms compared to the standard care arm immediately after acupressure and persisting until 0.5, 1 hr, and 2 hrs after adjustment for covariates. Conclusions. In the stable ventilated coma patient, adjunctive acupressure contributes to improvements in tidal volumes and the index of rapid shallow breathing, the two indices most critical for weaning patients from mechanical ventilation. These effects tend to be immediate and likely to be sustained for 1 to 2 hours. PMID:23710234

  10. Thermodynamics and mechanics of stretch-induced crystallization in rubbers

    NASA Astrophysics Data System (ADS)

    Guo, Qiang; Zaïri, Fahmi; Guo, Xinglin

    2018-05-01

    The aim of the present paper is to provide a quantitative prediction of the stretch-induced crystallization in natural rubber, the exclusive reason for its history-dependent thermomechanical features. A constitutive model based on a micromechanism inspired molecular chain approach is formulated within the context of the thermodynamic framework. The molecular configuration of the partially crystallized single chain is analyzed and calculated by means of some statistical mechanical methods. The random thermal oscillation of the crystal orientation, considered as a continuous random variable, is treated by means of a representative angle. The physical expression of the chain free energy is derived according to a two-step strategy by separating crystallization and stretching. This strategy ensures that the stretch-induced part of the thermodynamic crystallization force is null at the initial instant and allows, without any additional constraint, the formulation of a simple linear relationship for the crystallinity evolution law. The model contains very few physically interpretable material constants to simulate the complex mechanism: two chain-scale constants, one crystallinity kinetics constant, three thermodynamic constants related to the newly formed crystallites, and a function controlling the crystal orientation with respect to the chain. The model is used to discuss some important aspects of the micromechanism and the macroresponse under the equilibrium state and the nonequilibrium state involved during stretching and recovery, and continuous relaxation.

  11. ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prigogine, I.; Balescu, R.; Henin, F.

    1960-12-01

    Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)

  12. A statistical framework for protein quantitation in bottom-up MS-based proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas

    2009-08-15

    ABSTRACT Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and confidence measures. Challenges include the presence of low-quality or incorrectly identified peptides and widespread, informative, missing data. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model for protein abundance in terms of peptide peak intensities, applicable to both label-based and label-free quantitation experiments. The model allows for both random and censoring missingness mechanisms and provides naturally for protein-level estimates and confidence measures. The model is also used to derive automated filtering and imputation routines. Three LC-MS datasets are used tomore » illustrate the methods. Availability: The software has been made available in the open-source proteomics platform DAnTE (Polpitiya et al. (2008)) (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu« less

  13. Fluid leakage near the percolation threshold

    NASA Astrophysics Data System (ADS)

    Dapp, Wolf B.; Müser, Martin H.

    2016-02-01

    Percolation is a concept widely used in many fields of research and refers to the propagation of substances through porous media (e.g., coffee filtering), or the behaviour of complex networks (e.g., spreading of diseases). Percolation theory asserts that most percolative processes are universal, that is, the emergent powerlaws only depend on the general, statistical features of the macroscopic system, but not on specific details of the random realisation. In contrast, our computer simulations of the leakage through a seal—applying common assumptions of elasticity, contact mechanics, and fluid dynamics—show that the critical behaviour (how the flow ceases near the sealing point) solely depends on the microscopic details of the last constriction. It appears fundamentally impossible to accurately predict from statistical properties of the surfaces alone how strongly we have to tighten a water tap to make it stop dripping and also how it starts dripping once we loosen it again.

  14. Effects of catgut-embedding acupuncture technique on nitric oxide levels and blood pressure in patients with essential hypertension

    NASA Astrophysics Data System (ADS)

    Suhana; Srilestari, A.; Marbun, M. B. H.; Mihardja, H.

    2017-08-01

    Hypertension is common a health problem and its prevalence in Indonesia is quite high (31.7%). Catgut embedding—an acupuncture technique—is known to reduce blood pressure; however, no study has confirmed the underlying mechanism. This study examines the effect of catgut embedding on serum nitric oxide (NO) concentration and blood pressure in patients with essential hypertension. Forty hypertension patients were randomly assigned to two groups: the control group received anti-hypertensive drugs whereas the case group received anti-hypertensive drugs and catgut embedding. Results showed a statistically significant mean difference in NO concentration (p < 0.05) and statistically and clinically significant mean difference in systolic and diastolic blood pressure between the two groups (p < 0.05). The results confirm that catgut embedding can influence serum NO concentration and blood pressure in essential hypertension patients.

  15. From plant traits to plant communities: a statistical mechanistic approach to biodiversity.

    PubMed

    Shipley, Bill; Vile, Denis; Garnier, Eric

    2006-11-03

    We developed a quantitative method, analogous to those used in statistical mechanics, to predict how biodiversity will vary across environments, which plant species from a species pool will be found in which relative abundances in a given environment, and which plant traits determine community assembly. This provides a scaling from plant traits to ecological communities while bypassing the complications of population dynamics. Our method treats community development as a sorting process involving species that are ecologically equivalent except with respect to particular functional traits, which leads to a constrained random assembly of species; the relative abundance of each species adheres to a general exponential distribution as a function of its traits. Using data for eight functional traits of 30 herbaceous species and community-aggregated values of these traits in 12 sites along a 42-year chronosequence of secondary succession, we predicted 94% of the variance in the relative abundances.

  16. Statistical complexity measure of pseudorandom bit generators

    NASA Astrophysics Data System (ADS)

    González, C. M.; Larrondo, H. A.; Rosso, O. A.

    2005-08-01

    Pseudorandom number generators (PRNG) are extensively used in Monte Carlo simulations, gambling machines and cryptography as substitutes of ideal random number generators (RNG). Each application imposes different statistical requirements to PRNGs. As L’Ecuyer clearly states “the main goal for Monte Carlo methods is to reproduce the statistical properties on which these methods are based whereas for gambling machines and cryptology, observing the sequence of output values for some time should provide no practical advantage for predicting the forthcoming numbers better than by just guessing at random”. In accordance with different applications several statistical test suites have been developed to analyze the sequences generated by PRNGs. In a recent paper a new statistical complexity measure [Phys. Lett. A 311 (2003) 126] has been defined. Here we propose this measure, as a randomness quantifier of a PRNGs. The test is applied to three very well known and widely tested PRNGs available in the literature. All of them are based on mathematical algorithms. Another PRNGs based on Lorenz 3D chaotic dynamical system is also analyzed. PRNGs based on chaos may be considered as a model for physical noise sources and important new results are recently reported. All the design steps of this PRNG are described, and each stage increase the PRNG randomness using different strategies. It is shown that the MPR statistical complexity measure is capable to quantify this randomness improvement. The PRNG based on the chaotic 3D Lorenz dynamical system is also evaluated using traditional digital signal processing tools for comparison.

  17. The Statistical Mechanics of Ideal MHD Turbulence

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    2003-01-01

    Turbulence is a universal, nonlinear phenomenon found in all energetic fluid and plasma motion. In particular. understanding magneto hydrodynamic (MHD) turbulence and incorporating its effects in the computation and prediction of the flow of ionized gases in space, for example, are great challenges that must be met if such computations and predictions are to be meaningful. Although a general solution to the "problem of turbulence" does not exist in closed form, numerical integrations allow us to explore the phase space of solutions for both ideal and dissipative flows. For homogeneous, incompressible turbulence, Fourier methods are appropriate, and phase space is defined by the Fourier coefficients of the physical fields. In the case of ideal MHD flows, a fairly robust statistical mechanics has been developed, in which the symmetry and ergodic properties of phase space is understood. A discussion of these properties will illuminate our principal discovery: Coherent structure and randomness co-exist in ideal MHD turbulence. For dissipative flows, as opposed to ideal flows, progress beyond the dimensional analysis of Kolmogorov has been difficult. Here, some possible future directions that draw on the ideal results will also be discussed. Our conclusion will be that while ideal turbulence is now well understood, real turbulence still presents great challenges.

  18. Statistical mechanics of high-density bond percolation

    NASA Astrophysics Data System (ADS)

    Timonin, P. N.

    2018-05-01

    High-density (HD) percolation describes the percolation of specific κ -clusters, which are the compact sets of sites each connected to κ nearest filled sites at least. It takes place in the classical patterns of independently distributed sites or bonds in which the ordinary percolation transition also exists. Hence, the study of series of κ -type HD percolations amounts to the description of classical clusters' structure for which κ -clusters constitute κ -cores nested one into another. Such data are needed for description of a number of physical, biological, and information properties of complex systems on random lattices, graphs, and networks. They range from magnetic properties of semiconductor alloys to anomalies in supercooled water and clustering in biological and social networks. Here we present the statistical mechanics approach to study HD bond percolation on an arbitrary graph. It is shown that the generating function for κ -clusters' size distribution can be obtained from the partition function of the specific q -state Potts-Ising model in the q →1 limit. Using this approach we find exact κ -clusters' size distributions for the Bethe lattice and Erdos-Renyi graph. The application of the method to Euclidean lattices is also discussed.

  19. Comparison between effectiveness of Mechanical and Manual Traction combined with mobilization and exercise therapy in Patients with Cervical Radiculopathy.

    PubMed

    Bukhari, Syed Rehan Iftikhar; Shakil-Ur-Rehman, Syed; Ahmad, Shakeel; Naeem, Aamer

    2016-01-01

    Cervical radiculopathy is a common neuro-musculo-skeletal disorder causing pain and disability. Traction is part of the evidence based manual physical therapy management due to its mechanical nature, type of traction and parameters related to its applicability and are still to be explored more through research. Our objective was to determine the Effects of Mechanical versus Manual Traction in Manual Physical Therapy combined with segmental mobilization and exercise therapy in the physical therapy management of Patients with Cervical Radiculopathy. This randomized control trial was conducted at department of physical therapy and rehabilitation, Rathore Hospital Faisalabad, from February to July 2015. Inclusion criteria were both male and female patients with evident symptoms of cervical spine radiculopathy and age ranged between 20-70 years. The exclusion criteria were Patients with history of trauma, neck pain without radiculopathy, aged less than 20 and more than 70. A total of 72 patients with cervical radiculopathy were screened out as per the inclusion criteria, 42 patients were randomly selected and placed into two groups by toss and trial method, and only 36 patients completed the study, while 6 dropped out. The mechanical traction was applied in group A and manual traction in group B along with common intervention of segmental mobilization and exercise therapy in both groups for 6 weeks. The patient's outcomes were assessed by self reported NPRS and NDI at the baseline and after completion of 06 weeks exercise program at 3 days per week. The data was analyzed through SPSS version-21, and paired T test was applied at 95% level significance to determine the statistical deference between two groups. Clinically the group of patients treated with mechanical traction managed pain (mean pre 6.26, mean post 1.43), and disability (mean pre 24.43 and mean post 7.26) more effectively as compared with the group of patients treated with manual traction (Pain mean pre 6.80, mean post 3.85 and disability mean pre 21.92 and post 12.19). Statistically the results of both mechanical and manual traction techniques are equally significant in group A and B for pain and disability (p-value less than 0.05). If patients of cervical radiculopathy treated with mechanical traction, segmental mobilization, and exercise therapy will manage pain and disability more effectively than treated with manual traction, segmental mobilization, and exercise therapy.

  20. Quantifying, displaying and accounting for heterogeneity in the meta-analysis of RCTs using standard and generalised Q statistics

    PubMed Central

    2011-01-01

    Background Clinical researchers have often preferred to use a fixed effects model for the primary interpretation of a meta-analysis. Heterogeneity is usually assessed via the well known Q and I2 statistics, along with the random effects estimate they imply. In recent years, alternative methods for quantifying heterogeneity have been proposed, that are based on a 'generalised' Q statistic. Methods We review 18 IPD meta-analyses of RCTs into treatments for cancer, in order to quantify the amount of heterogeneity present and also to discuss practical methods for explaining heterogeneity. Results Differing results were obtained when the standard Q and I2 statistics were used to test for the presence of heterogeneity. The two meta-analyses with the largest amount of heterogeneity were investigated further, and on inspection the straightforward application of a random effects model was not deemed appropriate. Compared to the standard Q statistic, the generalised Q statistic provided a more accurate platform for estimating the amount of heterogeneity in the 18 meta-analyses. Conclusions Explaining heterogeneity via the pre-specification of trial subgroups, graphical diagnostic tools and sensitivity analyses produced a more desirable outcome than an automatic application of the random effects model. Generalised Q statistic methods for quantifying and adjusting for heterogeneity should be incorporated as standard into statistical software. Software is provided to help achieve this aim. PMID:21473747

  1. Effectiveness of the Pilates Method in the Treatment of Chronic Mechanical Neck Pain: A Randomized Controlled Trial.

    PubMed

    de Araujo Cazotti, Luciana; Jones, Anamaria; Roger-Silva, Diego; Ribeiro, Luiza Helena Coutinho; Natour, Jamil

    2018-05-09

    To assess the effectiveness of the Pilates method on pain, function, quality of life, and consumption of pain medication in patients with mechanical neck pain. The design was a randomized controlled trial, with a blinded assessor and intention-to-treat analysis. The study took place in the outpatient clinic of the rheumatology department, referral center. Sixty-four patients with chronic mechanical neck pain were randomly allocated into 2 groups: the Pilates group (PG) and control group (CG). The PG attended 2 sessions of Pilates per week, for 12 weeks. The protocol included Pilates exercises performed on a mat and on equipment and was adapted depending on the physical fitness of each participant; the repetitions varied from 6 to 12, respecting patient reports of fatigue and pain, using a single series for each exercise. The CG received only the standard pharmacological treatment. Both groups were instructed to use acetaminophen 750 mg if necessary. Patients were evaluated at baseline after 45, 90, and 180 days. We used the numerical pain scale (NPS) for pain; the neck disability index (NDI) for function, and the SF-36 questionnaire for quality of life. The groups were homogeneous at baseline, the only exception being body mass index (BMI), with the PG showing higher BMI. Regarding the assessment between groups over time (ANOVA), statistical differences were identified for pain (p <0.001), function (p <0.001) and the SF-36 (functional capacity, p=0.019; pain, p<0.001; general health, p=0.022; vitality, p <0.001; mental health, p = 0.012) with the PG consistently achieving better results. The drug consumption was lower in patients in the PG (p = 0.037). This trial demonstrated the effectiveness of the Pilates method for the treatment of chronic mechanical neck pain, resulting in improvement of pain, function, quality of life, and reduction of the use of analgesics. Copyright © 2018. Published by Elsevier Inc.

  2. A randomized clinical trial of continuous aspiration of subglottic secretions in cardiac surgery patients.

    PubMed

    Kollef, M H; Skubas, N J; Sundt, T M

    1999-11-01

    To determine whether the application of continuous aspiration of subglottic secretions (CASS) is associated with a decreased incidence of ventilator-associated pneumonia (VAP). Prospective clinical trial. Cardiothoracic ICU (CTICU) of Barnes-Jewish Hospital, St. Louis, a university-affiliated teaching hospital. Three hundred forty-three patients undergoing cardiac surgery and requiring mechanical ventilation in the CTICU. Patients were assigned to receive either CASS, using a specially designed endotracheal tube (Hi-Lo Evac; Mallinckrodt Inc; Athlone, Ireland), or routine postoperative medical care without CASS. One hundred sixty patients were assigned to receive CASS, and 183 were assigned to receive routine postoperative medical care without CASS. The two groups were similar at the time of randomization with regard to demographic characteristics, surgical procedures performed, and severity of illness. Risk factors for the development of VAP were also similar during the study period for both treatment groups. VAP was seen in 8 patients (5.0%) receiving CASS and in 15 patients (8. 2%) receiving routine postoperative medical care without CASS (relative risk, 0.61%; 95% confidence interval, 0.27 to 1.40; p = 0. 238). Episodes of VAP occurred statistically later among patients receiving CASS ([mean +/- SD] 5.6 +/- 2.3 days) than among patients who did not receive CASS (2.9 +/- 1.2 days); (p = 0.006). No statistically significant differences for hospital mortality, overall duration of mechanical ventilation, lengths of stay in the hospital or CTICU, or acquired organ system derangements were found between the two treatment groups. No complications related to CASS were observed in the intervention group. Our findings suggest that CASS can be safely administered to patients undergoing cardiac surgery. The occurrence of VAP can be significantly delayed among patients undergoing cardiac surgery using this simple-to-apply technique.

  3. Rare Event Simulation for T-cell Activation

    NASA Astrophysics Data System (ADS)

    Lipsmeier, Florian; Baake, Ellen

    2009-02-01

    The problem of statistical recognition is considered, as it arises in immunobiology, namely, the discrimination of foreign antigens against a background of the body's own molecules. The precise mechanism of this foreign-self-distinction, though one of the major tasks of the immune system, continues to be a fundamental puzzle. Recent progress has been made by van den Berg, Rand, and Burroughs (J. Theor. Biol. 209:465-486, 2001), who modelled the probabilistic nature of the interaction between the relevant cell types, namely, T-cells and antigen-presenting cells (APCs). Here, the stochasticity is due to the random sample of antigens present on the surface of every APC, and to the random receptor type that characterises individual T-cells. It has been shown previously (van den Berg et al. in J. Theor. Biol. 209:465-486, 2001; Zint et al. in J. Math. Biol. 57:841-861, 2008) that this model, though highly idealised, is capable of reproducing important aspects of the recognition phenomenon, and of explaining them on the basis of stochastic rare events. These results were obtained with the help of a refined large deviation theorem and were thus asymptotic in nature. Simulations have, so far, been restricted to the straightforward simple sampling approach, which does not allow for sample sizes large enough to address more detailed questions. Building on the available large deviation results, we develop an importance sampling technique that allows for a convenient exploration of the relevant tail events by means of simulation. With its help, we investigate the mechanism of statistical recognition in some depth. In particular, we illustrate how a foreign antigen can stand out against the self background if it is present in sufficiently many copies, although no a priori difference between self and nonself is built into the model.

  4. Influences of system uncertainties on the numerical transfer path analysis of engine systems

    NASA Astrophysics Data System (ADS)

    Acri, A.; Nijman, E.; Acri, A.; Offner, G.

    2017-10-01

    Practical mechanical systems operate with some degree of uncertainty. In numerical models uncertainties can result from poorly known or variable parameters, from geometrical approximation, from discretization or numerical errors, from uncertain inputs or from rapidly changing forcing that can be best described in a stochastic framework. Recently, random matrix theory was introduced to take parameter uncertainties into account in numerical modeling problems. In particular in this paper, Wishart random matrix theory is applied on a multi-body dynamic system to generate random variations of the properties of system components. Multi-body dynamics is a powerful numerical tool largely implemented during the design of new engines. In this paper the influence of model parameter variability on the results obtained from the multi-body simulation of engine dynamics is investigated. The aim is to define a methodology to properly assess and rank system sources when dealing with uncertainties. Particular attention is paid to the influence of these uncertainties on the analysis and the assessment of the different engine vibration sources. Examples of the effects of different levels of uncertainties are illustrated by means of examples using a representative numerical powertrain model. A numerical transfer path analysis, based on system dynamic substructuring, is used to derive and assess the internal engine vibration sources. The results obtained from this analysis are used to derive correlations between parameter uncertainties and statistical distribution of results. The derived statistical information can be used to advance the knowledge of the multi-body analysis and the assessment of system sources when uncertainties in model parameters are considered.

  5. Blessing of dimensionality: mathematical foundations of the statistical physics of data.

    PubMed

    Gorban, A N; Tyukin, I Y

    2018-04-28

    The concentrations of measure phenomena were discovered as the mathematical background to statistical mechanics at the end of the nineteenth/beginning of the twentieth century and have been explored in mathematics ever since. At the beginning of the twenty-first century, it became clear that the proper utilization of these phenomena in machine learning might transform the curse of dimensionality into the blessing of dimensionality This paper summarizes recently discovered phenomena of measure concentration which drastically simplify some machine learning problems in high dimension, and allow us to correct legacy artificial intelligence systems. The classical concentration of measure theorems state that i.i.d. random points are concentrated in a thin layer near a surface (a sphere or equators of a sphere, an average or median-level set of energy or another Lipschitz function, etc.). The new stochastic separation theorems describe the thin structure of these thin layers: the random points are not only concentrated in a thin layer but are all linearly separable from the rest of the set, even for exponentially large random sets. The linear functionals for separation of points can be selected in the form of the linear Fisher's discriminant. All artificial intelligence systems make errors. Non-destructive correction requires separation of the situations (samples) with errors from the samples corresponding to correct behaviour by a simple and robust classifier. The stochastic separation theorems provide us with such classifiers and determine a non-iterative (one-shot) procedure for their construction.This article is part of the theme issue 'Hilbert's sixth problem'. © 2018 The Author(s).

  6. Blessing of dimensionality: mathematical foundations of the statistical physics of data

    NASA Astrophysics Data System (ADS)

    Gorban, A. N.; Tyukin, I. Y.

    2018-04-01

    The concentrations of measure phenomena were discovered as the mathematical background to statistical mechanics at the end of the nineteenth/beginning of the twentieth century and have been explored in mathematics ever since. At the beginning of the twenty-first century, it became clear that the proper utilization of these phenomena in machine learning might transform the curse of dimensionality into the blessing of dimensionality. This paper summarizes recently discovered phenomena of measure concentration which drastically simplify some machine learning problems in high dimension, and allow us to correct legacy artificial intelligence systems. The classical concentration of measure theorems state that i.i.d. random points are concentrated in a thin layer near a surface (a sphere or equators of a sphere, an average or median-level set of energy or another Lipschitz function, etc.). The new stochastic separation theorems describe the thin structure of these thin layers: the random points are not only concentrated in a thin layer but are all linearly separable from the rest of the set, even for exponentially large random sets. The linear functionals for separation of points can be selected in the form of the linear Fisher's discriminant. All artificial intelligence systems make errors. Non-destructive correction requires separation of the situations (samples) with errors from the samples corresponding to correct behaviour by a simple and robust classifier. The stochastic separation theorems provide us with such classifiers and determine a non-iterative (one-shot) procedure for their construction. This article is part of the theme issue `Hilbert's sixth problem'.

  7. A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components

    NASA Technical Reports Server (NTRS)

    Abernethy, K.

    1986-01-01

    The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.

  8. Valid statistical inference methods for a case-control study with missing data.

    PubMed

    Tian, Guo-Liang; Zhang, Chi; Jiang, Xuejun

    2018-04-01

    The main objective of this paper is to derive the valid sampling distribution of the observed counts in a case-control study with missing data under the assumption of missing at random by employing the conditional sampling method and the mechanism augmentation method. The proposed sampling distribution, called the case-control sampling distribution, can be used to calculate the standard errors of the maximum likelihood estimates of parameters via the Fisher information matrix and to generate independent samples for constructing small-sample bootstrap confidence intervals. Theoretical comparisons of the new case-control sampling distribution with two existing sampling distributions exhibit a large difference. Simulations are conducted to investigate the influence of the three different sampling distributions on statistical inferences. One finding is that the conclusion by the Wald test for testing independency under the two existing sampling distributions could be completely different (even contradictory) from the Wald test for testing the equality of the success probabilities in control/case groups under the proposed distribution. A real cervical cancer data set is used to illustrate the proposed statistical methods.

  9. Propensity score to detect baseline imbalance in cluster randomized trials: the role of the c-statistic.

    PubMed

    Leyrat, Clémence; Caille, Agnès; Foucher, Yohann; Giraudeau, Bruno

    2016-01-22

    Despite randomization, baseline imbalance and confounding bias may occur in cluster randomized trials (CRTs). Covariate imbalance may jeopardize the validity of statistical inferences if they occur on prognostic factors. Thus, the diagnosis of a such imbalance is essential to adjust statistical analysis if required. We developed a tool based on the c-statistic of the propensity score (PS) model to detect global baseline covariate imbalance in CRTs and assess the risk of confounding bias. We performed a simulation study to assess the performance of the proposed tool and applied this method to analyze the data from 2 published CRTs. The proposed method had good performance for large sample sizes (n =500 per arm) and when the number of unbalanced covariates was not too small as compared with the total number of baseline covariates (≥40% of unbalanced covariates). We also provide a strategy for pre selection of the covariates needed to be included in the PS model to enhance imbalance detection. The proposed tool could be useful in deciding whether covariate adjustment is required before performing statistical analyses of CRTs.

  10. Data-Driven Lead-Acid Battery Prognostics Using Random Survival Forests

    DTIC Science & Technology

    2014-10-02

    Kogalur, Blackstone , & Lauer, 2008; Ishwaran & Kogalur, 2010). Random survival forest is a sur- vival analysis extension of Random Forests (Breiman, 2001...Statistics & probability letters, 80(13), 1056–1064. Ishwaran, H., Kogalur, U. B., Blackstone , E. H., & Lauer, M. S. (2008). Random survival forests. The

  11. Investigating the Randomness of Numbers

    ERIC Educational Resources Information Center

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  12. Nonlinear Dynamics, Chaotic and Complex Systems

    NASA Astrophysics Data System (ADS)

    Infeld, E.; Zelazny, R.; Galkowski, A.

    2011-04-01

    Part I. Dynamic Systems Bifurcation Theory and Chaos: 1. Chaos in random dynamical systems V. M. Gunldach; 2. Controlling chaos using embedded unstable periodic orbits: the problem of optimal periodic orbits B. R. Hunt and E. Ott; 3. Chaotic tracer dynamics in open hydrodynamical flows G. Karolyi, A. Pentek, T. Tel and Z. Toroczkai; 4. Homoclinic chaos L. P. Shilnikov; Part II. Spatially Extended Systems: 5. Hydrodynamics of relativistic probability flows I. Bialynicki-Birula; 6. Waves in ionic reaction-diffusion-migration systems P. Hasal, V. Nevoral, I. Schreiber, H. Sevcikova, D. Snita, and M. Marek; 7. Anomalous scaling in turbulence: a field theoretical approach V. Lvov and I. Procaccia; 8. Abelian sandpile cellular automata M. Markosova; 9. Transport in an incompletely chaotic magnetic field F. Spineanu; Part III. Dynamical Chaos Quantum Physics and Foundations Of Statistical Mechanics: 10. Non-equilibrium statistical mechanics and ergodic theory L. A. Bunimovich; 11. Pseudochaos in statistical physics B. Chirikov; 12. Foundations of non-equilibrium statistical mechanics J. P. Dougherty; 13. Thermomechanical particle simulations W. G. Hoover, H. A. Posch, C. H. Dellago, O. Kum, C. G. Hoover, A. J. De Groot and B. L. Holian; 14. Quantum dynamics on a Markov background and irreversibility B. Pavlov; 15. Time chaos and the laws of nature I. Prigogine and D. J. Driebe; 16. Evolutionary Q and cognitive systems: dynamic entropies and predictability of evolutionary processes W. Ebeling; 17. Spatiotemporal chaos information processing in neural networks H. Szu; 18. Phase transitions and learning in neural networks C. Van den Broeck; 19. Synthesis of chaos A. Vanecek and S. Celikovsky; 20. Computational complexity of continuous problems H. Wozniakowski; Part IV. Complex Systems As An Interface Between Natural Sciences and Environmental Social and Economic Sciences: 21. Stochastic differential geometry in finance studies V. G. Makhankov; Part V. Conference Banquet Speech: Where will the future go? M. J. Feigenbaum.

  13. Packet Randomized Experiments for Eliminating Classes of Confounders

    PubMed Central

    Pavela, Greg; Wiener, Howard; Fontaine, Kevin R.; Fields, David A.; Voss, Jameson D.; Allison, David B.

    2014-01-01

    Background Although randomization is considered essential for causal inference, it is often not possible to randomize in nutrition and obesity research. To address this, we develop a framework for an experimental design—packet randomized experiments (PREs), which improves causal inferences when randomization on a single treatment variable is not possible. This situation arises when subjects are randomly assigned to a condition (such as a new roommate) which varies in one characteristic of interest (such as weight), but also varies across many others. There has been no general discussion of this experimental design, including its strengths, limitations, and statistical properties. As such, researchers are left to develop and apply PREs on an ad hoc basis, limiting its potential to improve causal inferences among nutrition and obesity researchers. Methods We introduce PREs as an intermediary design between randomized controlled trials and observational studies. We review previous research that used the PRE design and describe its application in obesity-related research, including random roommate assignments, heterochronic parabiosis, and the quasi-random assignment of subjects to geographic areas. We then provide a statistical framework to control for potential packet-level confounders not accounted for by randomization. Results PREs have successfully been used to improve causal estimates of the effect of roommates, altitude, and breastfeeding on weight outcomes. When certain assumptions are met, PREs can asymptotically control for packet-level characteristics. This has the potential to statistically estimate the effect of a single treatment even when randomization to a single treatment did not occur. Conclusions Applying PREs to obesity-related research will improve decisions about clinical, public health, and policy actions insofar as it offers researchers new insight into cause and effect relationships among variables. PMID:25444088

  14. Origin of Somatic Mutations in β-Catenin versus Adenomatous Polyposis Coli in Colon Cancer: Random Mutagenesis in Animal Models versus Nonrandom Mutagenesis in Humans.

    PubMed

    Yang, Da; Zhang, Min; Gold, Barry

    2017-07-17

    Wnt signaling is compromised early in the development of human colorectal cancer (CRC) due to truncating nonsense mutations in adenomatous polyposis coli (APC). CRC induced by chemical carcinogens, such as heterocyclic aromatic amines and azoxymethane, in mice also involves dysregulation of Wnt signaling but via activating missense mutations in the β-catenin oncogene despite the fact that genetically modified mice harboring an inactive APC allele efficiently develop CRC. In contrast, activating mutations in β-catenin are rarely observed in human CRC. Dysregulation of the Wnt signaling pathway by the two distinct mechanisms reveals insights into the etiology of human CRC. On the basis of calculations related to DNA adduct levels produced in mouse CRC models using mutagens, and the number of stem cells in the mouse colon, we show that two nonsense mutations required for biallelic disruption of APC are statistically unlikely to produce CRC in experiments using small numbers of mice. We calculate that an activating mutation in one allele near the critical GSK3β phosphorylation site on β-catenin is >10 5 -times more likely to produce CRC by random mutagenesis due to chemicals than inactivating two alleles in APC, yet it does not occur in humans. Therefore, the mutagenesis mechanism in human CRC cannot be random. We explain that nonsense APC mutations predominate in human CRC because of deamination at 5-methylcytosine at CGA and CAG codons, coupled with the number of human colonic stem cells and lifespan. Our analyses, including a comparison of mutation type and age at CRC diagnosis in U.S. and Chinese patients, also indicate that APC mutations in CRC are not due to environmental mutagens that randomly damage DNA.

  15. [Effect of somatostatin-14 in simple mechanical obstruction of the small intestine].

    PubMed

    Jimenez-Garcia, A; Ahmad Araji, O; Balongo Garcia, R; Nogales Munoz, A; Salguero Villadiego, M; Cantillana Martinez, J

    1994-02-01

    In order to investigate the properties of somatostatin-14 we studied an experimental model of simple mechanical and closed loop occlusion. Forty-eight New Zealand rabbits were assigned randomly to three groups of 16: group C (controls) was operated and treated with saline solution (4 cc/Kg/h); group A was operated and initially treated with saline solution and an equal dose of somatostatin-14 (3.5 micrograms/Kg/h; and group B was operated and treated in the same manner as group A, but later, 8 hours after the laparotomy. The animals were sacrificed 24 hours later; intestinal secretion was quantified, blood and intestinal fluid chemistries were performed and specimens of the intestine were prepared for histological examination. Descriptive statistical analysis of the results was performed with the ANOVA, a semi-quantitative test and the covariance test. Somatostatin-14 produced an improvement in the volume of intestinal secretion in the treated groups compared with the control group. The results were statistically significant in group B treated after an 8-hour delay: closed loop (ml): 6.40 +/- 1.12, 2.50 +/- 0.94, 1.85 +/- 0.83 and simple mechanical occlusion (ml): 175 +/- 33.05, 89.50 +/- 9.27, 57.18 +/- 21.23, p < 0.01 for groups C, A and B C, A and B respectively. Net secretion of Cl and Na ions was also improved, p < 0.01.(ABSTRACT TRUNCATED AT 250 WORDS)

  16. Network problem threshold

    NASA Technical Reports Server (NTRS)

    Gejji, Raghvendra, R.

    1992-01-01

    Network transmission errors such as collisions, CRC errors, misalignment, etc. are statistical in nature. Although errors can vary randomly, a high level of errors does indicate specific network problems, e.g. equipment failure. In this project, we have studied the random nature of collisions theoretically as well as by gathering statistics, and established a numerical threshold above which a network problem is indicated with high probability.

  17. Confidence Intervals for the Between-Study Variance in Random Effects Meta-Analysis Using Generalised Cochran Heterogeneity Statistics

    ERIC Educational Resources Information Center

    Jackson, Dan

    2013-01-01

    Statistical inference is problematic in the common situation in meta-analysis where the random effects model is fitted to just a handful of studies. In particular, the asymptotic theory of maximum likelihood provides a poor approximation, and Bayesian methods are sensitive to the prior specification. Hence, less efficient, but easily computed and…

  18. A General Framework for Power Analysis to Detect the Moderator Effects in Two- and Three-Level Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Dong, Nianbo; Spybrook, Jessaca; Kelcey, Ben

    2016-01-01

    The purpose of this study is to propose a general framework for power analyses to detect the moderator effects in two- and three-level cluster randomized trials (CRTs). The study specifically aims to: (1) develop the statistical formulations for calculating statistical power, minimum detectable effect size (MDES) and its confidence interval to…

  19. Nonlinear dynamic mechanism of vocal tremor from voice analysis and model simulations

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; Jiang, Jack J.

    2008-09-01

    Nonlinear dynamic analysis and model simulations are used to study the nonlinear dynamic characteristics of vocal folds with vocal tremor, which can typically be characterized by low-frequency modulation and aperiodicity. Tremor voices from patients with disorders such as paresis, Parkinson's disease, hyperfunction, and adductor spasmodic dysphonia show low-dimensional characteristics, differing from random noise. Correlation dimension analysis statistically distinguishes tremor voices from normal voices. Furthermore, a nonlinear tremor model is proposed to study the vibrations of the vocal folds with vocal tremor. Fractal dimensions and positive Lyapunov exponents demonstrate the evidence of chaos in the tremor model, where amplitude and frequency play important roles in governing vocal fold dynamics. Nonlinear dynamic voice analysis and vocal fold modeling may provide a useful set of tools for understanding the dynamic mechanism of vocal tremor in patients with laryngeal diseases.

  20. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages.

    PubMed

    Kim, Yoonsang; Choi, Young-Ku; Emery, Sherry

    2013-08-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods' performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages-SAS GLIMMIX Laplace and SuperMix Gaussian quadrature-perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes.

  1. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages

    PubMed Central

    Kim, Yoonsang; Emery, Sherry

    2013-01-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods’ performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages—SAS GLIMMIX Laplace and SuperMix Gaussian quadrature—perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes. PMID:24288415

  2. New Estimates of Design Parameters for Clustered Randomization Studies: Findings from North Carolina and Florida. Working Paper 43

    ERIC Educational Resources Information Center

    Xu, Zeyu; Nichols, Austin

    2010-01-01

    The gold standard in making causal inference on program effects is a randomized trial. Most randomization designs in education randomize classrooms or schools rather than individual students. Such "clustered randomization" designs have one principal drawback: They tend to have limited statistical power or precision. This study aims to…

  3. Random ambience using high fidelity images

    NASA Astrophysics Data System (ADS)

    Abu, Nur Azman; Sahib, Shahrin

    2011-06-01

    Most of the secure communication nowadays mandates true random keys as an input. These operations are mostly designed and taken care of by the developers of the cryptosystem. Due to the nature of confidential crypto development today, pseudorandom keys are typically designed and still preferred by the developers of the cryptosystem. However, these pseudorandom keys are predictable, periodic and repeatable, hence they carry minimal entropy. True random keys are believed to be generated only via hardware random number generators. Careful statistical analysis is still required to have any confidence the process and apparatus generates numbers that are sufficiently random to suit the cryptographic use. In this underlying research, each moment in life is considered unique in itself. The random key is unique for the given moment generated by the user whenever he or she needs the random keys in practical secure communication. An ambience of high fidelity digital image shall be tested for its randomness according to the NIST Statistical Test Suite. Recommendation on generating a simple 4 megabits per second random cryptographic keys live shall be reported.

  4. Clustering, randomness and regularity in cloud fields. I - Theoretical considerations. II - Cumulus cloud fields

    NASA Technical Reports Server (NTRS)

    Weger, R. C.; Lee, J.; Zhu, Tianri; Welch, R. M.

    1992-01-01

    The current controversy existing in reference to the regularity vs. clustering in cloud fields is examined by means of analysis and simulation studies based upon nearest-neighbor cumulative distribution statistics. It is shown that the Poisson representation of random point processes is superior to pseudorandom-number-generated models and that pseudorandom-number-generated models bias the observed nearest-neighbor statistics towards regularity. Interpretation of this nearest-neighbor statistics is discussed for many cases of superpositions of clustering, randomness, and regularity. A detailed analysis is carried out of cumulus cloud field spatial distributions based upon Landsat, AVHRR, and Skylab data, showing that, when both large and small clouds are included in the cloud field distributions, the cloud field always has a strong clustering signal.

  5. Possible Statistics of Two Coupled Random Fields: Application to Passive Scalar

    NASA Technical Reports Server (NTRS)

    Dubrulle, B.; He, Guo-Wei; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    We use the relativity postulate of scale invariance to derive the similarity transformations between two coupled scale-invariant random elds at different scales. We nd the equations leading to the scaling exponents. This formulation is applied to the case of passive scalars advected i) by a random Gaussian velocity field; and ii) by a turbulent velocity field. In the Gaussian case, we show that the passive scalar increments follow a log-Levy distribution generalizing Kraichnan's solution and, in an appropriate limit, a log-normal distribution. In the turbulent case, we show that when the velocity increments follow a log-Poisson statistics, the passive scalar increments follow a statistics close to log-Poisson. This result explains the experimental observations of Ruiz et al. about the temperature increments.

  6. Exploring religious mechanisms for healthy alcohol use: religious messages and drinking among Korean women in California.

    PubMed

    Ayers, John W; Hofstetter, C Richard; Hughes, Suzanne C; Irvin, Veronica L; Sim, D Eastern Kang; Hovell, Melbourne F

    2009-11-01

    This research identifies social reinforcers within religious institutions associated with alcohol consumption among Korean women in California. Data were drawn from telephone interviews with female adults (N = 591) selected from a random sampling of persons in California with Korean surnames during 2007. Approximately 70% of attempted interviews were completed, with 92% conducted in Korean. Respondents were asked about any lifetime drinking (yes/no), drinking rate (typical number of drinks consumed on drinking days among current drinkers), and messages discouraging "excessive drinking" from religious leaders or congregants. Bivariable and multivariable regressions were used for analysis. Approximately 70.4% of women reported any lifetime drinking, and drinkers drank a mean (SD) of 1.10 (1.22) drinks on drinking days. About 30.8% reported any exposure to religious leaders' messages discouraging excessive drinking, and 28.2% reported any exposure to similar messages from congregants. Each congregant's message was statistically significantly associated with a 5.1% lower probability (odds ratio = 0.775, 95% confidence interval [CI]: 0.626, 0.959) of any lifetime drinking. also, each congregant's message was associated with a 13.8% (B = -0.138; 95% CI: -0.306, 0.029) lower drinking rate, which was statistically significant after adjusting for covariates using a one-tailed test. Exposure to leaders' messages was not statistically significantly associated with any lifetime drinking or drinking rate. Social reinforcement in the form of religious messages may be one mechanism by which religious institutions influence drinking behaviors. For Korean women, messages from congregants had a unique impact beyond the traditional religiosity indicators. These social mechanisms provide public health interventionists with religious pathways to improve drinking behaviors.

  7. Exploring Religious Mechanisms for Healthy Alcohol Use: Religious Messages and Drinking Among Korean Women in California*

    PubMed Central

    Ayers, John W.; Hofstetter, C. Richard; Hughes, Suzanne C.; Irvin, Veronica L.; Kang Sim, D. Eastern; Hovell, Melbourne F.

    2009-01-01

    Objective: This research identifies social reinforcers within religious institutions associated with alcohol consumption among Korean women in California. Method: Data were drawn from telephone interviews with female adults (N = 591) selected from a random sampling of persons in California with Korean surnames during 2007. Approximately 70% of attempted interviews were completed, with 92% conducted in Korean. Respondents were asked about any lifetime drinking (yes/no), drinking rate (typical number of drinks consumed on drinking days among current drinkers), and messages discouraging “excessive drinking” from religious leaders or congregants. Bivariable and multivariable regressions were used for analysis. Results: Approximately 70.4% of women reported any lifetime drinking, and drinkers drank a mean (SD) of 1.10 (1.22) drinks on drinking days. About 30.8%reported about 30.8% reported any exposure to religious leaders' messages discouraging excessive drinking, and 28.2% reported any exposure to similar messages from congregants. Each congregant's message was statistically significantly associated with a 5.1% lower probability (odds ratio = 0.775, 95% confidence interval [CI]: 0.626, 0.959) of any lifetime drinking. Also, each congregant's message was associated with a 13.8% (B = -0.138; 95% CI: -0.306, 0.029) lower drinking rate, which was statistically significant after adjusting for covariates using a one-tailed test. Exposure to leaders' messages was not statistically significantly associated with any lifetime drinking or drinking rate. Conclusions: Social reinforcement in the form of religious messages may be one mechanism by which religious institutions influence drinking behaviors. For Korean women, messages from congregants had a unique impact beyond the traditional religiosity indicators. These social mechanisms provide public health interventionists with religious pathways to improve drinking behaviors. PMID:19895765

  8. Are There Differences in Gait Mechanics in Patients With A Fixed Versus Mobile Bearing Total Ankle Arthroplasty? A Randomized Trial.

    PubMed

    Queen, Robin M; Franck, Christopher T; Schmitt, Daniel; Adams, Samuel B

    2017-10-01

    Total ankle arthroplasty (TAA) is an alternative to arthrodesis, but no randomized trial has examined whether a fixed bearing or mobile bearing implant provides improved gait mechanics. We wished to determine if fixed- or mobile-bearing TAA results in a larger improvement in pain scores and gait mechanics from before surgery to 1 year after surgery, and to quantify differences in outcomes using statistical analysis and report the standardized effect sizes for such comparisons. Patients with end-stage ankle arthritis who were scheduled for TAA between November 2011 and June 2013 (n = 40; 16 men, 24 women; average age, 63 years; age range, 35-81 years) were prospectively recruited for this study from a single foot and ankle orthopaedic clinic. During this period, 185 patients underwent TAA, with 144 being eligible to participate in this study. Patients were eligible to participate if they were able to meet all study inclusion criteria, which were: no previous diagnosis of rheumatoid arthritis, a contralateral TAA, bilateral ankle arthritis, previous revision TAA, an ankle fusion revision, or able to walk without the use of an assistive device, weight less than 250 pounds (114 kg), a sagittal or coronal plane deformity less than 15°, no presence of avascular necrosis of the distal tibia, no current neuropathy, age older than 35 years, no history of a talar neck fracture, or an avascular talus. Of the 144 eligible patients, 40 consented to participate in our randomized trial. These 40 patients were randomly assigned to either the fixed (n = 20) or mobile bearing implant group (n = 20). Walking speed, bilateral peak dorsiflexion angle, peak plantar flexion angle, sagittal plane ankle ROM, peak ankle inversion angle, peak plantar flexion moment, peak plantar flexion power during stance, peak weight acceptance, and propulsive vertical ground reaction force were analyzed during seven self-selected speed level walking trials for 33 participants using an eight-camera motion analysis system and four force plates. Seven patients were not included in the analysis owing to cancelled surgery (one from each group) and five were lost to followup (four with fixed bearing and one with mobile bearing implants). A series of effect-size calculations and two-sample t-tests comparing postoperative and preoperative increases in outcome variables between implant types were used to determine the differences in the magnitude of improvement between the two patient cohorts from before surgery to 1 year after surgery. The sample size in this study enabled us to detect a standardized shift of 1.01 SDs between group means with 80% power and a type I error rate of 5% for all outcome variables in the study. This randomized trial did not reveal any differences in outcomes between the two implant types under study at the sample size collected. In addition to these results, effect size analysis suggests that changes in outcome differ between implant types by less than 1 SD. Detection of the largest change score or observed effect (propulsive vertical ground reaction force [Fixed: 0.1 ± 0.1; 0.0-1.0; Mobile: 0.0 ± 0.1; 0.0-0.0; p = 0.0.051]) in this study would require a future trial to enroll 66 patients. However, the smallest change score or observed effect (walking speed [Fixed: 0.2 ± 0.3; 0.1-0.4; Mobile: 0.2 ± 0.3; 0.0-0.3; p = 0.742]) requires a sample size of 2336 to detect a significant difference with 80% power at the observed effect sizes. To our knowledge, this is the first randomized study to report the observed effect size comparing improvements in outcome measures between fixed and mobile bearing implant types. This study was statistically powered to detect large effects and descriptively analyze observed effect sizes. Based on our results there were no statistically or clinically meaningful differences between the fixed and mobile bearing implants when examining gait mechanics and pain 1 year after TAA. Level II, therapeutic study.

  9. Can Propensity Score Analysis Approximate Randomized Experiments Using Pretest and Demographic Information in Pre-K Intervention Research?

    PubMed

    Dong, Nianbo; Lipsey, Mark W

    2017-01-01

    It is unclear whether propensity score analysis (PSA) based on pretest and demographic covariates will meet the ignorability assumption for replicating the results of randomized experiments. This study applies within-study comparisons to assess whether pre-Kindergarten (pre-K) treatment effects on achievement outcomes estimated using PSA based on a pretest and demographic covariates can approximate those found in a randomized experiment. Data-Four studies with samples of pre-K children each provided data on two math achievement outcome measures with baseline pretests and child demographic variables that included race, gender, age, language spoken at home, and mother's highest education. Research Design and Data Analysis-A randomized study of a pre-K math curriculum provided benchmark estimates of effects on achievement measures. Comparison samples from other pre-K studies were then substituted for the original randomized control and the effects were reestimated using PSA. The correspondence was evaluated using multiple criteria. The effect estimates using PSA were in the same direction as the benchmark estimates, had similar but not identical statistical significance, and did not differ from the benchmarks at statistically significant levels. However, the magnitude of the effect sizes differed and displayed both absolute and relative bias larger than required to show statistical equivalence with formal tests, but those results were not definitive because of the limited statistical power. We conclude that treatment effect estimates based on a single pretest and demographic covariates in PSA correspond to those from a randomized experiment on the most general criteria for equivalence.

  10. Frailty effects in networks: comparison and identification of individual heterogeneity versus preferential attachment in evolving networks

    PubMed Central

    de Blasio, Birgitte Freiesleben; Seierstad, Taral Guldahl; Aalen, Odd O

    2011-01-01

    Preferential attachment is a proportionate growth process in networks, where nodes receive new links in proportion to their current degree. Preferential attachment is a popular generative mechanism to explain the widespread observation of power-law-distributed networks. An alternative explanation for the phenomenon is a randomly grown network with large individual variation in growth rates among the nodes (frailty). We derive analytically the distribution of individual rates, which will reproduce the connectivity distribution that is obtained from a general preferential attachment process (Yule process), and the structural differences between the two types of graphs are examined by simulations. We present a statistical test to distinguish the two generative mechanisms from each other and we apply the test to both simulated data and two real data sets of scientific citation and sexual partner networks. The findings from the latter analyses argue for frailty effects as an important mechanism underlying the dynamics of complex networks. PMID:21572513

  11. Anderson Localization in Quark-Gluon Plasma

    NASA Astrophysics Data System (ADS)

    Kovács, Tamás G.; Pittler, Ferenc

    2010-11-01

    At low temperature the low end of the QCD Dirac spectrum is well described by chiral random matrix theory. In contrast, at high temperature there is no similar statistical description of the spectrum. We show that at high temperature the lowest part of the spectrum consists of a band of statistically uncorrelated eigenvalues obeying essentially Poisson statistics and the corresponding eigenvectors are extremely localized. Going up in the spectrum the spectral density rapidly increases and the eigenvectors become more and more delocalized. At the same time the spectral statistics gradually crosses over to the bulk statistics expected from the corresponding random matrix ensemble. This phenomenon is reminiscent of Anderson localization in disordered conductors. Our findings are based on staggered Dirac spectra in quenched lattice simulations with the SU(2) gauge group.

  12. Ice Water Classification Using Statistical Distribution Based Conditional Random Fields in RADARSAT-2 Dual Polarization Imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Li, F.; Zhang, S.; Hao, W.; Zhu, T.; Yuan, L.; Xiao, F.

    2017-09-01

    In this paper, Statistical Distribution based Conditional Random Fields (STA-CRF) algorithm is exploited for improving marginal ice-water classification. Pixel level ice concentration is presented as the comparison of methods based on CRF. Furthermore, in order to explore the effective statistical distribution model to be integrated into STA-CRF, five statistical distribution models are investigated. The STA-CRF methods are tested on 2 scenes around Prydz Bay and Adélie Depression, where contain a variety of ice types during melt season. Experimental results indicate that the proposed method can resolve sea ice edge well in Marginal Ice Zone (MIZ) and show a robust distinction of ice and water.

  13. Statistical parameters of random heterogeneity estimated by analysing coda waves based on finite difference method

    NASA Astrophysics Data System (ADS)

    Emoto, K.; Saito, T.; Shiomi, K.

    2017-12-01

    Short-period (<1 s) seismograms are strongly affected by small-scale (<10 km) heterogeneities in the lithosphere. In general, short-period seismograms are analysed based on the statistical method by considering the interaction between seismic waves and randomly distributed small-scale heterogeneities. Statistical properties of the random heterogeneities have been estimated by analysing short-period seismograms. However, generally, the small-scale random heterogeneity is not taken into account for the modelling of long-period (>2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.

  14. A statistical mechanics approach to computing rare transitions in multi-stable turbulent geophysical flows

    NASA Astrophysics Data System (ADS)

    Laurie, J.; Bouchet, F.

    2012-04-01

    Many turbulent flows undergo sporadic random transitions, after long periods of apparent statistical stationarity. For instance, paths of the Kuroshio [1], the Earth's magnetic field reversal, atmospheric flows [2], MHD experiments [3], 2D turbulence experiments [4,5], 3D flows [6] show this kind of behavior. The understanding of this phenomena is extremely difficult due to the complexity, the large number of degrees of freedom, and the non-equilibrium nature of these turbulent flows. It is however a key issue for many geophysical problems. A straightforward study of these transitions, through a direct numerical simulation of the governing equations, is nearly always impracticable. This is mainly a complexity problem, due to the large number of degrees of freedom involved for genuine turbulent flows, and the extremely long time between two transitions. In this talk, we consider two-dimensional and geostrophic turbulent models, with stochastic forces. We consider regimes where two or more attractors coexist. As an alternative to direct numerical simulation, we propose a non-equilibrium statistical mechanics approach to the computation of this phenomenon. Our strategy is based on large deviation theory [7], derived from a path integral representation of the stochastic process. Among the trajectories connecting two non-equilibrium attractors, we determine the most probable one. Moreover, we also determine the transition rates, and in which cases this most probable trajectory is a typical one. Interestingly, we prove that in the class of models we consider, a mechanism exists for diffusion over sets of connected attractors. For the type of stochastic forces that allows this diffusion, the transition between attractors is not a rare event. It is then very difficult to characterize the flow as bistable. However for another class of stochastic forces, this diffusion mechanism is prevented, and genuine bistability or multi-stability is observed. We discuss how these results are probably connected to the long debated existence of multi-stability in the atmosphere and oceans.

  15. Mixed-order phase transition in a minimal, diffusion-based spin model.

    PubMed

    Fronczak, Agata; Fronczak, Piotr

    2016-07-01

    In this paper we exactly solve, within the grand canonical ensemble, a minimal spin model with the hybrid phase transition. We call the model diffusion based because its Hamiltonian can be recovered from a simple dynamic procedure, which can be seen as an equilibrium statistical mechanics representation of a biased random walk. We outline the derivation of the phase diagram of the model, in which the triple point has the hallmarks of the hybrid transition: discontinuity in the average magnetization and algebraically diverging susceptibilities. At this point, two second-order transition curves meet in equilibrium with the first-order curve, resulting in a prototypical mixed-order behavior.

  16. Foundations of statistical mechanics from symmetries of entanglement

    DOE PAGES

    Deffner, Sebastian; Zurek, Wojciech H.

    2016-06-09

    Envariance—entanglement assisted invariance—is a recently discovered symmetry of composite quantum systems. Here, we show that thermodynamic equilibrium states are fully characterized by their envariance. In particular, the microcanonical equilibrium of a systemmore » $${ \\mathcal S }$$ with Hamiltonian $${H}_{{ \\mathcal S }}$$ is a fully energetically degenerate quantum state envariant under every unitary transformation. A representation of the canonical equilibrium then follows from simply counting degenerate energy states. Finally, our conceptually novel approach is free of mathematically ambiguous notions such as ensemble, randomness, etc., and, while it does not even rely on probability, it helps to understand its role in the quantum world.« less

  17. Six-vertex model and Schramm-Loewner evolution.

    PubMed

    Kenyon, Richard; Miller, Jason; Sheffield, Scott; Wilson, David B

    2017-05-01

    Square ice is a statistical mechanics model for two-dimensional ice, widely believed to have a conformally invariant scaling limit. We associate a Peano (space-filling) curve to a square ice configuration, and more generally to a so-called six-vertex model configuration, and argue that its scaling limit is a space-filling version of the random fractal curve SLE_{κ}, Schramm-Loewner evolution with parameter κ, where 4<κ≤12+8sqrt[2]. For square ice, κ=12. At the "free-fermion point" of the six-vertex model, κ=8+4sqrt[3]. These unusual values lie outside the classical interval 2≤κ≤8.

  18. Neutral Evolution of Duplicated DNA: An Evolutionary Stick-Breaking Process Causes Scale-Invariant Behavior

    NASA Astrophysics Data System (ADS)

    Massip, Florian; Arndt, Peter F.

    2013-04-01

    Recently, an enrichment of identical matching sequences has been found in many eukaryotic genomes. Their length distribution exhibits a power law tail raising the question of what evolutionary mechanism or functional constraints would be able to shape this distribution. Here we introduce a simple and evolutionarily neutral model, which involves only point mutations and segmental duplications, and produces the same statistical features as observed for genomic data. Further, we extend a mathematical model for random stick breaking to analytically show that the exponent of the power law tail is -3 and universal as it does not depend on the microscopic details of the model.

  19. Seizure clustering.

    PubMed

    Haut, Sheryl R

    2006-02-01

    Seizure clusters, also known as repetitive or serial seizures, occur commonly in epilepsy. Clustering implies that the occurrence of one seizure may influence the probability of a subsequent seizure; thus, the investigation of the clustering phenomenon yields insights into both specific mechanisms of seizure clustering and more general concepts of seizure occurrence. Seizure clustering has been defined clinically as a number of seizures per unit time and, statistically, as a deviation from a random distribution, or interseizure interval dependence. This review explores the pathophysiology, epidemiology, and clinical implications of clustering, as well as other periodic patterns of seizure occurrence. Risk factors for experiencing clusters and potential precipitants of clustering are also addressed.

  20. On the Calculation of Uncertainty Statistics with Error Bounds for CFD Calculations Containing Random Parameters and Fields

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.

    2016-01-01

    This chapter discusses the ongoing development of combined uncertainty and error bound estimates for computational fluid dynamics (CFD) calculations subject to imposed random parameters and random fields. An objective of this work is the construction of computable error bound formulas for output uncertainty statistics that guide CFD practitioners in systematically determining how accurately CFD realizations should be approximated and how accurately uncertainty statistics should be approximated for output quantities of interest. Formal error bounds formulas for moment statistics that properly account for the presence of numerical errors in CFD calculations and numerical quadrature errors in the calculation of moment statistics have been previously presented in [8]. In this past work, hierarchical node-nested dense and sparse tensor product quadratures are used to calculate moment statistics integrals. In the present work, a framework has been developed that exploits the hierarchical structure of these quadratures in order to simplify the calculation of an estimate of the quadrature error needed in error bound formulas. When signed estimates of realization error are available, this signed error may also be used to estimate output quantity of interest probability densities as a means to assess the impact of realization error on these density estimates. Numerical results are presented for CFD problems with uncertainty to demonstrate the capabilities of this framework.

  1. Multilevel Model Prediction

    ERIC Educational Resources Information Center

    Frees, Edward W.; Kim, Jee-Seon

    2006-01-01

    Multilevel models are proven tools in social research for modeling complex, hierarchical systems. In multilevel modeling, statistical inference is based largely on quantification of random variables. This paper distinguishes among three types of random variables in multilevel modeling--model disturbances, random coefficients, and future response…

  2. The Analysis of Completely Randomized Factorial Experiments When Observations Are Lost at Random.

    ERIC Educational Resources Information Center

    Hummel, Thomas J.

    An investigation was conducted of the characteristics of two estimation procedures and corresponding test statistics used in the analysis of completely randomized factorial experiments when observations are lost at random. For one estimator, contrast coefficients for cell means did not involve the cell frequencies. For the other, contrast…

  3. Intermediate quantum maps for quantum computation

    NASA Astrophysics Data System (ADS)

    Giraud, O.; Georgeot, B.

    2005-10-01

    We study quantum maps displaying spectral statistics intermediate between Poisson and Wigner-Dyson. It is shown that they can be simulated on a quantum computer with a small number of gates, and efficiently yield information about fidelity decay or spectral statistics. We study their matrix elements and entanglement production and show that they converge with time to distributions which differ from random matrix predictions. A randomized version of these maps can be implemented even more economically and yields pseudorandom operators with original properties, enabling, for example, one to produce fractal random vectors. These algorithms are within reach of present-day quantum computers.

  4. Asymmetrically dominated choice problems, the isolation hypothesis and random incentive mechanisms.

    PubMed

    Cox, James C; Sadiraj, Vjollca; Schmidt, Ulrich

    2014-01-01

    This paper presents an experimental study of the random incentive mechanisms which are a standard procedure in economic and psychological experiments. Random incentive mechanisms have several advantages but are incentive-compatible only if responses to the single tasks are independent. This is true if either the independence axiom of expected utility theory or the isolation hypothesis of prospect theory holds. We present a simple test of this in the context of choice under risk. In the baseline (one task) treatment we observe risk behavior in a given choice problem. We show that by integrating a second, asymmetrically dominated choice problem in a random incentive mechanism risk behavior can be manipulated systematically. This implies that the isolation hypothesis is violated and the random incentive mechanism does not elicit true preferences in our example.

  5. Emergence of patterns in random processes

    NASA Astrophysics Data System (ADS)

    Newman, William I.; Turcotte, Donald L.; Malamud, Bruce D.

    2012-08-01

    Sixty years ago, it was observed that any independent and identically distributed (i.i.d.) random variable would produce a pattern of peak-to-peak sequences with, on average, three events per sequence. This outcome was employed to show that randomness could yield, as a null hypothesis for animal populations, an explanation for their apparent 3-year cycles. We show how we can explicitly obtain a universal distribution of the lengths of peak-to-peak sequences in time series and that this can be employed for long data sets as a test of their i.i.d. character. We illustrate the validity of our analysis utilizing the peak-to-peak statistics of a Gaussian white noise. We also consider the nearest-neighbor cluster statistics of point processes in time. If the time intervals are random, we show that cluster size statistics are identical to the peak-to-peak sequence statistics of time series. In order to study the influence of correlations in a time series, we determine the peak-to-peak sequence statistics for the Langevin equation of kinetic theory leading to Brownian motion. To test our methodology, we consider a variety of applications. Using a global catalog of earthquakes, we obtain the peak-to-peak statistics of earthquake magnitudes and the nearest neighbor interoccurrence time statistics. In both cases, we find good agreement with the i.i.d. theory. We also consider the interval statistics of the Old Faithful geyser in Yellowstone National Park. In this case, we find a significant deviation from the i.i.d. theory which we attribute to antipersistence. We consider the interval statistics using the AL index of geomagnetic substorms. We again find a significant deviation from i.i.d. behavior that we attribute to mild persistence. Finally, we examine the behavior of Standard and Poor's 500 stock index's daily returns from 1928-2011 and show that, while it is close to being i.i.d., there is, again, significant persistence. We expect that there will be many other applications of our methodology both to interoccurrence statistics and to time series.

  6. Effect of mechanical behaviour of the brachial artery on blood pressure measurement during both cuff inflation and cuff deflation.

    PubMed

    Zheng, Dingchang; Pan, Fan; Murray, Alan

    2013-10-01

    The aim of this study was to investigate the effect of different mechanical behaviour of the brachial artery on blood pressure (BP) measurements during cuff inflation and deflation. BP measurements were taken from each of 40 participants, with three repeat sessions under three randomized cuff deflation/inflation conditions. Cuff pressure was linearly deflated and inflated at a standard rate of 2-3 mmHg/s and also linearly inflated at a fast rate of 5-6 mmHg/s. Manual auscultatory systolic and diastolic BPs, and pulse pressure (SBP, DBP, PP) were measured. Automated BPs were determined from digitally recorded cuff pressures by fitting a polynomial model to the oscillometric pulse amplitudes. The BPs from cuff deflation and inflation were then compared. Repeatable measurements between sessions and between the sequential order of inflation/deflation conditions (all P > 0.1) indicated stability of arterial mechanical behaviour with repeat measurements. Comparing BPs obtained by standard inflation with those from standard deflation, manual SBP was 2.6 mmHg lower (P < 0.01), manual DBP was 1.5 mmHg higher (P < 0.01), manual PP was 4.2 mmHg lower (P < 0.001), automated DBP was 6.7 mmHg higher (P < 0.001) and automatic PP was 7.5 mmHg lower (P < 0.001). There was no statistically significant difference for any automated BPs between fast and standard cuff inflation. The statistically significant BP differences between inflation and deflation suggest different arterial mechanical behaviour between arterial opening and closing during BP measurement. We have shown that the mechanical behaviour of the brachial artery during BP measurement differs between cuff deflation and cuff inflation.

  7. Statistical inference of the generation probability of T-cell receptors from sequence repertoires.

    PubMed

    Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G

    2012-10-02

    Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.

  8. Finite-range Coulomb gas models of banded random matrices and quantum kicked rotors

    NASA Astrophysics Data System (ADS)

    Pandey, Akhilesh; Kumar, Avanish; Puri, Sanjay

    2017-11-01

    Dyson demonstrated an equivalence between infinite-range Coulomb gas models and classical random matrix ensembles for the study of eigenvalue statistics. We introduce finite-range Coulomb gas (FRCG) models via a Brownian matrix process, and study them analytically and by Monte Carlo simulations. These models yield new universality classes, and provide a theoretical framework for the study of banded random matrices (BRMs) and quantum kicked rotors (QKRs). We demonstrate that, for a BRM of bandwidth b and a QKR of chaos parameter α , the appropriate FRCG model has the effective range d =b2/N =α2/N , for large N matrix dimensionality. As d increases, there is a transition from Poisson to classical random matrix statistics.

  9. Finite-range Coulomb gas models of banded random matrices and quantum kicked rotors.

    PubMed

    Pandey, Akhilesh; Kumar, Avanish; Puri, Sanjay

    2017-11-01

    Dyson demonstrated an equivalence between infinite-range Coulomb gas models and classical random matrix ensembles for the study of eigenvalue statistics. We introduce finite-range Coulomb gas (FRCG) models via a Brownian matrix process, and study them analytically and by Monte Carlo simulations. These models yield new universality classes, and provide a theoretical framework for the study of banded random matrices (BRMs) and quantum kicked rotors (QKRs). We demonstrate that, for a BRM of bandwidth b and a QKR of chaos parameter α, the appropriate FRCG model has the effective range d=b^{2}/N=α^{2}/N, for large N matrix dimensionality. As d increases, there is a transition from Poisson to classical random matrix statistics.

  10. Information trimming: Sufficient statistics, mutual information, and predictability from effective channel states

    NASA Astrophysics Data System (ADS)

    James, Ryan G.; Mahoney, John R.; Crutchfield, James P.

    2017-06-01

    One of the most basic characterizations of the relationship between two random variables, X and Y , is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with the other. It is well known that either X (or Y ) can be replaced by its minimal sufficient statistic about Y (or X ) while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information X 's minimal sufficient statistic preserves about Y is exactly the information that Y 's minimal sufficient statistic preserves about X . We call this procedure information trimming. As an important corollary, we consider the case where one variable is a stochastic process' past and the other its future. In this case, the mutual information is the channel transmission rate between the channel's effective states. That is, the past-future mutual information (the excess entropy) is the amount of information about the future that can be predicted using the past. Translating our result about minimal sufficient statistics, this is equivalent to the mutual information between the forward- and reverse-time causal states of computational mechanics. We close by discussing multivariate extensions to this use of minimal sufficient statistics.

  11. graph-GPA: A graphical model for prioritizing GWAS results and investigating pleiotropic architecture.

    PubMed

    Chung, Dongjun; Kim, Hang J; Zhao, Hongyu

    2017-02-01

    Genome-wide association studies (GWAS) have identified tens of thousands of genetic variants associated with hundreds of phenotypes and diseases, which have provided clinical and medical benefits to patients with novel biomarkers and therapeutic targets. However, identification of risk variants associated with complex diseases remains challenging as they are often affected by many genetic variants with small or moderate effects. There has been accumulating evidence suggesting that different complex traits share common risk basis, namely pleiotropy. Recently, several statistical methods have been developed to improve statistical power to identify risk variants for complex traits through a joint analysis of multiple GWAS datasets by leveraging pleiotropy. While these methods were shown to improve statistical power for association mapping compared to separate analyses, they are still limited in the number of phenotypes that can be integrated. In order to address this challenge, in this paper, we propose a novel statistical framework, graph-GPA, to integrate a large number of GWAS datasets for multiple phenotypes using a hidden Markov random field approach. Application of graph-GPA to a joint analysis of GWAS datasets for 12 phenotypes shows that graph-GPA improves statistical power to identify risk variants compared to statistical methods based on smaller number of GWAS datasets. In addition, graph-GPA also promotes better understanding of genetic mechanisms shared among phenotypes, which can potentially be useful for the development of improved diagnosis and therapeutics. The R implementation of graph-GPA is currently available at https://dongjunchung.github.io/GGPA/.

  12. A Computational Study of Plastic Deformation in AISI 304 Induced by Surface Mechanical Attrition Treatment

    NASA Astrophysics Data System (ADS)

    Zhang, X. C.; Lu, J.; Shi, S. Q.

    2010-05-01

    As a technique of grain refinement process by plastic deformation, surface mechanical attrition treatment (SMAT) has been developed to be one of the most effective ways to optimize the mechanical properties of various materials including pure metals and alloys. SMAT can significantly reduce grain size into nanometer regime in the surface layer of bulk materials, providing tremendous opportunities for improving physical, chemical and mechanical properties of the materials. In this work, a computational modeling of the surface mechanical attrition treatment (SMAT) process is presented, in which Johnson-Cook plasticity model and the finite element method were employed to study the high strain rate, elastic-plastic dynamic process of ball impact on a metallic target. AISI 304 steel with low stacking fault energy was chosen as the target material. First, a random impact model was used to analyze the statistic characteristics of ball impact, and then the plastic deformation behavior and residual stress distribution in AISI 304 stainless steel during SMAT were studied. The simulation results show that the compressive residual stress and vertical deformation of the surface structures were directly affected by ball impact frequency, incident impact angle and ball diameter used in SMAT process.

  13. [Effectiveness of an individualised physiotherapy program versus group therapy on neck pain and disability in patients with acute and subacute mechanical neck pain].

    PubMed

    Antúnez Sánchez, Leonardo Gregorio; de la Casa Almeida, María; Rebollo Roldán, Jesús; Ramírez Manzano, Antonio; Martín Valero, Rocío; Suárez Serrano, Carmen

    To compare the efficacy in reducing neck pain and disability in an individualised physiotherapy treatment with group treatment in acute and subacute mechanical neck pain. Randomised clinical trial. Health Area of University Hospital Virgen del Rocío, Seville, Spain. A total of 90 patients diagnosed with mechanical neck pain of up to one month onset, distributed randomly into two groups: (i)individualised treatment; (ii)group treatment. The treatment consisted of 15 sessions of about 60minutes for both groups. Individual treatment consisted of 15minutes of infrared heat therapy, 17minutes of massage, and analytical passive stretching of the trapezius muscles and angle of the scapula. The group treatment consisted of a program of active mobilisation, isometric contractions, self-stretching, and postural recommendations. Pain was measured at the beginning and end of treatment pain using a Visual Analogue Scale (VAS) and an algometer applied on the trapezius muscles and angle of the scapula, and neck disability using the Neck Disability Index. Both treatments were statistically significant (P<.001) in improving all variables. Statistically significant differences (P<.001) were found for all of them in favour of individualised treatment compared to group treatment. Patients with acute or subacute mechanical neck pain experienced an improvement in pain and neck disability after receiving either of the physiotherapy treatments used in our study, with the individual treatment being more effective than collective. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.

  14. Chaotic oscillations and noise transformations in a simple dissipative system with delayed feedback

    NASA Astrophysics Data System (ADS)

    Zverev, V. V.; Rubinstein, B. Ya.

    1991-04-01

    We analyze the statistical behavior of signals in nonlinear circuits with delayed feedback in the presence of external Markovian noise. For the special class of circuits with intense phase mixing we develop an approach for the computation of the probability distributions and multitime correlation functions based on the random phase approximation. Both Gaussian and Kubo-Andersen models of external noise statistics are analyzed and the existence of the stationary (asymptotic) random process in the long-time limit is shown. We demonstrate that a nonlinear system with chaotic behavior becomes a noise amplifier with specific statistical transformation properties.

  15. Statistical error model for a solar electric propulsion thrust subsystem

    NASA Technical Reports Server (NTRS)

    Bantell, M. H.

    1973-01-01

    The solar electric propulsion thrust subsystem statistical error model was developed as a tool for investigating the effects of thrust subsystem parameter uncertainties on navigation accuracy. The model is currently being used to evaluate the impact of electric engine parameter uncertainties on navigation system performance for a baseline mission to Encke's Comet in the 1980s. The data given represent the next generation in statistical error modeling for low-thrust applications. Principal improvements include the representation of thrust uncertainties and random process modeling in terms of random parametric variations in the thrust vector process for a multi-engine configuration.

  16. Transfusion Indication Threshold Reduction (TITRe2) randomized controlled trial in cardiac surgery: statistical analysis plan.

    PubMed

    Pike, Katie; Nash, Rachel L; Murphy, Gavin J; Reeves, Barnaby C; Rogers, Chris A

    2015-02-22

    The Transfusion Indication Threshold Reduction (TITRe2) trial is the largest randomized controlled trial to date to compare red blood cell transfusion strategies following cardiac surgery. This update presents the statistical analysis plan, detailing how the study will be analyzed and presented. The statistical analysis plan has been written following recommendations from the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use, prior to database lock and the final analysis of trial data. Outlined analyses are in line with the Consolidated Standards of Reporting Trials (CONSORT). The study aims to randomize 2000 patients from 17 UK centres. Patients are randomized to either a restrictive (transfuse if haemoglobin concentration <7.5 g/dl) or liberal (transfuse if haemoglobin concentration <9 g/dl) transfusion strategy. The primary outcome is a binary composite outcome of any serious infectious or ischaemic event in the first 3 months following randomization. The statistical analysis plan details how non-adherence with the intervention, withdrawals from the study, and the study population will be derived and dealt with in the analysis. The planned analyses of the trial primary and secondary outcome measures are described in detail, including approaches taken to deal with multiple testing, model assumptions not being met and missing data. Details of planned subgroup and sensitivity analyses and pre-specified ancillary analyses are given, along with potential issues that have been identified with such analyses and possible approaches to overcome such issues. ISRCTN70923932 .

  17. An investigation of new toxicity test method performance in validation studies: 1. Toxicity test methods that have predictive capacity no greater than chance.

    PubMed

    Bruner, L H; Carr, G J; Harbell, J W; Curren, R D

    2002-06-01

    An approach commonly used to measure new toxicity test method (NTM) performance in validation studies is to divide toxicity results into positive and negative classifications, and the identify true positive (TP), true negative (TN), false positive (FP) and false negative (FN) results. After this step is completed, the contingent probability statistics (CPS), sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) are calculated. Although these statistics are widely used and often the only statistics used to assess the performance of toxicity test methods, there is little specific guidance in the validation literature on what values for these statistics indicate adequate performance. The purpose of this study was to begin developing data-based answers to this question by characterizing the CPS obtained from an NTM whose data have a completely random association with a reference test method (RTM). Determining the CPS of this worst-case scenario is useful because it provides a lower baseline from which the performance of an NTM can be judged in future validation studies. It also provides an indication of relationships in the CPS that help identify random or near-random relationships in the data. The results from this study of randomly associated tests show that the values obtained for the statistics vary significantly depending on the cut-offs chosen, that high values can be obtained for individual statistics, and that the different measures cannot be considered independently when evaluating the performance of an NTM. When the association between results of an NTM and RTM is random the sum of the complementary pairs of statistics (sensitivity + specificity, NPV + PPV) is approximately 1, and the prevalence (i.e., the proportion of toxic chemicals in the population of chemicals) and PPV are equal. Given that combinations of high sensitivity-low specificity or low specificity-high sensitivity (i.e., the sum of the sensitivity and specificity equal to approximately 1) indicate lack of predictive capacity, an NTM having these performance characteristics should be considered no better for predicting toxicity than by chance alone.

  18. Random elements on lattices: Review and statistical applications

    NASA Astrophysics Data System (ADS)

    Potocký, Rastislav; Villarroel, Claudia Navarro; Sepúlveda, Maritza; Luna, Guillermo; Stehlík, Milan

    2017-07-01

    We discuss important contributions to random elements on lattices. We relate to both algebraic and probabilistic properties. Several applications and concepts are discussed, e.g. positive dependence, Random walks and distributions on lattices, Super-lattices, learning. The application to Chilean Ecology is given.

  19. Novel pseudo-random number generator based on quantum random walks.

    PubMed

    Yang, Yu-Guang; Zhao, Qian-Qian

    2016-02-04

    In this paper, we investigate the potential application of quantum computation for constructing pseudo-random number generators (PRNGs) and further construct a novel PRNG based on quantum random walks (QRWs), a famous quantum computation model. The PRNG merely relies on the equations used in the QRWs, and thus the generation algorithm is simple and the computation speed is fast. The proposed PRNG is subjected to statistical tests such as NIST and successfully passed the test. Compared with the representative PRNG based on quantum chaotic maps (QCM), the present QRWs-based PRNG has some advantages such as better statistical complexity and recurrence. For example, the normalized Shannon entropy and the statistical complexity of the QRWs-based PRNG are 0.999699456771172 and 1.799961178212329e-04 respectively given the number of 8 bits-words, say, 16Mbits. By contrast, the corresponding values of the QCM-based PRNG are 0.999448131481064 and 3.701210794388818e-04 respectively. Thus the statistical complexity and the normalized entropy of the QRWs-based PRNG are closer to 0 and 1 respectively than those of the QCM-based PRNG when the number of words of the analyzed sequence increases. It provides a new clue to construct PRNGs and also extends the applications of quantum computation.

  20. Novel pseudo-random number generator based on quantum random walks

    PubMed Central

    Yang, Yu-Guang; Zhao, Qian-Qian

    2016-01-01

    In this paper, we investigate the potential application of quantum computation for constructing pseudo-random number generators (PRNGs) and further construct a novel PRNG based on quantum random walks (QRWs), a famous quantum computation model. The PRNG merely relies on the equations used in the QRWs, and thus the generation algorithm is simple and the computation speed is fast. The proposed PRNG is subjected to statistical tests such as NIST and successfully passed the test. Compared with the representative PRNG based on quantum chaotic maps (QCM), the present QRWs-based PRNG has some advantages such as better statistical complexity and recurrence. For example, the normalized Shannon entropy and the statistical complexity of the QRWs-based PRNG are 0.999699456771172 and 1.799961178212329e-04 respectively given the number of 8 bits-words, say, 16Mbits. By contrast, the corresponding values of the QCM-based PRNG are 0.999448131481064 and 3.701210794388818e-04 respectively. Thus the statistical complexity and the normalized entropy of the QRWs-based PRNG are closer to 0 and 1 respectively than those of the QCM-based PRNG when the number of words of the analyzed sequence increases. It provides a new clue to construct PRNGs and also extends the applications of quantum computation. PMID:26842402

  1. Linguistic Strategies for Improving Informed Consent in Clinical Trials Among Low Health Literacy Patients.

    PubMed

    Krieger, Janice L; Neil, Jordan M; Strekalova, Yulia A; Sarge, Melanie A

    2017-03-01

    Improving informed consent to participate in randomized clinical trials (RCTs) is a key challenge in cancer communication. The current study examines strategies for enhancing randomization comprehension among patients with diverse levels of health literacy and identifies cognitive and affective predictors of intentions to participate in cancer RCTs. Using a post-test-only experimental design, cancer patients (n = 500) were randomly assigned to receive one of three message conditions for explaining randomization (ie, plain language condition, gambling metaphor, benign metaphor) or a control message. All statistical tests were two-sided. Health literacy was a statistically significant moderator of randomization comprehension (P = .03). Among participants with the lowest levels of health literacy, the benign metaphor resulted in greater comprehension of randomization as compared with plain language (P = .04) and control (P = .004) messages. Among participants with the highest levels of health literacy, the gambling metaphor resulted in greater randomization comprehension as compared with the benign metaphor (P = .04). A serial mediation model showed a statistically significant negative indirect effect of comprehension on behavioral intention through personal relevance of RCTs and anxiety associated with participation in RCTs (P < .001). The effectiveness of metaphors for explaining randomization depends on health literacy, with a benign metaphor being particularly effective for patients at the lower end of the health literacy spectrum. The theoretical model demonstrates the cognitive and affective predictors of behavioral intention to participate in cancer RCTs and offers guidance on how future research should employ communication strategies to improve the informed consent processes. © The Author 2016. Published by Oxford University Press.

  2. Linguistic Strategies for Improving Informed Consent in Clinical Trials Among Low Health Literacy Patients

    PubMed Central

    Neil, Jordan M.; Strekalova, Yulia A.; Sarge, Melanie A.

    2017-01-01

    Abstract Background: Improving informed consent to participate in randomized clinical trials (RCTs) is a key challenge in cancer communication. The current study examines strategies for enhancing randomization comprehension among patients with diverse levels of health literacy and identifies cognitive and affective predictors of intentions to participate in cancer RCTs. Methods: Using a post-test-only experimental design, cancer patients (n = 500) were randomly assigned to receive one of three message conditions for explaining randomization (ie, plain language condition, gambling metaphor, benign metaphor) or a control message. All statistical tests were two-sided. Results: Health literacy was a statistically significant moderator of randomization comprehension (P = .03). Among participants with the lowest levels of health literacy, the benign metaphor resulted in greater comprehension of randomization as compared with plain language (P = .04) and control (P = .004) messages. Among participants with the highest levels of health literacy, the gambling metaphor resulted in greater randomization comprehension as compared with the benign metaphor (P = .04). A serial mediation model showed a statistically significant negative indirect effect of comprehension on behavioral intention through personal relevance of RCTs and anxiety associated with participation in RCTs (P < .001). Conclusions: The effectiveness of metaphors for explaining randomization depends on health literacy, with a benign metaphor being particularly effective for patients at the lower end of the health literacy spectrum. The theoretical model demonstrates the cognitive and affective predictors of behavioral intention to participate in cancer RCTs and offers guidance on how future research should employ communication strategies to improve the informed consent processes. PMID:27794035

  3. Statistical properties of fluctuating enzymes with dynamic cooperativity using a first passage time distribution formalism.

    PubMed

    Singh, Divya; Chaudhury, Srabanti

    2017-04-14

    We study the temporal fluctuations in catalytic rates for single enzyme reactions undergoing slow transitions between two active states. We use a first passage time distribution formalism to obtain the closed-form analytical expressions of the mean reaction time and the randomness parameter for reaction schemes where conformational fluctuations are present between two free enzyme conformers. Our studies confirm that the sole presence of free enzyme fluctuations yields a non Michaelis-Menten equation and can lead to dynamic cooperativity. The randomness parameter, which is a measure of the dynamic disorder in the system, converges to unity at a high substrate concentration. If slow fluctuations are present between the enzyme-substrate conformers (off-pathway mechanism), dynamic disorder is present at a high substrate concentration. Our results confirm that the dynamic disorder at a high substrate concentration is determined only by the slow fluctuations between the enzyme-substrate conformers and the randomness parameter is greater than unity. Slow conformational fluctuations between free enzymes are responsible for the emergence of dynamic cooperativity in single enzymes. Our theoretical findings are well supported by comparison with experimental data on the single enzyme beta-galactosidase.

  4. On the statistical mechanics of the 2D stochastic Euler equation

    NASA Astrophysics Data System (ADS)

    Bouchet, Freddy; Laurie, Jason; Zaboronski, Oleg

    2011-12-01

    The dynamics of vortices and large scale structures is qualitatively very different in two dimensional flows compared to its three dimensional counterparts, due to the presence of multiple integrals of motion. These are believed to be responsible for a variety of phenomena observed in Euler flow such as the formation of large scale coherent structures, the existence of meta-stable states and random abrupt changes in the topology of the flow. In this paper we study stochastic dynamics of the finite dimensional approximation of the 2D Euler flow based on Lie algebra su(N) which preserves all integrals of motion. In particular, we exploit rich algebraic structure responsible for the existence of Euler's conservation laws to calculate the invariant measures and explore their properties and also study the approach to equilibrium. Unexpectedly, we find deep connections between equilibrium measures of finite dimensional su(N) truncations of the stochastic Euler equations and random matrix models. Our work can be regarded as a preparation for addressing the questions of large scale structures, meta-stability and the dynamics of random transitions between different flow topologies in stochastic 2D Euler flows.

  5. Periodicity in the autocorrelation function as a mechanism for regularly occurring zero crossings or extreme values of a Gaussian process.

    PubMed

    Wilson, Lorna R M; Hopcraft, Keith I

    2017-12-01

    The problem of zero crossings is of great historical prevalence and promises extensive application. The challenge is to establish precisely how the autocorrelation function or power spectrum of a one-dimensional continuous random process determines the density function of the intervals between the zero crossings of that process. This paper investigates the case where periodicities are incorporated into the autocorrelation function of a smooth process. Numerical simulations, and statistics about the number of crossings in a fixed interval, reveal that in this case the zero crossings segue between a random and deterministic point process depending on the relative time scales of the periodic and nonperiodic components of the autocorrelation function. By considering the Laplace transform of the density function, we show that incorporating correlation between successive intervals is essential to obtaining accurate results for the interval variance. The same method enables prediction of the density function tail in some regions, and we suggest approaches for extending this to cover all regions. In an ever-more complex world, the potential applications for this scale of regularity in a random process are far reaching and powerful.

  6. Periodicity in the autocorrelation function as a mechanism for regularly occurring zero crossings or extreme values of a Gaussian process

    NASA Astrophysics Data System (ADS)

    Wilson, Lorna R. M.; Hopcraft, Keith I.

    2017-12-01

    The problem of zero crossings is of great historical prevalence and promises extensive application. The challenge is to establish precisely how the autocorrelation function or power spectrum of a one-dimensional continuous random process determines the density function of the intervals between the zero crossings of that process. This paper investigates the case where periodicities are incorporated into the autocorrelation function of a smooth process. Numerical simulations, and statistics about the number of crossings in a fixed interval, reveal that in this case the zero crossings segue between a random and deterministic point process depending on the relative time scales of the periodic and nonperiodic components of the autocorrelation function. By considering the Laplace transform of the density function, we show that incorporating correlation between successive intervals is essential to obtaining accurate results for the interval variance. The same method enables prediction of the density function tail in some regions, and we suggest approaches for extending this to cover all regions. In an ever-more complex world, the potential applications for this scale of regularity in a random process are far reaching and powerful.

  7. Microscopic origin of read current noise in TaOx-based resistive switching memory by ultra-low temperature measurement

    NASA Astrophysics Data System (ADS)

    Pan, Yue; Cai, Yimao; Liu, Yefan; Fang, Yichen; Yu, Muxi; Tan, Shenghu; Huang, Ru

    2016-04-01

    TaOx-based resistive random access memory (RRAM) attracts considerable attention for the development of next generation nonvolatile memories. However, read current noise in RRAM is one of the critical concerns for storage application, and its microscopic origin is still under debate. In this work, the read current noise in TaOx-based RRAM was studied thoroughly. Based on a noise power spectral density analysis at room temperature and at ultra-low temperature of 25 K, discrete random telegraph noise (RTN) and continuous average current fluctuation (ACF) are identified and decoupled from the total read current noise in TaOx RRAM devices. A statistical comparison of noise amplitude further reveals that ACF depends strongly on the temperature, whereas RTN is independent of the temperature. Measurement results combined with conduction mechanism analysis show that RTN in TaOx RRAM devices arises from electron trapping/detrapping process in the hopping conduction, and ACF is originated from the thermal activation of conduction centers that form the percolation network. At last, a unified model in the framework of hopping conduction is proposed to explain the underlying mechanism of both RTN and ACF noise, which can provide meaningful guidelines for designing noise-immune RRAM devices.

  8. Triggering extreme events at the nanoscale in photonic seas

    NASA Astrophysics Data System (ADS)

    Liu, C.; van der Wel, R. E. C.; Rotenberg, N.; Kuipers, L.; Krauss, T. F.; di Falco, A.; Fratalocchi, A.

    2015-04-01

    Hurricanes, tsunamis, rogue waves and tornadoes are rare natural phenomena that embed an exceptionally large amount of energy, which appears and quickly disappears in a probabilistic fashion. This makes them difficult to predict and hard to generate on demand. Here we demonstrate that we can trigger the onset of rare events akin to rogue waves controllably, and systematically use their generation to break the diffraction limit of light propagation. We illustrate this phenomenon in the case of a random field, where energy oscillates among incoherent degrees of freedom. Despite the low energy carried by each wave, we illustrate how to control a mechanism of spontaneous synchronization, which constructively builds up the spectral energy available in the whole bandwidth of the field into giant structures, whose statistics is predictable. The larger the frequency bandwidth of the random field, the larger the amplitude of rare events that are built up by this mechanism. Our system is composed of an integrated optical resonator, realized on a photonic crystal chip. Through near-field imaging experiments, we record confined rogue waves characterized by a spatial localization of 206 nm and with an ultrashort duration of 163 fs at a wavelength of 1.55 μm. Such localized energy patterns are formed in a deterministic dielectric structure that does not require nonlinear properties.

  9. Toward a quantitative approach to migrants integration

    NASA Astrophysics Data System (ADS)

    Barra, A.; Contucci, P.

    2010-03-01

    Migration phenomena and all the related issues, like integration of different social groups, are intrinsically complex problems since they strongly depend on several competitive mechanisms as economic factors, cultural differences and many others. By identifying a few essential assumptions, and using the statistical mechanics of complex systems, we propose a novel quantitative approach that provides a minimal theory for those phenomena. We show that the competitive interactions in decision making between a population of N host citizens and P immigrants, a bi-partite spin-glass, give rise to a social consciousness inside the host community in the sense of the associative memory of neural networks. The theory leads to a natural quantitative definition of migrant's "integration" inside the community. From the technical point of view this minimal picture assumes, as control parameters, only general notions like the strength of the random interactions, the ratio between the sizes of the two parties and the cultural influence. Few steps forward, toward more refined models, which include a digression on the kind of the felt experiences and some structure on the random interaction topology (as dilution to avoid the plain mean-field approach) and correlations of experiences felt between the two parties (biasing the distribution of the coupling) are discussed at the end, where we show the robustness of our approach.

  10. Acute effects of single and multiple level thoracic manipulations on chronic mechanical neck pain: a randomized controlled trial

    PubMed Central

    Puntumetakul, Rungthip; Suvarnnato, Thavatchai; Werasirirat, Phurichaya; Uthaikhup, Sureeporn; Yamauchi, Junichiro; Boucaut, Rose

    2015-01-01

    Background Thoracic spine manipulation has become a popular alternative to local cervical manipulative therapy for mechanical neck pain. This study investigated the acute effects of single-level and multiple-level thoracic manipulations on chronic mechanical neck pain (CMNP). Methods Forty-eight patients with CMNP were randomly allocated to single-level thoracic manipulation (STM) at T6–T7 or multiple-level thoracic manipulation (MTM), or to a control group (prone lying). Cervical range of motion (CROM), visual analog scale (VAS), and the Thai version of the Neck Disability Index (NDI-TH) scores were measured at baseline, and at 24-hour and at 1-week follow-up. Results At 24-hour and 1-week follow-up, neck disability and pain levels were significantly (P<0.05) improved in the STM and MTM groups compared with the control group. CROM in flexion and left lateral flexion were increased significantly (P<0.05) in the STM group when compared with the control group at 1-week follow-up. The CROM in right rotation was increased significantly after MTM compared to the control group (P<0.05) at 24-hour follow-up. There were no statistically significant differences in neck disability, pain level at rest, and CROM between the STM and MTM groups. Conclusion These results suggest that both single-level and multiple-level thoracic manipulation improve neck disability, pain levels, and CROM at 24-hour and 1-week follow-up in patients with CMNP. PMID:25624764

  11. Statistical process control in nursing research.

    PubMed

    Polit, Denise F; Chaboyer, Wendy

    2012-02-01

    In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.

  12. The Statistical Basis of Chemical Equilibria.

    ERIC Educational Resources Information Center

    Hauptmann, Siegfried; Menger, Eva

    1978-01-01

    Describes a machine which demonstrates the statistical bases of chemical equilibrium, and in doing so conveys insight into the connections among statistical mechanics, quantum mechanics, Maxwell Boltzmann statistics, statistical thermodynamics, and transition state theory. (GA)

  13. CO2 driven endotracheal tube cuff control in critically ill patients: A randomized controlled study.

    PubMed

    De Pascale, Gennaro; Pennisi, Mariano Alberto; Vallecoccia, Maria Sole; Bello, Giuseppe; Maviglia, Riccardo; Montini, Luca; Di Gravio, Valentina; Cutuli, Salvatore Lucio; Conti, Giorgio; Antonelli, Massimo

    2017-01-01

    To determine the safety and clinical efficacy of an innovative integrated airway system (AnapnoGuard™ 100 system) that continuously monitors and controls the cuff pressure (Pcuff), while facilitating the aspiration of subglottic secretions (SS). This was a prospective, single centre, open-label, randomized, controlled feasibility and safety trial. The primary endpoint of the study was the rate of device related adverse events (AE) and serious AE (SAE) as a result of using AnapnoGuard (AG) 100 during mechanical ventilation. Secondary endpoints were: (1) mechanical complications rate (2) ICU staff satisfaction; (3) VAP occurrence; (4) length of mechanical ventilation; (5) length of Intensive Care Unit stay and mortality; (6) volume of evacuated subglottic secretions. Sixty patients were randomized to be intubated with the AG endotracheal-tube (ETT) and connected to the AG 100 system allowing Pcuff adjustment and SS aspiration; or with an ETT combined with SS drainage and Pcuff controlled manually. No difference in adverse events rate was identified between the groups. The use of AG system was associated with a significantly higher incidence of Pcuff determinations in the safety range (97.3% vs. 71%; p<0.01) and a trend to a greater volume of aspirated SS secretions: (192.0[64-413] ml vs. 150[50-200], p = 0.19 (total)); (57.8[20-88.7] ml vs. 50[18.7-62] ml, p = 0.11 (daily)). No inter-group difference was detected using AG system vs. controls in terms of post-extubation throat pain level (0 [0-2] vs. 0 [0-3]; p = 0.7), hoarseness (42.9% vs. 75%; p = 0.55) and tracheal mucosa oedema (16.7% vs. 10%; p = 0.65). Patients enrolled in the AG group had a trend to reduced VAP risk of ventilator-associated pneumonia(VAP) (14.8% vs. 40%; p = 0.06), which were more frequently monomicrobial (25% vs. 70%; p = 0.03). No statistically significant difference was observed in duration of mechanical ventilation, ICU stay, and mortality. The use AG 100 system and AG tube in critically ill intubated patients is safe and effective in Pcuff control and SS drainage. Its protective role against VAP needs to be confirmed in a larger randomized trial. ClinicalTrials.gov NCT01550978. Date of registration: February 21, 2012.

  14. Non-invasive high-frequency ventilation versus bi-phasic continuous positive airway pressure (BP-CPAP) following CPAP failure in infants <1250 g: a pilot randomized controlled trial.

    PubMed

    Mukerji, A; Sarmiento, K; Lee, B; Hassall, K; Shah, V

    2017-01-01

    Non-invasive high-frequency ventilation (NIHFV), a relatively new modality, is gaining popularity despite limited data. We sought to evaluate the effectiveness of NIHFV versus bi-phasic continuous positive airway pressure (BP-CPAP) in preterm infants failing CPAP. Infants with BW<1250 g on CPAP were randomly assigned to NIHFV or BP-CPAP if they met pre-determined criteria for CPAP failure. Infants were eligible for randomization after 72 h age and until 2000 g. Guidelines for adjustment of settings and criteria for failure of assigned mode were implemented. The primary aim was to assess feasibility of a larger trial. In addition, failure of assigned non-invasive respiratory support (NRS) mode, invasive mechanical ventilation (MV) 72 h and 7 days post-randomization, and bronchopulmonary dysplasia (BPD) were assessed. Thirty-nine infants were randomized to NIHFV (N=16) or BP-CPAP (N=23). There were no significant differences in mean (s.d.) postmenstrual age (28.6 (1.5) versus 29.0 (2.3) weeks, P=0.47), mean (s.d.) weight at randomization (965.0 (227.0) versus 958.1 (310.4) g, P=0.94) or other baseline demographics between the groups. Failure of assigned NRS mode was lower with NIHFV (37.5 versus 65.2%, P=0.09), although not statistically significant. There were no differences in rates of invasive MV 72 h and 7 days post-randomization or BPD. NIHFV was not superior to BP-CPAP in this pilot study. Effectiveness of NIHFV needs to be proven in larger multi-center, appropriately powered trials before widespread implementation.

  15. Correlated randomness: Some examples of exotic statistical physics

    NASA Astrophysics Data System (ADS)

    Stanley, H. Eugene

    2005-05-01

    One challenge of biology, medicine, and economics is that the systems treated by these sciences have no perfect metronome in time and no perfect spatial architecture -- crystalline or otherwise. Nonetheless, as if by magic, out of nothing but randomness one finds remarkably fine-tuned processes in time and remarkably fine-tuned structures in space. To understand this `miracle', one might consider placing aside the human tendency to see the universe as a machine. Instead, one might address the challenge of uncovering how, through randomness (albeit, as we shall see, strongly correlated randomness), one can arrive at many spatial and temporal patterns in biology, medicine, and economics. Inspired by principles developed by statistical physics over the past 50 years -- scale invariance and universality -- we review some recent applications of correlated randomness to fields that might startle Boltzmann if he were alive today.

  16. Tensor Minkowski Functionals for random fields on the sphere

    NASA Astrophysics Data System (ADS)

    Chingangbam, Pravabati; Yogendran, K. P.; Joby, P. K.; Ganesan, Vidhya; Appleby, Stephen; Park, Changbom

    2017-12-01

    We generalize the translation invariant tensor-valued Minkowski Functionals which are defined on two-dimensional flat space to the unit sphere. We apply them to level sets of random fields. The contours enclosing boundaries of level sets of random fields give a spatial distribution of random smooth closed curves. We outline a method to compute the tensor-valued Minkowski Functionals numerically for any random field on the sphere. Then we obtain analytic expressions for the ensemble expectation values of the matrix elements for isotropic Gaussian and Rayleigh fields. The results hold on flat as well as any curved space with affine connection. We elucidate the way in which the matrix elements encode information about the Gaussian nature and statistical isotropy (or departure from isotropy) of the field. Finally, we apply the method to maps of the Galactic foreground emissions from the 2015 PLANCK data and demonstrate their high level of statistical anisotropy and departure from Gaussianity.

  17. A comparative study of restricted randomization procedures for multiarm trials with equal or unequal treatment allocation ratios.

    PubMed

    Ryeznik, Yevgen; Sverdlov, Oleksandr

    2018-06-04

    Randomization designs for multiarm clinical trials are increasingly used in practice, especially in phase II dose-ranging studies. Many new methods have been proposed in the literature; however, there is lack of systematic, head-to-head comparison of the competing designs. In this paper, we systematically investigate statistical properties of various restricted randomization procedures for multiarm trials with fixed and possibly unequal allocation ratios. The design operating characteristics include measures of allocation balance, randomness of treatment assignments, variations in the allocation ratio, and statistical characteristics such as type I error rate and power. The results from the current paper should help clinical investigators select an appropriate randomization procedure for their clinical trial. We also provide a web-based R shiny application that can be used to reproduce all results in this paper and run simulations under additional user-defined experimental scenarios. Copyright © 2018 John Wiley & Sons, Ltd.

  18. Quantifying randomness in real networks

    NASA Astrophysics Data System (ADS)

    Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-10-01

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs.

  19. Random-field-induced disordering mechanism in a disordered ferromagnet: Between the Imry-Ma and the standard disordering mechanism

    NASA Astrophysics Data System (ADS)

    Andresen, Juan Carlos; Katzgraber, Helmut G.; Schechter, Moshe

    2017-12-01

    Random fields disorder Ising ferromagnets by aligning single spins in the direction of the random field in three space dimensions, or by flipping large ferromagnetic domains at dimensions two and below. While the former requires random fields of typical magnitude similar to the interaction strength, the latter Imry-Ma mechanism only requires infinitesimal random fields. Recently, it has been shown that for dilute anisotropic dipolar systems a third mechanism exists, where the ferromagnetic phase is disordered by finite-size glassy domains at a random field of finite magnitude that is considerably smaller than the typical interaction strength. Using large-scale Monte Carlo simulations and zero-temperature numerical approaches, we show that this mechanism applies to disordered ferromagnets with competing short-range ferromagnetic and antiferromagnetic interactions, suggesting its generality in ferromagnetic systems with competing interactions and an underlying spin-glass phase. A finite-size-scaling analysis of the magnetization distribution suggests that the transition might be first order.

  20. Realizations of highly heterogeneous collagen networks via stochastic reconstruction for micromechanical analysis of tumor cell invasion

    NASA Astrophysics Data System (ADS)

    Nan, Hanqing; Liang, Long; Chen, Guo; Liu, Liyu; Liu, Ruchuan; Jiao, Yang

    2018-03-01

    Three-dimensional (3D) collective cell migration in a collagen-based extracellular matrix (ECM) is among one of the most significant topics in developmental biology, cancer progression, tissue regeneration, and immune response. Recent studies have suggested that collagen-fiber mediated force transmission in cellularized ECM plays an important role in stress homeostasis and regulation of collective cellular behaviors. Motivated by the recent in vitro observation that oriented collagen can significantly enhance the penetration of migrating breast cancer cells into dense Matrigel which mimics the intravasation process in vivo [Han et al. Proc. Natl. Acad. Sci. USA 113, 11208 (2016), 10.1073/pnas.1610347113], we devise a procedure for generating realizations of highly heterogeneous 3D collagen networks with prescribed microstructural statistics via stochastic optimization. Specifically, a collagen network is represented via the graph (node-bond) model and the microstructural statistics considered include the cross-link (node) density, valence distribution, fiber (bond) length distribution, as well as fiber orientation distribution. An optimization problem is formulated in which the objective function is defined as the squared difference between a set of target microstructural statistics and the corresponding statistics for the simulated network. Simulated annealing is employed to solve the optimization problem by evolving an initial network via random perturbations to generate realizations of homogeneous networks with randomly oriented fibers, homogeneous networks with aligned fibers, heterogeneous networks with a continuous variation of fiber orientation along a prescribed direction, as well as a binary system containing a collagen region with aligned fibers and a dense Matrigel region with randomly oriented fibers. The generation and propagation of active forces in the simulated networks due to polarized contraction of an embedded ellipsoidal cell and a small group of cells are analyzed by considering a nonlinear fiber model incorporating strain hardening upon large stretching and buckling upon compression. Our analysis shows that oriented fibers can significantly enhance long-range force transmission in the network. Moreover, in the oriented-collagen-Matrigel system, the forces generated by a polarized cell in collagen can penetrate deeply into the Matrigel region. The stressed Matrigel fibers could provide contact guidance for the migrating cell cells, and thus enhance their penetration into Matrigel. This suggests a possible mechanism for the observed enhanced intravasation by oriented collagen.

  1. Quantum chaos in ultracold collisions of gas-phase erbium atoms.

    PubMed

    Frisch, Albert; Mark, Michael; Aikawa, Kiyotaka; Ferlaino, Francesca; Bohn, John L; Makrides, Constantinos; Petrov, Alexander; Kotochigova, Svetlana

    2014-03-27

    Atomic and molecular samples reduced to temperatures below one microkelvin, yet still in the gas phase, afford unprecedented energy resolution in probing and manipulating the interactions between their constituent particles. As a result of this resolution, atoms can be made to scatter resonantly on demand, through the precise control of a magnetic field. For simple atoms, such as alkalis, scattering resonances are extremely well characterized. However, ultracold physics is now poised to enter a new regime, where much more complex species can be cooled and studied, including magnetic lanthanide atoms and even molecules. For molecules, it has been speculated that a dense set of resonances in ultracold collision cross-sections will probably exhibit essentially random fluctuations, much as the observed energy spectra of nuclear scattering do. According to the Bohigas-Giannoni-Schmit conjecture, such fluctuations would imply chaotic dynamics of the underlying classical motion driving the collision. This would necessitate new ways of looking at the fundamental interactions in ultracold atomic and molecular systems, as well as perhaps new chaos-driven states of ultracold matter. Here we describe the experimental demonstration that random spectra are indeed found at ultralow temperatures. In the experiment, an ultracold gas of erbium atoms is shown to exhibit many Fano-Feshbach resonances, of the order of three per gauss for bosons. Analysis of their statistics verifies that their distribution of nearest-neighbour spacings is what one would expect from random matrix theory. The density and statistics of these resonances are explained by fully quantum mechanical scattering calculations that locate their origin in the anisotropy of the atoms' potential energy surface. Our results therefore reveal chaotic behaviour in the native interaction between ultracold atoms.

  2. [Efficacy of epidural steroid injections for chronic lumbar pain syndromes without neurological deficits. A randomized, double blind study as part of a multimodal treatment concept].

    PubMed

    Niemier, K; Schindler, M; Volk, T; Baum, K; Wolf, B; Eberitsch, J; Seidel, W

    2015-07-01

    Chronic lumbar pain syndromes without neurological deficits are generated by a multitude of causes. Functional, morphological and psychosocial factors are discussed. In many cases a diseased intervertebral disc is found on radiological examination but the clinical relevance of these findings is not clear. For this study it was postulated that a diseased disc results in a local inflammatory reaction therefore causing pain and impairing treatability of patients. An epidural injection of steroids can reduce inflammation and therefore improve treatability and ultimately treatment outcome. A double blind randomized prospective trial was carried out. Patients treated in hospital for a chronic lumbar pain syndrome without neurological deficits within a multimodal treatment program were screened for indications for an epidural steroid injection (e.g. diseased lumbar disc and intention to treat). Patients eligible for the study were randomized into two groups. The treatment group received an epidural injection of 80 mg triamcinolone and 8 ml bupivacaine 0.25 %. The control group received only an epidural injection of 8 ml bupivacaine 0.25 %. In both groups pain intensity and treatability showed a statistically significant improvement after the epidural injection. The differences between the control and treatment groups were small and not clinically relevant. A small subgroup might profit from the steroid injection. In addition the treatability was dependent on psychometric values and the long-term outcome from a reduction of muscular skeletal dysfunctions. After the epidural injection the decrease in pain and increase in treatability was statistically significant. The mechanism of the improvement is not clear and should be examined further. The epidural injection of a steroid in this subgroup of patients did not lead to a clinical improvement in the outcome.

  3. What's Missing in Teaching Probability and Statistics: Building Cognitive Schema for Understanding Random Phenomena

    ERIC Educational Resources Information Center

    Kuzmak, Sylvia

    2016-01-01

    Teaching probability and statistics is more than teaching the mathematics itself. Historically, the mathematics of probability and statistics was first developed through analyzing games of chance such as the rolling of dice. This article makes the case that the understanding of probability and statistics is dependent upon building a…

  4. Development and Assessment of a Preliminary Randomization-Based Introductory Statistics Curriculum

    ERIC Educational Resources Information Center

    Tintle, Nathan; VanderStoep, Jill; Holmes, Vicki-Lynn; Quisenberry, Brooke; Swanson, Todd

    2011-01-01

    The algebra-based introductory statistics course is the most popular undergraduate course in statistics. While there is a general consensus for the content of the curriculum, the recent Guidelines for Assessment and Instruction in Statistics Education (GAISE) have challenged the pedagogy of this course. Additionally, some arguments have been made…

  5. The Fluctuation-Dissipation Theorem of Colloidal Particle's energy on 2D Periodic Substrates: A Monte Carlo Study of thermal noise-like fluctuation and diffusion like Brownian motion

    NASA Astrophysics Data System (ADS)

    Najafi, Amin

    2014-05-01

    Using the Monte Carlo simulations, we have calculated mean-square fluctuations in statistical mechanics, such as those for colloids energy configuration are set on square 2D periodic substrates interacting via a long range screened Coulomb potential on any specific and fixed substrate. Random fluctuations with small deviations from the state of thermodynamic equilibrium arise from the granular structure of them and appear as thermal diffusion with Gaussian distribution structure as well. The variations are showing linear form of the Fluctuation-Dissipation Theorem on the energy of particles constitutive a canonical ensemble with continuous diffusion process of colloidal particle systems. The noise-like variation of the energy per particle and the order parameter versus the Brownian displacement of sum of large number of random steps of particles at low temperatures phase are presenting a markovian process on colloidal particles configuration, too.

  6. Battles between an insurgent army and an advanced army - focus on strategy

    NASA Astrophysics Data System (ADS)

    Sen, Surajit; Shanahan, Linda

    2008-03-01

    Detailed and aggregate analyses of the outcome of past battles focusing on rates of troop losses or on the ratios of forces on each side is at the heart of present knowledge about battles. Here we present non-equilibrium statistical mechanics based studies of possible outcomes of well matched strategic battles by a ``blue'' army against insurgency based attacks by well matched opponents in a ``red'' army in red territory. We assume that the red army attacks with randomly varying force levels to potentially confuse and drive the blue's strategies. The temporal evolution of the model battles incorporate randomness in the deployment of the reds and hence possess attendant history dependence. Our results reveal that while unpredictable events play a major role in battles, a balance between risk of exposure in a battlefield and the use of short range intelligence is needed in determining whether one side can decimate the other, and hence force a battle to end.

  7. Statistical mechanics of complex economies

    NASA Astrophysics Data System (ADS)

    Bardoscia, Marco; Livan, Giacomo; Marsili, Matteo

    2017-04-01

    In the pursuit of ever increasing efficiency and growth, our economies have evolved to remarkable degrees of complexity, with nested production processes feeding each other in order to create products of greater sophistication from less sophisticated ones, down to raw materials. The engine of such an expansion have been competitive markets that, according to general equilibrium theory (GET), achieve efficient allocations under specific conditions. We study large random economies within the GET framework, as templates of complex economies, and we find that a non-trivial phase transition occurs: the economy freezes in a state where all production processes collapse when either the number of primary goods or the number of available technologies fall below a critical threshold. As in other examples of phase transitions in large random systems, this is an unintended consequence of the growth in complexity. Our findings suggest that the Industrial Revolution can be regarded as a sharp transition between different phases, but also imply that well developed economies can collapse if too many intermediate goods are introduced.

  8. Charting the Replica Symmetric Phase

    NASA Astrophysics Data System (ADS)

    Coja-Oghlan, Amin; Efthymiou, Charilaos; Jaafari, Nor; Kang, Mihyun; Kapetanopoulos, Tobias

    2018-02-01

    Diluted mean-field models are spin systems whose geometry of interactions is induced by a sparse random graph or hypergraph. Such models play an eminent role in the statistical mechanics of disordered systems as well as in combinatorics and computer science. In a path-breaking paper based on the non-rigorous `cavity method', physicists predicted not only the existence of a replica symmetry breaking phase transition in such models but also sketched a detailed picture of the evolution of the Gibbs measure within the replica symmetric phase and its impact on important problems in combinatorics, computer science and physics (Krzakala et al. in Proc Natl Acad Sci 104:10318-10323, 2007). In this paper we rigorise this picture completely for a broad class of models, encompassing the Potts antiferromagnet on the random graph, the k-XORSAT model and the diluted k-spin model for even k. We also prove a conjecture about the detection problem in the stochastic block model that has received considerable attention (Decelle et al. in Phys Rev E 84:066106, 2011).

  9. The Hot (Invisible?) Hand: Can Time Sequence Patterns of Success/Failure in Sports Be Modeled as Repeated Random Independent Trials?

    PubMed Central

    Yaari, Gur; Eisenmann, Shmuel

    2011-01-01

    The long lasting debate initiated by Gilovich, Vallone and Tversky in is revisited: does a “hot hand” phenomenon exist in sports? Hereby we come back to one of the cases analyzed by the original study, but with a much larger data set: all free throws taken during five regular seasons () of the National Basketball Association (NBA). Evidence supporting the existence of the “hot hand” phenomenon is provided. However, while statistical traces of this phenomenon are observed in the data, an open question still remains: are these non random patterns a result of “success breeds success” and “failure breeds failure” mechanisms or simply “better” and “worse” periods? Although free throws data is not adequate to answer this question in a definite way, we speculate based on it, that the latter is the dominant cause behind the appearance of the “hot hand” phenomenon in the data. PMID:21998630

  10. The hot (invisible?) hand: can time sequence patterns of success/failure in sports be modeled as repeated random independent trials?

    PubMed

    Yaari, Gur; Eisenmann, Shmuel

    2011-01-01

    The long lasting debate initiated by Gilovich, Vallone and Tversky in [Formula: see text] is revisited: does a "hot hand" phenomenon exist in sports? Hereby we come back to one of the cases analyzed by the original study, but with a much larger data set: all free throws taken during five regular seasons ([Formula: see text]) of the National Basketball Association (NBA). Evidence supporting the existence of the "hot hand" phenomenon is provided. However, while statistical traces of this phenomenon are observed in the data, an open question still remains: are these non random patterns a result of "success breeds success" and "failure breeds failure" mechanisms or simply "better" and "worse" periods? Although free throws data is not adequate to answer this question in a definite way, we speculate based on it, that the latter is the dominant cause behind the appearance of the "hot hand" phenomenon in the data.

  11. Yoga in the schools: a systematic review of the literature.

    PubMed

    Serwacki, Michelle L; Cook-Cottone, Catherine

    2012-01-01

    The objective of this research was to examine the evidence for delivering yoga-based interventions in schools. An electronic literature search was conducted to identify peer-reviewed, published studies in which yoga and a meditative component (breathing practices or meditation) were taught to youths in a school setting. Pilot studies, single cohort, quasi-experimental, and randomized clinical trials were considered. quality was evaluated and summarized. Twelve published studies were identified. Samples for which yoga was implemented as an intervention included youths with autism, intellectual disability, learning disability, and emotional disturbance, as well as typically developing youths. Although effects of participating in school-based yoga programs appeared to be beneficial for the most part, methodological limitations, including lack of randomization, small samples, limited detail regarding the intervention, and statistical ambiguities curtailed the ability to provide definitive conclusions or recommendations. Findings speak to the need for greater methodological rigor and an increased understanding of the mechanisms of success for school-based yoga interventions.

  12. Branched flow and caustics in random media with magnetic fields

    NASA Astrophysics Data System (ADS)

    Metzger, Jakob; Fleischmann, Ragnar; Geisel, Theo

    2009-03-01

    Classical particles as well as quantum mechanical waves exhibit complex behaviour when propagating through random media. One of the dominant features of the dynamics in correlated, weak disorder potentials is the branching of the flow. This can be observed in several physical systems, most notably in the electron flow in two-dimensional electron gases [1], and has also been used to describe the formation of freak waves [2]. We present advances in the theoretical understanding and numerical simulation of classical branched flows in magnetic fields. In particular, we study branching statistics and branch density profiles. Our results have direct consequences for experiments which measure transport properties in electronic systems [3].[1] e.g. M. A. Topinka et al., Nature 410, 183 (2001), M. P. Jura et al., Nature Physics 3, 841 (2007)[2] E. J. Heller, L. Kaplan and A. Dahlen, J. Geophys. Res., 113, C09023 (2008)[3] J. J. Metzger, R. Fleischmann and T. Geisel, in preparation

  13. Randomized clinical trial of extended use of a hydrophobic condenser humidifier: 1 vs. 7 days.

    PubMed

    Thomachot, Laurent; Leone, Marc; Razzouk, Karim; Antonini, François; Vialet, Renaud; Martin, Claude

    2002-01-01

    To determine whether extended use (7 days) would affect the efficiency on heat and water preservation of a hydrophobic condenser humidifier as well as the rate of ventilation-acquired pneumonia, compared with 1 day of use. Prospective, controlled, randomized, not blinded, clinical study. Twelve-bed intensive care unit of a university hospital. One hundred and fifty-five consecutive patients undergoing mechanical ventilation for > or = 48 hrs. After randomization, patients were allocated to one of the two following groups: a) heat and moisture exchangers (HMEs) changed every 24 hrs; b) HMEs changed only once a week. Devices in both groups could be changed at the discretion of the staff when signs of occlusion or increased resistance were identified. Efficient airway humidification and heating were assessed by clinical variables (numbers of tracheal suctionings and instillations required, peak and mean airway pressures). The frequency rates of bronchial colonization and ventilation-acquired pneumonia were evaluated by using clinical and microbiological criteria. Endotracheal tube occlusion, ventilatory support variables, duration of mechanical ventilation, length of intensive care, acquired multiorgan dysfunction, and mortality rates also were recorded. The two groups were similar at the time of randomization. Endotracheal tube occlusion never occurred. In the targeted population (patients ventilated for > or = 7 days), the frequency rate of ventilation-acquired pneumonia was 24% in the HME 1-day group and 17% in the HME 7-day group (p > .05, not significant). Ventilation-acquired pneumonia rates per 1000 ventilatory support days were 16.4/1000 in the HME 1-day group and 12.4/1000 in the HME 7-day group (p > .05, not significant). No statistically significant differences were found between the two groups for duration of mechanical ventilation, intensive care unit length of stay, acquired organ system derangements, and mortality rate. There was indirect evidence of very little, if any, change in HME resistance. Changing the studied hydrophobic HME after 7 days did not affect efficiency, increase resistance, or altered bacterial colonization. The frequency rate of ventilation-acquired pneumonia was also unchanged. Use of HMEs for > 24 hrs and up to 7 days is safe.

  14. Précis of statistical significance: rationale, validity, and utility.

    PubMed

    Chow, S L

    1998-04-01

    The null-hypothesis significance-test procedure (NHSTP) is defended in the context of the theory-corroboration experiment, as well as the following contrasts: (a) substantive hypotheses versus statistical hypotheses, (b) theory corroboration versus statistical hypothesis testing, (c) theoretical inference versus statistical decision, (d) experiments versus nonexperimental studies, and (e) theory corroboration versus treatment assessment. The null hypothesis can be true because it is the hypothesis that errors are randomly distributed in data. Moreover, the null hypothesis is never used as a categorical proposition. Statistical significance means only that chance influences can be excluded as an explanation of data; it does not identify the nonchance factor responsible. The experimental conclusion is drawn with the inductive principle underlying the experimental design. A chain of deductive arguments gives rise to the theoretical conclusion via the experimental conclusion. The anomalous relationship between statistical significance and the effect size often used to criticize NHSTP is more apparent than real. The absolute size of the effect is not an index of evidential support for the substantive hypothesis. Nor is the effect size, by itself, informative as to the practical importance of the research result. Being a conditional probability, statistical power cannot be the a priori probability of statistical significance. The validity of statistical power is debatable because statistical significance is determined with a single sampling distribution of the test statistic based on H0, whereas it takes two distributions to represent statistical power or effect size. Sample size should not be determined in the mechanical manner envisaged in power analysis. It is inappropriate to criticize NHSTP for nonstatistical reasons. At the same time, neither effect size, nor confidence interval estimate, nor posterior probability can be used to exclude chance as an explanation of data. Neither can any of them fulfill the nonstatistical functions expected of them by critics.

  15. Intermittent pneumatic compression to prevent venous thromboembolism in patients with high risk of bleeding hospitalized in intensive care units: the CIREA1 randomized trial.

    PubMed

    Vignon, Philippe; Dequin, Pierre-François; Renault, Anne; Mathonnet, Armelle; Paleiron, Nicolas; Imbert, Audrey; Chatellier, Delphine; Gissot, Valérie; Lhéritier, Gwenaelle; Aboyans, Victor; Prat, Gwenael; Garot, Denis; Boulain, Thierry; Diehl, Jean-Luc; Bressollette, Luc; Delluc, Aurélien; Lacut, Karine

    2013-05-01

    Venous thromboembolism (VTE) is a frequent and serious problem in intensive care units (ICU). Anticoagulant treatments have demonstrated their efficacy in preventing VTE. However, when the bleeding risk is high, they are contraindicated, and mechanical devices are recommended. To date, mechanical prophylaxis has not been rigorously evaluated in any trials in ICU patients. In this multicenter, open-label, randomized trial with blinded evaluation of endpoints, we randomly assigned 407 patients with a high risk of bleeding to receive intermittent pneumatic compression (IPC) associated with graduated compression stockings (GCS) or GCS alone for 6 days during their ICU stay. The primary endpoint was the occurrence of a VTE between days 1 and 6, including nonfatal symptomatic documented VTE, or death due to a pulmonary embolism, or asymptomatic deep vein thrombosis detected by ultrasonography systematically performed on day 6. The primary outcome was assessed in 363 patients (89.2%). By day 6, the incidence of the primary outcome was 5.6% (10 of 179 patients) in the IPC + GCS group and 9.2% (17 of 184 patients) in the GCS group (relative risk 0.60; 95% confidence interval 0.28-1.28; p = 0.19). Tolerance of IPC was poor in only 12 patients (6.0%). No intergroup difference in mortality rate was observed. With the limitation of a low statistical power, our results do not support the superiority of the combination of IPC + GCS compared to GCS alone to prevent VTE in ICU patients at high risk of bleeding.

  16. Weak Ergodicity Breaking of Receptor Motion in Living Cells Stemming from Random Diffusivity

    NASA Astrophysics Data System (ADS)

    Manzo, Carlo; Torreno-Pina, Juan A.; Massignan, Pietro; Lapeyre, Gerald J.; Lewenstein, Maciej; Garcia Parajo, Maria F.

    2015-01-01

    Molecular transport in living systems regulates numerous processes underlying biological function. Although many cellular components exhibit anomalous diffusion, only recently has the subdiffusive motion been associated with nonergodic behavior. These findings have stimulated new questions for their implications in statistical mechanics and cell biology. Is nonergodicity a common strategy shared by living systems? Which physical mechanisms generate it? What are its implications for biological function? Here, we use single-particle tracking to demonstrate that the motion of dendritic cell-specific intercellular adhesion molecule 3-grabbing nonintegrin (DC-SIGN), a receptor with unique pathogen-recognition capabilities, reveals nonergodic subdiffusion on living-cell membranes In contrast to previous studies, this behavior is incompatible with transient immobilization, and, therefore, it cannot be interpreted according to continuous-time random-walk theory. We show that the receptor undergoes changes of diffusivity, consistent with the current view of the cell membrane as a highly dynamic and diverse environment. Simulations based on a model of an ordinary random walk in complex media quantitatively reproduce all our observations, pointing toward diffusion heterogeneity as the cause of DC-SIGN behavior. By studying different receptor mutants, we further correlate receptor motion to its molecular structure, thus establishing a strong link between nonergodicity and biological function. These results underscore the role of disorder in cell membranes and its connection with function regulation. Because of its generality, our approach offers a framework to interpret anomalous transport in other complex media where dynamic heterogeneity might play a major role, such as those found, e.g., in soft condensed matter, geology, and ecology.

  17. Persistant Spectral Hole-Burning: Photon-Gating and Fundamental Statistical Limits

    DTIC Science & Technology

    1989-11-03

    pentacene inhomogeneous line that results from tile statistics of independent, additive random variables. For this data, Nil - 10’. The rms amplitude...features in inhomogencous lines. To illustrate this, Figure 5 shows a portion of the optical spectrum of pentacene in p-terphenyl before and after a...contained in each irradiated spot of recording medium. The stress-induced variations in the local environment of the storage centers are random in nature

  18. A generalization of random matrix theory and its application to statistical physics.

    PubMed

    Wang, Duan; Zhang, Xin; Horvatic, Davor; Podobnik, Boris; Eugene Stanley, H

    2017-02-01

    To study the statistical structure of crosscorrelations in empirical data, we generalize random matrix theory and propose a new method of cross-correlation analysis, known as autoregressive random matrix theory (ARRMT). ARRMT takes into account the influence of auto-correlations in the study of cross-correlations in multiple time series. We first analytically and numerically determine how auto-correlations affect the eigenvalue distribution of the correlation matrix. Then we introduce ARRMT with a detailed procedure of how to implement the method. Finally, we illustrate the method using two examples taken from inflation rates for air pressure data for 95 US cities.

  19. Effects of Interventions on Survival in Acute Respiratory Distress Syndrome: an Umbrella Review of 159 Published Randomized Trials and 29 Meta-analyses

    PubMed Central

    Tonelli, Adriano R.; Zein, Joe; Adams, Jacob; Ioannidis, John P.A.

    2014-01-01

    Purpose Multiple interventions have been tested in acute respiratory distress syndrome (ARDS). We examined the entire agenda of published randomized controlled trials (RCTs) in ARDS that reported on mortality and of respective meta-analyses. Methods We searched PubMed, the Cochrane Library and Web of Knowledge until July 2013. We included RCTs in ARDS published in English. We excluded trials of newborns and children; and those on short-term interventions, ARDS prevention or post-traumatic lung injury. We also reviewed all meta-analyses of RCTs in this field that addressed mortality. Treatment modalities were grouped in five categories: mechanical ventilation strategies and respiratory care, enteral or parenteral therapies, inhaled / intratracheal medications, nutritional support and hemodynamic monitoring. Results We identified 159 published RCTs of which 93 had overall mortality reported (n= 20,671 patients) - 44 trials (14,426 patients) reported mortality as a primary outcome. A statistically significant survival benefit was observed in 8 trials (7 interventions) and two trials reported an adverse effect on survival. Among RTCs with >50 deaths in at least 1 treatment arm (n=21), 2 showed a statistically significant mortality benefit of the intervention (lower tidal volumes and prone positioning), 1 showed a statistically significant mortality benefit only in adjusted analyses (cisatracurium) and 1 (high-frequency oscillatory ventilation) showed a significant detrimental effect. Across 29 meta-analyses, the most consistent evidence was seen for low tidal volumes and prone positioning in severe ARDS. Conclusions There is limited supportive evidence that specific interventions can decrease mortality in ARDS. While low tidal volumes and prone positioning in severe ARDS seem effective, most sporadic findings of interventions suggesting reduced mortality are not corroborated consistently in large-scale evidence including meta-analyses. PMID:24667919

  20. Randomized Clinical Trial Comparing Low Density versus High Density Meshes in Patients with Bilateral Inguinal Hernia.

    PubMed

    Carro, Jose Luis Porrero; Riu, Sol Villar; Lojo, Beatriz Ramos; Latorre, Lucia; Garcia, Maria Teresa Alonso; Pardo, Benito Alcaide; Naranjo, Oscar Bonachia; Herrero, Alberto Marcos; Cabezudo, Carlos Sanchez; Herreras, Esther Quiros

    2017-12-01

    We present a randomized clinical trial to compare postoperative pain, complications, feeling of a foreign body, and recurrence between heavyweight and lightweight meshes in patients with bilateral groin hernia. Sixty-seven patients with bilateral hernia were included in our study. In each patient, the side of the lightweight mesh was decided by random numbers table. Pain score was measured by visual analogue scale, on 1st, 3rd, 5th, and 7th postoperative day, and one year after the surgery. There were no statistically significative differences between both meshes in postoperative complications. About differences of pain average, there were statistically significant differences only on the 1st postoperative day (P <0.01) and the 7th postoperative day (P <0.05). In the review after a year, there were no statistically significative differences in any parameter. In our study, we did not find statistically significative differences between lightweight and heavyweight meshes in postoperative pain, complications, feeling of a foreign body, and recurrence, except pain on 1st and 7th postoperative day.

  1. A randomized trial in a massive online open course shows people don't know what a statistically significant relationship looks like, but they can learn.

    PubMed

    Fisher, Aaron; Anderson, G Brooke; Peng, Roger; Leek, Jeff

    2014-01-01

    Scatterplots are the most common way for statisticians, scientists, and the public to visually detect relationships between measured variables. At the same time, and despite widely publicized controversy, P-values remain the most commonly used measure to statistically justify relationships identified between variables. Here we measure the ability to detect statistically significant relationships from scatterplots in a randomized trial of 2,039 students in a statistics massive open online course (MOOC). Each subject was shown a random set of scatterplots and asked to visually determine if the underlying relationships were statistically significant at the P < 0.05 level. Subjects correctly classified only 47.4% (95% CI [45.1%-49.7%]) of statistically significant relationships, and 74.6% (95% CI [72.5%-76.6%]) of non-significant relationships. Adding visual aids such as a best fit line or scatterplot smooth increased the probability a relationship was called significant, regardless of whether the relationship was actually significant. Classification of statistically significant relationships improved on repeat attempts of the survey, although classification of non-significant relationships did not. Our results suggest: (1) that evidence-based data analysis can be used to identify weaknesses in theoretical procedures in the hands of average users, (2) data analysts can be trained to improve detection of statistically significant results with practice, but (3) data analysts have incorrect intuition about what statistically significant relationships look like, particularly for small effects. We have built a web tool for people to compare scatterplots with their corresponding p-values which is available here: http://glimmer.rstudio.com/afisher/EDA/.

  2. A randomized trial in a massive online open course shows people don’t know what a statistically significant relationship looks like, but they can learn

    PubMed Central

    Fisher, Aaron; Anderson, G. Brooke; Peng, Roger

    2014-01-01

    Scatterplots are the most common way for statisticians, scientists, and the public to visually detect relationships between measured variables. At the same time, and despite widely publicized controversy, P-values remain the most commonly used measure to statistically justify relationships identified between variables. Here we measure the ability to detect statistically significant relationships from scatterplots in a randomized trial of 2,039 students in a statistics massive open online course (MOOC). Each subject was shown a random set of scatterplots and asked to visually determine if the underlying relationships were statistically significant at the P < 0.05 level. Subjects correctly classified only 47.4% (95% CI [45.1%–49.7%]) of statistically significant relationships, and 74.6% (95% CI [72.5%–76.6%]) of non-significant relationships. Adding visual aids such as a best fit line or scatterplot smooth increased the probability a relationship was called significant, regardless of whether the relationship was actually significant. Classification of statistically significant relationships improved on repeat attempts of the survey, although classification of non-significant relationships did not. Our results suggest: (1) that evidence-based data analysis can be used to identify weaknesses in theoretical procedures in the hands of average users, (2) data analysts can be trained to improve detection of statistically significant results with practice, but (3) data analysts have incorrect intuition about what statistically significant relationships look like, particularly for small effects. We have built a web tool for people to compare scatterplots with their corresponding p-values which is available here: http://glimmer.rstudio.com/afisher/EDA/. PMID:25337457

  3. Testing statistical self-similarity in the topology of river networks

    USGS Publications Warehouse

    Troutman, Brent M.; Mantilla, Ricardo; Gupta, Vijay K.

    2010-01-01

    Recent work has demonstrated that the topological properties of real river networks deviate significantly from predictions of Shreve's random model. At the same time the property of mean self-similarity postulated by Tokunaga's model is well supported by data. Recently, a new class of network model called random self-similar networks (RSN) that combines self-similarity and randomness has been introduced to replicate important topological features observed in real river networks. We investigate if the hypothesis of statistical self-similarity in the RSN model is supported by data on a set of 30 basins located across the continental United States that encompass a wide range of hydroclimatic variability. We demonstrate that the generators of the RSN model obey a geometric distribution, and self-similarity holds in a statistical sense in 26 of these 30 basins. The parameters describing the distribution of interior and exterior generators are tested to be statistically different and the difference is shown to produce the well-known Hack's law. The inter-basin variability of RSN parameters is found to be statistically significant. We also test generator dependence on two climatic indices, mean annual precipitation and radiative index of dryness. Some indication of climatic influence on the generators is detected, but this influence is not statistically significant with the sample size available. Finally, two key applications of the RSN model to hydrology and geomorphology are briefly discussed.

  4. An Entropy-Based Measure of Dependence between Two Groups of Random Variables. Research Report. ETS RR-07-20

    ERIC Educational Resources Information Center

    Kong, Nan

    2007-01-01

    In multivariate statistics, the linear relationship among random variables has been fully explored in the past. This paper looks into the dependence of one group of random variables on another group of random variables using (conditional) entropy. A new measure, called the K-dependence coefficient or dependence coefficient, is defined using…

  5. Nonlinear wave chaos: statistics of second harmonic fields.

    PubMed

    Zhou, Min; Ott, Edward; Antonsen, Thomas M; Anlage, Steven M

    2017-10-01

    Concepts from the field of wave chaos have been shown to successfully predict the statistical properties of linear electromagnetic fields in electrically large enclosures. The Random Coupling Model (RCM) describes these properties by incorporating both universal features described by Random Matrix Theory and the system-specific features of particular system realizations. In an effort to extend this approach to the nonlinear domain, we add an active nonlinear frequency-doubling circuit to an otherwise linear wave chaotic system, and we measure the statistical properties of the resulting second harmonic fields. We develop an RCM-based model of this system as two linear chaotic cavities coupled by means of a nonlinear transfer function. The harmonic field strengths are predicted to be the product of two statistical quantities and the nonlinearity characteristics. Statistical results from measurement-based calculation, RCM-based simulation, and direct experimental measurements are compared and show good agreement over many decades of power.

  6. Asymptotic Linear Spectral Statistics for Spiked Hermitian Random Matrices

    NASA Astrophysics Data System (ADS)

    Passemier, Damien; McKay, Matthew R.; Chen, Yang

    2015-07-01

    Using the Coulomb Fluid method, this paper derives central limit theorems (CLTs) for linear spectral statistics of three "spiked" Hermitian random matrix ensembles. These include Johnstone's spiked model (i.e., central Wishart with spiked correlation), non-central Wishart with rank-one non-centrality, and a related class of non-central matrices. For a generic linear statistic, we derive simple and explicit CLT expressions as the matrix dimensions grow large. For all three ensembles under consideration, we find that the primary effect of the spike is to introduce an correction term to the asymptotic mean of the linear spectral statistic, which we characterize with simple formulas. The utility of our proposed framework is demonstrated through application to three different linear statistics problems: the classical likelihood ratio test for a population covariance, the capacity analysis of multi-antenna wireless communication systems with a line-of-sight transmission path, and a classical multiple sample significance testing problem.

  7. Coherent Doppler lidar signal covariance including wind shear and wind turbulence

    NASA Technical Reports Server (NTRS)

    Frehlich, R. G.

    1993-01-01

    The performance of coherent Doppler lidar is determined by the statistics of the coherent Doppler signal. The derivation and calculation of the covariance of the Doppler lidar signal is presented for random atmospheric wind fields with wind shear. The random component is described by a Kolmogorov turbulence spectrum. The signal parameters are clarified for a general coherent Doppler lidar system. There are two distinct physical regimes: one where the transmitted pulse determines the signal statistics and the other where the wind field dominates the signal statistics. The Doppler shift of the signal is identified in terms of the wind field and system parameters.

  8. Statistical and sampling issues when using multiple particle tracking

    NASA Astrophysics Data System (ADS)

    Savin, Thierry; Doyle, Patrick S.

    2007-08-01

    Video microscopy can be used to simultaneously track several microparticles embedded in a complex material. The trajectories are used to extract a sample of displacements at random locations in the material. From this sample, averaged quantities characterizing the dynamics of the probes are calculated to evaluate structural and/or mechanical properties of the assessed material. However, the sampling of measured displacements in heterogeneous systems is singular because the volume of observation with video microscopy is finite. By carefully characterizing the sampling design in the experimental output of the multiple particle tracking technique, we derive estimators for the mean and variance of the probes’ dynamics that are independent of the peculiar statistical characteristics. We expose stringent tests of these estimators using simulated and experimental complex systems with a known heterogeneous structure. Up to a certain fundamental limitation, which we characterize through a material degree of sampling by the embedded probe tracking, these estimators can be applied to quantify the heterogeneity of a material, providing an original and intelligible kind of information on complex fluid properties. More generally, we show that the precise assessment of the statistics in the multiple particle tracking output sample of observations is essential in order to provide accurate unbiased measurements.

  9. Search for Correlated Fluctuations in the Beta+ Decay of Na-22

    NASA Astrophysics Data System (ADS)

    Silverman, M. P.; Strange, W.

    2008-10-01

    Claims for a ``cosmogenic'' force that correlates otherwise independent stochastic events have been made for at least 10 years, based largely on visual inspection of time series of histograms whose shapes were interpreted as suggestive of recurrent patterns with semi-diurnal, diurnal, and monthly periods. Building on our earlier work to test randomness of different nuclear decay processes, we have searched for correlations in the time-series of coincident positron-electron annihilations deriving from beta+ decay of Na-22. Disintegrations were counted within a narrow time window over a period of 7 days, leading to a time series of more than 1 million events. Statistical tests were performed on the raw time series, its correlation function, and its Fourier transform to search for cyclic correlations indicative of quantum-mechanical violating deviations from Poisson statistics. The time series was then partitioned into a sequence of 167 ``bags'' each of 8192 events. A histogram was made of the events of each bag, where contiguous frequency classes differed by a single count. The chronological sequence of histograms was then tested for correlations within classes. In all cases the results of the tests were in accord with statistical control, giving no evidence of correlated fluctuations.

  10. Convolutionless Nakajima-Zwanzig equations for stochastic analysis in nonlinear dynamical systems.

    PubMed

    Venturi, D; Karniadakis, G E

    2014-06-08

    Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima-Zwanzig-Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection-reaction problems.

  11. Quantum signature of chaos and thermalization in the kicked Dicke model

    NASA Astrophysics Data System (ADS)

    Ray, S.; Ghosh, A.; Sinha, S.

    2016-09-01

    We study the quantum dynamics of the kicked Dicke model (KDM) in terms of the Floquet operator, and we analyze the connection between chaos and thermalization in this context. The Hamiltonian map is constructed by suitably taking the classical limit of the Heisenberg equation of motion to study the corresponding phase-space dynamics, which shows a crossover from regular to chaotic motion by tuning the kicking strength. The fixed-point analysis and calculation of the Lyapunov exponent (LE) provide us with a complete picture of the onset of chaos in phase-space dynamics. We carry out a spectral analysis of the Floquet operator, which includes a calculation of the quasienergy spacing distribution and structural entropy to show the correspondence to the random matrix theory in the chaotic regime. Finally, we analyze the thermodynamics and statistical properties of the bosonic sector as well as the spin sector, and we discuss how such a periodically kicked system relaxes to a thermalized state in accordance with the laws of statistical mechanics.

  12. How measurement reversal could erroneously suggest the capability to discriminate the preparation basis of a quantum ensemble

    NASA Astrophysics Data System (ADS)

    Goyal, Sandeep K.; Singh, Rajeev; Ghosh, Sibasish

    2016-01-01

    Mixed states of a quantum system, represented by density operators, can be decomposed as a statistical mixture of pure states in a number of ways where each decomposition can be viewed as a different preparation recipe. However the fact that the density matrix contains full information about the ensemble makes it impossible to estimate the preparation basis for the quantum system. Here we present a measurement scheme to (seemingly) improve the performance of unsharp measurements. We argue that in some situations this scheme is capable of providing statistics from a single copy of the quantum system, thus making it possible to perform state tomography from a single copy. One of the by-products of the scheme is a way to distinguish between different preparation methods used to prepare the state of the quantum system. However, our numerical simulations disagree with our intuitive predictions. We show that a counterintuitive property of a biased classical random walk is responsible for the proposed mechanism not working.

  13. Quantum signature of chaos and thermalization in the kicked Dicke model.

    PubMed

    Ray, S; Ghosh, A; Sinha, S

    2016-09-01

    We study the quantum dynamics of the kicked Dicke model (KDM) in terms of the Floquet operator, and we analyze the connection between chaos and thermalization in this context. The Hamiltonian map is constructed by suitably taking the classical limit of the Heisenberg equation of motion to study the corresponding phase-space dynamics, which shows a crossover from regular to chaotic motion by tuning the kicking strength. The fixed-point analysis and calculation of the Lyapunov exponent (LE) provide us with a complete picture of the onset of chaos in phase-space dynamics. We carry out a spectral analysis of the Floquet operator, which includes a calculation of the quasienergy spacing distribution and structural entropy to show the correspondence to the random matrix theory in the chaotic regime. Finally, we analyze the thermodynamics and statistical properties of the bosonic sector as well as the spin sector, and we discuss how such a periodically kicked system relaxes to a thermalized state in accordance with the laws of statistical mechanics.

  14. Computational simulation of the creep-rupture process in filamentary composite materials

    NASA Technical Reports Server (NTRS)

    Slattery, Kerry T.; Hackett, Robert M.

    1991-01-01

    A computational simulation of the internal damage accumulation which causes the creep-rupture phenomenon in filamentary composite materials is developed. The creep-rupture process involves complex interactions between several damage mechanisms. A statistically-based computational simulation using a time-differencing approach is employed to model these progressive interactions. The finite element method is used to calculate the internal stresses. The fibers are modeled as a series of bar elements which are connected transversely by matrix elements. Flaws are distributed randomly throughout the elements in the model. Load is applied, and the properties of the individual elements are updated at the end of each time step as a function of the stress history. The simulation is continued until failure occurs. Several cases, with different initial flaw dispersions, are run to establish a statistical distribution of the time-to-failure. The calculations are performed on a supercomputer. The simulation results compare favorably with the results of creep-rupture experiments conducted at the Lawrence Livermore National Laboratory.

  15. Properties of branching exponential flights in bounded domains

    NASA Astrophysics Data System (ADS)

    Zoia, A.; Dumonteil, E.; Mazzolo, A.

    2012-11-01

    In a series of recent works, important results have been reported concerning the statistical properties of exponential flights evolving in bounded domains, a widely adopted model for finite-speed transport phenomena (Blanco S. and Fournier R., Europhys. Lett., 61 (2003) 168; Mazzolo A., Europhys. Lett., 68 (2004) 350; Bénichou O. et al., Europhys. Lett., 70 (2005) 42). Motivated by physical and biological systems where random spatial displacements are coupled with Galton-Watson birth-death mechanisms, such as neutron multiplication, diffusion of reproducing bacteria or spread of epidemics, in this letter we extend those results in two directions, via a Feynman-Kac formalism. First, we characterize the occupation statistics of exponential flights in the presence of absorption and branching, and give explicit moment formulas for the total length travelled by the walker and the number of performed collisions in a given domain. Then, we show that the survival and escape probability can be derived as well by resorting to a similar approach.

  16. Convolutionless Nakajima–Zwanzig equations for stochastic analysis in nonlinear dynamical systems

    PubMed Central

    Venturi, D.; Karniadakis, G. E.

    2014-01-01

    Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima–Zwanzig–Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection–reaction problems. PMID:24910519

  17. Analysis of Machine Learning Techniques for Heart Failure Readmissions.

    PubMed

    Mortazavi, Bobak J; Downing, Nicholas S; Bucholz, Emily M; Dharmarajan, Kumar; Manhapra, Ajay; Li, Shu-Xia; Negahban, Sahand N; Krumholz, Harlan M

    2016-11-01

    The current ability to predict readmissions in patients with heart failure is modest at best. It is unclear whether machine learning techniques that address higher dimensional, nonlinear relationships among variables would enhance prediction. We sought to compare the effectiveness of several machine learning algorithms for predicting readmissions. Using data from the Telemonitoring to Improve Heart Failure Outcomes trial, we compared the effectiveness of random forests, boosting, random forests combined hierarchically with support vector machines or logistic regression (LR), and Poisson regression against traditional LR to predict 30- and 180-day all-cause readmissions and readmissions because of heart failure. We randomly selected 50% of patients for a derivation set, and a validation set comprised the remaining patients, validated using 100 bootstrapped iterations. We compared C statistics for discrimination and distributions of observed outcomes in risk deciles for predictive range. In 30-day all-cause readmission prediction, the best performing machine learning model, random forests, provided a 17.8% improvement over LR (mean C statistics, 0.628 and 0.533, respectively). For readmissions because of heart failure, boosting improved the C statistic by 24.9% over LR (mean C statistic 0.678 and 0.543, respectively). For 30-day all-cause readmission, the observed readmission rates in the lowest and highest deciles of predicted risk with random forests (7.8% and 26.2%, respectively) showed a much wider separation than LR (14.2% and 16.4%, respectively). Machine learning methods improved the prediction of readmission after hospitalization for heart failure compared with LR and provided the greatest predictive range in observed readmission rates. © 2016 American Heart Association, Inc.

  18. People's Intuitions about Randomness and Probability: An Empirical Study

    ERIC Educational Resources Information Center

    Lecoutre, Marie-Paule; Rovira, Katia; Lecoutre, Bruno; Poitevineau, Jacques

    2006-01-01

    What people mean by randomness should be taken into account when teaching statistical inference. This experiment explored subjective beliefs about randomness and probability through two successive tasks. Subjects were asked to categorize 16 familiar items: 8 real items from everyday life experiences, and 8 stochastic items involving a repeatable…

  19. Analyzing Randomized Controlled Interventions: Three Notes for Applied Linguists

    ERIC Educational Resources Information Center

    Vanhove, Jan

    2015-01-01

    I discuss three common practices that obfuscate or invalidate the statistical analysis of randomized controlled interventions in applied linguistics. These are (a) checking whether randomization produced groups that are balanced on a number of possibly relevant covariates, (b) using repeated measures ANOVA to analyze pretest-posttest designs, and…

  20. Evaluation of Evidence of Statistical Support and Corroboration of Subgroup Claims in Randomized Clinical Trials.

    PubMed

    Wallach, Joshua D; Sullivan, Patrick G; Trepanowski, John F; Sainani, Kristin L; Steyerberg, Ewout W; Ioannidis, John P A

    2017-04-01

    Many published randomized clinical trials (RCTs) make claims for subgroup differences. To evaluate how often subgroup claims reported in the abstracts of RCTs are actually supported by statistical evidence (P < .05 from an interaction test) and corroborated by subsequent RCTs and meta-analyses. This meta-epidemiological survey examines data sets of trials with at least 1 subgroup claim, including Subgroup Analysis of Trials Is Rarely Easy (SATIRE) articles and Discontinuation of Randomized Trials (DISCO) articles. We used Scopus (updated July 2016) to search for English-language articles citing each of the eligible index articles with at least 1 subgroup finding in the abstract. Articles with a subgroup claim in the abstract with or without evidence of statistical heterogeneity (P < .05 from an interaction test) in the text and articles attempting to corroborate the subgroup findings. Study characteristics of trials with at least 1 subgroup claim in the abstract were recorded. Two reviewers extracted the data necessary to calculate subgroup-level effect sizes, standard errors, and the P values for interaction. For individual RCTs and meta-analyses that attempted to corroborate the subgroup findings from the index articles, trial characteristics were extracted. Cochran Q test was used to reevaluate heterogeneity with the data from all available trials. The number of subgroup claims in the abstracts of RCTs, the number of subgroup claims in the abstracts of RCTs with statistical support (subgroup findings), and the number of subgroup findings corroborated by subsequent RCTs and meta-analyses. Sixty-four eligible RCTs made a total of 117 subgroup claims in their abstracts. Of these 117 claims, only 46 (39.3%) in 33 articles had evidence of statistically significant heterogeneity from a test for interaction. In addition, out of these 46 subgroup findings, only 16 (34.8%) ensured balance between randomization groups within the subgroups (eg, through stratified randomization), 13 (28.3%) entailed a prespecified subgroup analysis, and 1 (2.2%) was adjusted for multiple testing. Only 5 (10.9%) of the 46 subgroup findings had at least 1 subsequent pure corroboration attempt by a meta-analysis or an RCT. In all 5 cases, the corroboration attempts found no evidence of a statistically significant subgroup effect. In addition, all effect sizes from meta-analyses were attenuated toward the null. A minority of subgroup claims made in the abstracts of RCTs are supported by their own data (ie, a significant interaction effect). For those that have statistical support (P < .05 from an interaction test), most fail to meet other best practices for subgroup tests, including prespecification, stratified randomization, and adjustment for multiple testing. Attempts to corroborate statistically significant subgroup differences are rare; when done, the initially observed subgroup differences are not reproduced.

  1. A Comparison of the Achievement of Statistics Students Enrolled in Online and Face-to-Face Settings

    ERIC Educational Resources Information Center

    Christmann, Edwin P.

    2017-01-01

    This study compared the achievement of male and female students who were enrolled in an online univariate statistics course to students enrolled in a traditional face-to-face univariate statistics course. The subjects, 47 graduate students enrolled in univariate statistics classes at a public, comprehensive university, were randomly assigned to…

  2. Pseudo-Random Number Generator Based on Coupled Map Lattices

    NASA Astrophysics Data System (ADS)

    Lü, Huaping; Wang, Shihong; Hu, Gang

    A one-way coupled chaotic map lattice is used for generating pseudo-random numbers. It is shown that with suitable cooperative applications of both chaotic and conventional approaches, the output of the spatiotemporally chaotic system can easily meet the practical requirements of random numbers, i.e., excellent random statistical properties, long periodicity of computer realizations, and fast speed of random number generations. This pseudo-random number generator system can be used as ideal synchronous and self-synchronizing stream cipher systems for secure communications.

  3. Quantum Random Number Generation Using a Quanta Image Sensor

    PubMed Central

    Amri, Emna; Felk, Yacine; Stucki, Damien; Ma, Jiaju; Fossum, Eric R.

    2016-01-01

    A new quantum random number generation method is proposed. The method is based on the randomness of the photon emission process and the single photon counting capability of the Quanta Image Sensor (QIS). It has the potential to generate high-quality random numbers with remarkable data output rate. In this paper, the principle of photon statistics and theory of entropy are discussed. Sample data were collected with QIS jot device, and its randomness quality was analyzed. The randomness assessment method and results are discussed. PMID:27367698

  4. Data-driven probability concentration and sampling on manifold

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soize, C., E-mail: christian.soize@univ-paris-est.fr; Ghanem, R., E-mail: ghanem@usc.edu

    2016-09-15

    A new methodology is proposed for generating realizations of a random vector with values in a finite-dimensional Euclidean space that are statistically consistent with a dataset of observations of this vector. The probability distribution of this random vector, while a priori not known, is presumed to be concentrated on an unknown subset of the Euclidean space. A random matrix is introduced whose columns are independent copies of the random vector and for which the number of columns is the number of data points in the dataset. The approach is based on the use of (i) the multidimensional kernel-density estimation methodmore » for estimating the probability distribution of the random matrix, (ii) a MCMC method for generating realizations for the random matrix, (iii) the diffusion-maps approach for discovering and characterizing the geometry and the structure of the dataset, and (iv) a reduced-order representation of the random matrix, which is constructed using the diffusion-maps vectors associated with the first eigenvalues of the transition matrix relative to the given dataset. The convergence aspects of the proposed methodology are analyzed and a numerical validation is explored through three applications of increasing complexity. The proposed method is found to be robust to noise levels and data complexity as well as to the intrinsic dimension of data and the size of experimental datasets. Both the methodology and the underlying mathematical framework presented in this paper contribute new capabilities and perspectives at the interface of uncertainty quantification, statistical data analysis, stochastic modeling and associated statistical inverse problems.« less

  5. Structure and Randomness of Continuous-Time, Discrete-Event Processes

    NASA Astrophysics Data System (ADS)

    Marzen, Sarah E.; Crutchfield, James P.

    2017-10-01

    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.

  6. Auditory Magnetoencephalographic Frequency-Tagged Responses Mirror the Ongoing Segmentation Processes Underlying Statistical Learning.

    PubMed

    Farthouat, Juliane; Franco, Ana; Mary, Alison; Delpouve, Julie; Wens, Vincent; Op de Beeck, Marc; De Tiège, Xavier; Peigneux, Philippe

    2017-03-01

    Humans are highly sensitive to statistical regularities in their environment. This phenomenon, usually referred as statistical learning, is most often assessed using post-learning behavioural measures that are limited by a lack of sensibility and do not monitor the temporal dynamics of learning. In the present study, we used magnetoencephalographic frequency-tagged responses to investigate the neural sources and temporal development of the ongoing brain activity that supports the detection of regularities embedded in auditory streams. Participants passively listened to statistical streams in which tones were grouped as triplets, and to random streams in which tones were randomly presented. Results show that during exposure to statistical (vs. random) streams, tritone frequency-related responses reflecting the learning of regularities embedded in the stream increased in the left supplementary motor area and left posterior superior temporal sulcus (pSTS), whereas tone frequency-related responses decreased in the right angular gyrus and right pSTS. Tritone frequency-related responses rapidly developed to reach significance after 3 min of exposure. These results suggest that the incidental extraction of novel regularities is subtended by a gradual shift from rhythmic activity reflecting individual tone succession toward rhythmic activity synchronised with triplet presentation, and that these rhythmic processes are subtended by distinct neural sources.

  7. Compaction Behavior of Granular Materials

    NASA Astrophysics Data System (ADS)

    Endicott, Mark R.; Kenkre, V. M.; Glass, S. Jill; Hurd, Alan J.

    1996-03-01

    We report the results of our recent study of compaction of granular materials. A theoretical model is developed for the description of the compaction of granular materials exemplified by granulated ceramic powders. Its predictions are compared to observations of uniaxial compaction tests of ceramic granules of PMN-PT, spray dried alumina and rutile. The theoretical model employs a volume-based statistical mechanics treatment and an activation analogy. Results of a computer simulation of random packing of discs in two dimensions are also reported. The effect of type of particle size distribution and other parameters of that distribution on the calculated quantities are discussed. We examine the implications of the results of the simulation for the theoretical model.

  8. Statistical mechanics of the mixed majority minority game with random external information

    NASA Astrophysics Data System (ADS)

    DeMartino, A.; Giardina, I.; Mosetti, G.

    2003-08-01

    We study the asymptotic macroscopic properties of the mixed majority-minority game, modelling a population in which two types of heterogeneous adaptive agents, namely 'fundamentalists' driven by differentiation and 'trend-followers' driven by imitation, interact. The presence of a fraction f of trend-followers is shown to induce (a) a significant loss of informational efficiency with respect to a pure minority game (in particular, an efficient, unpredictable phase exists only for f < 1/2), and (b) a catastrophic increase of global fluctuations for f > 1/2. We solve the model by means of an approximate static (replica) theory and by a direct dynamical (generating functional) technique. The two approaches coincide and match numerical results convincingly.

  9. Learning probability distributions from smooth observables and the maximum entropy principle: some remarks

    NASA Astrophysics Data System (ADS)

    Obuchi, Tomoyuki; Monasson, Rémi

    2015-09-01

    The maximum entropy principle (MEP) is a very useful working hypothesis in a wide variety of inference problems, ranging from biological to engineering tasks. To better understand the reasons of the success of MEP, we propose a statistical-mechanical formulation to treat the space of probability distributions constrained by the measures of (experimental) observables. In this paper we first review the results of a detailed analysis of the simplest case of randomly chosen observables. In addition, we investigate by numerical and analytical means the case of smooth observables, which is of practical relevance. Our preliminary results are presented and discussed with respect to the efficiency of the MEP.

  10. The uniform quantized electron gas revisited

    NASA Astrophysics Data System (ADS)

    Lomba, Enrique; Høye, Johan S.

    2017-11-01

    In this article we continue and extend our recent work on the correlation energy of the quantized electron gas of uniform density at temperature T=0 . As before, we utilize the methods, properties, and results obtained by means of classical statistical mechanics. These were extended to quantized systems via the Feynman path integral formalism. The latter translates the quantum problem into a classical polymer problem in four dimensions. Again, the well known RPA (random phase approximation) is recovered as a basic result which we then modify and improve upon. Here we analyze the condition of thermodynamic self-consistency. Our numerical calculations exhibit a remarkable agreement with well known results of a standard parameterization of Monte Carlo correlation energies.

  11. Zipf's law holds for phrases, not words.

    PubMed

    Williams, Jake Ryland; Lessard, Paul R; Desu, Suma; Clark, Eric M; Bagrow, James P; Danforth, Christopher M; Dodds, Peter Sheridan

    2015-08-11

    With Zipf's law being originally and most famously observed for word frequency, it is surprisingly limited in its applicability to human language, holding over no more than three to four orders of magnitude before hitting a clear break in scaling. Here, building on the simple observation that phrases of one or more words comprise the most coherent units of meaning in language, we show empirically that Zipf's law for phrases extends over as many as nine orders of rank magnitude. In doing so, we develop a principled and scalable statistical mechanical method of random text partitioning, which opens up a rich frontier of rigorous text analysis via a rank ordering of mixed length phrases.

  12. Simulation and analysis of scalable non-Gaussian statistically anisotropic random functions

    NASA Astrophysics Data System (ADS)

    Riva, Monica; Panzeri, Marco; Guadagnini, Alberto; Neuman, Shlomo P.

    2015-12-01

    Many earth and environmental (as well as other) variables, Y, and their spatial or temporal increments, ΔY, exhibit non-Gaussian statistical scaling. Previously we were able to capture some key aspects of such scaling by treating Y or ΔY as standard sub-Gaussian random functions. We were however unable to reconcile two seemingly contradictory observations, namely that whereas sample frequency distributions of Y (or its logarithm) exhibit relatively mild non-Gaussian peaks and tails, those of ΔY display peaks that grow sharper and tails that become heavier with decreasing separation distance or lag. Recently we overcame this difficulty by developing a new generalized sub-Gaussian model which captures both behaviors in a unified and consistent manner, exploring it on synthetically generated random functions in one dimension (Riva et al., 2015). Here we extend our generalized sub-Gaussian model to multiple dimensions, present an algorithm to generate corresponding random realizations of statistically isotropic or anisotropic sub-Gaussian functions and illustrate it in two dimensions. We demonstrate the accuracy of our algorithm by comparing ensemble statistics of Y and ΔY (such as, mean, variance, variogram and probability density function) with those of Monte Carlo generated realizations. We end by exploring the feasibility of estimating all relevant parameters of our model by analyzing jointly spatial moments of Y and ΔY obtained from a single realization of Y.

  13. Dragon-Kings, Black-Swans and Prediction (Invited)

    NASA Astrophysics Data System (ADS)

    Sornette, D.

    2010-12-01

    Extreme fluctuations or events are often associated with power law statistics. Indeed, it is a popular belief that "wild randomness'' is deeply associated with distributions with power law tails characterized by small exponents. In other words, power law tails are often seen as the epitome of extreme events (the "Black Swan'' story). Here, we document in very different systems that there is life beyond power law tails: power laws can be superseded by "dragon-kings'', monster events that occur beyond (or changing) the power law tail. Dragon-kings reveal hidden mechanisms that are only transiently active and that amplify the normal fluctuations (often described by the power laws of the normal regime). The goal of this lecture is to catalyze the interest of the community of geophysicists across all fields of geosciences so that the "invisible gorilla" fallacy may be avoided. Our own research illustrates that new statistics or representation of data are often necessary to identify dragon-kings, with strategies guided by the underlying mechanisms. Paradoxically, the monsters may be ignored or hidden by the use of inappropriate analysis or statistical tools that amount to cut a mamooth in small pieces, so as to lead to the incorrect belief that only mice exist. In order to stimulate further research, we will document and discuss the dragon-king phenomenon on the statistics of financial losses, economic geography, hydrodynamic turbulence, mechanical ruptures, avalanches in complex heterogeneous media, earthquakes, and epileptic seizures. The special status of dragon-kings open a new research program on their predictability, based on the fact that they belong to a different class of their own and express specific mechanisms amplifying the normal dynamics via positive feedbacks. We will present evidence of these claims for the predictions of material rupture, financial crashes and epileptic seizures. As a bonus, a few remarks will be offered at the end on how the dragon-king phenomenon allows us to understand the present World financial crisis as underpinned in two decades of successive financial and economic bubbles, inflating the mother of all bubbles with new monster dragon-kings at the horizon. The consequences in terms of a new "normal" are eye-opening. Ref: D. Sornette, Dragon-Kings, Black Swans and the Prediction of Crises, International Journal of Terraspace Science and Engineering 1(3), 1-17 (2009) (http://arXiv.org/abs/0907.4290) and (http://ssrn.com/abstract=1470006)

  14. Variable lung protective mechanical ventilation decreases incidence of postoperative delirium and cognitive dysfunction during open abdominal surgery.

    PubMed

    Wang, Ruichun; Chen, Junping; Wu, Guorong

    2015-01-01

    Postoperative cognitive dysfunction (POCD) is a subtle impairment of cognitive abilities and can manifest on different neuropsychological features in the early postoperative period. It has been proved that the use of mechanical ventilation (MV) increased the development of delirium and POCD. However, the impact of variable and conventional lung protective mechanical ventilation on the incidence of POCD still remains unknown, which was the aim of this study. 162 patients scheduled to undergo elective gastrointestinal tumor resection via laparotomy in Ningbo No. 2 hospital with expected duration >2 h from June, 2013 to June, 2015 were enrolled in this study. Patients included were divided into two groups according to the scheme of lung protective MV, variable ventilation group (VV group, n=79) and conventional ventilation group (CV group, n=83) by randomization performed by random block randomization. The plasma levels of inflammatory cytokines, characteristics of the surgical procedure, incidence of delirium and POCD were collected and compared. Postoperative delirium was detected in 36 of 162 patients (22.2%) and 12 patients of these (16.5%) belonged to the VV group while 24 patients (28.9%) were in the CV group (P=0.036). POCD on the seventh postoperative day in CV group (26/83, 31.3%) was increased in comparison with the VV group (14/79, 17.7%) with significant statistical difference (P=0.045). The levels of inflammatory cytokines were all significantly higher in CV group than those in VV group on the 1st postoperative day (P<0.05). On 7th postoperative day, the levels of IL-6 and TNF-α in CV group remained much higher compared with VV group (P<0.05). Variable vs conventional lung protective MV decreased the incidence of postoperative delirium and POCD by reducing the systemic proinflammatory response.

  15. Introduction

    NASA Astrophysics Data System (ADS)

    Cohen, E. G. D.

    Lecture notes are organized around the key word dissipation, while focusing on a presentation of modern theoretical developments in the study of irreversible phenomena. A broad cross-disciplinary perspective towards non-equilibrium statistical mechanics is backed by the general theory of nonlinear and complex dynamical systems. The classical-quantum intertwine and semiclassical dissipative borderline issue (decoherence, "classical out of quantum") are here included . Special emphasis is put on links between the theory of classical and quantum dynamical systems (temporal disorder, dynamical chaos and transport processes) with central problems of non-equilibrium statistical mechanics like e.g. the connection between dynamics and thermodynamics, relaxation towards equilibrium states and mechanisms capable to drive and next maintain the physical system far from equilibrium, in a non-equilibrium steady (stationary) state. The notion of an equilibrium state - towards which a system naturally evolves if left undisturbed - is a fundamental concept of equilibrium statistical mechanics. Taken as a primitive point of reference that allows to give an unambiguous status to near equilibrium and far from equilibrium systems, together with the dynamical notion of a relaxation (decay) towards a prescribed asymptotic invariant measure or probability distribution (properties of ergodicity and mixing are implicit). A related issue is to keep under control the process of driving a physical system away from an initial state of equilibrium and either keeping it in another (non-equilibrium) steady state or allowing to restore the initial data (return back, relax). To this end various models of environment (heat bath, reservoir, thermostat, measuring instrument etc.), and the environment - system coupling are analyzed. The central theme of the book is the dynamics of dissipation and various mechanisms responsible for the irreversible behaviour (transport properties) of open systems on classical and quantum levels of description. A distinguishing feature of these lecture notes is that microscopic foundations of irreversibility are investigated basically in terms of "small" systems, when the "system" and/or "environment" may have a finite (and small) number of degrees of freedom and may be bounded. This is to be contrasted with the casual understanding of statistical mechanics which is regarded to refer to systems with a very large number of degrees of freedom. In fact, it is commonly accepted that the accumulation of effects due to many (range of the Avogadro number) particles is required for statistical mechanics reasoning. Albeit those large numbers are not at all sufficient for transport properties. A helpful hint towards this conceptual turnover comes from the observation that for chaotic dynamical systems the random time evolution proves to be compatible with the underlying purely deterministic laws of motion. Chaotic features of the classical dynamics already appear in systems with two degrees of freedom and such systems need to be described in statistical terms, if we wish to quantify the dynamics of relaxation towards an invariant ergodic measure. The relaxation towards equilibrium finds a statistical description through an analysis of statistical ensembles. This entails an extension of the range of validity of statistical mechanics to small classical systems. On the other hand, the dynamics of fluctuations in macroscopic dissipative systems (due to their molecular composition and thermal mobility) may render a characterization of such systems as being chaotic. That motivates attempts of understanding the role of microscopic chaos and various "chaotic hypotheses" - dynamical systems approach is being pushed down to the level of atoms, molecules and complex matter constituents, whose natural substitute are low-dimensional model subsystems (encompassing as well the mesoscopic "quantum chaos") - in non-equilibrium transport phenomena. On the way a number of questions is addressed like e.g.: is there, or what is the nature of a connection between chaos (modern theory of dynamical systems) and irreversible thermodynamics; can really quantum chaos explain some peculiar features of quantum transport? The answer in both cases is positive, modulo a careful discrimination between viewing the dynamical chaos as a necessary or sufficient basis for irreversibility. In those dynamical contexts, another key term dynamical semigroups refers to major technical tools appropriate for the "dissipative mathematics", modelling irreversible behaviour on the classical and quantum levels of description. Dynamical systems theory and "quantum chaos" research involve both a high level of mathematical sophistication and heavy computer "experimentation". One of the present volume specific flavors is a tutorial access to quite advanced mathematical tools. They gradually penetrate the classical and quantum dynamical semigroup description, while culminating in the noncommutative Brillouin zone construction as a prerequisite to understand transport in aperiodic solids. Lecture notes are structured into chapters to give a better insight into major conceptual streamlines. Chapter I is devoted to a discussion of non-equilibrium steady states and, through so-called chaotic hypothesis combined with suitable fluctuation theorems, elucidates the role of Sinai-Ruelle-Bowen distribution in both equilibrium and non-equilibrium statistical physics frameworks (E. G. D. Cohen). Links between dynamics and statistics (Boltzmann versus Tsallis) are also discussed. Fluctuation relations and a survey of deterministic thermostats are given in the context of non-equilibrium steady states of fluids (L. Rondoni). Response of systems driven far from equilibrium is analyzed on the basis of a central assertion about the existence of the statistical representation in terms of an ensemble of dynamical realizations of the driving process. Non-equilibrium work relation is deduced for irreversible processes (C. Jarzynski). The survey of non-equilibrium steady states in statistical mechanics of classical and quantum systems employs heat bath models and the random matrix theory input. The quantum heat bath analysis and derivation of fluctuation-dissipation theorems is performed by means of the influence functional technique adopted to solve quantum master equations (D. Kusnezov). Chapter II deals with an issue of relaxation and its dynamical theory in both classical and quantum contexts. Pollicott-Ruelle resonance background for the exponential decay scenario is discussed for irreversible processes of diffusion in the Lorentz gas and multibaker models (P. Gaspard). The Pollicott-Ruelle theory reappears as a major inspiration in the survey of the behaviour of ensembles of chaotic systems, with a focus on model systems for which no rigorous results concerning the exponential decay of correlations in time is available (S. Fishman). The observation, that non-equilibrium transport processes in simple classical chaotic systems can be described in terms of fractal structures developing in the system phase space, links their formation and properties with the entropy production in the course of diffusion processes displaying a low dimensional deterministic (chaotic) origin (J. R. Dorfman). Chapter III offers an introduction to the theory of dynamical semigroups. Asymptotic properties of Markov operators and Markov semigroups acting in the set of probability densities (statistical ensemble notion is implicit) are analyzed. Ergodicity, mixing, strong (complete) mixing and sweeping are discussed in the familiar setting of "noise, chaos and fractals" (R. Rudnicki). The next step comprises a passage to quantum dynamical semigroups and completely positive dynamical maps, with an ultimate goal to introduce a consistent framework for the analysis of irreversible phenomena in open quantum systems, where dissipation and decoherence are crucial concepts (R. Alicki). Friction and damping in classical and quantum mechanics of finite dissipative systems is analyzed by means of Markovian quantum semigroups with special emphasis on the issue of complete positivity (M. Fannes). Specific two-level model systems of elementary particle physics (kaons) and rudiments of neutron interferometry are employed to elucidate a distinction between positivity and complete positivity (F. Benatti). Quantization of dynamics of stochastic models related to equilibrium Gibbs states results in dynamical maps which form quantum stochastic dynamical semigroups (W. A. Majewski). Chapter IV addresses diverse but deeply interrelated features of driven chaotic (mesoscopic) classical and quantum systems, their dissipative properties, notions of quantum irreversibility, entanglement, dephasing and decoherence. A survey of non-perturbative quantum effects for open quantum systems is concluded by outlining the discrepancies between random matrix theory and non-perturbative semiclassical predictions (D. Cohen). As a useful supplement to the subject of bounded open systems, methods of quantum state control in a cavity (coherent versus incoherent dynamics and dissipation) are described for low dimensional quantum systems (A. Buchleitner). The dynamics of open quantum systems can be alternatively described by means of non-Markovian stochastic Schrödinger equation, jointly for an open system and its environment, which moves us beyond the Linblad evolution scenario of Markovian dynamical semigroups. The quantum Brownian motion is considered (W. Strunz) . Chapter V enforces a conceptual transition 'from "small" to "large" systems with emphasis on irreversible thermodynamics of quantum transport. Typical features of the statistical mechanics of infinitely extended systems and the dynamical (small) systems approach are described by means of representative examples of relaxation towards asymptotic steady states: quantum one-dimensional lattice conductor and an open multibaker map (S. Tasaki). Dissipative transport in aperiodic solids is reviewed by invoking methods on noncommutative geometry. The anomalous Drude formula is derived. The occurence of quantum chaos is discussed together with its main consequences (J. Bellissard). The chapter is concluded by a survey of scaling limits of the N-body Schrödinger quantum dynamics, where classical evolution equations of irreversible statistical mechanics (linear Boltzmann, Hartree, Vlasov) emerge "out of quantum". In particular, a scaling limit of one body quantum dynamics with impurities (static random potential) and that of quantum dynamics with weakly coupled phonons are shown to yield the linear Boltzmann equation (L. Erdös). Various interrelations between chapters and individual lectures, plus a detailed fine-tuned information about the subject matter coverage of the volume, can be recovered by examining an extensive index.

  16. Exploring psychological mechanisms of clinical response to an internet-delivered psychological pain management program.

    PubMed

    Gandy, M; Karin, E; Jones, M P; McDonald, S; Sharpe, L; Titov, N; Dear, B F

    2018-05-13

    The evidence for Internet-delivered pain management programs for chronic pain is growing, but there is little empirical understanding of how they effect change. Understanding mechanisms of clinical response to these programs could inform their effective development and delivery. A large sample (n = 396) from a previous randomized controlled trial of a validated internet-delivered psychological pain management program, the Pain Course, was used to examine the influence of three potential psychological mechanisms (pain acceptance, pain self-efficacy, fear of movement/re-injury) on treatment-related change in disability, depression, anxiety and average pain. Analyses involved generalized estimating equation models for clinical outcomes that adjusted for co-occurring change in psychological variables. This was paired with cross-lagged analysis to assess for evidence of causality. Analyses involved two time points, pre-treatment and post-treatment. Changes in pain-acceptance were strongly associated with changes in three (depression, anxiety and average pain) of the four clinical outcomes. Changes in self-efficacy were also strongly associated with two (anxiety and average pain) clinical outcomes. These findings suggest that participants were unlikely to improve in these clinical outcomes without also experiencing increases in their pain self-efficacy and pain acceptance. However, there was no clear evidence from cross-lagged analyses to currently support these psychological variables as direct mechanisms of clinical improvements. There was only statistical evidence to suggest higher levels of self-efficacy moderated improvements in depression. The findings suggest that, while clinical improvements are closely associated with improvements in pain acceptance and self-efficacy, these psychological variables may not drive the treatment effects observed. This study employed robust statistical techniques to assess the psychological mechanisms of an established internet-delivered pain management program. While clinical improvements (e.g. depression, anxiety, pain) were closely associated with improvements in psychological variables (e.g. pain self-efficacy and pain acceptance), these variables do not appear to be treatment mechanisms. © 2018 European Pain Federation - EFIC®.

  17. Response statistics of rotating shaft with non-linear elastic restoring forces by path integration

    NASA Astrophysics Data System (ADS)

    Gaidai, Oleg; Naess, Arvid; Dimentberg, Michael

    2017-07-01

    Extreme statistics of random vibrations is studied for a Jeffcott rotor under uniaxial white noise excitation. Restoring force is modelled as elastic non-linear; comparison is done with linearized restoring force to see the force non-linearity effect on the response statistics. While for the linear model analytical solutions and stability conditions are available, it is not generally the case for non-linear system except for some special cases. The statistics of non-linear case is studied by applying path integration (PI) method, which is based on the Markov property of the coupled dynamic system. The Jeffcott rotor response statistics can be obtained by solving the Fokker-Planck (FP) equation of the 4D dynamic system. An efficient implementation of PI algorithm is applied, namely fast Fourier transform (FFT) is used to simulate dynamic system additive noise. The latter allows significantly reduce computational time, compared to the classical PI. Excitation is modelled as Gaussian white noise, however any kind distributed white noise can be implemented with the same PI technique. Also multidirectional Markov noise can be modelled with PI in the same way as unidirectional. PI is accelerated by using Monte Carlo (MC) estimated joint probability density function (PDF) as initial input. Symmetry of dynamic system was utilized to afford higher mesh resolution. Both internal (rotating) and external damping are included in mechanical model of the rotor. The main advantage of using PI rather than MC is that PI offers high accuracy in the probability distribution tail. The latter is of critical importance for e.g. extreme value statistics, system reliability, and first passage probability.

  18. 3D Representative Volume Element Reconstruction of Fiber Composites via Orientation Tensor and Substructure Features

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yi; Chen, Wei; Xu, Hongyi

    To provide a seamless integration of manufacturing processing simulation and fiber microstructure modeling, two new stochastic 3D microstructure reconstruction methods are proposed for two types of random fiber composites: random short fiber composites, and Sheet Molding Compounds (SMC) chopped fiber composites. A Random Sequential Adsorption (RSA) algorithm is first developed to embed statistical orientation information into 3D RVE reconstruction of random short fiber composites. For the SMC composites, an optimized Voronoi diagram based approach is developed for capturing the substructure features of SMC chopped fiber composites. The proposed methods are distinguished from other reconstruction works by providing a way ofmore » integrating statistical information (fiber orientation tensor) obtained from material processing simulation, as well as capturing the multiscale substructures of the SMC composites.« less

  19. Quantifying the impact of between-study heterogeneity in multivariate meta-analyses

    PubMed Central

    Jackson, Dan; White, Ian R; Riley, Richard D

    2012-01-01

    Measures that quantify the impact of heterogeneity in univariate meta-analysis, including the very popular I2 statistic, are now well established. Multivariate meta-analysis, where studies provide multiple outcomes that are pooled in a single analysis, is also becoming more commonly used. The question of how to quantify heterogeneity in the multivariate setting is therefore raised. It is the univariate R2 statistic, the ratio of the variance of the estimated treatment effect under the random and fixed effects models, that generalises most naturally, so this statistic provides our basis. This statistic is then used to derive a multivariate analogue of I2, which we call . We also provide a multivariate H2 statistic, the ratio of a generalisation of Cochran's heterogeneity statistic and its associated degrees of freedom, with an accompanying generalisation of the usual I2 statistic, . Our proposed heterogeneity statistics can be used alongside all the usual estimates and inferential procedures used in multivariate meta-analysis. We apply our methods to some real datasets and show how our statistics are equally appropriate in the context of multivariate meta-regression, where study level covariate effects are included in the model. Our heterogeneity statistics may be used when applying any procedure for fitting the multivariate random effects model. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22763950

  20. Effect of physical exercise on spontaneous physical activity energy expenditure and energy intake in overweight adults (the EFECT study): a study protocol for a randomized controlled trial.

    PubMed

    Paravidino, Vitor Barreto; Mediano, Mauro Felippe Felix; Silva, Inácio Crochemore M; Wendt, Andrea; Del Vecchio, Fabrício Boscolo; Neves, Fabiana Alves; Terra, Bruno de Souza; Gomes, Erika Alvarenga Corrêa; Moura, Anibal Sanchez; Sichieri, Rosely

    2018-03-07

    Physical exercise interventions have been extensively advocated for the treatment of obesity; however, clinical trials evaluating the effectiveness of exercise interventions on weight control show controversial results. Compensatory mechanisms through a decrease in energy expenditure and/or an increase in caloric consumption is a possible explanation. Several physiological mechanisms involved in the energy balance could explain compensatory mechanisms, but the influences of physical exercise on these adjustments are still unclear. Therefore, the present trial aims to evaluate the effects of exercise on non-exercise physical activity energy expenditure, energy intake and appetite sensations among active overweight/obese adults, as well as, to investigate hormonal changes associated with physical exercise. This study is a randomized controlled trial with parallel, three-group experimental arms. Eighty-one overweight/obese adults will be randomly allocated (1:1:1 ratio) to a vigorous exercise group, moderate exercise group or control group. The trial will be conducted at a military institution and the intervention groups will be submitted to exercise sessions in the evening, three times a week for 65 min, during a 2-week period. The primary outcome will be total spontaneous physical activity energy expenditure during a 2-week period. Secondary outcomes will be caloric intake, appetite sensations and laboratorial biomarkers. Intention-to-treat analysis will be performed using linear mixed-effects models to evaluate the effect of treatment-by-time interaction on primary and secondary outcomes. Data analysis will be performed using SAS 9.3 and statistical significance will be set at p < 0.05. The results of the present study will help to understand the effect of physical exercise training on subsequent non-exercise physical activity, appetite and energy intake as well as understand the physiological mechanisms underlying a possible compensatory phenomenon, supporting the development of more effective interventions for prevention and treatment of obesity. Physical Exercise and Energy Balance trial registry, trial registration number: NCT 03138187 . Registered on 30 April 2017.

  1. Resource-constrained Data Collection and Fusion for Identifying Weak Distributed Patterns in Networks

    DTIC Science & Technology

    2013-10-15

    statistic,” in Artifical Intelligence and Statistics (AISTATS), 2013. [6] ——, “Detecting activity in graphs via the Graph Ellipsoid Scan Statistic... Artifical Intelligence and Statistics (AISTATS), 2013. [8] ——, “Near-optimal anomaly detection in graphs using Lovász Extended Scan Statistic,” in Neural...networks,” in Artificial Intelligence and Statistics (AISTATS), 2010. 11 [11] D. Aldous, “The random walk construction of uniform spanning trees and

  2. Two-dimensional random surface model for asperity-contact in elastohydrodynamic lubrication

    NASA Technical Reports Server (NTRS)

    Coy, J. J.; Sidik, S. M.

    1979-01-01

    Relations for the asperity-contact time function during elastohydrodynamic lubrication of a ball bearing are presented. The analysis is based on a two-dimensional random surface model, and actual profile traces of the bearing surfaces are used as statistical sample records. The results of the analysis show that transition from 90 percent contact to 1 percent contact occurs within a dimensionless film thickness range of approximately four to five. This thickness ratio is several times large than reported in the literature where one-dimensional random surface models were used. It is shown that low pass filtering of the statistical records will bring agreement between the present results and those in the literature.

  3. Simulation of flight maneuver-load distributions by utilizing stationary, non-Gaussian random load histories

    NASA Technical Reports Server (NTRS)

    Leybold, H. A.

    1971-01-01

    Random numbers were generated with the aid of a digital computer and transformed such that the probability density function of a discrete random load history composed of these random numbers had one of the following non-Gaussian distributions: Poisson, binomial, log-normal, Weibull, and exponential. The resulting random load histories were analyzed to determine their peak statistics and were compared with cumulative peak maneuver-load distributions for fighter and transport aircraft in flight.

  4. Two spatial light modulator system for laboratory simulation of random beam propagation in random media.

    PubMed

    Wang, Fei; Toselli, Italo; Korotkova, Olga

    2016-02-10

    An optical system consisting of a laser source and two independent consecutive phase-only spatial light modulators (SLMs) is shown to accurately simulate a generated random beam (first SLM) after interaction with a stationary random medium (second SLM). To illustrate the range of possibilities, a recently introduced class of random optical frames is examined on propagation in free space and several weak turbulent channels with Kolmogorov and non-Kolmogorov statistics.

  5. Impact of Simulated Microgravity on Cytoskeleton and Viscoelastic Properties of Endothelial Cell

    NASA Astrophysics Data System (ADS)

    Janmaleki, M.; Pachenari, M.; Seyedpour, S. M.; Shahghadami, R.; Sanati-Nezhad, A.

    2016-09-01

    This study focused on the effects of simulated microgravity (s-μg) on mechanical properties, major cytoskeleton biopolymers, and morphology of endothelial cells (ECs). The structural and functional integrity of ECs are vital to regulate vascular homeostasis and prevent atherosclerosis. Furthermore, these highly gravity sensitive cells play a key role in pathogenesis of many diseases. In this research, impacts of s-μg on mechanical behavior of human umbilical vein endothelial cells were investigated by utilizing a three-dimensional random positioning machine (3D-RPM). Results revealed a considerable drop in cell stiffness and viscosity after 24 hrs of being subjected to weightlessness. Cortical rigidity experienced relatively immediate and significant decline comparing to the stiffness of whole cell body. The cells became rounded in morphology while western blot analysis showed reduction of the main cytoskeletal components. Moreover, fluorescence staining confirmed disorganization of both actin filaments and microtubules (MTs). The results were compared statistically among test and control groups and it was concluded that s-μg led to a significant alteration in mechanical behavior of ECs due to remodeling of cell cytoskeleton.

  6. Analysis of Uniform Random Numbers Generated by Randu and Urn Ten Different Seeds.

    DTIC Science & Technology

    The statistical properties of the numbers generated by two uniform random number generators, RANDU and URN, each using ten different seeds are...The testing is performed on a sequence of 50,000 numbers generated by each uniform random number generator using each of the ten seeds . (Author)

  7. Bayesian statistical inference enhances the interpretation of contemporary randomized controlled trials.

    PubMed

    Wijeysundera, Duminda N; Austin, Peter C; Hux, Janet E; Beattie, W Scott; Laupacis, Andreas

    2009-01-01

    Randomized trials generally use "frequentist" statistics based on P-values and 95% confidence intervals. Frequentist methods have limitations that might be overcome, in part, by Bayesian inference. To illustrate these advantages, we re-analyzed randomized trials published in four general medical journals during 2004. We used Medline to identify randomized superiority trials with two parallel arms, individual-level randomization and dichotomous or time-to-event primary outcomes. Studies with P<0.05 in favor of the intervention were deemed "positive"; otherwise, they were "negative." We used several prior distributions and exact conjugate analyses to calculate Bayesian posterior probabilities for clinically relevant effects. Of 88 included studies, 39 were positive using a frequentist analysis. Although the Bayesian posterior probabilities of any benefit (relative risk or hazard ratio<1) were high in positive studies, these probabilities were lower and variable for larger benefits. The positive studies had only moderate probabilities for exceeding the effects that were assumed for calculating the sample size. By comparison, there were moderate probabilities of any benefit in negative studies. Bayesian and frequentist analyses complement each other when interpreting the results of randomized trials. Future reports of randomized trials should include both.

  8. Perception of randomness: On the time of streaks.

    PubMed

    Sun, Yanlong; Wang, Hongbin

    2010-12-01

    People tend to think that streaks in random sequential events are rare and remarkable. When they actually encounter streaks, they tend to consider the underlying process as non-random. The present paper examines the time of pattern occurrences in sequences of Bernoulli trials, and shows that among all patterns of the same length, a streak is the most delayed pattern for its first occurrence. It is argued that when time is of essence, how often a pattern is to occur (mean time, or, frequency) and when a pattern is to first occur (waiting time) are different questions and bear different psychological relevance. The waiting time statistics may provide a quantitative measure to the psychological distance when people are expecting a probabilistic event, and such measure is consistent with both of the representativeness and availability heuristics in people's perception of randomness. We discuss some of the recent empirical findings and suggest that people's judgment and generation of random sequences may be guided by their actual experiences of the waiting time statistics. Published by Elsevier Inc.

  9. Statistical mechanics based on fractional classical and quantum mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korichi, Z.; Meftah, M. T., E-mail: mewalid@yahoo.com

    2014-03-15

    The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.

  10. Philosophers assess randomized clinical trials: the need for dialogue.

    PubMed

    Miké, V

    1989-09-01

    In recent years a growing number of professional philosophers have joined in the controversy over ethical aspects of randomized clinical trials (RCTs). Morally questionable in their utilitarian approach, RCTs are claimed by some to be in direct violation of the second form of Kant's Categorical Imperative. But the arguments used in these critiques at times derive from a lack of insight into basic statistical procedures and the realities of the biomedical research process. Presented to physicians and other nonspecialists, including the lay public, such distortions can be harmful. Given the great complexity of statistical methodology and the anomalous nature of concepts of evidence, more sustained input into the interdisciplinary dialogue is needed from the statistical profession.

  11. How Statistics "Excel" Online.

    ERIC Educational Resources Information Center

    Chao, Faith; Davis, James

    2000-01-01

    Discusses the use of Microsoft Excel software and provides examples of its use in an online statistics course at Golden Gate University in the areas of randomness and probability, sampling distributions, confidence intervals, and regression analysis. (LRW)

  12. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  13. Statistical power and optimal design in experiments in which samples of participants respond to samples of stimuli.

    PubMed

    Westfall, Jacob; Kenny, David A; Judd, Charles M

    2014-10-01

    Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.

  14. Many-Body Localization and Thermalization in Quantum Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Nandkishore, Rahul; Huse, David A.

    2015-03-01

    We review some recent developments in the statistical mechanics of isolated quantum systems. We provide a brief introduction to quantum thermalization, paying particular attention to the eigenstate thermalization hypothesis (ETH) and the resulting single-eigenstate statistical mechanics. We then focus on a class of systems that fail to quantum thermalize and whose eigenstates violate the ETH: These are the many-body Anderson-localized systems; their long-time properties are not captured by the conventional ensembles of quantum statistical mechanics. These systems can forever locally remember information about their local initial conditions and are thus of interest for possibilities of storing quantum information. We discuss key features of many-body localization (MBL) and review a phenomenology of the MBL phase. Single-eigenstate statistical mechanics within the MBL phase reveal dynamically stable ordered phases, and phase transitions among them, that are invisible to equilibrium statistical mechanics and can occur at high energy and low spatial dimensionality, where equilibrium ordering is forbidden.

  15. [Mechanical thrombectomy in acute ischemic stroke. What is the position after the latest study results?].

    PubMed

    Hacke, W; Diener, H-C

    2015-06-01

    Mechanical devices for the recanalization of vessel occlusions in severe acute ischemic stroke have been developed for more than a decade. Several devices have been approved for clinical use on the basis of uncontrolled case series. Many neurologists have asked for randomized clinical trials comparing the new devices with standard treatment, e.g. thrombolytic therapy within a 4.5 h time window. The first 3 investigator initiated randomized trials published in 2013 failed to show superiority of mechanical thrombectomy over standard treatment. In the aftermath of these negative results several new trials with changes in design (e.g. shorter time window and only proximal vessel occlusions) and the use of modern devices with proven higher recanalization rates, so called stent retrievers, have been launched. In October 2014 the first of these new trials was presented and showed a clear superiority of thrombectomy. Based on this result interim analyses of five other studies were performed and most were prematurely terminated because of overwhelming efficacy. Only one trial testing another type of recanalization device failed to reach a statistically significant result. Currently five studies have already been published and two more studies have been presented at scientific conferences. This article provides an overview of the study protocols and the results of the individual studies, their common features and the characteristics of patients who benefit from this treatment. Finally, the consequences that these results may have for the treatment of patients with severe stroke caused by proximal vessel occlusion are discussed.

  16. [Efficiency of novel splash-proof ventilator circuit component on VAP and the colonization of multiple-drug resistant bacteria prevention in patients undergoing mechanical ventilation: a prospective randomized controlled intervention study with 318 patients].

    PubMed

    Xu, Songao; Yu, Huijie; Sun, Hui; Zhu, Xiangyun; Xu, Xiaoqin; Xu, Jun; Cao, Weizhong

    2017-01-01

    To investigate the efficiency of closed tracheal suction system (CTSS) using novel splash-proof ventilator circuit component on ventilator-associated pneumonia (VAP) and the colonization of multiple-drug resistant bacteria (MDR) in patients undergoing mechanical ventilation (MV) prevention. A prospective single-blinded randomized parallel controlled intervention study was conducted. 330 severe patients admitted to the intensive care unit (ICU) of the First Hospital of Jiaxing from January 2014 to May 2016 were enrolled, and they were divided into open tracheal suction group, closed tracheal suction group, and splash-proof suction group on average by random number table. The patients in the three groups used conventional ventilator circuit component, conventional CTSS, and CTSS with a novel splash-proof ventilator circuit component for MV and sputum suction, respectively. The incidence of VAP, airway bacterial colonization rate, MDR and fungi colonization rate, duration of MV, length of ICU and hospitalization stay, and financial expenditure during hospitalization, as well as the in-hospital prognosis were recorded. After excluding patients who did not meet the inclusion criteria, incomplete data, backed out and so on, 318 patients were enrolled in the analysis finally. Compared with the open tracheal suction group, the total incidence of VAP was decreased in the closed tracheal suction group and splash-proof suction group [20.95% (22/105), 21.90% (23/105) vs. 29.63% (32/108)], but no statistical difference was found (both P > 0.05), and the incidence of VAP infections/1 000 MV days showed the same change tendency (cases: 14.56, 17.35 vs. 23.07). The rate of airway bacterial colonization and the rate of MDR colonization in the open tracheal suction group and splash-proof suction group were remarkably lower than those of closed tracheal suction group [32.41% (35/108), 28.57% (30/105) vs. 46.67% (49/105), 20.37% (22/108), 15.24% (16/105) vs. 39.05% (41/105)] with significantly statistical differences (all P < 0.05). Besides, no significantly statistical difference was found in the fungi colonization rate among open tracheal group, closed tracheal group, and splash-proof suction group (4.63%, 3.81% and 6.67%, respectively, P > 0.05). Compared with the closed tracheal suction group, the duration of MV, the length of ICU and hospitalization stay were shortened in the open tracheal suction group and splash-proof suction group [duration of MV (days): 8.00 (4.00, 13.75), 8.00 (5.00, 13.00) vs. 9.00 (5.00, 16.00); the length of ICU stay (days): 10.00 (6.00, 16.00), 11.00 (7.00, 19.00) vs. 13.00 (7.50, 22.00); the length of hospitalization stay (days): 16.50 (9.25, 32.00), 19.00 (10.50, 32.50) vs. 21.00 (10.00, 36.00)], and financial expenditure during hospitalization was lowered [10 thousand Yuan: 4.95 (3.13, 8.62), 5.47 (3.84, 9.41) vs. 6.52 (3.99, 11.02)] without significantly statistical differences (all P > 0.05). Moreover, no significantly statistical difference was found in the in-hospital prognosis among the three groups. CTSS performed using novel splash-proof ventilator circuit component shared similar advantages in preventing VAP with the conventional CTSS. Meanwhile, it is superior because it prevented the colonization of MDR and high price in the conventional CTSS.Clinical Trail Registration Chinese Clinical Trial Registry, ChiCTR-IOR-16009694.

  17. Neither fixed nor random: weighted least squares meta-regression.

    PubMed

    Stanley, T D; Doucouliagos, Hristos

    2017-03-01

    Our study revisits and challenges two core conventional meta-regression estimators: the prevalent use of 'mixed-effects' or random-effects meta-regression analysis and the correction of standard errors that defines fixed-effects meta-regression analysis (FE-MRA). We show how and explain why an unrestricted weighted least squares MRA (WLS-MRA) estimator is superior to conventional random-effects (or mixed-effects) meta-regression when there is publication (or small-sample) bias that is as good as FE-MRA in all cases and better than fixed effects in most practical applications. Simulations and statistical theory show that WLS-MRA provides satisfactory estimates of meta-regression coefficients that are practically equivalent to mixed effects or random effects when there is no publication bias. When there is publication selection bias, WLS-MRA always has smaller bias than mixed effects or random effects. In practical applications, an unrestricted WLS meta-regression is likely to give practically equivalent or superior estimates to fixed-effects, random-effects, and mixed-effects meta-regression approaches. However, random-effects meta-regression remains viable and perhaps somewhat preferable if selection for statistical significance (publication bias) can be ruled out and when random, additive normal heterogeneity is known to directly affect the 'true' regression coefficient. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  18. A new test statistic for climate models that includes field and spatial dependencies using Gaussian Markov random fields

    DOE PAGES

    Nosedal-Sanchez, Alvaro; Jackson, Charles S.; Huerta, Gabriel

    2016-07-20

    A new test statistic for climate model evaluation has been developed that potentially mitigates some of the limitations that exist for observing and representing field and space dependencies of climate phenomena. Traditionally such dependencies have been ignored when climate models have been evaluated against observational data, which makes it difficult to assess whether any given model is simulating observed climate for the right reasons. The new statistic uses Gaussian Markov random fields for estimating field and space dependencies within a first-order grid point neighborhood structure. We illustrate the ability of Gaussian Markov random fields to represent empirical estimates of fieldmore » and space covariances using "witch hat" graphs. We further use the new statistic to evaluate the tropical response of a climate model (CAM3.1) to changes in two parameters important to its representation of cloud and precipitation physics. Overall, the inclusion of dependency information did not alter significantly the recognition of those regions of parameter space that best approximated observations. However, there were some qualitative differences in the shape of the response surface that suggest how such a measure could affect estimates of model uncertainty.« less

  19. Understanding regulatory networks requires more than computing a multitude of graph statistics. Comment on "Drivers of structural features in gene regulatory networks: From biophysical constraints to biological function" by O.C. Martin et al.

    NASA Astrophysics Data System (ADS)

    Tkačik, Gašper

    2016-07-01

    The article by O. Martin and colleagues provides a much needed systematic review of a body of work that relates the topological structure of genetic regulatory networks to evolutionary selection for function. This connection is very important. Using the current wealth of genomic data, statistical features of regulatory networks (e.g., degree distributions, motif composition, etc.) can be quantified rather easily; it is, however, often unclear how to interpret the results. On a graph theoretic level the statistical significance of the results can be evaluated by comparing observed graphs to ;randomized; ones (bravely ignoring the issue of how precisely to randomize!) and comparing the frequency of appearance of a particular network structure relative to a randomized null expectation. While this is a convenient operational test for statistical significance, its biological meaning is questionable. In contrast, an in-silico genotype-to-phenotype model makes explicit the assumptions about the network function, and thus clearly defines the expected network structures that can be compared to the case of no selection for function and, ultimately, to data.

  20. Phylogeography Takes a Relaxed Random Walk in Continuous Space and Time

    PubMed Central

    Lemey, Philippe; Rambaut, Andrew; Welch, John J.; Suchard, Marc A.

    2010-01-01

    Research aimed at understanding the geographic context of evolutionary histories is burgeoning across biological disciplines. Recent endeavors attempt to interpret contemporaneous genetic variation in the light of increasingly detailed geographical and environmental observations. Such interest has promoted the development of phylogeographic inference techniques that explicitly aim to integrate such heterogeneous data. One promising development involves reconstructing phylogeographic history on a continuous landscape. Here, we present a Bayesian statistical approach to infer continuous phylogeographic diffusion using random walk models while simultaneously reconstructing the evolutionary history in time from molecular sequence data. Moreover, by accommodating branch-specific variation in dispersal rates, we relax the most restrictive assumption of the standard Brownian diffusion process and demonstrate increased statistical efficiency in spatial reconstructions of overdispersed random walks by analyzing both simulated and real viral genetic data. We further illustrate how drawing inference about summary statistics from a fully specified stochastic process over both sequence evolution and spatial movement reveals important characteristics of a rabies epidemic. Together with recent advances in discrete phylogeographic inference, the continuous model developments furnish a flexible statistical framework for biogeographical reconstructions that is easily expanded upon to accommodate various landscape genetic features. PMID:20203288

  1. About influence of input rate random part of nonstationary queue system on statistical estimates of its macroscopic indicators

    NASA Astrophysics Data System (ADS)

    Korelin, Ivan A.; Porshnev, Sergey V.

    2018-05-01

    A model of the non-stationary queuing system (NQS) is described. The input of this model receives a flow of requests with input rate λ = λdet (t) + λrnd (t), where λdet (t) is a deterministic function depending on time; λrnd (t) is a random function. The parameters of functions λdet (t), λrnd (t) were identified on the basis of statistical information on visitor flows collected from various Russian football stadiums. The statistical modeling of NQS is carried out and the average statistical dependences are obtained: the length of the queue of requests waiting for service, the average wait time for the service, the number of visitors entered to the stadium on the time. It is shown that these dependencies can be characterized by the following parameters: the number of visitors who entered at the time of the match; time required to service all incoming visitors; the maximum value; the argument value when the studied dependence reaches its maximum value. The dependences of these parameters on the energy ratio of the deterministic and random component of the input rate are investigated.

  2. Method for simulating atmospheric turbulence phase effects for multiple time slices and anisoplanatic conditions.

    PubMed

    Roggemann, M C; Welsh, B M; Montera, D; Rhoadarmer, T A

    1995-07-10

    Simulating the effects of atmospheric turbulence on optical imaging systems is an important aspect of understanding the performance of these systems. Simulations are particularly important for understanding the statistics of some adaptive-optics system performance measures, such as the mean and variance of the compensated optical transfer function, and for understanding the statistics of estimators used to reconstruct intensity distributions from turbulence-corrupted image measurements. Current methods of simulating the performance of these systems typically make use of random phase screens placed in the system pupil. Methods exist for making random draws of phase screens that have the correct spatial statistics. However, simulating temporal effects and anisoplanatism requires one or more phase screens at different distances from the aperture, possibly moving with different velocities. We describe and demonstrate a method for creating random draws of phase screens with the correct space-time statistics for a bitrary turbulence and wind-velocity profiles, which can be placed in the telescope pupil in simulations. Results are provided for both the von Kármán and the Kolmogorov turbulence spectra. We also show how to simulate anisoplanatic effects with this technique.

  3. Effect of a stress management program on subjects with neck pain: A pilot randomized controlled trial.

    PubMed

    Metikaridis, T Damianos; Hadjipavlou, Alexander; Artemiadis, Artemios; Chrousos, George; Darviri, Christina

    2016-05-20

    Studies have shown that stress is implicated in the cause of neck pain (NP). The purpose of this study is to examine the effect of a simple, zero cost stress management program on patients suffering from NP. This study is a parallel-type randomized clinical study. People suffering from chronic non-specific NP were chosen randomly to participate in an eight week duration program of stress management (N= 28) (including diaphragmatic breathing, progressive muscle relaxation) or in a no intervention control condition (N= 25). Self-report measures were used for the evaluation of various variables at the beginning and at the end of the eight-week monitoring period. Descriptive and inferential statistic methods were used for the statistical analysis. At the end of the monitoring period, the intervention group showed a statistically significant reduction of stress and anxiety (p= 0.03, p= 0.01), report of stress related symptoms (p= 0.003), percentage of disability due to NP (p= 0.000) and NP intensity (p= 0.002). At the same time, daily routine satisfaction levels were elevated (p= 0.019). No statistically significant difference was observed in cortisol measurements. Stress management has positive effects on NP patients.

  4. A new test statistic for climate models that includes field and spatial dependencies using Gaussian Markov random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nosedal-Sanchez, Alvaro; Jackson, Charles S.; Huerta, Gabriel

    A new test statistic for climate model evaluation has been developed that potentially mitigates some of the limitations that exist for observing and representing field and space dependencies of climate phenomena. Traditionally such dependencies have been ignored when climate models have been evaluated against observational data, which makes it difficult to assess whether any given model is simulating observed climate for the right reasons. The new statistic uses Gaussian Markov random fields for estimating field and space dependencies within a first-order grid point neighborhood structure. We illustrate the ability of Gaussian Markov random fields to represent empirical estimates of fieldmore » and space covariances using "witch hat" graphs. We further use the new statistic to evaluate the tropical response of a climate model (CAM3.1) to changes in two parameters important to its representation of cloud and precipitation physics. Overall, the inclusion of dependency information did not alter significantly the recognition of those regions of parameter space that best approximated observations. However, there were some qualitative differences in the shape of the response surface that suggest how such a measure could affect estimates of model uncertainty.« less

  5. Unbiased split variable selection for random survival forests using maximally selected rank statistics.

    PubMed

    Wright, Marvin N; Dankowski, Theresa; Ziegler, Andreas

    2017-04-15

    The most popular approach for analyzing survival data is the Cox regression model. The Cox model may, however, be misspecified, and its proportionality assumption may not always be fulfilled. An alternative approach for survival prediction is random forests for survival outcomes. The standard split criterion for random survival forests is the log-rank test statistic, which favors splitting variables with many possible split points. Conditional inference forests avoid this split variable selection bias. However, linear rank statistics are utilized by default in conditional inference forests to select the optimal splitting variable, which cannot detect non-linear effects in the independent variables. An alternative is to use maximally selected rank statistics for the split point selection. As in conditional inference forests, splitting variables are compared on the p-value scale. However, instead of the conditional Monte-Carlo approach used in conditional inference forests, p-value approximations are employed. We describe several p-value approximations and the implementation of the proposed random forest approach. A simulation study demonstrates that unbiased split variable selection is possible. However, there is a trade-off between unbiased split variable selection and runtime. In benchmark studies of prediction performance on simulated and real datasets, the new method performs better than random survival forests if informative dichotomous variables are combined with uninformative variables with more categories and better than conditional inference forests if non-linear covariate effects are included. In a runtime comparison, the method proves to be computationally faster than both alternatives, if a simple p-value approximation is used. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  6. North-south asymmetry of solar activity as a superposition of two realizations - the sign and absolute value

    NASA Astrophysics Data System (ADS)

    Badalyan, O. G.; Obridko, V. N.

    2017-07-01

    Context. Since the occurrence of north-south asymmetry (NSA) of alternating sign may be determined by different mechanisms, the frequency and amplitude characteristics of this phenomenon should be considered separately. Aims: We propose a new approach to the description of the NSA of solar activity. Methods: The asymmetry defined as A = (N-S)/(N + S) (where N and S are, respectively, the indices of activity of the northern and southern hemispheres) is treated as a superposition of two functions: the sign of asymmetry (signature) and its absolute value (modulus). This approach is applied to the analysis of the NSA of sunspot group areas for the period 1874-2013. Results: We show that the sign of asymmetry provides information on the behavior of the asymmetry. In particular, it displays quasi-periodic variation with a period of 12 yr and quasi-biennial oscillations as the asymmetry itself. The statistics of the so-called monochrome intervals (long periods of positive or negative asymmetry) are considered and it is shown that the distribution of these intervals is described by the random distribution law. This means that the dynamo mechanisms governing the cyclic variation of solar activity must involve random processes. At the same time, the asymmetry modulus has completely different statistical properties and is probably associated with processes that determine the amplitude of the cycle. One can reliably isolate an 11-yr cycle in the behavior of the asymmetry absolute value shifted by half a period with respect to the Wolf numbers. It is shown that the asymmetry modulus has a significant prognostic value: the higher the maximum of the asymmetry modulus, the lower the following Wolf number maximum. Conclusions: A fundamental nature of this concept of NSA is discussed in the context of the general methodology of cognizing the world. It is supposed that the proposed description of the NSA will help clarify the nature of this phenomenon.

  7. Conventional versus computer-navigated TKA: a prospective randomized study.

    PubMed

    Todesca, Alessandro; Garro, Luca; Penna, Massimo; Bejui-Hugues, Jacques

    2017-06-01

    The purpose of this study was to assess the midterm results of total knee arthroplasty (TKA) implanted with a specific computer navigation system in a group of patients (NAV) and to assess the same prosthesis implanted with the conventional technique in another group (CON); we hypothesized that computer navigation surgery would improve implant alignment, functional scores and survival of the implant compared to the conventional technique. From 2008 to 2009, 225 patients were enrolled in the study and randomly assigned in CON and NAV groups; 240 consecutive mobile-bearing ultra-congruent score (Amplitude, Valence, France) TKAs were performed by a single surgeon, 117 using the conventional method and 123 using the computer-navigated approach. Clinical outcome assessment was based on the Knee Society Score (KSS), the Hospital for Special Surgery Knee Score and the Western Ontario Mac Master University Index score. Component survival was calculated by Kaplan-Meier analysis. Median follow-up was 6.4 years (range 6-7 years). Two patients were lost to follow-up. No differences were seen between the two groups in age, sex, BMI and side of implantation. Three patients of CON group referred feelings of instability during walking, but clinical tests were all negative. NAV group showed statistical significant better KSS Score and wider ROM and fewer outliers from neutral mechanical axis, lateral distal femoral angle, medial proximal tibial angle and tibial slope in post-operative radiographic assessment. There was one case of early post-operative superficial infection (caused by Staph. Aureus) successfully treated with antibiotics. No mechanical loosening, mobile-bearing dislocation or patellofemoral complication was seen. At 7 years of follow-up, component survival in relation to the risk of aseptic loosening or other complications was 100 %. There were no implant revisions. This study demonstrates superior accuracy in implant positioning and statistical significant better functional outcomes of computer-navigated TKA. Computer navigation for TKAs should be used routinely in primary implants. II.

  8. Effects of a saw palmetto herbal blend in men with symptomatic benign prostatic hyperplasia.

    PubMed

    Marks, L S; Partin, A W; Epstein, J I; Tyler, V E; Simon, I; Macairan, M L; Chan, T L; Dorey, F J; Garris, J B; Veltri, R W; Santos, P B; Stonebrook, K A; deKernion, J B

    2000-05-01

    We tested the effects of a saw palmetto herbal blend in men with symptomatic benign prostatic hyperplasia (BPH) via a randomized, placebo controlled trial. We randomized 44 men 45 to 80 years old with symptomatic BPH into a trial of a saw palmetto herbal blend versus placebo. End points included routine clinical measures (symptom score, uroflowmetry and post-void residual urine volume), blood chemistry studies (prostate specific antigen, sex hormones and multiphasic analysis), prostate volumetrics by magnetic resonance imaging, and prostate biopsy for zonal tissue morphometry and semiquantitative histology studies. Saw palmetto herbal blend and placebo groups had improved clinical parameters with a slight advantage in the saw palmetto group (not statistically significant). Neither prostate specific antigen nor prostate volume changed from baseline. Prostate epithelial contraction was noted, especially in the transition zone, where percent epithelium decreased from 17.8% at baseline to 10.7% after 6 months of saw palmetto herbal blend (p <0.01). Histological studies showed that the percent of atrophic glands increased from 25. 2% to 40.9% after treatment with saw palmetto herbal blend (p <0.01). The mechanism of action appeared to be nonhormonal but it was not identified by tissue studies of apoptosis, cellular proliferation, angiogenesis, growth factors or androgen receptor expression. We noted no adverse effects of saw palmetto herbal blend. When the study was no longer blinded, 41 men elected to continue therapy in an open label extension. Saw palmetto herbal blend appears to be a safe, highly desirable option for men with moderately symptomatic BPH. The secondary outcome measures of clinical effect in our study were only slightly better for saw palmetto herbal blend than placebo (not statistically significant). However, saw palmetto herbal blend therapy was associated with epithelial contraction, especially in the transition zone (p <0.01), indicating a possible mechanism of action underlying the clinical significance detected in other studies.

  9. Compendium of Abstracts on Statistical Applications in Geotechnical Engineering.

    DTIC Science & Technology

    1983-09-01

    research in the application of probabilistic and statistical methods to soil mechanics, rock mechanics, and engineering geology problems have grown markedly...probability, statistics, soil mechanics, rock mechanics, and engineering geology. 2. The purpose of this report is to make available to the U. S...Deformation Dynamic Response Analysis Seepage, Soil Permeability and Piping Earthquake Engineering, Seismology, Settlement and Heave Seismic Risk Analysis

  10. A Statistical Method to Distinguish Functional Brain Networks

    PubMed Central

    Fujita, André; Vidal, Maciel C.; Takahashi, Daniel Y.

    2017-01-01

    One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism (p < 0.001). PMID:28261045

  11. A Statistical Method to Distinguish Functional Brain Networks.

    PubMed

    Fujita, André; Vidal, Maciel C; Takahashi, Daniel Y

    2017-01-01

    One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism ( p < 0.001).

  12. Robust inference from multiple test statistics via permutations: a better alternative to the single test statistic approach for randomized trials.

    PubMed

    Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie

    2013-01-01

    Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.

  13. Interpreting “statistical hypothesis testing” results in clinical research

    PubMed Central

    Sarmukaddam, Sanjeev B.

    2012-01-01

    Difference between “Clinical Significance and Statistical Significance” should be kept in mind while interpreting “statistical hypothesis testing” results in clinical research. This fact is already known to many but again pointed out here as philosophy of “statistical hypothesis testing” is sometimes unnecessarily criticized mainly due to failure in considering such distinction. Randomized controlled trials are also wrongly criticized similarly. Some scientific method may not be applicable in some peculiar/particular situation does not mean that the method is useless. Also remember that “statistical hypothesis testing” is not for decision making and the field of “decision analysis” is very much an integral part of science of statistics. It is not correct to say that “confidence intervals have nothing to do with confidence” unless one understands meaning of the word “confidence” as used in context of confidence interval. Interpretation of the results of every study should always consider all possible alternative explanations like chance, bias, and confounding. Statistical tests in inferential statistics are, in general, designed to answer the question “How likely is the difference found in random sample(s) is due to chance” and therefore limitation of relying only on statistical significance in making clinical decisions should be avoided. PMID:22707861

  14. The efficacy of a behavioral activation intervention among depressed US Latinos with limited English language proficiency: study protocol for a randomized controlled trial.

    PubMed

    Collado, Anahi; Long, Katherine E; MacPherson, Laura; Lejuez, Carl W

    2014-06-18

    Major depressive disorder is highly prevalent among Latinos with limited English language proficiency in the United States. Although major depressive disorder is highly treatable, barriers to depression treatment have historically prevented Latinos with limited English language proficiency from accessing effective interventions. The project seeks to evaluate the efficacy of behavioral activation treatment for depression, an empirically supported treatment for depression, as an intervention that may address some of the disparities surrounding the receipt of efficacious mental health care for this population. Following a pilot study of behavioral activation treatment for depression with 10 participants which yielded very promising results, the current study is a randomized control trial testing behavioral activation treatment for depression versus a supportive counseling treatment for depression. We are in the process of recruiting 60 Latinos with limited English language proficiency meeting criteria for major depressive disorder according to the Diagnostic and Statistical Manual of Mental Disorders 4th and 5th Edition for participation in a single-center efficacy trial. Participants are randomized to receive 10 sessions of behavioral activation treatment for depression (n = 30) or 10 sessions of supportive counseling (n = 30). Assessments occur prior to each session and at 1 month after completing treatment. Intervention targets include depressive symptomatology and the proposed mechanisms of behavioral activation treatment for depression: activity level and environmental reward. We will also examine other factors related to treatment outcome such as treatment adherence, treatment satisfaction, and therapeutic alliance. This randomized controlled trial will allow us to determine the efficacy of behavioral activation treatment for depression in a fast-growing, yet highly underserved population in US mental health services. The study is also among the first to examine the effect of the proposed mechanisms of change of behavioral activation treatment for depression (that is, activity level and environmental reward) on depression over time. To our knowledge, this is the first randomized controlled trial to compare an empirical-supported treatment to a control supportive counseling condition in a sample of depressed, Spanish-speaking Latinos in the United States. Clinical Trials Register: NCT01958840; registered 8 October 2013.

  15. Twice random, once mixed: applying mixed models to simultaneously analyze random effects of language and participants.

    PubMed

    Janssen, Dirk P

    2012-03-01

    Psychologists, psycholinguists, and other researchers using language stimuli have been struggling for more than 30 years with the problem of how to analyze experimental data that contain two crossed random effects (items and participants). The classical analysis of variance does not apply; alternatives have been proposed but have failed to catch on, and a statistically unsatisfactory procedure of using two approximations (known as F(1) and F(2)) has become the standard. A simple and elegant solution using mixed model analysis has been available for 15 years, and recent improvements in statistical software have made mixed models analysis widely available. The aim of this article is to increase the use of mixed models by giving a concise practical introduction and by giving clear directions for undertaking the analysis in the most popular statistical packages. The article also introduces the DJMIXED: add-on package for SPSS, which makes entering the models and reporting their results as straightforward as possible.

  16. Does rational selection of training and test sets improve the outcome of QSAR modeling?

    PubMed

    Martin, Todd M; Harten, Paul; Young, Douglas M; Muratov, Eugene N; Golbraikh, Alexander; Zhu, Hao; Tropsha, Alexander

    2012-10-22

    Prior to using a quantitative structure activity relationship (QSAR) model for external predictions, its predictive power should be established and validated. In the absence of a true external data set, the best way to validate the predictive ability of a model is to perform its statistical external validation. In statistical external validation, the overall data set is divided into training and test sets. Commonly, this splitting is performed using random division. Rational splitting methods can divide data sets into training and test sets in an intelligent fashion. The purpose of this study was to determine whether rational division methods lead to more predictive models compared to random division. A special data splitting procedure was used to facilitate the comparison between random and rational division methods. For each toxicity end point, the overall data set was divided into a modeling set (80% of the overall set) and an external evaluation set (20% of the overall set) using random division. The modeling set was then subdivided into a training set (80% of the modeling set) and a test set (20% of the modeling set) using rational division methods and by using random division. The Kennard-Stone, minimal test set dissimilarity, and sphere exclusion algorithms were used as the rational division methods. The hierarchical clustering, random forest, and k-nearest neighbor (kNN) methods were used to develop QSAR models based on the training sets. For kNN QSAR, multiple training and test sets were generated, and multiple QSAR models were built. The results of this study indicate that models based on rational division methods generate better statistical results for the test sets than models based on random division, but the predictive power of both types of models are comparable.

  17. TVT-Exact and midurethral sling (SLING-IUFT) operative procedures: a randomized study

    PubMed Central

    Aniulis, Povilas; Skaudickas, Darijus

    2015-01-01

    Objectives The aim of the study is to compare results, effectiveness and complications of TVT exact and midurethral sling (SLING-IUFT) operations in the treatment of female stress urinary incontinence (SUI). Methods A single center nonblind, randomized study of women with SUI who were randomized to TVT-Exact and SLING-IUFT was performed by one surgeon from April 2009 to April 2011. SUI was diagnosed on coughing and Valsalva test and urodynamics (cystometry and uroflowmetry) were assessed before operation and 1 year after surgery. This was a prospective randomized study. The follow up period was 12 months. 76 patients were operated using the TVT-Exact operation and 78 patients – using the SLING-IUFT operation. There was no statistically significant differences between groups for BMI, parity, menopausal status and prolapsed stage (no patients had cystocele greater than stage II). Results Mean operative time was significantly shorter in the SLING-IUFT group (19 ± 5.6 min.) compared with the TVT-Exact group (27 ± 7.1 min.). There were statistically significant differences in the effectiveness of both procedures: TVT-Exact – at 94.5% and SLING-IUFT – at 61.2% after one year. Hospital stay was statistically significantly shorter in the SLING-IUFT group (1. 2 ± 0.5 days) compared with the TVT-Exact group (3.5 ± 1.5 days). Statistically significantly fewer complications occurred in the SLING-IUFT group. Conclusion the TVT-Exact and SLING-IUFT operations are both effective for surgical treatment of female stress urinary incontinence. The SLING-IUFT involved a shorter operation time and lower complications rate., the TVT-Exact procedure had statistically significantly more complications than the SLING-IUFT operation, but a higher effectiveness. PMID:28352711

  18. TVT-Exact and midurethral sling (SLING-IUFT) operative procedures: a randomized study.

    PubMed

    Aniuliene, Rosita; Aniulis, Povilas; Skaudickas, Darijus

    2015-01-01

    The aim of the study is to compare results, effectiveness and complications of TVT exact and midurethral sling (SLING-IUFT) operations in the treatment of female stress urinary incontinence (SUI). A single center nonblind, randomized study of women with SUI who were randomized to TVT-Exact and SLING-IUFT was performed by one surgeon from April 2009 to April 2011. SUI was diagnosed on coughing and Valsalva test and urodynamics (cystometry and uroflowmetry) were assessed before operation and 1 year after surgery. This was a prospective randomized study. The follow up period was 12 months. 76 patients were operated using the TVT-Exact operation and 78 patients - using the SLING-IUFT operation. There was no statistically significant differences between groups for BMI, parity, menopausal status and prolapsed stage (no patients had cystocele greater than stage II). Mean operative time was significantly shorter in the SLING-IUFT group (19 ± 5.6 min.) compared with the TVT-Exact group (27 ± 7.1 min.). There were statistically significant differences in the effectiveness of both procedures: TVT-Exact - at 94.5% and SLING-IUFT - at 61.2% after one year. Hospital stay was statistically significantly shorter in the SLING-IUFT group (1. 2 ± 0.5 days) compared with the TVT-Exact group (3.5 ± 1.5 days). Statistically significantly fewer complications occurred in the SLING-IUFT group. the TVT-Exact and SLING-IUFT operations are both effective for surgical treatment of female stress urinary incontinence. The SLING-IUFT involved a shorter operation time and lower complications rate., the TVT-Exact procedure had statistically significantly more complications than the SLING-IUFT operation, but a higher effectiveness.

  19. Coordinate based random effect size meta-analysis of neuroimaging studies.

    PubMed

    Tench, C R; Tanasescu, Radu; Constantinescu, C S; Auer, D P; Cottam, W J

    2017-06-01

    Low power in neuroimaging studies can make them difficult to interpret, and Coordinate based meta-analysis (CBMA) may go some way to mitigating this issue. CBMA has been used in many analyses to detect where published functional MRI or voxel-based morphometry studies testing similar hypotheses report significant summary results (coordinates) consistently. Only the reported coordinates and possibly t statistics are analysed, and statistical significance of clusters is determined by coordinate density. Here a method of performing coordinate based random effect size meta-analysis and meta-regression is introduced. The algorithm (ClusterZ) analyses both coordinates and reported t statistic or Z score, standardised by the number of subjects. Statistical significance is determined not by coordinate density, but by a random effects meta-analyses of reported effects performed cluster-wise using standard statistical methods and taking account of censoring inherent in the published summary results. Type 1 error control is achieved using the false cluster discovery rate (FCDR), which is based on the false discovery rate. This controls both the family wise error rate under the null hypothesis that coordinates are randomly drawn from a standard stereotaxic space, and the proportion of significant clusters that are expected under the null. Such control is necessary to avoid propagating and even amplifying the very issues motivating the meta-analysis in the first place. ClusterZ is demonstrated on both numerically simulated data and on real data from reports of grey matter loss in multiple sclerosis (MS) and syndromes suggestive of MS, and of painful stimulus in healthy controls. The software implementation is available to download and use freely. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Gaps between avalanches in one-dimensional random-field Ising models

    NASA Astrophysics Data System (ADS)

    Nampoothiri, Jishnu N.; Ramola, Kabir; Sabhapandit, Sanjib; Chakraborty, Bulbul

    2017-09-01

    We analyze the statistics of gaps (Δ H ) between successive avalanches in one-dimensional random-field Ising models (RFIMs) in an external field H at zero temperature. In the first part of the paper we study the nearest-neighbor ferromagnetic RFIM. We map the sequence of avalanches in this system to a nonhomogeneous Poisson process with an H -dependent rate ρ (H ) . We use this to analytically compute the distribution of gaps P (Δ H ) between avalanches as the field is increased monotonically from -∞ to +∞ . We show that P (Δ H ) tends to a constant C (R ) as Δ H →0+ , which displays a nontrivial behavior with the strength of disorder R . We verify our predictions with numerical simulations. In the second part of the paper, motivated by avalanche gap distributions in driven disordered amorphous solids, we study a long-range antiferromagnetic RFIM. This model displays a gapped behavior P (Δ H )=0 up to a system size dependent offset value Δ Hoff , and P (Δ H ) ˜(ΔH -Δ Hoff) θ as Δ H →Hoff+ . We perform numerical simulations on this model and determine θ ≈0.95 (5 ) . We also discuss mechanisms which would lead to a nonzero exponent θ for general spin models with quenched random fields.

Top