Sample records for prequantum classical statistical

  1. Prequantum classical statistical field theory: background field as a source of everything?

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2011-07-01

    Prequantum classical statistical field theory (PCSFT) is a new attempt to consider quantum mechanics (QM) as an emergent phenomenon, cf. with De Broglie's "double solution" approach, Bohmian mechanics, stochastic electrodynamics (SED), Nelson's stochastic QM and its generalization by Davidson, 't Hooft's models and their development by Elze. PCSFT is a comeback to a purely wave viewpoint on QM, cf. with early Schrodinger. There is no quantum particles at all, only waves. In particular, photons are simply wave-pulses of the classical electromagnetic field, cf. SED. Moreover, even massive particles are special "prequantum fields": the electron field, the neutron field, and so on. PCSFT claims that (sooner or later) people will be able to measure components of these fields: components of the "photonic field" (the classical electromagnetic field of low intensity), electronic field, neutronic field, and so on. At the moment we are able to produce quantum correlations as correlations of classical Gaussian random fields. In this paper we are interested in mathematical and physical reasons of usage of Gaussian fields. We consider prequantum signals (corresponding to quantum systems) as composed of a huge number of wave-pulses (on very fine prequantum time scale). We speculate that the prequantum background field (the field of "vacuum fluctuations") might play the role of a source of such pulses, i.e., the source of everything.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khrennikov, Andrei

    We present fundamentals of a prequantum model with hidden variables of the classical field type. In some sense this is the comeback of classical wave mechanics. Our approach also can be considered as incorporation of quantum mechanics into classical signal theory. All quantum averages (including correlations of entangled systems) can be represented as classical signal averages and correlations.

  3. Quantum correlations and dynamics from classical random fields valued in complex Hilbert spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khrennikov, Andrei

    2010-08-15

    One of the crucial differences between mathematical models of classical and quantum mechanics (QM) is the use of the tensor product of the state spaces of subsystems as the state space of the corresponding composite system. (To describe an ensemble of classical composite systems, one uses random variables taking values in the Cartesian product of the state spaces of subsystems.) We show that, nevertheless, it is possible to establish a natural correspondence between the classical and the quantum probabilistic descriptions of composite systems. Quantum averages for composite systems (including entangled) can be represented as averages with respect to classical randommore » fields. It is essentially what Albert Einstein dreamed of. QM is represented as classical statistical mechanics with infinite-dimensional phase space. While the mathematical construction is completely rigorous, its physical interpretation is a complicated problem. We present the basic physical interpretation of prequantum classical statistical field theory in Sec. II. However, this is only the first step toward real physical theory.« less

  4. Quantum epistemology from subquantum ontology: Quantum mechanics from theory of classical random fields

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2017-02-01

    The scientific methodology based on two descriptive levels, ontic (reality as it is) and epistemic (observational), is briefly presented. Following Schrödinger, we point to the possible gap between these two descriptions. Our main aim is to show that, although ontic entities may be unaccessible for observations, they can be useful for clarification of the physical nature of operational epistemic entities. We illustrate this thesis by the concrete example: starting with the concrete ontic model preceding quantum mechanics (the latter is treated as an epistemic model), namely, prequantum classical statistical field theory (PCSFT), we propose the natural physical interpretation for the basic quantum mechanical entity-the quantum state ("wave function"). The correspondence PCSFT ↦ QM is not straightforward, it couples the covariance operators of classical (prequantum) random fields with the quantum density operators. We use this correspondence to clarify the physical meaning of the pure quantum state and the superposition principle-by using the formalism of classical field correlations. In classical mechanics the phase space description can be considered as the ontic description, here states are given by points λ =(x , p) of phase space. The dynamics of the ontic state is given by the system of Hamiltonian equations.We can also consider probability distributions on the phase space (or equivalently random variables valued in it). We call them probabilistic ontic states. Dynamics of probabilistic ontic states is given by the Liouville equation.In classical physics we can (at least in principle) measure both the coordinate and momentum and hence ontic states can be treated as epistemic states as well (or it is better to say that here epistemic states can be treated as ontic states). Probabilistic ontic states represent probabilities for outcomes of joint measurement of position and momentum.However, this was a very special, although very important, example of description of physical phenomena. In general there are no reasons to expect that properties of ontic states are approachable through our measurements. There is a gap between ontic and epistemic descriptions, cf. also with 't Hooft [49,50] and G G. Groessing et al. [51]. In general the presence of such a gap also implies unapproachability of the probabilistic ontic states, i.e., probability distributions on the space of ontic states. De Broglie [28] called such probability distributions hidden probabilities and distinguished them sharply from probability distributions of measurements outcomes, see also Lochak [29]. (The latter distributions are described by the quantum formalism.)This ontic-epistemic approach based on the combination of two descriptive levels for natural phenomena is closely related to the old Bild conception which was originated in the works of Hertz. Later it was heavily explored by Schrödinger in the quantum domain, see, e.g., [8,11] for detailed analysis. According to Hertz one cannot expect to construct a complete theoretical model based explicitly on observable quantities. The complete theoretical model can contain quantities which are unapproachable for external measurement inspection. For example, Hertz by trying to create a mechanical model for Maxwell's electromagnetism invented hidden masses. The main distinguishing property of a theoretical model (in contrast to an observational model) is the continuity of description, i.e., the absence of gaps in description. From this viewpoint, the quantum mechanical description is not continuous: there is a gap between premeasurement dynamics and the measurement outcome. QM cannot say anything what happens in the process of measurement, this is the well known measurement problem of QM [32], cf. [52,53]. Continuity of description is closely related to causality. However, here we cannot go in more detail, see [8,11].The important question is about interrelation between two levels of description, ontic-epistemic (or theoretical-observational). In the introduction we have already cited Schrödinger who emphasized the possible complexity of this interrelation. In particular, in general there is no reason to expect a straightforward coupling of the form, cf. [9,10]:

  5. Super-Lie n-algebra extensions, higher WZW models and super-p-branes with tensor multiplet fields

    NASA Astrophysics Data System (ADS)

    Fiorenza, Domenico; Sati, Hisham; Schreiber, Urs

    2015-12-01

    We formalize higher-dimensional and higher gauge WZW-type sigma-model local prequantum field theory, and discuss its rationalized/perturbative description in (super-)Lie n-algebra homotopy theory (the true home of the "FDA"-language used in the supergravity literature). We show generally how the intersection laws for such higher WZW-type σ-model branes (open brane ending on background brane) are encoded precisely in (super-)L∞-extension theory and how the resulting "extended (super-)space-times" formalize spacetimes containing σ-model brane condensates. As an application we prove in Lie n-algebra homotopy theory that the complete super-p-brane spectrum of superstring/M-theory is realized this way, including the pure σ-model branes (the "old brane scan") but also the branes with tensor multiplet worldvolume fields, notably the D-branes and the M5-brane. For instance the degree-0 piece of the higher symmetry algebra of 11-dimensional (11D) spacetime with an M2-brane condensate turns out to be the "M-theory super-Lie algebra". We also observe that in this formulation there is a simple formal proof of the fact that type IIA spacetime with a D0-brane condensate is the 11D sugra/M-theory spacetime, and of (prequantum) S-duality for type IIB string theory. Finally we give the non-perturbative description of all this by higher WZW-type σ-models on higher super-orbispaces with higher WZW terms in stacky differential cohomology.

  6. The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes

    ERIC Educational Resources Information Center

    Cartier, Stephen F.

    2011-01-01

    A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…

  7. Accessible Information Without Disturbing Partially Known Quantum States on a von Neumann Algebra

    NASA Astrophysics Data System (ADS)

    Kuramochi, Yui

    2018-04-01

    This paper addresses the problem of how much information we can extract without disturbing a statistical experiment, which is a family of partially known normal states on a von Neumann algebra. We define the classical part of a statistical experiment as the restriction of the equivalent minimal sufficient statistical experiment to the center of the outcome space, which, in the case of density operators on a Hilbert space, corresponds to the classical probability distributions appearing in the maximal decomposition by Koashi and Imoto (Phys. Rev. A 66, 022,318 2002). We show that we can access by a Schwarz or completely positive channel at most the classical part of a statistical experiment if we do not disturb the states. We apply this result to the broadcasting problem of a statistical experiment. We also show that the classical part of the direct product of statistical experiments is the direct product of the classical parts of the statistical experiments. The proof of the latter result is based on the theorem that the direct product of minimal sufficient statistical experiments is also minimal sufficient.

  8. Statistical mechanics based on fractional classical and quantum mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korichi, Z.; Meftah, M. T., E-mail: mewalid@yahoo.com

    2014-03-15

    The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.

  9. Classical Statistics and Statistical Learning in Imaging Neuroscience

    PubMed Central

    Bzdok, Danilo

    2017-01-01

    Brain-imaging research has predominantly generated insight by means of classical statistics, including regression-type analyses and null-hypothesis testing using t-test and ANOVA. Throughout recent years, statistical learning methods enjoy increasing popularity especially for applications in rich and complex data, including cross-validated out-of-sample prediction using pattern classification and sparsity-inducing regression. This concept paper discusses the implications of inferential justifications and algorithmic methodologies in common data analysis scenarios in neuroimaging. It is retraced how classical statistics and statistical learning originated from different historical contexts, build on different theoretical foundations, make different assumptions, and evaluate different outcome metrics to permit differently nuanced conclusions. The present considerations should help reduce current confusion between model-driven classical hypothesis testing and data-driven learning algorithms for investigating the brain with imaging techniques. PMID:29056896

  10. Quantum formalism for classical statistics

    NASA Astrophysics Data System (ADS)

    Wetterich, C.

    2018-06-01

    In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.

  11. On Ruch's Principle of Decreasing Mixing Distance in classical statistical physics

    NASA Astrophysics Data System (ADS)

    Busch, Paul; Quadt, Ralf

    1990-10-01

    Ruch's Principle of Decreasing Mixing Distance is reviewed as a statistical physical principle and its basic suport and geometric interpretation, the Ruch-Schranner-Seligman theorem, is generalized to be applicable to a large representative class of classical statistical systems.

  12. Teaching Statistics Using Classic Psychology Research: An Activities-Based Approach

    ERIC Educational Resources Information Center

    Holmes, Karen Y.; Dodd, Brett A.

    2012-01-01

    In this article, we discuss a collection of active learning activities derived from classic psychology studies that illustrate the appropriate use of descriptive and inferential statistics. (Contains 2 tables.)

  13. The Development of Bayesian Theory and Its Applications in Business and Bioinformatics

    NASA Astrophysics Data System (ADS)

    Zhang, Yifei

    2018-03-01

    Bayesian Theory originated from an Essay of a British mathematician named Thomas Bayes in 1763, and after its development in 20th century, Bayesian Statistics has been taking a significant part in statistical study of all fields. Due to the recent breakthrough of high-dimensional integral, Bayesian Statistics has been improved and perfected, and now it can be used to solve problems that Classical Statistics failed to solve. This paper summarizes Bayesian Statistics’ history, concepts and applications, which are illustrated in five parts: the history of Bayesian Statistics, the weakness of Classical Statistics, Bayesian Theory and its development and applications. The first two parts make a comparison between Bayesian Statistics and Classical Statistics in a macroscopic aspect. And the last three parts focus on Bayesian Theory in specific -- from introducing some particular Bayesian Statistics’ concepts to listing their development and finally their applications.

  14. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling.

    PubMed

    Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall

    2016-01-01

    Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.

  15. Quantum mechanics as classical statistical mechanics with an ontic extension and an epistemic restriction.

    PubMed

    Budiyono, Agung; Rohrlich, Daniel

    2017-11-03

    Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.

  16. Statistical mechanics in the context of special relativity. II.

    PubMed

    Kaniadakis, G

    2005-09-01

    The special relativity laws emerge as one-parameter (light speed) generalizations of the corresponding laws of classical physics. These generalizations, imposed by the Lorentz transformations, affect both the definition of the various physical observables (e.g., momentum, energy, etc.), as well as the mathematical apparatus of the theory. Here, following the general lines of [Phys. Rev. E 66, 056125 (2002)], we show that the Lorentz transformations impose also a proper one-parameter generalization of the classical Boltzmann-Gibbs-Shannon entropy. The obtained relativistic entropy permits us to construct a coherent and self-consistent relativistic statistical theory, preserving the main features of the ordinary statistical theory, which is recovered in the classical limit. The predicted distribution function is a one-parameter continuous deformation of the classical Maxwell-Boltzmann distribution and has a simple analytic form, showing power law tails in accordance with the experimental evidence. Furthermore, this statistical mechanics can be obtained as the stationary case of a generalized kinetic theory governed by an evolution equation obeying the H theorem and reproducing the Boltzmann equation of the ordinary kinetics in the classical limit.

  17. Free Fermions and the Classical Compact Groups

    NASA Astrophysics Data System (ADS)

    Cunden, Fabio Deelan; Mezzadri, Francesco; O'Connell, Neil

    2018-06-01

    There is a close connection between the ground state of non-interacting fermions in a box with classical (absorbing, reflecting, and periodic) boundary conditions and the eigenvalue statistics of the classical compact groups. The associated determinantal point processes can be extended in two natural directions: (i) we consider the full family of admissible quantum boundary conditions (i.e., self-adjoint extensions) for the Laplacian on a bounded interval, and the corresponding projection correlation kernels; (ii) we construct the grand canonical extensions at finite temperature of the projection kernels, interpolating from Poisson to random matrix eigenvalue statistics. The scaling limits in the bulk and at the edges are studied in a unified framework, and the question of universality is addressed. Whether the finite temperature determinantal processes correspond to the eigenvalue statistics of some matrix models is, a priori, not obvious. We complete the picture by constructing a finite temperature extension of the Haar measure on the classical compact groups. The eigenvalue statistics of the resulting grand canonical matrix models (of random size) corresponds exactly to the grand canonical measure of free fermions with classical boundary conditions.

  18. A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sinitskiy, Anton V.; Voth, Gregory A., E-mail: gavoth@uchicago.edu

    2015-09-07

    Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman’s imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionistmore » perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.« less

  19. A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals.

    PubMed

    Sinitskiy, Anton V; Voth, Gregory A

    2015-09-07

    Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman's imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionist perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.

  20. Active control on high-order coherence and statistic characterization on random phase fluctuation of two classical point sources.

    PubMed

    Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan

    2016-03-29

    Young's double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources.

  1. Lenard-Balescu calculations and classical molecular dynamics simulations of electrical and thermal conductivities of hydrogen plasmas

    DOE PAGES

    Whitley, Heather D.; Scullard, Christian R.; Benedict, Lorin X.; ...

    2014-12-04

    Here, we present a discussion of kinetic theory treatments of linear electrical and thermal transport in hydrogen plasmas, for a regime of interest to inertial confinement fusion applications. In order to assess the accuracy of one of the more involved of these approaches, classical Lenard-Balescu theory, we perform classical molecular dynamics simulations of hydrogen plasmas using 2-body quantum statistical potentials and compute both electrical and thermal conductivity from out particle trajectories using the Kubo approach. Our classical Lenard-Balescu results employing the identical statistical potentials agree well with the simulations.

  2. ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prigogine, I.; Balescu, R.; Henin, F.

    1960-12-01

    Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)

  3. Western classical music development: a statistical analysis of composers similarity, differentiation and evolution.

    PubMed

    Georges, Patrick

    2017-01-01

    This paper proposes a statistical analysis that captures similarities and differences between classical music composers with the eventual aim to understand why particular composers 'sound' different even if their 'lineages' (influences network) are similar or why they 'sound' alike if their 'lineages' are different. In order to do this we use statistical methods and measures of association or similarity (based on presence/absence of traits such as specific 'ecological' characteristics and personal musical influences) that have been developed in biosystematics, scientometrics, and bibliographic coupling. This paper also represents a first step towards a more ambitious goal of developing an evolutionary model of Western classical music.

  4. Gaussian orthogonal ensemble statistics in graphene billiards with the shape of classically integrable billiards.

    PubMed

    Yu, Pei; Li, Zi-Yuan; Xu, Hong-Ya; Huang, Liang; Dietz, Barbara; Grebogi, Celso; Lai, Ying-Cheng

    2016-12-01

    A crucial result in quantum chaos, which has been established for a long time, is that the spectral properties of classically integrable systems generically are described by Poisson statistics, whereas those of time-reversal symmetric, classically chaotic systems coincide with those of random matrices from the Gaussian orthogonal ensemble (GOE). Does this result hold for two-dimensional Dirac material systems? To address this fundamental question, we investigate the spectral properties in a representative class of graphene billiards with shapes of classically integrable circular-sector billiards. Naively one may expect to observe Poisson statistics, which is indeed true for energies close to the band edges where the quasiparticle obeys the Schrödinger equation. However, for energies near the Dirac point, where the quasiparticles behave like massless Dirac fermions, Poisson statistics is extremely rare in the sense that it emerges only under quite strict symmetry constraints on the straight boundary parts of the sector. An arbitrarily small amount of imperfection of the boundary results in GOE statistics. This implies that, for circular-sector confinements with arbitrary angle, the spectral properties will generically be GOE. These results are corroborated by extensive numerical computation. Furthermore, we provide a physical understanding for our results.

  5. Gaussian orthogonal ensemble statistics in graphene billiards with the shape of classically integrable billiards

    NASA Astrophysics Data System (ADS)

    Yu, Pei; Li, Zi-Yuan; Xu, Hong-Ya; Huang, Liang; Dietz, Barbara; Grebogi, Celso; Lai, Ying-Cheng

    2016-12-01

    A crucial result in quantum chaos, which has been established for a long time, is that the spectral properties of classically integrable systems generically are described by Poisson statistics, whereas those of time-reversal symmetric, classically chaotic systems coincide with those of random matrices from the Gaussian orthogonal ensemble (GOE). Does this result hold for two-dimensional Dirac material systems? To address this fundamental question, we investigate the spectral properties in a representative class of graphene billiards with shapes of classically integrable circular-sector billiards. Naively one may expect to observe Poisson statistics, which is indeed true for energies close to the band edges where the quasiparticle obeys the Schrödinger equation. However, for energies near the Dirac point, where the quasiparticles behave like massless Dirac fermions, Poisson statistics is extremely rare in the sense that it emerges only under quite strict symmetry constraints on the straight boundary parts of the sector. An arbitrarily small amount of imperfection of the boundary results in GOE statistics. This implies that, for circular-sector confinements with arbitrary angle, the spectral properties will generically be GOE. These results are corroborated by extensive numerical computation. Furthermore, we provide a physical understanding for our results.

  6. Data Analysis Techniques for Physical Scientists

    NASA Astrophysics Data System (ADS)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  7. Pauli structures arising from confined particles interacting via a statistical potential

    NASA Astrophysics Data System (ADS)

    Batle, Josep; Ciftja, Orion; Farouk, Ahmed; Alkhambashi, Majid; Abdalla, Soliman

    2017-09-01

    There have been suggestions that the Pauli exclusion principle alone can lead a non-interacting (free) system of identical fermions to form crystalline structures dubbed Pauli crystals. Single-shot imaging experiments for the case of ultra-cold systems of free spin-polarized fermionic atoms in a two-dimensional harmonic trap appear to show geometric arrangements that cannot be characterized as Wigner crystals. This work explores this idea and considers a well-known approach that enables one to treat a quantum system of free fermions as a system of classical particles interacting with a statistical interaction potential. The model under consideration, though classical in nature, incorporates the quantum statistics by endowing the classical particles with an effective interaction potential. The reasonable expectation is that possible Pauli crystal features seen in experiments may manifest in this model that captures the correct quantum statistics as a first order correction. We use the Monte Carlo simulated annealing method to obtain the most stable configurations of finite two-dimensional systems of confined particles that interact with an appropriate statistical repulsion potential. We consider both an isotropic harmonic and a hard-wall confinement potential. Despite minor differences, the most stable configurations observed in our model correspond to the reported Pauli crystals in single-shot imaging experiments of free spin-polarized fermions in a harmonic trap. The crystalline configurations observed appear to be different from the expected classical Wigner crystal structures that would emerge should the confined classical particles had interacted with a pair-wise Coulomb repulsion.

  8. Nonclassical light revealed by the joint statistics of simultaneous measurements.

    PubMed

    Luis, Alfredo

    2016-04-15

    Nonclassicality cannot be a single-observable property, since the statistics of any quantum observable is compatible with classical physics. We develop a general procedure to reveal nonclassical behavior of light states from the joint statistics arising in the practical measurement of multiple observables. Beside embracing previous approaches, this protocol can disclose nonclassical features for standard examples of classical-like behavior, such as SU(2) and Glauber coherent states. When combined with other criteria, this would imply that every light state is nonclassical.

  9. Pure Rotational Spectroscopy of Asymmetric Tops in the Undergraduate Classroom or Laboratory

    NASA Astrophysics Data System (ADS)

    Minei, A. J.; Cooke, S. A.

    2013-06-01

    Due to concerns of complexity, the asymmetric top, for which κ = {(2B - A - C) / (A - C)} ≠ ± 1, is feared, or at least avoided, by many instructors when explaining the rigid rotor. However, the spectral patterns formed by cold} asymmetric rigid rotors in the centimeter-wave} region of the electromagnetic spectrum can be easily identified. We will present some techniques for spectral analyses that we have successfully employed with undergraduate students who are either ``pre-quantum mechanics" or are currently enrolled in a chemical quantum mechanics class. The activities are simple, requiring the students to first locate repeating patterns and then apply simple algebraic expressions in order to determine all three rotational constants. The method will be illustrated using the spectra of 2,2,3,3-tetrafluoropropyl trifluoroacetate (CF_3C(=O)OCH_2CF_2CHF_2), (E)-1,3,3,3-tetrafluoropropene (CF_3CH=CHF), 1H,1H,2H-perfluorocyclobutane (CF_2CF_2CHFCH_2), and 2H-nonafluorobutane (CF_3CHFCF_2CF_3). The first two of these species have predominantly a-type spectra, the third has a predominantly b-type spectrum, the fourth has a predominantly c-type spectrum.

  10. Aspects of Geodesical Motion with Fisher-Rao Metric: Classical and Quantum

    NASA Astrophysics Data System (ADS)

    Ciaglia, Florio M.; Cosmo, Fabio Di; Felice, Domenico; Mancini, Stefano; Marmo, Giuseppe; Pérez-Pardo, Juan M.

    The purpose of this paper is to exploit the geometric structure of quantum mechanics and of statistical manifolds to study the qualitative effect that the quantum properties have in the statistical description of a system. We show that the end points of geodesics in the classical setting coincide with the probability distributions that minimise Shannon’s entropy, i.e. with distributions of zero dispersion. In the quantum setting this happens only for particular initial conditions, which in turn correspond to classical submanifolds. This result can be interpreted as a geometric manifestation of the uncertainty principle.

  11. Perturbative thermodynamic geometry of nonextensive ideal classical, Bose, and Fermi gases.

    PubMed

    Mohammadzadeh, Hosein; Adli, Fereshteh; Nouri, Sahereh

    2016-12-01

    We investigate perturbative thermodynamic geometry of nonextensive ideal classical, Bose, and Fermi gases. We show that the intrinsic statistical interaction of nonextensive Bose (Fermi) gas is attractive (repulsive) similar to the extensive case but the value of thermodynamic curvature is changed by a nonextensive parameter. In contrary to the extensive ideal classical gas, the nonextensive one may be divided to two different regimes. According to the deviation parameter of the system to the nonextensive case, one can find a special value of fugacity, z^{*}, where the sign of thermodynamic curvature is changed. Therefore, we argue that the nonextensive parameter induces an attractive (repulsive) statistical interaction for zz^{*}) for an ideal classical gas. Also, according to the singular point of thermodynamic curvature, we consider the condensation of nonextensive Boson gas.

  12. Strong correlations between the exponent α and the particle number for a Renyi monoatomic gas in Gibbs' statistical mechanics.

    PubMed

    Plastino, A; Rocca, M C

    2017-06-01

    Appealing to the 1902 Gibbs formalism for classical statistical mechanics (SM)-the first SM axiomatic theory ever that successfully explained equilibrium thermodynamics-we show that already at the classical level there is a strong correlation between Renyi's exponent α and the number of particles for very simple systems. No reference to heat baths is needed for such a purpose.

  13. Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.

    PubMed

    Verde, Pablo E

    2010-12-30

    In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples. Copyright © 2010 John Wiley & Sons, Ltd.

  14. Assessing the Kansas water-level monitoring program: An example of the application of classical statistics to a geological problem

    USGS Publications Warehouse

    Davis, J.C.

    2000-01-01

    Geologists may feel that geological data are not amenable to statistical analysis, or at best require specialized approaches such as nonparametric statistics and geostatistics. However, there are many circumstances, particularly in systematic studies conducted for environmental or regulatory purposes, where traditional parametric statistical procedures can be beneficial. An example is the application of analysis of variance to data collected in an annual program of measuring groundwater levels in Kansas. Influences such as well conditions, operator effects, and use of the water can be assessed and wells that yield less reliable measurements can be identified. Such statistical studies have resulted in yearly improvements in the quality and reliability of the collected hydrologic data. Similar benefits may be achieved in other geological studies by the appropriate use of classical statistical tools.

  15. Microgravity experiments on vibrated granular gases in a dilute regime: non-classical statistics

    NASA Astrophysics Data System (ADS)

    Leconte, M.; Garrabos, Y.; Falcon, E.; Lecoutre-Chabot, C.; Palencia, F.; Évesque, P.; Beysens, D.

    2006-07-01

    We report on an experimental study of a dilute gas of steel spheres colliding inelastically and excited by a piston performing sinusoidal vibration, in low gravity. Using improved experimental apparatus, here we present some results concerning the collision statistics of particles on a wall of the container. We also propose a simple model where the non-classical statistics obtained from our data are attributed to the boundary condition playing the role of a 'velostat' instead of a thermostat. The significant differences from the kinetic theory of usual gas are related to the inelasticity of collisions.

  16. Zubarev's Nonequilibrium Statistical Operator Method in the Generalized Statistics of Multiparticle Systems

    NASA Astrophysics Data System (ADS)

    Glushak, P. A.; Markiv, B. B.; Tokarchuk, M. V.

    2018-01-01

    We present a generalization of Zubarev's nonequilibrium statistical operator method based on the principle of maximum Renyi entropy. In the framework of this approach, we obtain transport equations for the basic set of parameters of the reduced description of nonequilibrium processes in a classical system of interacting particles using Liouville equations with fractional derivatives. For a classical systems of particles in a medium with a fractal structure, we obtain a non-Markovian diffusion equation with fractional spatial derivatives. For a concrete model of the frequency dependence of a memory function, we obtain generalized Kettano-type diffusion equation with the spatial and temporal fractality taken into account. We present a generalization of nonequilibrium thermofield dynamics in Zubarev's nonequilibrium statistical operator method in the framework of Renyi statistics.

  17. Quantum Mechanics From the Cradle?

    ERIC Educational Resources Information Center

    Martin, John L.

    1974-01-01

    States that the major problem in learning quantum mechanics is often the student's ignorance of classical mechanics and that one conceptual hurdle in quantum mechanics is its statistical nature, in contrast to the determinism of classical mechanics. (MLH)

  18. A Revelation: Quantum-Statistics and Classical-Statistics are Analytic-Geometry Conic-Sections and Numbers/Functions: Euler, Riemann, Bernoulli Generating-Functions: Conics to Numbers/Functions Deep Subtle Connections

    NASA Astrophysics Data System (ADS)

    Descartes, R.; Rota, G.-C.; Euler, L.; Bernoulli, J. D.; Siegel, Edward Carl-Ludwig

    2011-03-01

    Quantum-statistics Dichotomy: Fermi-Dirac(FDQS) Versus Bose-Einstein(BEQS), respectively with contact-repulsion/non-condensation(FDCR) versus attraction/ condensationBEC are manifestly-demonstrated by Taylor-expansion ONLY of their denominator exponential, identified BOTH as Descartes analytic-geometry conic-sections, FDQS as Elllipse (homotopy to rectangle FDQS distribution-function), VIA Maxwell-Boltzmann classical-statistics(MBCS) to Parabola MORPHISM, VS. BEQS to Hyperbola, Archimedes' HYPERBOLICITY INEVITABILITY, and as well generating-functions[Abramowitz-Stegun, Handbook Math.-Functions--p. 804!!!], respectively of Euler-numbers/functions, (via Riemann zeta-function(domination of quantum-statistics: [Pathria, Statistical-Mechanics; Huang, Statistical-Mechanics]) VS. Bernoulli-numbers/ functions. Much can be learned about statistical-physics from Euler-numbers/functions via Riemann zeta-function(s) VS. Bernoulli-numbers/functions [Conway-Guy, Book of Numbers] and about Euler-numbers/functions, via Riemann zeta-function(s) MORPHISM, VS. Bernoulli-numbers/ functions, visa versa!!! Ex.: Riemann-hypothesis PHYSICS proof PARTLY as BEQS BEC/BEA!!!

  19. Statistical Thermodynamics and Microscale Thermophysics

    NASA Astrophysics Data System (ADS)

    Carey, Van P.

    1999-08-01

    Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.

  20. A heuristic statistical stopping rule for iterative reconstruction in emission tomography.

    PubMed

    Ben Bouallègue, F; Crouzet, J F; Mariano-Goulart, D

    2013-01-01

    We propose a statistical stopping criterion for iterative reconstruction in emission tomography based on a heuristic statistical description of the reconstruction process. The method was assessed for MLEM reconstruction. Based on Monte-Carlo numerical simulations and using a perfectly modeled system matrix, our method was compared with classical iterative reconstruction followed by low-pass filtering in terms of Euclidian distance to the exact object, noise, and resolution. The stopping criterion was then evaluated with realistic PET data of a Hoffman brain phantom produced using the GATE platform for different count levels. The numerical experiments showed that compared with the classical method, our technique yielded significant improvement of the noise-resolution tradeoff for a wide range of counting statistics compatible with routine clinical settings. When working with realistic data, the stopping rule allowed a qualitatively and quantitatively efficient determination of the optimal image. Our method appears to give a reliable estimation of the optimal stopping point for iterative reconstruction. It should thus be of practical interest as it produces images with similar or better quality than classical post-filtered iterative reconstruction with a mastered computation time.

  1. Information transport in classical statistical systems

    NASA Astrophysics Data System (ADS)

    Wetterich, C.

    2018-02-01

    For "static memory materials" the bulk properties depend on boundary conditions. Such materials can be realized by classical statistical systems which admit no unique equilibrium state. We describe the propagation of information from the boundary to the bulk by classical wave functions. The dependence of wave functions on the location of hypersurfaces in the bulk is governed by a linear evolution equation that can be viewed as a generalized Schrödinger equation. Classical wave functions obey the superposition principle, with local probabilities realized as bilinears of wave functions. For static memory materials the evolution within a subsector is unitary, as characteristic for the time evolution in quantum mechanics. The space-dependence in static memory materials can be used as an analogue representation of the time evolution in quantum mechanics - such materials are "quantum simulators". For example, an asymmetric Ising model on a Euclidean two-dimensional lattice represents the time evolution of free relativistic fermions in two-dimensional Minkowski space.

  2. Quantum theory of multiscale coarse-graining.

    PubMed

    Han, Yining; Jin, Jaehyeok; Wagner, Jacob W; Voth, Gregory A

    2018-03-14

    Coarse-grained (CG) models serve as a powerful tool to simulate molecular systems at much longer temporal and spatial scales. Previously, CG models and methods have been built upon classical statistical mechanics. The present paper develops a theory and numerical methodology for coarse-graining in quantum statistical mechanics, by generalizing the multiscale coarse-graining (MS-CG) method to quantum Boltzmann statistics. A rigorous derivation of the sufficient thermodynamic consistency condition is first presented via imaginary time Feynman path integrals. It identifies the optimal choice of CG action functional and effective quantum CG (qCG) force field to generate a quantum MS-CG (qMS-CG) description of the equilibrium system that is consistent with the quantum fine-grained model projected onto the CG variables. A variational principle then provides a class of algorithms for optimally approximating the qMS-CG force fields. Specifically, a variational method based on force matching, which was also adopted in the classical MS-CG theory, is generalized to quantum Boltzmann statistics. The qMS-CG numerical algorithms and practical issues in implementing this variational minimization procedure are also discussed. Then, two numerical examples are presented to demonstrate the method. Finally, as an alternative strategy, a quasi-classical approximation for the thermal density matrix expressed in the CG variables is derived. This approach provides an interesting physical picture for coarse-graining in quantum Boltzmann statistical mechanics in which the consistency with the quantum particle delocalization is obviously manifest, and it opens up an avenue for using path integral centroid-based effective classical force fields in a coarse-graining methodology.

  3. Thermodynamics and statistical mechanics. [thermodynamic properties of gases

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The basic thermodynamic properties of gases are reviewed and the relations between them are derived from the first and second laws. The elements of statistical mechanics are then formulated and the partition function is derived. The classical form of the partition function is used to obtain the Maxwell-Boltzmann distribution of kinetic energies in the gas phase and the equipartition of energy theorem is given in its most general form. The thermodynamic properties are all derived as functions of the partition function. Quantum statistics are reviewed briefly and the differences between the Boltzmann distribution function for classical particles and the Fermi-Dirac and Bose-Einstein distributions for quantum particles are discussed.

  4. Quick Overview Scout 2008 Version 1.0

    EPA Science Inventory

    The Scout 2008 version 1.0 statistical software package has been updated from past DOS and Windows versions to provide classical and robust univariate and multivariate graphical and statistical methods that are not typically available in commercial or freeware statistical softwar...

  5. The Relationship between Background Classical Music and Reading Comprehension on Seventh and Eighth Grade Students

    ERIC Educational Resources Information Center

    Falcon, Evelyn

    2017-01-01

    The purpose of this study was to examine if there is any relationship on reading comprehension when background classical music is played in the setting of a 7th and 8th grade classroom. This study also examined if there was a statistically significant difference in test anxiety when listening to classical music while completing a test. Reading…

  6. Algorithms for tensor network renormalization

    NASA Astrophysics Data System (ADS)

    Evenbly, G.

    2017-01-01

    We discuss in detail algorithms for implementing tensor network renormalization (TNR) for the study of classical statistical and quantum many-body systems. First, we recall established techniques for how the partition function of a 2 D classical many-body system or the Euclidean path integral of a 1 D quantum system can be represented as a network of tensors, before describing how TNR can be implemented to efficiently contract the network via a sequence of coarse-graining transformations. The efficacy of the TNR approach is then benchmarked for the 2 D classical statistical and 1 D quantum Ising models; in particular the ability of TNR to maintain a high level of accuracy over sustained coarse-graining transformations, even at a critical point, is demonstrated.

  7. Frequent statistics of link-layer bit stream data based on AC-IM algorithm

    NASA Astrophysics Data System (ADS)

    Cao, Chenghong; Lei, Yingke; Xu, Yiming

    2017-08-01

    At present, there are many relevant researches on data processing using classical pattern matching and its improved algorithm, but few researches on statistical data of link-layer bit stream. This paper adopts a frequent statistical method of link-layer bit stream data based on AC-IM algorithm for classical multi-pattern matching algorithms such as AC algorithm has high computational complexity, low efficiency and it cannot be applied to binary bit stream data. The method's maximum jump distance of the mode tree is length of the shortest mode string plus 3 in case of no missing? In this paper, theoretical analysis is made on the principle of algorithm construction firstly, and then the experimental results show that the algorithm can adapt to the binary bit stream data environment and extract the frequent sequence more accurately, the effect is obvious. Meanwhile, comparing with the classical AC algorithm and other improved algorithms, AC-IM algorithm has a greater maximum jump distance and less time-consuming.

  8. Asymptotic Linear Spectral Statistics for Spiked Hermitian Random Matrices

    NASA Astrophysics Data System (ADS)

    Passemier, Damien; McKay, Matthew R.; Chen, Yang

    2015-07-01

    Using the Coulomb Fluid method, this paper derives central limit theorems (CLTs) for linear spectral statistics of three "spiked" Hermitian random matrix ensembles. These include Johnstone's spiked model (i.e., central Wishart with spiked correlation), non-central Wishart with rank-one non-centrality, and a related class of non-central matrices. For a generic linear statistic, we derive simple and explicit CLT expressions as the matrix dimensions grow large. For all three ensembles under consideration, we find that the primary effect of the spike is to introduce an correction term to the asymptotic mean of the linear spectral statistic, which we characterize with simple formulas. The utility of our proposed framework is demonstrated through application to three different linear statistics problems: the classical likelihood ratio test for a population covariance, the capacity analysis of multi-antenna wireless communication systems with a line-of-sight transmission path, and a classical multiple sample significance testing problem.

  9. Continuous quantum measurement and the quantum to classical transition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharya, Tanmoy; Habib, Salman; Jacobs, Kurt

    2003-04-01

    While ultimately they are described by quantum mechanics, macroscopic mechanical systems are nevertheless observed to follow the trajectories predicted by classical mechanics. Hence, in the regime defining macroscopic physics, the trajectories of the correct classical motion must emerge from quantum mechanics, a process referred to as the quantum to classical transition. Extending previous work [Bhattacharya, Habib, and Jacobs, Phys. Rev. Lett. 85, 4852 (2000)], here we elucidate this transition in some detail, showing that once the measurement processes that affect all macroscopic systems are taken into account, quantum mechanics indeed predicts the emergence of classical motion. We derive inequalities thatmore » describe the parameter regime in which classical motion is obtained, and provide numerical examples. We also demonstrate two further important properties of the classical limit: first, that multiple observers all agree on the motion of an object, and second, that classical statistical inference may be used to correctly track the classical motion.« less

  10. Least Squares Procedures.

    ERIC Educational Resources Information Center

    Hester, Yvette

    Least squares methods are sophisticated mathematical curve fitting procedures used in all classical parametric methods. The linear least squares approximation is most often associated with finding the "line of best fit" or the regression line. Since all statistical analyses are correlational and all classical parametric methods are least…

  11. Scout 2008 Version 1.0 User Guide

    EPA Science Inventory

    The Scout 2008 version 1.0 software package provides a wide variety of classical and robust statistical methods that are not typically available in other commercial software packages. A major part of Scout deals with classical, robust, and resistant univariate and multivariate ou...

  12. A statistical physics view of pitch fluctuations in the classical music from Bach to Chopin: evidence for scaling.

    PubMed

    Liu, Lu; Wei, Jianrong; Zhang, Huishu; Xin, Jianhong; Huang, Jiping

    2013-01-01

    Because classical music has greatly affected our life and culture in its long history, it has attracted extensive attention from researchers to understand laws behind it. Based on statistical physics, here we use a different method to investigate classical music, namely, by analyzing cumulative distribution functions (CDFs) and autocorrelation functions of pitch fluctuations in compositions. We analyze 1,876 compositions of five representative classical music composers across 164 years from Bach, to Mozart, to Beethoven, to Mendelsohn, and to Chopin. We report that the biggest pitch fluctuations of a composer gradually increase as time evolves from Bach time to Mendelsohn/Chopin time. In particular, for the compositions of a composer, the positive and negative tails of a CDF of pitch fluctuations are distributed not only in power laws (with the scale-free property), but also in symmetry (namely, the probability of a treble following a bass and that of a bass following a treble are basically the same for each composer). The power-law exponent decreases as time elapses. Further, we also calculate the autocorrelation function of the pitch fluctuation. The autocorrelation function shows a power-law distribution for each composer. Especially, the power-law exponents vary with the composers, indicating their different levels of long-range correlation of notes. This work not only suggests a way to understand and develop music from a viewpoint of statistical physics, but also enriches the realm of traditional statistical physics by analyzing music.

  13. Phase-Sensitive Coherence and the Classical-Quantum Boundary in Ghost Imaging

    NASA Technical Reports Server (NTRS)

    Erkmen, Baris I.; Hardy, Nicholas D.; Venkatraman, Dheera; Wong, Franco N. C.; Shapiro, Jeffrey H.

    2011-01-01

    The theory of partial coherence has a long and storied history in classical statistical optics. the vast majority of this work addresses fields that are statistically stationary in time, hence their complex envelopes only have phase-insensitive correlations. The quantum optics of squeezed-state generation, however, depends on nonlinear interactions producing baseband field operators with phase-insensitive and phase-sensitive correlations. Utilizing quantum light to enhance imaging has been a topic of considerable current interest, much of it involving biphotons, i.e., streams of entangled-photon pairs. Biphotons have been employed for quantum versions of optical coherence tomography, ghost imaging, holography, and lithography. However, their seemingly quantum features have been mimicked with classical-sate light, questioning wherein lies the classical-quantum boundary. We have shown, for the case of Gaussian-state light, that this boundary is intimately connected to the theory of phase-sensitive partial coherence. Here we present that theory, contrasting it with the familiar case of phase-insensitive partial coherence, and use it to elucidate the classical-quantum boundary of ghost imaging. We show, both theoretically and experimentally, that classical phase-sensitive light produces ghost imaging most closely mimicking those obtained in biphotons, and we derived the spatial resolution, image contrast, and signal-to-noise ratio of a standoff-sensing ghost imager, taking into account target-induced speckle.

  14. Actinic cheilitis: aesthetic and functional comparative evaluation of vermilionectomy using the classic and W-plasty techniques.

    PubMed

    Rossoe, Ed Wilson Tsuneo; Tebcherani, Antonio José; Sittart, José Alexandre; Pires, Mario Cezar

    2011-01-01

    Chronic actinic cheilitis is actinic keratosis located on the vermilion border. Treatment is essential because of the potential for malignant transformation. To evaluate the aesthetic and functional results of vermilionectomy using the classic and W-plasty techniques in actinic cheilitis. In the classic technique, the scar is linear and in the W-plasty one, it is a broken line. 32 patients with clinical and histopathological diagnosis of actinic cheilitis were treated. Out of the 32 patients, 15 underwent the W-plasty technique and 17 underwent the classic one. We evaluated parameters such as scar retraction and functional changes. A statistically significant association between the technique used and scar retraction was found, which was positive when using the classic technique (p = 0.01 with Yates' correction). The odds ratio was calculated at 11.25, i.e., there was a greater chance of retraction in patients undergoing the classic technique. Both techniques revealed no functional changes. We evaluated postoperative complications such as the presence of crusts, dry lips, paresthesia, and suture dehiscence. There was no statistically significant association between complications and the technique used (p = 0.69). We concluded that vermilionectomy using the W-plasty technique shows better cosmetic results and similar complication rates.

  15. Probability and Statistics: A Prelude.

    ERIC Educational Resources Information Center

    Goodman, A. F.; Blischke, W. R.

    Probability and statistics have become indispensable to scientific, technical, and management progress. They serve as essential dialects of mathematics, the classical language of science, and as instruments necessary for intelligent generation and analysis of information. A prelude to probability and statistics is presented by examination of the…

  16. Use of Fermi-Dirac statistics for defects in solids

    NASA Astrophysics Data System (ADS)

    Johnson, R. A.

    1981-12-01

    The Fermi-Dirac distribution function is an approximation describing a special case of Boltzmann statistics. A general occupation probability formula is derived and a criterion given for the use of Fermi-Dirac statistics. Application to classical problems of defects in solids is discussed.

  17. Comparison of Classical and Quantum Mechanical Uncertainties.

    ERIC Educational Resources Information Center

    Peslak, John, Jr.

    1979-01-01

    Comparisons are made for the particle-in-a-box, the harmonic oscillator, and the one-electron atom. A classical uncertainty principle is derived and compared with its quantum-mechanical counterpart. The results are discussed in terms of the statistical interpretation of the uncertainty principle. (Author/BB)

  18. On Some Assumptions of the Null Hypothesis Statistical Testing

    ERIC Educational Resources Information Center

    Patriota, Alexandre Galvão

    2017-01-01

    Bayesian and classical statistical approaches are based on different types of logical principles. In order to avoid mistaken inferences and misguided interpretations, the practitioner must respect the inference rules embedded into each statistical method. Ignoring these principles leads to the paradoxical conclusions that the hypothesis…

  19. Robust Statistics: What They Are, and Why They Are So Important

    ERIC Educational Resources Information Center

    Corlu, Sencer M.

    2009-01-01

    The problem with "classical" statistics all invoking the mean is that these estimates are notoriously influenced by atypical scores (outliers), partly because the mean itself is differentially influenced by outliers. In theory, "modern" statistics may generate more replicable characterizations of data, because at least in some…

  20. Teaching Classical Statistical Mechanics: A Simulation Approach.

    ERIC Educational Resources Information Center

    Sauer, G.

    1981-01-01

    Describes a one-dimensional model for an ideal gas to study development of disordered motion in Newtonian mechanics. A Monte Carlo procedure for simulation of the statistical ensemble of an ideal gas with fixed total energy is developed. Compares both approaches for a pseudoexperimental foundation of statistical mechanics. (Author/JN)

  1. Classical Electrodynamics: Lecture notes

    NASA Astrophysics Data System (ADS)

    Likharev, Konstantin K.

    2018-06-01

    Essential Advanced Physics is a series comprising four parts: Classical Mechanics, Classical Electrodynamics, Quantum Mechanics and Statistical Mechanics. Each part consists of two volumes, Lecture notes and Problems with solutions, further supplemented by an additional collection of test problems and solutions available to qualifying university instructors. This volume, Classical Electrodynamics: Lecture notes is intended to be the basis for a two-semester graduate-level course on electricity and magnetism, including not only the interaction and dynamics charged point particles, but also properties of dielectric, conducting, and magnetic media. The course also covers special relativity, including its kinematics and particle-dynamics aspects, and electromagnetic radiation by relativistic particles.

  2. Ehrenfest dynamics is purity non-preserving: A necessary ingredient for decoherence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alonso, J. L.; Instituto de Biocomputacion y Fisica de Sistemas Complejos; Unidad Asociada IQFR-BIFI, Universidad de Zaragoza, Mariano Esquillor s/n, E-50018 Zaragoza

    2012-08-07

    We discuss the evolution of purity in mixed quantum/classical approaches to electronic nonadiabatic dynamics in the context of the Ehrenfest model. As it is impossible to exactly determine initial conditions for a realistic system, we choose to work in the statistical Ehrenfest formalism that we introduced in Alonso et al. [J. Phys. A: Math. Theor. 44, 396004 (2011)]. From it, we develop a new framework to determine exactly the change in the purity of the quantum subsystem along with the evolution of a statistical Ehrenfest system. In a simple case, we verify how and to which extent Ehrenfest statistical dynamicsmore » makes a system with more than one classical trajectory, and an initial quantum pure state become a quantum mixed one. We prove this numerically showing how the evolution of purity depends on time, on the dimension of the quantum state space D, and on the number of classical trajectories N of the initial distribution. The results in this work open new perspectives for studying decoherence with Ehrenfest dynamics.« less

  3. Generalized relative entropies in the classical limit

    NASA Astrophysics Data System (ADS)

    Kowalski, A. M.; Martin, M. T.; Plastino, A.

    2015-03-01

    Our protagonists are (i) the Cressie-Read family of divergences (characterized by the parameter γ), (ii) Tsallis' generalized relative entropies (characterized by the q one), and, as a particular instance of both, (iii) the Kullback-Leibler (KL) relative entropy. In their normalized versions, we ascertain the equivalence between (i) and (ii). Additionally, we employ these three entropic quantifiers in order to provide a statistical investigation of the classical limit of a semiclassical model, whose properties are well known from a purely dynamic viewpoint. This places us in a good position to assess the appropriateness of our statistical quantifiers for describing involved systems. We compare the behaviour of (i), (ii), and (iii) as one proceeds towards the classical limit. We determine optimal ranges for γ and/or q. It is shown the Tsallis-quantifier is better than KL's for 1.5 < q < 2.5.

  4. Dynamically biased statistical model for the ortho/para conversion in the H2 + H3+ → H3+ + H2 reaction.

    PubMed

    Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio

    2012-09-07

    In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007)]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H(5)(+) complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H(5)(+) complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011)] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.

  5. Dynamically biased statistical model for the ortho/para conversion in the H2+H3+ --> H3++ H2 reaction

    NASA Astrophysics Data System (ADS)

    Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio

    2012-09-01

    In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007), 10.1063/1.2430711]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H_5^+ complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H_5^+ complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011), 10.1063/1.3587246] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.

  6. APPROACH TO EQUILIBRIUM OF A QUANTUM PLASMA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balescu, R.

    1961-01-01

    The treatment of irreversible processes in a classical plasma (R. Balescu, Phys. Fluids 3, 62(1960)) was extended to a gas of charged particles obeying quantum statistics. The various contributions to the equation of evolution for the reduced one-particle Wigner function were written in a form analogous to the classical formalism. The summation was then performed in a straightforward manner. The resulting equation describes collisions between particles "dressed" by their polarization clouds, exactly as in the classical situation. (auth)

  7. Unbiased estimators for spatial distribution functions of classical fluids

    NASA Astrophysics Data System (ADS)

    Adib, Artur B.; Jarzynski, Christopher

    2005-01-01

    We use a statistical-mechanical identity closely related to the familiar virial theorem, to derive unbiased estimators for spatial distribution functions of classical fluids. In particular, we obtain estimators for both the fluid density ρ(r) in the vicinity of a fixed solute and the pair correlation g(r) of a homogeneous classical fluid. We illustrate the utility of our estimators with numerical examples, which reveal advantages over traditional histogram-based methods of computing such distributions.

  8. Realistic finite temperature simulations of magnetic systems using quantum statistics

    NASA Astrophysics Data System (ADS)

    Bergqvist, Lars; Bergman, Anders

    2018-01-01

    We have performed realistic atomistic simulations at finite temperatures using Monte Carlo and atomistic spin dynamics simulations incorporating quantum (Bose-Einstein) statistics. The description is much improved at low temperatures compared to classical (Boltzmann) statistics normally used in these kind of simulations, while at higher temperatures the classical statistics are recovered. This corrected low-temperature description is reflected in both magnetization and the magnetic specific heat, the latter allowing for improved modeling of the magnetic contribution to free energies. A central property in the method is the magnon density of states at finite temperatures, and we have compared several different implementations for obtaining it. The method has no restrictions regarding chemical and magnetic order of the considered materials. This is demonstrated by applying the method to elemental ferromagnetic systems, including Fe and Ni, as well as Fe-Co random alloys and the ferrimagnetic system GdFe3.

  9. [The new methods in gerontology for life expectancy prediction of the indigenous population of Yugra].

    PubMed

    Gavrilenko, T V; Es'kov, V M; Khadartsev, A A; Khimikova, O I; Sokolova, A A

    2014-01-01

    The behavior of the state vector of human cardio-vascular system in different age groups according to methods of theory of chaos-self-organization and methods of classical statistics was investigated. Observations were made on the indigenous people of North of the Russian Federation. Using methods of the theory of chaos-self-organization the differences in the parameters of quasi-attractors of the human state vector of cardio-vascular system of the people of Russian Federation North were shown. Comparison with the results obtained by classical statistics was made.

  10. The Multiphoton Interaction of Lambda Model Atom and Two-Mode Fields

    NASA Technical Reports Server (NTRS)

    Liu, Tang-Kun

    1996-01-01

    The system of two-mode fields interacting with atom by means of multiphotons is addressed, and the non-classical statistic quality of two-mode fields with interaction is discussed. Through mathematical calculation, some new rules of non-classical effects of two-mode fields which evolue with time, are established.

  11. A Review of Classical Methods of Item Analysis.

    ERIC Educational Resources Information Center

    French, Christine L.

    Item analysis is a very important consideration in the test development process. It is a statistical procedure to analyze test items that combines methods used to evaluate the important characteristics of test items, such as difficulty, discrimination, and distractibility of the items in a test. This paper reviews some of the classical methods for…

  12. Non-classical State via Superposition of Two Opposite Coherent States

    NASA Astrophysics Data System (ADS)

    Ren, Gang; Du, Jian-ming; Yu, Hai-jun

    2018-04-01

    We study the non-classical properties of the states generated by superpositions of two opposite coherent states with the arbitrary relative phase factors. We show that the relative phase factors plays an important role in these superpositions. We demonstrate this result by discussing their squeezing properties, quantum statistical properties and fidelity in principle.

  13. For a statistical interpretation of Helmholtz' thermal displacement

    NASA Astrophysics Data System (ADS)

    Podio-Guidugli, Paolo

    2016-11-01

    On moving from the classic papers by Einstein and Langevin on Brownian motion, two consistent statistical interpretations are given for the thermal displacement, a scalar field formally introduced by Helmholtz, whose time derivative is by definition the absolute temperature.

  14. Bayes and the Law

    PubMed Central

    Fenton, Norman; Neil, Martin; Berger, Daniel

    2016-01-01

    Although the last forty years has seen considerable growth in the use of statistics in legal proceedings, it is primarily classical statistical methods rather than Bayesian methods that have been used. Yet the Bayesian approach avoids many of the problems of classical statistics and is also well suited to a broader range of problems. This paper reviews the potential and actual use of Bayes in the law and explains the main reasons for its lack of impact on legal practice. These include misconceptions by the legal community about Bayes’ theorem, over-reliance on the use of the likelihood ratio and the lack of adoption of modern computational methods. We argue that Bayesian Networks (BNs), which automatically produce the necessary Bayesian calculations, provide an opportunity to address most concerns about using Bayes in the law. PMID:27398389

  15. Bayes and the Law.

    PubMed

    Fenton, Norman; Neil, Martin; Berger, Daniel

    2016-06-01

    Although the last forty years has seen considerable growth in the use of statistics in legal proceedings, it is primarily classical statistical methods rather than Bayesian methods that have been used. Yet the Bayesian approach avoids many of the problems of classical statistics and is also well suited to a broader range of problems. This paper reviews the potential and actual use of Bayes in the law and explains the main reasons for its lack of impact on legal practice. These include misconceptions by the legal community about Bayes' theorem, over-reliance on the use of the likelihood ratio and the lack of adoption of modern computational methods. We argue that Bayesian Networks (BNs), which automatically produce the necessary Bayesian calculations, provide an opportunity to address most concerns about using Bayes in the law.

  16. Uniform quantized electron gas

    NASA Astrophysics Data System (ADS)

    Høye, Johan S.; Lomba, Enrique

    2016-10-01

    In this work we study the correlation energy of the quantized electron gas of uniform density at temperature T  =  0. To do so we utilize methods from classical statistical mechanics. The basis for this is the Feynman path integral for the partition function of quantized systems. With this representation the quantum mechanical problem can be interpreted as, and is equivalent to, a classical polymer problem in four dimensions where the fourth dimension is imaginary time. Thus methods, results, and properties obtained in the statistical mechanics of classical fluids can be utilized. From this viewpoint we recover the well known RPA (random phase approximation). Then to improve it we modify the RPA by requiring the corresponding correlation function to be such that electrons with equal spins can not be on the same position. Numerical evaluations are compared with well known results of a standard parameterization of Monte Carlo correlation energies.

  17. Classical Electrodynamics: Problems with solutions; Problems with solutions

    NASA Astrophysics Data System (ADS)

    Likharev, Konstantin K.

    2018-06-01

    l Advanced Physics is a series comprising four parts: Classical Mechanics, Classical Electrodynamics, Quantum Mechanics and Statistical Mechanics. Each part consists of two volumes, Lecture notes and Problems with solutions, further supplemented by an additional collection of test problems and solutions available to qualifying university instructors. This volume, Classical Electrodynamics: Lecture notes is intended to be the basis for a two-semester graduate-level course on electricity and magnetism, including not only the interaction and dynamics charged point particles, but also properties of dielectric, conducting, and magnetic media. The course also covers special relativity, including its kinematics and particle-dynamics aspects, and electromagnetic radiation by relativistic particles.

  18. Evaluation of anterior knee pain in a PS total knee arthroplasty: the role of patella-friendly femoral component and patellar size.

    PubMed

    Atzori, F; Sabatini, L; Deledda, D; Schirò, M; Lo Baido, R; Baido, R L; Massè, A

    2015-04-01

    Total knee arthroplasty gives excellent objective results. Nevertheless, the subjective findings do not match the normal knee perception: Often, it depends on patellar pain onset. In this study, we analyzed clinical and radiological items that can affect resurfaced patellar tracking, and role of a patella-friendly femoral component and patellar size on patellar pain onset. Thirty consecutive patients were implanted using the same-cemented posterior-stabilized TKA associated with patella resurfacing. Fifteen patients were implanted using a classical femoral component, while another 15 patients were implanted using a patella-friendly femoral component. The statistical analysis was set to detect a significant difference (p < 0.05) in clinical and radiological outcomes related to several surgical parameters. Clinical and functional outcomes were recorded using the Knee Society Scoring System (KSS) and patellar pain with the Burnett questionnaire. Mean follow-up was 25 months. KSS results were excellent in both groups. Group 2 (patella-friendly femoral model) reached a higher percentage of 100 points in the clinical and functional KSS, but there was no statistical difference. Also, no statistical differences for Burnett Questionnaire results were recorded. We had one case of patellar clunk syndrome in the standard femoral component group and one poor result in the second group. Postoperative radiographic measurements evidenced no statistical differences in both groups. In group 1 (classical femoral component), better significant result (p < 0.05) war recorded at clinical evaluation according to the Knee Society Scoring System (KSS) in case of wider patellar component resurfaced. The present study reveals no statistically significant difference in the incidence of anterior knee pain between classical and "patella-friendly" femoral components. With the particular type of implant design utilized in this study, when the classical femoral component is used, bigger patellar implant sizes (38 and 41 mm) showed superior clinical outcome.

  19. Effect of Turkish classical music on blood pressure: a randomized controlled trial in hypertensive elderly patients.

    PubMed

    Bekiroğlu, Tansel; Ovayolu, Nimet; Ergün, Yusuf; Ekerbiçer, Hasan Çetin

    2013-06-01

    Existing studies suggest that music therapy can have favorable effects on hypertension and anxiety. We therefore set out to investigate the effect of Turkish classical music. To investigate whether Turkish classical music has positive effects on blood pressures and anxiety levels in elderly patients. This was a randomized controlled trial performed on 60 hypertensive patients living in a local elderly home in Adana, Turkey. Following the completion of a socio-demographic form for each patient, Hamilton anxiety scale was applied. Thereafter, the subjects were randomly divided into two equal-size groups and were allowed to either listen to Turkish classical music (music therapy group) or have a resting period (control group) for 25 min. The primary and secondary outcome measures were blood pressure and Hamilton anxiety scale scores, respectively. The mean reduction in systolic blood pressure was 13.00 mmHg in the music therapy group and 6.50 mmHg in the control group. The baseline adjusted between treatment group difference was not statistically significant (95% CI 6.80-9.36). The median reductions in diastolic blood pressures were 10 mmHg both in the music therapy and control groups. The between treatment group difference was not statistically significant (Mann-Whitney U test, P = 0.839). The mean reduction in HAMA-A was 1.63 in the music therapy group and 0.77 in the control group. The baseline adjusted between treatment group difference was not statistically significant (95% CI 0.82-1.92). The study demonstrated that both Turkish classical music and resting alone have positive effects on blood pressure in patients with hypertension. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. A brief overview of current relationships of geography, statistics, and taxonomy with the classical integrated control concept

    USDA-ARS?s Scientific Manuscript database

    A classic paper on the integrated control concept appeared in the later part of the 1950’s, led by Vernon Stern, Ray Smith, Robert van den Bosch, and Kenneth Hagen. Numerous concepts and definitions were formulated at that time. In this presentation, a short philosophical summary will be presented...

  1. MSUSTAT.

    ERIC Educational Resources Information Center

    Mauriello, David

    1984-01-01

    Reviews an interactive statistical analysis package (designed to run on 8- and 16-bit machines that utilize CP/M 80 and MS-DOS operating systems), considering its features and uses, documentation, operation, and performance. The package consists of 40 general purpose statistical procedures derived from the classic textbook "Statistical…

  2. White matter pathology in ALS and lower motor neuron ALS variants: a diffusion tensor imaging study using tract-based spatial statistics.

    PubMed

    Prudlo, Johannes; Bißbort, Charlotte; Glass, Aenne; Grossmann, Annette; Hauenstein, Karlheinz; Benecke, Reiner; Teipel, Stefan J

    2012-09-01

    The aim of this work was to investigate white-matter microstructural changes within and outside the corticospinal tract in classical amyotrophic lateral sclerosis (ALS) and in lower motor neuron (LMN) ALS variants by means of diffusion tensor imaging (DTI). We investigated 22 ALS patients and 21 age-matched controls utilizing a whole-brain approach with a 1.5-T scanner for DTI. The patient group was comprised of 15 classical ALS- and seven LMN ALS-variant patients (progressive muscular atrophy, flail arm and flail leg syndrome). Disease severity was measured by the revised version of the functional rating scale. White matter fractional anisotropy (FA) was assessed using tract-based spatial statistics (TBSS) and a region of interest (ROI) approach. We found significant FA reductions in motor and extra-motor cerebral fiber tracts in classical ALS and in the LMN ALS-variant patients compared to controls. The voxel-based TBSS results were confirmed by the ROI findings. The white matter damage correlated with the disease severity in the patient group and was found in a similar distribution, but to a lesser extent, among the LMN ALS-variant subgroup. ALS and LMN ALS variants are multisystem degenerations. DTI shows the potential to determine an earlier diagnosis, particularly in LMN ALS variants. The statistically identical findings of white matter lesions in classical ALS and LMN variants as ascertained by DTI further underline that these variants should be regarded as part of the ALS spectrum.

  3. Interference in the classical probabilistic model and its representation in complex Hilbert space

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei Yu.

    2005-10-01

    The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.

  4. Epistemic View of Quantum States and Communication Complexity of Quantum Channels

    NASA Astrophysics Data System (ADS)

    Montina, Alberto

    2012-09-01

    The communication complexity of a quantum channel is the minimal amount of classical communication required for classically simulating a process of state preparation, transmission through the channel and subsequent measurement. It establishes a limit on the power of quantum communication in terms of classical resources. We show that classical simulations employing a finite amount of communication can be derived from a special class of hidden variable theories where quantum states represent statistical knowledge about the classical state and not an element of reality. This special class has attracted strong interest very recently. The communication cost of each derived simulation is given by the mutual information between the quantum state and the classical state of the parent hidden variable theory. Finally, we find that the communication complexity for single qubits is smaller than 1.28 bits. The previous known upper bound was 1.85 bits.

  5. Statistical benchmark for BosonSampling

    NASA Astrophysics Data System (ADS)

    Walschaers, Mattia; Kuipers, Jack; Urbina, Juan-Diego; Mayer, Klaus; Tichy, Malte Christopher; Richter, Klaus; Buchleitner, Andreas

    2016-03-01

    Boson samplers—set-ups that generate complex many-particle output states through the transmission of elementary many-particle input states across a multitude of mutually coupled modes—promise the efficient quantum simulation of a classically intractable computational task, and challenge the extended Church-Turing thesis, one of the fundamental dogmas of computer science. However, as in all experimental quantum simulations of truly complex systems, one crucial problem remains: how to certify that a given experimental measurement record unambiguously results from enforcing the claimed dynamics, on bosons, fermions or distinguishable particles? Here we offer a statistical solution to the certification problem, identifying an unambiguous statistical signature of many-body quantum interference upon transmission across a multimode, random scattering device. We show that statistical analysis of only partial information on the output state allows to characterise the imparted dynamics through particle type-specific features of the emerging interference patterns. The relevant statistical quantifiers are classically computable, define a falsifiable benchmark for BosonSampling, and reveal distinctive features of many-particle quantum dynamics, which go much beyond mere bunching or anti-bunching effects.

  6. The Shock and Vibration Bulletin. Part 2. Invited Papers, Structural Dynamics

    DTIC Science & Technology

    1974-08-01

    VIKING LANDER DYNAMICS 41 Mr. Joseph C. Pohlen, Martin Marietta Aerospace, Denver, Colorado Structural Dynamics PERFORMANCE OF STATISTICAL ENERGY ANALYSIS 47...aerospace structures. Analytical prediction of these environments is beyond the current scope of classical modal techniques. Statistical energy analysis methods...have been developed that circumvent the difficulties of high-frequency nodal analysis. These statistical energy analysis methods are evaluated

  7. Statistical Interpretation of the Local Field Inside Dielectrics.

    ERIC Educational Resources Information Center

    Berrera, Ruben G.; Mello, P. A.

    1982-01-01

    Compares several derivations of the Clausius-Mossotti relation to analyze consistently the nature of approximations used and their range of applicability. Also presents a statistical-mechanical calculation of the local field for classical system of harmonic oscillators interacting via the Coulomb potential. (Author/SK)

  8. Bayesian Statistics for Biological Data: Pedigree Analysis

    ERIC Educational Resources Information Center

    Stanfield, William D.; Carlton, Matthew A.

    2004-01-01

    The use of Bayes' formula is applied to the biological problem of pedigree analysis to show that the Bayes' formula and non-Bayesian or "classical" methods of probability calculation give different answers. First year college students of biology can be introduced to the Bayesian statistics.

  9. Rotation of EOFs by the Independent Component Analysis: Towards A Solution of the Mixing Problem in the Decomposition of Geophysical Time Series

    NASA Technical Reports Server (NTRS)

    Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)

    2001-01-01

    The Independent Component Analysis is a recently developed technique for component extraction. This new method requires the statistical independence of the extracted components, a stronger constraint that uses higher-order statistics, instead of the classical decorrelation, a weaker constraint that uses only second-order statistics. This technique has been used recently for the analysis of geophysical time series with the goal of investigating the causes of variability in observed data (i.e. exploratory approach). We demonstrate with a data simulation experiment that, if initialized with a Principal Component Analysis, the Independent Component Analysis performs a rotation of the classical PCA (or EOF) solution. This rotation uses no localization criterion like other Rotation Techniques (RT), only the global generalization of decorrelation by statistical independence is used. This rotation of the PCA solution seems to be able to solve the tendency of PCA to mix several physical phenomena, even when the signal is just their linear sum.

  10. Properties of the Boltzmann equation in the classical approximation

    DOE PAGES

    Epelbaum, Thomas; Gelis, François; Tanji, Naoto; ...

    2014-12-30

    We examine the Boltzmann equation with elastic point-like scalar interactions in two different versions of the the classical approximation. Although solving numerically the Boltzmann equation with the unapproximated collision term poses no problem, this allows one to study the effect of the ultraviolet cutoff in these approximations. This cutoff dependence in the classical approximations of the Boltzmann equation is closely related to the non-renormalizability of the classical statistical approximation of the underlying quantum field theory. The kinetic theory setup that we consider here allows one to study in a much simpler way the dependence on the ultraviolet cutoff, since onemore » has also access to the non-approximated result for comparison.« less

  11. Relationships among Classical Test Theory and Item Response Theory Frameworks via Factor Analytic Models

    ERIC Educational Resources Information Center

    Kohli, Nidhi; Koran, Jennifer; Henn, Lisa

    2015-01-01

    There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…

  12. Evidence of non-classical (squeezed) light in biological systems

    NASA Astrophysics Data System (ADS)

    Popp, F. A.; Chang, J. J.; Herzog, A.; Yan, Z.; Yan, Y.

    2002-01-01

    By use of coincidence measurements on “ultraweak” photon emission, the photocount statistics (PCS) of artificial visible light turns out to follow-as expected-super-Poissonian PCS. Biophotons, originating from spontaneous or light-induced living systems, display super-Poissonian, Poissonian and even sub-Poissonian PCS. This result shows the first time evidence of non-classical (squeezed) light in living tissues.

  13. Rydberg Atoms in Strong Fields: a Testing Ground for Quantum Chaos.

    NASA Astrophysics Data System (ADS)

    Courtney, Michael

    1995-01-01

    Rydberg atoms in strong static electric and magnetic fields provide experimentally accessible systems for studying the connections between classical chaos and quantum mechanics in the semiclassical limit. This experimental accessibility has motivated the development of reliable quantum mechanical solutions. This thesis uses both experimental and computed quantum spectra to test the central approaches to quantum chaos. These central approaches consist mainly of developing methods to compute the spectra of quantum systems in non -perturbative regimes, correlating statistical descriptions of eigenvalues with the classical behavior of the same Hamiltonian, and the development of semiclassical methods such as periodic-orbit theory. Particular emphasis is given to identifying the spectral signature of recurrences --quantum wave packets which follow classical orbits. The new findings include: the breakdown of the connection between energy-level statistics and classical chaos in odd-parity diamagnetic lithium, the discovery of the signature of very long period orbits in atomic spectra, quantitative evidence for the scattering of recurrences by the alkali -metal core, quantitative description of the behavior of recurrences near bifurcations, and a semiclassical interpretation of the evolution of continuum Stark spectra. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.).

  14. Hidden Statistics of Schroedinger Equation

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2011-01-01

    Work was carried out in determination of the mathematical origin of randomness in quantum mechanics and creating a hidden statistics of Schr dinger equation; i.e., to expose the transitional stochastic process as a "bridge" to the quantum world. The governing equations of hidden statistics would preserve such properties of quantum physics as superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods.

  15. Expected values and variances of Bragg peak intensities measured in a nanocrystalline powder diffraction experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Öztürk, Hande; Noyan, I. Cevdet

    A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less

  16. Expected values and variances of Bragg peak intensities measured in a nanocrystalline powder diffraction experiment

    DOE PAGES

    Öztürk, Hande; Noyan, I. Cevdet

    2017-08-24

    A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less

  17. Statistical measures of Planck scale signal correlations in interferometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Craig J.; Kwon, Ohkyung

    2015-06-22

    A model-independent statistical framework is presented to interpret data from systems where the mean time derivative of positional cross correlation between world lines, a measure of spreading in a quantum geometrical wave function, is measured with a precision smaller than the Planck time. The framework provides a general way to constrain possible departures from perfect independence of classical world lines, associated with Planck scale bounds on positional information. A parametrized candidate set of possible correlation functions is shown to be consistent with the known causal structure of the classical geometry measured by an apparatus, and the holographic scaling of informationmore » suggested by gravity. Frequency-domain power spectra are derived that can be compared with interferometer data. As a result, simple projections of sensitivity for specific experimental set-ups suggests that measurements will directly yield constraints on a universal time derivative of the correlation function, and thereby confirm or rule out a class of Planck scale departures from classical geometry.« less

  18. An Introduction to Confidence Intervals for Both Statistical Estimates and Effect Sizes.

    ERIC Educational Resources Information Center

    Capraro, Mary Margaret

    This paper summarizes methods of estimating confidence intervals, including classical intervals and intervals for effect sizes. The recent American Psychological Association (APA) Task Force on Statistical Inference report suggested that confidence intervals should always be reported, and the fifth edition of the APA "Publication Manual"…

  19. Chemical Potential for the Interacting Classical Gas and the Ideal Quantum Gas Obeying a Generalized Exclusion Principle

    ERIC Educational Resources Information Center

    Sevilla, F. J.; Olivares-Quiroz, L.

    2012-01-01

    In this work, we address the concept of the chemical potential [mu] in classical and quantum gases towards the calculation of the equation of state [mu] = [mu](n, T) where n is the particle density and "T" the absolute temperature using the methods of equilibrium statistical mechanics. Two cases seldom discussed in elementary textbooks are…

  20. Recurrence theorems: A unified account

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wallace, David, E-mail: david.wallace@balliol.ox.ac.uk

    I discuss classical and quantum recurrence theorems in a unified manner, treating both as generalisations of the fact that a system with a finite state space only has so many places to go. Along the way, I prove versions of the recurrence theorem applicable to dynamics on linear and metric spaces and make some comments about applications of the classical recurrence theorem in the foundations of statistical mechanics.

  1. Fisher information as a generalized measure of coherence in classical and quantum optics.

    PubMed

    Luis, Alfredo

    2012-10-22

    We show that metrological resolution in the detection of small phase shifts provides a suitable generalization of the degrees of coherence and polarization. Resolution is estimated via Fisher information. Besides the standard two-beam Gaussian case, this approach provides also good results for multiple field components and nonGaussian statistics. This works equally well in quantum and classical optics.

  2. Quantum approach to classical statistical mechanics.

    PubMed

    Somma, R D; Batista, C D; Ortiz, G

    2007-07-20

    We present a new approach to study the thermodynamic properties of d-dimensional classical systems by reducing the problem to the computation of ground state properties of a d-dimensional quantum model. This classical-to-quantum mapping allows us to extend the scope of standard optimization methods by unifying them under a general framework. The quantum annealing method is naturally extended to simulate classical systems at finite temperatures. We derive the rates to assure convergence to the optimal thermodynamic state using the adiabatic theorem of quantum mechanics. For simulated and quantum annealing, we obtain the asymptotic rates of T(t) approximately (pN)/(k(B)logt) and gamma(t) approximately (Nt)(-c/N), for the temperature and magnetic field, respectively. Other annealing strategies are also discussed.

  3. Limit Theorems for Dispersing Billiards with Cusps

    NASA Astrophysics Data System (ADS)

    Bálint, P.; Chernov, N.; Dolgopyat, D.

    2011-12-01

    Dispersing billiards with cusps are deterministic dynamical systems with a mild degree of chaos, exhibiting "intermittent" behavior that alternates between regular and chaotic patterns. Their statistical properties are therefore weak and delicate. They are characterized by a slow (power-law) decay of correlations, and as a result the classical central limit theorem fails. We prove that a non-classical central limit theorem holds, with a scaling factor of {sqrt{nlog n}} replacing the standard {sqrt{n}} . We also derive the respective Weak Invariance Principle, and we identify the class of observables for which the classical CLT still holds.

  4. Quantum probabilistic logic programming

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  5. A quantum–quantum Metropolis algorithm

    PubMed Central

    Yung, Man-Hong; Aspuru-Guzik, Alán

    2012-01-01

    The classical Metropolis sampling method is a cornerstone of many statistical modeling applications that range from physics, chemistry, and biology to economics. This method is particularly suitable for sampling the thermal distributions of classical systems. The challenge of extending this method to the simulation of arbitrary quantum systems is that, in general, eigenstates of quantum Hamiltonians cannot be obtained efficiently with a classical computer. However, this challenge can be overcome by quantum computers. Here, we present a quantum algorithm which fully generalizes the classical Metropolis algorithm to the quantum domain. The meaning of quantum generalization is twofold: The proposed algorithm is not only applicable to both classical and quantum systems, but also offers a quantum speedup relative to the classical counterpart. Furthermore, unlike the classical method of quantum Monte Carlo, this quantum algorithm does not suffer from the negative-sign problem associated with fermionic systems. Applications of this algorithm include the study of low-temperature properties of quantum systems, such as the Hubbard model, and preparing the thermal states of sizable molecules to simulate, for example, chemical reactions at an arbitrary temperature. PMID:22215584

  6. Turbulent scaling laws as solutions of the multi-point correlation equation using statistical symmetries

    NASA Astrophysics Data System (ADS)

    Oberlack, Martin; Rosteck, Andreas; Avsarkisov, Victor

    2013-11-01

    Text-book knowledge proclaims that Lie symmetries such as Galilean transformation lie at the heart of fluid dynamics. These important properties also carry over to the statistical description of turbulence, i.e. to the Reynolds stress transport equations and its generalization, the multi-point correlation equations (MPCE). Interesting enough, the MPCE admit a much larger set of symmetries, in fact infinite dimensional, subsequently named statistical symmetries. Most important, theses new symmetries have important consequences for our understanding of turbulent scaling laws. The symmetries form the essential foundation to construct exact solutions to the infinite set of MPCE, which in turn are identified as classical and new turbulent scaling laws. Examples on various classical and new shear flow scaling laws including higher order moments will be presented. Even new scaling have been forecasted from these symmetries and in turn validated by DNS. Turbulence modellers have implicitly recognized at least one of the statistical symmetries as this is the basis for the usual log-law which has been employed for calibrating essentially all engineering turbulence models. An obvious conclusion is to generally make turbulence models consistent with the new statistical symmetries.

  7. Modulation Doped GaAs/Al sub xGA sub (1-x)As Layered Structures with Applications to Field Effect Transistors.

    DTIC Science & Technology

    1982-02-15

    function of the doping density at 300 and 77 K for the classical Boltzmann statistics or depletion approximation (solid line) and for the approximate...Fermi-Dirac statistics (equation (19) dotted line)• This comparison demonstrates that the deviation from Boltzmann statistics is quite noticeable...tunneling Schottky barriers cannot be obtained at these doping levels. The dotted lines are obtained when Boltzmann statistics are used in the Al Ga

  8. Reliability of a Measure of Institutional Discrimination against Minorities

    DTIC Science & Technology

    1979-12-01

    samples are presented. The first is based upon classical statistical theory and the second derives from a series of computer-generated Monte Carlo...Institutional racism and sexism . Englewood Cliffs, N. J.: Prentice-Hall, Inc., 1978. Hays, W. L. and Winkler, R. L. Statistics : probability, inference... statistical measure of the e of institutional discrimination are discussed. Two methods of dealing with the problem of reliability of the measure in small

  9. Quantum vertex model for reversible classical computing.

    PubMed

    Chamon, C; Mucciolo, E R; Ruckenstein, A E; Yang, Z-C

    2017-05-12

    Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without 'learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.

  10. Quantum vertex model for reversible classical computing

    NASA Astrophysics Data System (ADS)

    Chamon, C.; Mucciolo, E. R.; Ruckenstein, A. E.; Yang, Z.-C.

    2017-05-01

    Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without `learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.

  11. High-Speed Imaging Analysis of Register Transitions in Classically and Jazz-Trained Male Voices.

    PubMed

    Dippold, Sebastian; Voigt, Daniel; Richter, Bernhard; Echternach, Matthias

    2015-01-01

    Little data are available concerning register functions in different styles of singing such as classically or jazz-trained voices. Differences between registers seem to be much more audible in jazz singing than classical singing, and so we hypothesized that classically trained singers exhibit a smoother register transition, stemming from more regular vocal fold oscillation patterns. High-speed digital imaging (HSDI) was used for 19 male singers (10 jazz-trained singers, 9 classically trained) who performed a glissando from modal to falsetto register across the register transition. Vocal fold oscillation patterns were analyzed in terms of different parameters of regularity such as relative average perturbation (RAP), correlation dimension (D2) and shimmer. HSDI observations showed more regular vocal fold oscillation patterns during the register transition for the classically trained singers. Additionally, the RAP and D2 values were generally lower and more consistent for the classically trained singers compared to the jazz singers. However, intergroup comparisons showed no statistically significant differences. Some of our results may support the hypothesis that classically trained singers exhibit a smoother register transition from modal to falsetto register. © 2015 S. Karger AG, Basel.

  12. On the early history of field emission including attempts of tunneling spectroscopy

    NASA Astrophysics Data System (ADS)

    Kleint, C.

    1993-04-01

    Field emission is certainly one of the oldest surface science techniques, its roots reaching back about 250 years to the time of enlightenment. An account of very early studies and of later work is given but mostly restricted to Leipzig and to pre-Müllerian investigations. Studies of field emission from metal tips were carried out in the 18th century by Johann Heinrich Winkler who used vacuum pumps built by Jacob Leupold, a famous Leipzig mechanic. A short account of the career of Winkler will be given and his field emission experiments are illustrated. Field emission was investigated again in Leipzig much later by Julius Edgar Lilienfeld who worked on the improvement of X-ray tubes. He coined the terms ‘autoelektronische Entladung’ of ‘Äona-Effekt’ in 1922, and developed degassing procedures which are very similar to modern ultra-high vacuum processing. A pre-quantum mechanical explanation of the field emission phenomena was undertaken by Walter Schottky. Cunradi (1926) tried to measure temperature changes during field emission. Franz Rother, in a thesis (1914) suggested by Otto Wiener, dealt with the distance dependence of currents in vacuum between electrodes down to 20 nm. His habilitation in 1926 was an extension of his early work but now with field emission tips as a cathode. We might look at his measurements of the field emission characteristics in dependence on distance as a precursor to modern tunneling spectroscopy as well.

  13. Statistical methods for biodosimetry in the presence of both Berkson and classical measurement error

    NASA Astrophysics Data System (ADS)

    Miller, Austin

    In radiation epidemiology, the true dose received by those exposed cannot be assessed directly. Physical dosimetry uses a deterministic function of the source term, distance and shielding to estimate dose. For the atomic bomb survivors, the physical dosimetry system is well established. The classical measurement errors plaguing the location and shielding inputs to the physical dosimetry system are well known. Adjusting for the associated biases requires an estimate for the classical measurement error variance, for which no data-driven estimate exists. In this case, an instrumental variable solution is the most viable option to overcome the classical measurement error indeterminacy. Biological indicators of dose may serve as instrumental variables. Specification of the biodosimeter dose-response model requires identification of the radiosensitivity variables, for which we develop statistical definitions and variables. More recently, researchers have recognized Berkson error in the dose estimates, introduced by averaging assumptions for many components in the physical dosimetry system. We show that Berkson error induces a bias in the instrumental variable estimate of the dose-response coefficient, and then address the estimation problem. This model is specified by developing an instrumental variable mixed measurement error likelihood function, which is then maximized using a Monte Carlo EM Algorithm. These methods produce dose estimates that incorporate information from both physical and biological indicators of dose, as well as the first instrumental variable based data-driven estimate for the classical measurement error variance.

  14. Statistical Methodology for the Analysis of Repeated Duration Data in Behavioral Studies

    ERIC Educational Resources Information Center

    Letué, Frédérique; Martinez, Marie-José; Samson, Adeline; Vilain, Anne; Vilain, Coriandre

    2018-01-01

    Purpose: Repeated duration data are frequently used in behavioral studies. Classical linear or log-linear mixed models are often inadequate to analyze such data, because they usually consist of nonnegative and skew-distributed variables. Therefore, we recommend use of a statistical methodology specific to duration data. Method: We propose a…

  15. Integration of ecological indices in the multivariate evaluation of an urban inventory of street trees

    Treesearch

    J. Grabinsky; A. Aldama; A. Chacalo; H. J. Vazquez

    2000-01-01

    Inventory data of Mexico City's street trees were studied using classical statistical arboricultural and ecological statistical approaches. Multivariate techniques were applied to both. Results did not differ substantially and were complementary. It was possible to reduce inventory data and to group species, boroughs, blocks, and variables.

  16. Seed Dispersal Near and Far: Patterns Across Temperate and Tropical Forests

    Treesearch

    James S. Clark; Miles Silman; Ruth Kern; Eric Macklin; Janneke HilleRisLambers

    1999-01-01

    Dispersal affects community dynamics and vegetation response to global change. Understanding these effects requires descriptions of dispersal at local and regional scales and statistical models that permit estimation. Classical models of dispersal describe local or long-distance dispersal, but not both. The lack of statistical methods means that models have rarely been...

  17. Teaching Bayesian Statistics to Undergraduate Students through Debates

    ERIC Educational Resources Information Center

    Stewart, Sepideh; Stewart, Wayne

    2014-01-01

    This paper describes a lecturer's approach to teaching Bayesian statistics to students who were only exposed to the classical paradigm. The study shows how the lecturer extended himself by making use of ventriloquist dolls to grab hold of students' attention and embed important ideas in revealing the differences between the Bayesian and classical…

  18. Non-Gaussian statistics and nanosecond dynamics of electrostatic fluctuations affecting optical transitions in proteins.

    PubMed

    Martin, Daniel R; Matyushov, Dmitry V

    2012-08-30

    We show that electrostatic fluctuations of the protein-water interface are globally non-Gaussian. The electrostatic component of the optical transition energy (energy gap) in a hydrated green fluorescent protein is studied here by classical molecular dynamics simulations. The distribution of the energy gap displays a high excess in the breadth of electrostatic fluctuations over the prediction of the Gaussian statistics. The energy gap dynamics include a nanosecond component. When simulations are repeated with frozen protein motions, the statistics shifts to the expectations of linear response and the slow dynamics disappear. We therefore suggest that both the non-Gaussian statistics and the nanosecond dynamics originate largely from global, low-frequency motions of the protein coupled to the interfacial water. The non-Gaussian statistics can be experimentally verified from the temperature dependence of the first two spectral moments measured at constant-volume conditions. Simulations at different temperatures are consistent with other indicators of the non-Gaussian statistics. In particular, the high-temperature part of the energy gap variance (second spectral moment) scales linearly with temperature and extrapolates to zero at a temperature characteristic of the protein glass transition. This result, violating the classical limit of the fluctuation-dissipation theorem, leads to a non-Boltzmann statistics of the energy gap and corresponding non-Arrhenius kinetics of radiationless electronic transitions, empirically described by the Vogel-Fulcher-Tammann law.

  19. Fiber Breakage Model for Carbon Composite Stress Rupture Phenomenon: Theoretical Development and Applications

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Phoenix, S. Leigh; Grimes-Ledesma, Lorie

    2010-01-01

    Stress rupture failure of Carbon Composite Overwrapped Pressure Vessels (COPVs) is of serious concern to Science Mission and Constellation programs since there are a number of COPVs on board space vehicles with stored gases under high pressure for long durations of time. It has become customary to establish the reliability of these vessels using the so called classic models. The classical models are based on Weibull statistics fitted to observed stress rupture data. These stochastic models cannot account for any additional damage due to the complex pressure-time histories characteristic of COPVs being supplied for NASA missions. In particular, it is suspected that the effects of proof test could significantly reduce the stress rupture lifetime of COPVs. The focus of this paper is to present an analytical appraisal of a model that incorporates damage due to proof test. The model examined in the current paper is based on physical mechanisms such as micromechanics based load sharing concepts coupled with creep rupture and Weibull statistics. For example, the classic model cannot accommodate for damage due to proof testing which every flight vessel undergoes. The paper compares current model to the classic model with a number of examples. In addition, several applications of the model to current ISS and Constellation program issues are also examined.

  20. Color stability of shade guides after autoclave sterilization.

    PubMed

    Schmeling, Max; Sartori, Neimar; Monteiro, Sylvio; Baratieri, Luiz

    2014-01-01

    This study evaluated the influence of 120 autoclave sterilization cycles on the color stability of two commercial shade guides (Vita Classical and Vita System 3D-Master). The specimens were evaluated by spectrophotometer before and after the sterilization cycles. The color was described using the three-dimensional CIELab system. The statistical analysis was performed in three chromaticity coordinates, before and after sterilization cycles, using the paired samples t test. All specimens became darker after autoclave sterilization cycles. However, specimens of Vita Classical became redder, while those of the Vita System 3D-Master became more yellow. Repeated cycles of autoclave sterilization caused statistically significant changes in the color coordinates of the two shade guides. However, these differences are considered clinically acceptable.

  1. Much Polyphony but Little Harmony: Otto Sackur's Groping for a Quantum Theory of Gases

    NASA Astrophysics Data System (ADS)

    Badino, Massimiliano; Friedrich, Bretislav

    2013-09-01

    The endeavor of Otto Sackur (1880-1914) was driven, on the one hand, by his interest in Nernst's heat theorem, statistical mechanics, and the problem of chemical equilibrium and, on the other hand, by his goal to shed light on classical mechanics from the quantum vantage point. Inspired by the interplay between classical physics and quantum theory, Sackur chanced to expound his personal take on the role of the quantum in the changing landscape of physics in the turbulent 1910s. We tell the story of this enthusiastic practitioner of the old quantum theory and early contributor to quantum statistical mechanics, whose scientific ontogenesis provides a telling clue about the phylogeny of his contemporaries.

  2. Finite-range Coulomb gas models of banded random matrices and quantum kicked rotors

    NASA Astrophysics Data System (ADS)

    Pandey, Akhilesh; Kumar, Avanish; Puri, Sanjay

    2017-11-01

    Dyson demonstrated an equivalence between infinite-range Coulomb gas models and classical random matrix ensembles for the study of eigenvalue statistics. We introduce finite-range Coulomb gas (FRCG) models via a Brownian matrix process, and study them analytically and by Monte Carlo simulations. These models yield new universality classes, and provide a theoretical framework for the study of banded random matrices (BRMs) and quantum kicked rotors (QKRs). We demonstrate that, for a BRM of bandwidth b and a QKR of chaos parameter α , the appropriate FRCG model has the effective range d =b2/N =α2/N , for large N matrix dimensionality. As d increases, there is a transition from Poisson to classical random matrix statistics.

  3. Quantum gas-liquid condensation in an attractive Bose gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koh, Shun-ichiro

    Gas-liquid condensation (GLC) in an attractive Bose gas is studied on the basis of statistical mechanics. Using some results in combinatorial mathematics, the following are derived. (1) With decreasing temperature, the Bose-statistical coherence grows in the many-body wave function, which gives rise to the divergence of the grand partition function prior to Bose-Einstein condensation. It is a quantum-mechanical analogue to the GLC in a classical gas (quantum GLC). (2) This GLC is triggered by the bosons with zero momentum. Compared with the classical GLC, an incomparably weaker attractive force creates it. For the system showing the quantum GLC, we discussmore » a cold helium 4 gas at sufficiently low pressure.« less

  4. Finite-range Coulomb gas models of banded random matrices and quantum kicked rotors.

    PubMed

    Pandey, Akhilesh; Kumar, Avanish; Puri, Sanjay

    2017-11-01

    Dyson demonstrated an equivalence between infinite-range Coulomb gas models and classical random matrix ensembles for the study of eigenvalue statistics. We introduce finite-range Coulomb gas (FRCG) models via a Brownian matrix process, and study them analytically and by Monte Carlo simulations. These models yield new universality classes, and provide a theoretical framework for the study of banded random matrices (BRMs) and quantum kicked rotors (QKRs). We demonstrate that, for a BRM of bandwidth b and a QKR of chaos parameter α, the appropriate FRCG model has the effective range d=b^{2}/N=α^{2}/N, for large N matrix dimensionality. As d increases, there is a transition from Poisson to classical random matrix statistics.

  5. The fracture load and failure types of veneered anterior zirconia crowns: an analysis of normal and Weibull distribution of complete and censored data.

    PubMed

    Stawarczyk, Bogna; Ozcan, Mutlu; Hämmerle, Christoph H F; Roos, Malgorzata

    2012-05-01

    The aim of this study was to compare the fracture load of veneered anterior zirconia crowns using normal and Weibull distribution of complete and censored data. Standardized zirconia frameworks for maxillary canines were milled using a CAD/CAM system and randomly divided into 3 groups (N=90, n=30 per group). They were veneered with three veneering ceramics, namely GC Initial ZR, Vita VM9, IPS e.max Ceram using layering technique. The crowns were cemented with glass ionomer cement on metal abutments. The specimens were then loaded to fracture (1 mm/min) in a Universal Testing Machine. The data were analyzed using classical method (normal data distribution (μ, σ); Levene test and one-way ANOVA) and according to the Weibull statistics (s, m). In addition, fracture load results were analyzed depending on complete and censored failure types (only chipping vs. total fracture together with chipping). When computed with complete data, significantly higher mean fracture loads (N) were observed for GC Initial ZR (μ=978, σ=157; s=1043, m=7.2) and VITA VM9 (μ=1074, σ=179; s=1139; m=7.8) than that of IPS e.max Ceram (μ=798, σ=174; s=859, m=5.8) (p<0.05) by classical and Weibull statistics, respectively. When the data were censored for only total fracture, IPS e.max Ceram presented the lowest fracture load for chipping with both classical distribution (μ=790, σ=160) and Weibull statistics (s=836, m=6.5). When total fracture with chipping (classical distribution) was considered as failure, IPS e.max Ceram did not show significant fracture load for total fracture (μ=1054, σ=110) compared to other groups (GC Initial ZR: μ=1039, σ=152, VITA VM9: μ=1170, σ=166). According to Weibull distributed data, VITA VM9 showed significantly higher fracture load (s=1228, m=9.4) than those of other groups. Both classical distribution and Weibull statistics for complete data yielded similar outcomes. Censored data analysis of all ceramic systems based on failure types is essential and brings additional information regarding the susceptibility to chipping or total fracture. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  6. JOURNAL SCOPE GUIDELINES: Paper classification scheme

    NASA Astrophysics Data System (ADS)

    2005-06-01

    This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Craig

    It is argued by extrapolation of general relativity and quantum mechanics that a classical inertial frame corresponds to a statistically defined observable that rotationally fluctuates due to Planck scale indeterminacy. Physical effects of exotic nonlocal rotational correlations on large scale field states are estimated. Their entanglement with the strong interaction vacuum is estimated to produce a universal, statistical centrifugal acceleration that resembles the observed cosmological constant.

  8. Computational algorithms dealing with the classical and statistical mechanics of celestial scale polymers in space elevator technology

    NASA Astrophysics Data System (ADS)

    Knudsen, Steven; Golubovic, Leonardo

    Prospects to build Space Elevator (SE) systems have become realistic with ultra-strong materials such as carbon nano-tubes and diamond nano-threads. At cosmic length-scales, space elevators can be modeled as polymer like floppy strings of tethered mass beads. A new venue in SE science has emerged with the introduction of the Rotating Space Elevator (RSE) concept supported by novel algorithms discussed in this presentation. An RSE is a loopy string reaching into outer space. Unlike the classical geostationary SE concepts of Tsiolkovsky, Artsutanov, and Pearson, our RSE exhibits an internal rotation. Thanks to this, objects sliding along the RSE loop spontaneously oscillate between two turning points, one of which is close to the Earth whereas the other one is in outer space. The RSE concept thus solves a major problem in SE technology which is how to supply energy to the climbers moving along space elevator strings. The investigation of the classical and statistical mechanics of a floppy string interacting with objects sliding along it required development of subtle computational algorithms described in this presentation

  9. The effect of live classical piano music on the vital signs of patients undergoing ophthalmic surgery.

    PubMed

    Camara, Jorge G; Ruszkowski, Joseph M; Worak, Sandra R

    2008-06-25

    Music and surgery. To determine the effect of live classical piano music on vital signs of patients undergoing ophthalmic surgery. Retrospective case series. 203 patients who underwent various ophthalmologic procedures in a period during which a piano was present in the operating room of St. Francis Medical Center. [Note: St. Francis Medical Center has recently been renamed Hawaii Medical Center East.] Demographic data, surgical procedures, and the vital signs of 203 patients who underwent ophthalmic procedures were obtained from patient records. Blood pressure, heart rate, and respiratory rate measured in the preoperative holding area were compared with the same parameters taken in the operating room, with and without exposure to live piano music. A paired t-test was used for statistical analysis. Mean arterial pressure, heart rate, and respiratory rate. 115 patients who were exposed to live piano music showed a statistically significant decrease in mean arterial blood pressure, heart rate, and respiratory rate in the operating room compared with their vital signs measured in the preoperative holding area (P < .0001). The control group of 88 patients not exposed to live piano music showed a statistically significant increase in mean arterial blood pressure (P < .0002) and heart rate and respiratory rate (P < .0001). Live classical piano music lowered the blood pressure, heart rate, and respiratory rate in patients undergoing ophthalmic surgery.

  10. Statistical mechanical foundation of the peridynamic nonlocal continuum theory: energy and momentum conservation laws.

    PubMed

    Lehoucq, R B; Sears, Mark P

    2011-09-01

    The purpose of this paper is to derive the energy and momentum conservation laws of the peridynamic nonlocal continuum theory using the principles of classical statistical mechanics. The peridynamic laws allow the consideration of discontinuous motion, or deformation, by relying on integral operators. These operators sum forces and power expenditures separated by a finite distance and so represent nonlocal interaction. The integral operators replace the differential divergence operators conventionally used, thereby obviating special treatment at points of discontinuity. The derivation presented employs a general multibody interatomic potential, avoiding the standard assumption of a pairwise decomposition. The integral operators are also expressed in terms of a stress tensor and heat flux vector under the assumption that these fields are differentiable, demonstrating that the classical continuum energy and momentum conservation laws are consequences of the more general peridynamic laws. An important conclusion is that nonlocal interaction is intrinsic to continuum conservation laws when derived using the principles of statistical mechanics.

  11. On Probability Domains IV

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  12. RANDOMNESS of Numbers DEFINITION(QUERY:WHAT? V HOW?) ONLY Via MAXWELL-BOLTZMANN CLASSICAL-Statistics(MBCS) Hot-Plasma VS. Digits-Clumping Log-Law NON-Randomness Inversion ONLY BOSE-EINSTEIN QUANTUM-Statistics(BEQS) .

    NASA Astrophysics Data System (ADS)

    Siegel, Z.; Siegel, Edward Carl-Ludwig

    2011-03-01

    RANDOMNESS of Numbers cognitive-semantics DEFINITION VIA Cognition QUERY: WHAT???, NOT HOW?) VS. computer-``science" mindLESS number-crunching (Harrel-Sipser-...) algorithmics Goldreich "PSEUDO-randomness"[Not.AMS(02)] mea-culpa is ONLY via MAXWELL-BOLTZMANN CLASSICAL-STATISTICS(NOT FDQS!!!) "hot-plasma" REPULSION VERSUS Newcomb(1881)-Weyl(1914;1916)-Benford(1938) "NeWBe" logarithmic-law digit-CLUMPING/ CLUSTERING NON-Randomness simple Siegel[AMS Joint.Mtg.(02)-Abs. # 973-60-124] algebraic-inversion to THE QUANTUM and ONLY BEQS preferentially SEQUENTIALLY lower-DIGITS CLUMPING/CLUSTERING with d = 0 BEC, is ONLY VIA Siegel-Baez FUZZYICS=CATEGORYICS (SON OF TRIZ)/"Category-Semantics"(C-S), latter intersection/union of Lawvere(1964)-Siegel(1964)] category-theory (matrix: MORPHISMS V FUNCTORS) "+" cognitive-semantics'' (matrix: ANTONYMS V SYNONYMS) yields Siegel-Baez FUZZYICS=CATEGORYICS/C-S tabular list-format matrix truth-table analytics: MBCS RANDOMNESS TRUTH/EMET!!!

  13. Reversibility in Quantum Models of Stochastic Processes

    NASA Astrophysics Data System (ADS)

    Gier, David; Crutchfield, James; Mahoney, John; James, Ryan

    Natural phenomena such as time series of neural firing, orientation of layers in crystal stacking and successive measurements in spin-systems are inherently probabilistic. The provably minimal classical models of such stochastic processes are ɛ-machines, which consist of internal states, transition probabilities between states and output values. The topological properties of the ɛ-machine for a given process characterize the structure, memory and patterns of that process. However ɛ-machines are often not ideal because their statistical complexity (Cμ) is demonstrably greater than the excess entropy (E) of the processes they represent. Quantum models (q-machines) of the same processes can do better in that their statistical complexity (Cq) obeys the relation Cμ >= Cq >= E. q-machines can be constructed to consider longer lengths of strings, resulting in greater compression. With code-words of sufficiently long length, the statistical complexity becomes time-symmetric - a feature apparently novel to this quantum representation. This result has ramifications for compression of classical information in quantum computing and quantum communication technology.

  14. The ambiguity of simplicity in quantum and classical simulation

    NASA Astrophysics Data System (ADS)

    Aghamohammadi, Cina; Mahoney, John R.; Crutchfield, James P.

    2017-04-01

    A system's perceived simplicity depends on whether it is represented classically or quantally. This is not so surprising, as classical and quantum physics are descriptive frameworks built on different assumptions that capture, emphasize, and express different properties and mechanisms. What is surprising is that, as we demonstrate, simplicity is ambiguous: the relative simplicity between two systems can change sign when moving between classical and quantum descriptions. Here, we associate simplicity with small model-memory. We see that the notions of absolute physical simplicity at best form a partial, not a total, order. This suggests that appeals to principles of physical simplicity, via Ockham's Razor or to the ;elegance; of competing theories, may be fundamentally subjective. Recent rapid progress in quantum computation and quantum simulation suggest that the ambiguity of simplicity will strongly impact statistical inference and, in particular, model selection.

  15. Chance, determinism and the classical theory of probability.

    PubMed

    Vasudevan, Anubav

    2018-02-01

    This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Power-law distributions for a trapped ion interacting with a classical buffer gas.

    PubMed

    DeVoe, Ralph G

    2009-02-13

    Classical collisions with an ideal gas generate non-Maxwellian distribution functions for a single ion in a radio frequency ion trap. The distributions have power-law tails whose exponent depends on the ratio of buffer gas to ion mass. This provides a statistical explanation for the previously observed transition from cooling to heating. Monte Carlo results approximate a Tsallis distribution over a wide range of parameters and have ab initio agreement with experiment.

  17. Study on elevated-temperature flow behavior of Ni-Cr-Mo-B ultra-heavy-plate steel via experiment and modelling

    NASA Astrophysics Data System (ADS)

    Gao, Zhi-yu; Kang, Yu; Li, Yan-shuai; Meng, Chao; Pan, Tao

    2018-04-01

    Elevated-temperature flow behavior of a novel Ni-Cr-Mo-B ultra-heavy-plate steel was investigated by conducting hot compressive deformation tests on a Gleeble-3800 thermo-mechanical simulator at a temperature range of 1123 K–1423 K with a strain rate range from 0.01 s‑1 to10 s‑1 and a height reduction of 70%. Based on the experimental results, classic strain-compensated Arrhenius-type, a new revised strain-compensated Arrhenius-type and classic modified Johnson-Cook constitutive models were developed for predicting the high-temperature deformation behavior of the steel. The predictability of these models were comparatively evaluated in terms of statistical parameters including correlation coefficient (R), average absolute relative error (AARE), average root mean square error (RMSE), normalized mean bias error (NMBE) and relative error. The statistical results indicate that the new revised strain-compensated Arrhenius-type model could give prediction of elevated-temperature flow stress for the steel accurately under the entire process conditions. However, the predicted values by the classic modified Johnson-Cook model could not agree well with the experimental values, and the classic strain-compensated Arrhenius-type model could track the deformation behavior more accurately compared with the modified Johnson-Cook model, but less accurately with the new revised strain-compensated Arrhenius-type model. In addition, reasons of differences in predictability of these models were discussed in detail.

  18. Acute effect of scapular proprioceptive neuromuscular facilitation (PNF) techniques and classic exercises in adhesive capsulitis: a randomized controlled trial

    PubMed Central

    Balcı, Nilay Comuk; Yuruk, Zeliha Ozlem; Zeybek, Aslican; Gulsen, Mustafa; Tekindal, Mustafa Agah

    2016-01-01

    [Purpose] The aim of our study was to compare the initial effects of scapular proprioceptive neuromuscular facilitation techniques and classic exercise interventions with physiotherapy modalities on pain, scapular dyskinesis, range of motion, and function in adhesive capsulitis. [Subjects and Methods] Fifty-three subjects were allocated to 3 groups: scapular proprioceptive neuromuscular facilitation exercies and physiotherapy modalities, classic exercise and physiotherapy modalities, and only physiotherapy modalities. The intervention was applied in a single session. The Visual Analog Scale, Lateral Scapular Slide Test, range of motion and Simple Shoulder Test were evaluated before and just after the one-hour intervention in the same session (all in one session). [Results] All of the groups showed significant differences in shoulder flexion and abduction range of motion and Simple Shoulder Test scores. There were statistically significant differences in Visual Analog Scale scores in the proprioceptive neuromuscular facilitation and control groups, and no treatment method had significant effect on the Lateral Scapular Slide Test results. There were no statistically significant differences between the groups before and after the intervention. [Conclusion] Proprioceptive neuromuscular facilitation, classic exercise, and physiotherapy modalities had immediate effects on adhesive capsulitis in our study. However, there was no additional benefit of exercises in one session over physiotherapy modalities. Also, an effective treatment regimen for shoulder rehabilitation of adhesive capsulitis patients should include scapular exercises. PMID:27190456

  19. Malnutrition and Environmental Enrichment: A Statistical Reappraisal of the Findings of the Adoption Study of Winick et al. (1975).

    ERIC Educational Resources Information Center

    Trueman, Mark

    1985-01-01

    Critically reviews the influential study "Malnutrition and Environmental Enrichment" by Winick et al. (1975) and highlights what are considered to be statistical flaws in its analysis. Data in the classic study of height, weight, and IQ changes in three groups of adopted, malnourished Korean girls are reanalysed and conclusions…

  20. Information flow and quantum cryptography using statistical fluctuations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Home, D.; Whitaker, M.A.B.

    2003-02-01

    A procedure is formulated, using the quantum teleportation arrangement, that communicates knowledge of an apparatus setting between the wings of the experiment, using statistical fluctuations in a sequence of measurement results. It requires an entangled state, and transmission of classical information totally unrelated to the apparatus setting actually communicated. Our procedure has conceptual interest, and has applications to quantum cryptography.

  1. Wigner surmises and the two-dimensional homogeneous Poisson point process.

    PubMed

    Sakhr, Jamal; Nieminen, John M

    2006-04-01

    We derive a set of identities that relate the higher-order interpoint spacing statistics of the two-dimensional homogeneous Poisson point process to the Wigner surmises for the higher-order spacing distributions of eigenvalues from the three classical random matrix ensembles. We also report a remarkable identity that equates the second-nearest-neighbor spacing statistics of the points of the Poisson process and the nearest-neighbor spacing statistics of complex eigenvalues from Ginibre's ensemble of 2 x 2 complex non-Hermitian random matrices.

  2. Quantum and classical behavior in interacting bosonic systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertzberg, Mark P.

    It is understood that in free bosonic theories, the classical field theory accurately describes the full quantum theory when the occupancy numbers of systems are very large. However, the situation is less understood in interacting theories, especially on time scales longer than the dynamical relaxation time. Recently there have been claims that the quantum theory deviates spectacularly from the classical theory on this time scale, even if the occupancy numbers are extremely large. Furthermore, it is claimed that the quantum theory quickly thermalizes while the classical theory does not. The evidence for these claims comes from noticing a spectacular differencemore » in the time evolution of expectation values of quantum operators compared to the classical micro-state evolution. If true, this would have dramatic consequences for many important phenomena, including laboratory studies of interacting BECs, dark matter axions, preheating after inflation, etc. In this work we critically examine these claims. We show that in fact the classical theory can describe the quantum behavior in the high occupancy regime, even when interactions are large. The connection is that the expectation values of quantum operators in a single quantum micro-state are approximated by a corresponding classical ensemble average over many classical micro-states. Furthermore, by the ergodic theorem, a classical ensemble average of local fields with statistical translation invariance is the spatial average of a single micro-state. So the correlation functions of the quantum and classical field theories of a single micro-state approximately agree at high occupancy, even in interacting systems. Furthermore, both quantum and classical field theories can thermalize, when appropriate coarse graining is introduced, with the classical case requiring a cutoff on low occupancy UV modes. We discuss applications of our results.« less

  3. Quantum-Classical Correspondence Principle for Work Distributions

    NASA Astrophysics Data System (ADS)

    Jarzynski, Christopher; Quan, H. T.; Rahav, Saar

    2015-07-01

    For closed quantum systems driven away from equilibrium, work is often defined in terms of projective measurements of initial and final energies. This definition leads to statistical distributions of work that satisfy nonequilibrium work and fluctuation relations. While this two-point measurement definition of quantum work can be justified heuristically by appeal to the first law of thermodynamics, its relationship to the classical definition of work has not been carefully examined. In this paper, we employ semiclassical methods, combined with numerical simulations of a driven quartic oscillator, to study the correspondence between classical and quantal definitions of work in systems with 1 degree of freedom. We find that a semiclassical work distribution, built from classical trajectories that connect the initial and final energies, provides an excellent approximation to the quantum work distribution when the trajectories are assigned suitable phases and are allowed to interfere. Neglecting the interferences between trajectories reduces the distribution to that of the corresponding classical process. Hence, in the semiclassical limit, the quantum work distribution converges to the classical distribution, decorated by a quantum interference pattern. We also derive the form of the quantum work distribution at the boundary between classically allowed and forbidden regions, where this distribution tunnels into the forbidden region. Our results clarify how the correspondence principle applies in the context of quantum and classical work distributions and contribute to the understanding of work and nonequilibrium work relations in the quantum regime.

  4. Characterizing chaotic melodies in automatic music composition

    NASA Astrophysics Data System (ADS)

    Coca, Andrés E.; Tost, Gerard O.; Zhao, Liang

    2010-09-01

    In this paper, we initially present an algorithm for automatic composition of melodies using chaotic dynamical systems. Afterward, we characterize chaotic music in a comprehensive way as comprising three perspectives: musical discrimination, dynamical influence on musical features, and musical perception. With respect to the first perspective, the coherence between generated chaotic melodies (continuous as well as discrete chaotic melodies) and a set of classical reference melodies is characterized by statistical descriptors and melodic measures. The significant differences among the three types of melodies are determined by discriminant analysis. Regarding the second perspective, the influence of dynamical features of chaotic attractors, e.g., Lyapunov exponent, Hurst coefficient, and correlation dimension, on melodic features is determined by canonical correlation analysis. The last perspective is related to perception of originality, complexity, and degree of melodiousness (Euler's gradus suavitatis) of chaotic and classical melodies by nonparametric statistical tests.

  5. The Effect of Live Classical Piano Music on the Vital Signs of Patients Undergoing Ophthalmic Surgery

    PubMed Central

    Camara, Jorge G.; Ruszkowski, Joseph M.; Worak, Sandra R.

    2008-01-01

    Context Music and surgery. Objective To determine the effect of live classical piano music on vital signs of patients undergoing ophthalmic surgery. Design Retrospective case series. Setting and Patients 203 patients who underwent various ophthalmologic procedures in a period during which a piano was present in the operating room of St. Francis Medical Center. [Note: St. Francis Medical Center has recently been renamed Hawaii Medical Center East.] Intervention Demographic data, surgical procedures, and the vital signs of 203 patients who underwent ophthalmic procedures were obtained from patient records. Blood pressure, heart rate, and respiratory rate measured in the preoperative holding area were compared with the same parameters taken in the operating room, with and without exposure to live piano music. A paired t-test was used for statistical analysis. Main outcome measure Mean arterial pressure, heart rate, and respiratory rate. Results 115 patients who were exposed to live piano music showed a statistically significant decrease in mean arterial blood pressure, heart rate, and respiratory rate in the operating room compared with their vital signs measured in the preoperative holding area (P < .0001). The control group of 88 patients not exposed to live piano music showed a statistically significant increase in mean arterial blood pressure (P < .0002) and heart rate and respiratory rate (P < .0001). Conclusion Live classical piano music lowered the blood pressure, heart rate, and respiratory rate in patients undergoing ophthalmic surgery. PMID:18679538

  6. Quantum probability, choice in large worlds, and the statistical structure of reality.

    PubMed

    Ross, Don; Ladyman, James

    2013-06-01

    Classical probability models of incentive response are inadequate in "large worlds," where the dimensions of relative risk and the dimensions of similarity in outcome comparisons typically differ. Quantum probability models for choice in large worlds may be motivated pragmatically - there is no third theory - or metaphysically: statistical processing in the brain adapts to the true scale-relative structure of the universe.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paavola, Janika; Hall, Michael J. W.; Paris, Matteo G. A.

    The transition from quantum to classical, in the case of a quantum harmonic oscillator, is typically identified with the transition from a quantum superposition of macroscopically distinguishable states, such as the Schroedinger-cat state, into the corresponding statistical mixture. This transition is commonly characterized by the asymptotic loss of the interference term in the Wigner representation of the cat state. In this paper we show that the quantum-to-classical transition has different dynamical features depending on the measure for nonclassicality used. Measures based on an operatorial definition have well-defined physical meaning and allow a deeper understanding of the quantum-to-classical transition. Our analysismore » shows that, for most nonclassicality measures, the Schroedinger-cat state becomes classical after a finite time. Moreover, our results challenge the prevailing idea that more macroscopic states are more susceptible to decoherence in the sense that the transition from quantum to classical occurs faster. Since nonclassicality is a prerequisite for entanglement generation our results also bridge the gap between decoherence, which is lost only asymptotically, and entanglement, which may show a ''sudden death''. In fact, whereas the loss of coherences still remains asymptotic, we emphasize that the transition from quantum to classical can indeed occur at a finite time.« less

  8. Tsallis non-extensive statistics and solar wind plasma complexity

    NASA Astrophysics Data System (ADS)

    Pavlos, G. P.; Iliopoulos, A. C.; Zastenker, G. N.; Zelenyi, L. M.; Karakatsanis, L. P.; Riazantseva, M. O.; Xenakis, M. N.; Pavlos, E. G.

    2015-03-01

    This article presents novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which took place on 26th September 2011. Solar wind plasma is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields (B → , E →) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992).

  9. A methodology for the stochastic generation of hourly synthetic direct normal irradiation time series

    NASA Astrophysics Data System (ADS)

    Larrañeta, M.; Moreno-Tejera, S.; Lillo-Bravo, I.; Silva-Pérez, M. A.

    2018-02-01

    Many of the available solar radiation databases only provide global horizontal irradiance (GHI) while there is a growing need of extensive databases of direct normal radiation (DNI) mainly for the development of concentrated solar power and concentrated photovoltaic technologies. In the present work, we propose a methodology for the generation of synthetic DNI hourly data from the hourly average GHI values by dividing the irradiance into a deterministic and stochastic component intending to emulate the dynamics of the solar radiation. The deterministic component is modeled through a simple classical model. The stochastic component is fitted to measured data in order to maintain the consistency of the synthetic data with the state of the sky, generating statistically significant DNI data with a cumulative frequency distribution very similar to the measured data. The adaptation and application of the model to the location of Seville shows significant improvements in terms of frequency distribution over the classical models. The proposed methodology applied to other locations with different climatological characteristics better results than the classical models in terms of frequency distribution reaching a reduction of the 50% in the Finkelstein-Schafer (FS) and Kolmogorov-Smirnov test integral (KSI) statistics.

  10. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis.

    PubMed

    Carvalho, Pedro; Marques, Rui Cunha

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. An Update on Statistical Boosting in Biomedicine.

    PubMed

    Mayr, Andreas; Hofner, Benjamin; Waldmann, Elisabeth; Hepp, Tobias; Meyer, Sebastian; Gefeller, Olaf

    2017-01-01

    Statistical boosting algorithms have triggered a lot of research during the last decade. They combine a powerful machine learning approach with classical statistical modelling, offering various practical advantages like automated variable selection and implicit regularization of effect estimates. They are extremely flexible, as the underlying base-learners (regression functions defining the type of effect for the explanatory variables) can be combined with any kind of loss function (target function to be optimized, defining the type of regression setting). In this review article, we highlight the most recent methodological developments on statistical boosting regarding variable selection, functional regression, and advanced time-to-event modelling. Additionally, we provide a short overview on relevant applications of statistical boosting in biomedicine.

  12. On information, negentropy and H-theorem

    NASA Astrophysics Data System (ADS)

    Chakrabarti, C. G.; Sarker, N. G.

    1983-09-01

    The paper deals with the imprtance of the Kullback descrimination information in the statistical characterization of negentropy of non-equilibrium state and the irreversibility of a classical dynamical system. The theory based on the Kullback discrimination information as the H-function gives new insight into the interrelation between the concepts of coarse-graining and the principle of sufficiency leading to important statistical characterization of thermal equilibrium of a closed system.

  13. Minimum Uncertainty Coherent States Attached to Nondegenerate Parametric Amplifiers

    NASA Astrophysics Data System (ADS)

    Dehghani, A.; Mojaveri, B.

    2015-06-01

    Exact analytical solutions for the two-mode nondegenerate parametric amplifier have been obtained by using the transformation from the two-dimensional harmonic oscillator Hamiltonian. Some important physical properties such as quantum statistics and quadrature squeezing of the corresponding states are investigated. In addition, these states carry classical features such as Poissonian statistics and minimize the Heisenberg uncertainty relation of a pair of the coordinate and the momentum operators.

  14. A Photon Interference Detector with Continuous Display.

    ERIC Educational Resources Information Center

    Gilmore, R. S.

    1978-01-01

    Describes an apparatus which attempts to give a direct visual impression of the random detection of individual photons coupled with the recognition of the classical intensity distribution as a result of fairly high proton statistics. (Author/GA)

  15. Colors of Inner Disk Classical Kuiper Belt Objects

    NASA Astrophysics Data System (ADS)

    Romanishin, W.; Tegler, S. C.; Consolmagno, G. J.

    2010-07-01

    We present new optical broadband colors, obtained with the Keck 1 and Vatican Advanced Technology telescopes, for six objects in the inner classical Kuiper Belt. Objects in the inner classical Kuiper Belt are of interest as they may represent the surviving members of the primordial Kuiper Belt that formed interior to the current position of the 3:2 resonance with Neptune, the current position of the plutinos, or, alternatively, they may be objects formed at a different heliocentric distance that were then moved to their present locations. The six new colors, combined with four previously published, show that the ten inner belt objects with known colors form a neutral clump and a reddish clump in B-R color. Nonparametric statistical tests show no significant difference between the B-R color distribution of the inner disk objects compared to the color distributions of Centaurs, plutinos, or scattered disk objects. However, the B-R color distribution of the inner classical Kuiper Belt Objects does differ significantly from the distribution of colors in the cold (low inclination) main classical Kuiper Belt. The cold main classical objects are predominately red, while the inner classical belt objects are a mixture of neutral and red. The color difference may reveal the existence of a gradient in the composition and/or surface processing history in the primordial Kuiper Belt, or indicate that the inner disk objects are not dynamically analogous to the cold main classical belt objects.

  16. COLORS OF INNER DISK CLASSICAL KUIPER BELT OBJECTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romanishin, W.; Tegler, S. C.; Consolmagno, G. J., E-mail: wromanishin@ou.ed, E-mail: Stephen.Tegler@nau.ed, E-mail: gjc@specola.v

    2010-07-15

    We present new optical broadband colors, obtained with the Keck 1 and Vatican Advanced Technology telescopes, for six objects in the inner classical Kuiper Belt. Objects in the inner classical Kuiper Belt are of interest as they may represent the surviving members of the primordial Kuiper Belt that formed interior to the current position of the 3:2 resonance with Neptune, the current position of the plutinos, or, alternatively, they may be objects formed at a different heliocentric distance that were then moved to their present locations. The six new colors, combined with four previously published, show that the ten innermore » belt objects with known colors form a neutral clump and a reddish clump in B-R color. Nonparametric statistical tests show no significant difference between the B-R color distribution of the inner disk objects compared to the color distributions of Centaurs, plutinos, or scattered disk objects. However, the B-R color distribution of the inner classical Kuiper Belt Objects does differ significantly from the distribution of colors in the cold (low inclination) main classical Kuiper Belt. The cold main classical objects are predominately red, while the inner classical belt objects are a mixture of neutral and red. The color difference may reveal the existence of a gradient in the composition and/or surface processing history in the primordial Kuiper Belt, or indicate that the inner disk objects are not dynamically analogous to the cold main classical belt objects.« less

  17. Statistics of transmission eigenvalues in two-dimensional quantum cavities: Ballistic versus stochastic scattering

    NASA Astrophysics Data System (ADS)

    Rotter, Stefan; Aigner, Florian; Burgdörfer, Joachim

    2007-03-01

    We investigate the statistical distribution of transmission eigenvalues in phase-coherent transport through quantum dots. In two-dimensional ab initio simulations for both clean and disordered two-dimensional cavities, we find markedly different quantum-to-classical crossover scenarios for these two cases. In particular, we observe the emergence of “noiseless scattering states” in clean cavities, irrespective of sharp-edged entrance and exit lead mouths. We find the onset of these “classical” states to be largely independent of the cavity’s classical chaoticity, but very sensitive with respect to bulk disorder. Our results suggest that for weakly disordered cavities, the transmission eigenvalue distribution is determined both by scattering at the disorder potential and the cavity walls. To properly account for this intermediate parameter regime, we introduce a hybrid crossover scheme, which combines previous models that are valid in the ballistic and the stochastic limit, respectively.

  18. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-01-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  19. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-03-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  20. Turbulent statistics and intermittency enhancement in coflowing superfluid 4He

    NASA Astrophysics Data System (ADS)

    Biferale, L.; Khomenko, D.; L'vov, V.; Pomyalov, A.; Procaccia, I.; Sahoo, G.

    2018-02-01

    The large-scale turbulent statistics of mechanically driven superfluid 4He was shown experimentally to follow the classical counterpart. In this paper, we use direct numerical simulations to study the whole range of scales in a range of temperatures T ∈[1.3 ,2.1 ] K. The numerics employ self-consistent and nonlinearly coupled normal and superfluid components. The main results are that (i) the velocity fluctuations of normal and super components are well correlated in the inertial range of scales, but decorrelate at small scales. (ii) The energy transfer by mutual friction between components is particulary efficient in the temperature range between 1.8 and 2 K, leading to enhancement of small-scale intermittency for these temperatures. (iii) At low T and close to Tλ, the scaling properties of the energy spectra and structure functions of the two components are approaching those of classical hydrodynamic turbulence.

  1. Quantum chaos: an introduction via chains of interacting spins-1/2

    NASA Astrophysics Data System (ADS)

    Gubin, Aviva; Santos, Lea

    2012-02-01

    We discuss aspects of quantum chaos by focusing on spectral statistical properties and structures of eigenstates of quantum many-body systems. Quantum systems whose classical counterparts are chaotic have properties that differ from those of quantum systems whose classical counterparts are regular. One of the main signatures of what became known as quantum chaos is a spectrum showing repulsion of the energy levels. We show how level repulsion may develop in one-dimensional systems of interacting spins-1/2 which are devoid of random elements and involve only two-body interactions. We present a simple recipe to unfold the spectrum and emphasize the importance of taking into account the symmetries of the system. In addition to the statistics of eigenvalues, we analyze also how the structure of the eigenstates may indicate chaos. This is done by computing quantities that measure the level of delocalization of the eigenstates.

  2. Finite-size effect on optimal efficiency of heat engines.

    PubMed

    Tajima, Hiroyasu; Hayashi, Masahito

    2017-07-01

    The optimal efficiency of quantum (or classical) heat engines whose heat baths are n-particle systems is given by the strong large deviation. We give the optimal work extraction process as a concrete energy-preserving unitary time evolution among the heat baths and the work storage. We show that our optimal work extraction turns the disordered energy of the heat baths to the ordered energy of the work storage, by evaluating the ratio of the entropy difference to the energy difference in the heat baths and the work storage, respectively. By comparing the statistical mechanical optimal efficiency with the macroscopic thermodynamic bound, we evaluate the accuracy of the macroscopic thermodynamics with finite-size heat baths from the statistical mechanical viewpoint. We also evaluate the quantum coherence effect on the optimal efficiency of the cycle processes without restricting their cycle time by comparing the classical and quantum optimal efficiencies.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gevorkyan, A. S., E-mail: g-ashot@sci.am; Sahakyan, V. V.

    We study the classical 1D Heisenberg spin glasses in the framework of nearest-neighboring model. Based on the Hamilton equations we obtained the system of recurrence equations which allows to perform node-by-node calculations of a spin-chain. It is shown that calculations from the first principles of classical mechanics lead to ℕℙ hard problem, that however in the limit of the statistical equilibrium can be calculated by ℙ algorithm. For the partition function of the ensemble a new representation is offered in the form of one-dimensional integral of spin-chains’ energy distribution.

  4. A classical density-functional theory for describing water interfaces.

    PubMed

    Hughes, Jessica; Krebs, Eric J; Roundy, David

    2013-01-14

    We develop a classical density functional for water which combines the White Bear fundamental-measure theory (FMT) functional for the hard sphere fluid with attractive interactions based on the statistical associating fluid theory variable range (SAFT-VR). This functional reproduces the properties of water at both long and short length scales over a wide range of temperatures and is computationally efficient, comparable to the cost of FMT itself. We demonstrate our functional by applying it to systems composed of two hard rods, four hard rods arranged in a square, and hard spheres in water.

  5. Quantum communication complexity advantage implies violation of a Bell inequality

    PubMed Central

    Buhrman, Harry; Czekaj, Łukasz; Grudka, Andrzej; Horodecki, Michał; Horodecki, Paweł; Markiewicz, Marcin; Speelman, Florian; Strelchuk, Sergii

    2016-01-01

    We obtain a general connection between a large quantum advantage in communication complexity and Bell nonlocality. We show that given any protocol offering a sufficiently large quantum advantage in communication complexity, there exists a way of obtaining measurement statistics that violate some Bell inequality. Our main tool is port-based teleportation. If the gap between quantum and classical communication complexity can grow arbitrarily large, the ratio of the quantum value to the classical value of the Bell quantity becomes unbounded with the increase in the number of inputs and outputs. PMID:26957600

  6. Exploring Attitudes of Indian Classical Singers Toward Seeking Vocal Health Care.

    PubMed

    Gunjawate, Dhanshree R; Aithal, Venkataraja U; Guddattu, Vasudeva; Kishore, Amrutha; Bellur, Rajashekhar

    2016-11-01

    The attitude of Indian classical singers toward seeking vocal health care is a dimension yet to be explored. The current study was aimed to determine the attitudes of these singers toward seeking vocal health care and further understand the influence of age and gender. Cross-sectional. A 10-item self-report questionnaire adapted from a study on contemporary commercial music singers was used. An additional question was added to ask if the singer was aware about the profession and role of speech-language pathologists (SLPs). The questionnaire was administered on 55 randomly selected self-identified trained Indian classical singers who rated the items using a five-point Likert scale. Demographic variables were summarized using descriptive statistics and t test was used to compare the mean scores between genders and age groups. Of the singers, 78.2% were likely to see a doctor for heath-related problems, whereas 81.8% were unlikely to seek medical care for voice-related problems; the difference was statistically significant (P < 0.001). Responses for the questions assessing the attitudes toward findings from medical examination by a specialist revealed a statistically significant difference (P = 0.02) between the genders. Age did not have a significant influence on the responses. Only 23.6% of the respondents were aware about the profession and the role of SLPs. The findings are in tune with western literature reporting hesitation of singers toward seeking vocal health care and draws attention of SLPs to promote their role in vocal health awareness and management. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  7. Comparison of classical statistical methods and artificial neural network in traffic noise prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nedic, Vladimir, E-mail: vnedic@kg.ac.rs; Despotovic, Danijela, E-mail: ddespotovic@kg.ac.rs; Cvetanovic, Slobodan, E-mail: slobodan.cvetanovic@eknfak.ni.ac.rs

    2014-11-15

    Traffic is the main source of noise in urban environments and significantly affects human mental and physical health and labor productivity. Therefore it is very important to model the noise produced by various vehicles. Techniques for traffic noise prediction are mainly based on regression analysis, which generally is not good enough to describe the trends of noise. In this paper the application of artificial neural networks (ANNs) for the prediction of traffic noise is presented. As input variables of the neural network, the proposed structure of the traffic flow and the average speed of the traffic flow are chosen. Themore » output variable of the network is the equivalent noise level in the given time period L{sub eq}. Based on these parameters, the network is modeled, trained and tested through a comparative analysis of the calculated values and measured levels of traffic noise using the originally developed user friendly software package. It is shown that the artificial neural networks can be a useful tool for the prediction of noise with sufficient accuracy. In addition, the measured values were also used to calculate equivalent noise level by means of classical methods, and comparative analysis is given. The results clearly show that ANN approach is superior in traffic noise level prediction to any other statistical method. - Highlights: • We proposed an ANN model for prediction of traffic noise. • We developed originally designed user friendly software package. • The results are compared with classical statistical methods. • The results are much better predictive capabilities of ANN model.« less

  8. Quantum-Like Representation of Non-Bayesian Inference

    NASA Astrophysics Data System (ADS)

    Asano, M.; Basieva, I.; Khrennikov, A.; Ohya, M.; Tanaka, Y.

    2013-01-01

    This research is related to the problem of "irrational decision making or inference" that have been discussed in cognitive psychology. There are some experimental studies, and these statistical data cannot be described by classical probability theory. The process of decision making generating these data cannot be reduced to the classical Bayesian inference. For this problem, a number of quantum-like coginitive models of decision making was proposed. Our previous work represented in a natural way the classical Bayesian inference in the frame work of quantum mechanics. By using this representation, in this paper, we try to discuss the non-Bayesian (irrational) inference that is biased by effects like the quantum interference. Further, we describe "psychological factor" disturbing "rationality" as an "environment" correlating with the "main system" of usual Bayesian inference.

  9. Non-classical Signature of Parametric Fluorescence and its Application in Metrology

    NASA Astrophysics Data System (ADS)

    Hamar, M.; Michálek, V.; Pathak, A.

    2014-08-01

    The article provides a short theoretical background of what the non-classical light means. We applied the criterion for the existence of non-classical effects derived by C.T. Lee on parametric fluorescence. The criterion was originally derived for the study of two light beams with one mode per beam. We checked if the criterion is still working for two multimode beams of parametric down-conversion through numerical simulations. The theoretical results were tested by measurement of photon number statistics of twin beams emitted by nonlinear BBO crystal pumped by intense femtoseconds UV pulse. We used ICCD camera as the detector of photons in both beams. It appears that the criterion can be used for the measurement of the quantum efficiencies of the ICCD cameras.

  10. Introduction

    NASA Astrophysics Data System (ADS)

    Cohen, E. G. D.

    Lecture notes are organized around the key word dissipation, while focusing on a presentation of modern theoretical developments in the study of irreversible phenomena. A broad cross-disciplinary perspective towards non-equilibrium statistical mechanics is backed by the general theory of nonlinear and complex dynamical systems. The classical-quantum intertwine and semiclassical dissipative borderline issue (decoherence, "classical out of quantum") are here included . Special emphasis is put on links between the theory of classical and quantum dynamical systems (temporal disorder, dynamical chaos and transport processes) with central problems of non-equilibrium statistical mechanics like e.g. the connection between dynamics and thermodynamics, relaxation towards equilibrium states and mechanisms capable to drive and next maintain the physical system far from equilibrium, in a non-equilibrium steady (stationary) state. The notion of an equilibrium state - towards which a system naturally evolves if left undisturbed - is a fundamental concept of equilibrium statistical mechanics. Taken as a primitive point of reference that allows to give an unambiguous status to near equilibrium and far from equilibrium systems, together with the dynamical notion of a relaxation (decay) towards a prescribed asymptotic invariant measure or probability distribution (properties of ergodicity and mixing are implicit). A related issue is to keep under control the process of driving a physical system away from an initial state of equilibrium and either keeping it in another (non-equilibrium) steady state or allowing to restore the initial data (return back, relax). To this end various models of environment (heat bath, reservoir, thermostat, measuring instrument etc.), and the environment - system coupling are analyzed. The central theme of the book is the dynamics of dissipation and various mechanisms responsible for the irreversible behaviour (transport properties) of open systems on classical and quantum levels of description. A distinguishing feature of these lecture notes is that microscopic foundations of irreversibility are investigated basically in terms of "small" systems, when the "system" and/or "environment" may have a finite (and small) number of degrees of freedom and may be bounded. This is to be contrasted with the casual understanding of statistical mechanics which is regarded to refer to systems with a very large number of degrees of freedom. In fact, it is commonly accepted that the accumulation of effects due to many (range of the Avogadro number) particles is required for statistical mechanics reasoning. Albeit those large numbers are not at all sufficient for transport properties. A helpful hint towards this conceptual turnover comes from the observation that for chaotic dynamical systems the random time evolution proves to be compatible with the underlying purely deterministic laws of motion. Chaotic features of the classical dynamics already appear in systems with two degrees of freedom and such systems need to be described in statistical terms, if we wish to quantify the dynamics of relaxation towards an invariant ergodic measure. The relaxation towards equilibrium finds a statistical description through an analysis of statistical ensembles. This entails an extension of the range of validity of statistical mechanics to small classical systems. On the other hand, the dynamics of fluctuations in macroscopic dissipative systems (due to their molecular composition and thermal mobility) may render a characterization of such systems as being chaotic. That motivates attempts of understanding the role of microscopic chaos and various "chaotic hypotheses" - dynamical systems approach is being pushed down to the level of atoms, molecules and complex matter constituents, whose natural substitute are low-dimensional model subsystems (encompassing as well the mesoscopic "quantum chaos") - in non-equilibrium transport phenomena. On the way a number of questions is addressed like e.g.: is there, or what is the nature of a connection between chaos (modern theory of dynamical systems) and irreversible thermodynamics; can really quantum chaos explain some peculiar features of quantum transport? The answer in both cases is positive, modulo a careful discrimination between viewing the dynamical chaos as a necessary or sufficient basis for irreversibility. In those dynamical contexts, another key term dynamical semigroups refers to major technical tools appropriate for the "dissipative mathematics", modelling irreversible behaviour on the classical and quantum levels of description. Dynamical systems theory and "quantum chaos" research involve both a high level of mathematical sophistication and heavy computer "experimentation". One of the present volume specific flavors is a tutorial access to quite advanced mathematical tools. They gradually penetrate the classical and quantum dynamical semigroup description, while culminating in the noncommutative Brillouin zone construction as a prerequisite to understand transport in aperiodic solids. Lecture notes are structured into chapters to give a better insight into major conceptual streamlines. Chapter I is devoted to a discussion of non-equilibrium steady states and, through so-called chaotic hypothesis combined with suitable fluctuation theorems, elucidates the role of Sinai-Ruelle-Bowen distribution in both equilibrium and non-equilibrium statistical physics frameworks (E. G. D. Cohen). Links between dynamics and statistics (Boltzmann versus Tsallis) are also discussed. Fluctuation relations and a survey of deterministic thermostats are given in the context of non-equilibrium steady states of fluids (L. Rondoni). Response of systems driven far from equilibrium is analyzed on the basis of a central assertion about the existence of the statistical representation in terms of an ensemble of dynamical realizations of the driving process. Non-equilibrium work relation is deduced for irreversible processes (C. Jarzynski). The survey of non-equilibrium steady states in statistical mechanics of classical and quantum systems employs heat bath models and the random matrix theory input. The quantum heat bath analysis and derivation of fluctuation-dissipation theorems is performed by means of the influence functional technique adopted to solve quantum master equations (D. Kusnezov). Chapter II deals with an issue of relaxation and its dynamical theory in both classical and quantum contexts. Pollicott-Ruelle resonance background for the exponential decay scenario is discussed for irreversible processes of diffusion in the Lorentz gas and multibaker models (P. Gaspard). The Pollicott-Ruelle theory reappears as a major inspiration in the survey of the behaviour of ensembles of chaotic systems, with a focus on model systems for which no rigorous results concerning the exponential decay of correlations in time is available (S. Fishman). The observation, that non-equilibrium transport processes in simple classical chaotic systems can be described in terms of fractal structures developing in the system phase space, links their formation and properties with the entropy production in the course of diffusion processes displaying a low dimensional deterministic (chaotic) origin (J. R. Dorfman). Chapter III offers an introduction to the theory of dynamical semigroups. Asymptotic properties of Markov operators and Markov semigroups acting in the set of probability densities (statistical ensemble notion is implicit) are analyzed. Ergodicity, mixing, strong (complete) mixing and sweeping are discussed in the familiar setting of "noise, chaos and fractals" (R. Rudnicki). The next step comprises a passage to quantum dynamical semigroups and completely positive dynamical maps, with an ultimate goal to introduce a consistent framework for the analysis of irreversible phenomena in open quantum systems, where dissipation and decoherence are crucial concepts (R. Alicki). Friction and damping in classical and quantum mechanics of finite dissipative systems is analyzed by means of Markovian quantum semigroups with special emphasis on the issue of complete positivity (M. Fannes). Specific two-level model systems of elementary particle physics (kaons) and rudiments of neutron interferometry are employed to elucidate a distinction between positivity and complete positivity (F. Benatti). Quantization of dynamics of stochastic models related to equilibrium Gibbs states results in dynamical maps which form quantum stochastic dynamical semigroups (W. A. Majewski). Chapter IV addresses diverse but deeply interrelated features of driven chaotic (mesoscopic) classical and quantum systems, their dissipative properties, notions of quantum irreversibility, entanglement, dephasing and decoherence. A survey of non-perturbative quantum effects for open quantum systems is concluded by outlining the discrepancies between random matrix theory and non-perturbative semiclassical predictions (D. Cohen). As a useful supplement to the subject of bounded open systems, methods of quantum state control in a cavity (coherent versus incoherent dynamics and dissipation) are described for low dimensional quantum systems (A. Buchleitner). The dynamics of open quantum systems can be alternatively described by means of non-Markovian stochastic Schrödinger equation, jointly for an open system and its environment, which moves us beyond the Linblad evolution scenario of Markovian dynamical semigroups. The quantum Brownian motion is considered (W. Strunz) . Chapter V enforces a conceptual transition 'from "small" to "large" systems with emphasis on irreversible thermodynamics of quantum transport. Typical features of the statistical mechanics of infinitely extended systems and the dynamical (small) systems approach are described by means of representative examples of relaxation towards asymptotic steady states: quantum one-dimensional lattice conductor and an open multibaker map (S. Tasaki). Dissipative transport in aperiodic solids is reviewed by invoking methods on noncommutative geometry. The anomalous Drude formula is derived. The occurence of quantum chaos is discussed together with its main consequences (J. Bellissard). The chapter is concluded by a survey of scaling limits of the N-body Schrödinger quantum dynamics, where classical evolution equations of irreversible statistical mechanics (linear Boltzmann, Hartree, Vlasov) emerge "out of quantum". In particular, a scaling limit of one body quantum dynamics with impurities (static random potential) and that of quantum dynamics with weakly coupled phonons are shown to yield the linear Boltzmann equation (L. Erdös). Various interrelations between chapters and individual lectures, plus a detailed fine-tuned information about the subject matter coverage of the volume, can be recovered by examining an extensive index.

  11. Hearing the shape of the Ising model with a programmable superconducting-flux annealer.

    PubMed

    Vinci, Walter; Markström, Klas; Boixo, Sergio; Roy, Aidan; Spedalieri, Federico M; Warburton, Paul A; Severini, Simone

    2014-07-16

    Two objects can be distinguished if they have different measurable properties. Thus, distinguishability depends on the Physics of the objects. In considering graphs, we revisit the Ising model as a framework to define physically meaningful spectral invariants. In this context, we introduce a family of refinements of the classical spectrum and consider the quantum partition function. We demonstrate that the energy spectrum of the quantum Ising Hamiltonian is a stronger invariant than the classical one without refinements. For the purpose of implementing the related physical systems, we perform experiments on a programmable annealer with superconducting flux technology. Departing from the paradigm of adiabatic computation, we take advantage of a noisy evolution of the device to generate statistics of low energy states. The graphs considered in the experiments have the same classical partition functions, but different quantum spectra. The data obtained from the annealer distinguish non-isomorphic graphs via information contained in the classical refinements of the functions but not via the differences in the quantum spectra.

  12. Minimized state complexity of quantum-encoded cryptic processes

    NASA Astrophysics Data System (ADS)

    Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.

    2016-05-01

    The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.

  13. Single-snapshot DOA estimation by using Compressed Sensing

    NASA Astrophysics Data System (ADS)

    Fortunati, Stefano; Grasso, Raffaele; Gini, Fulvio; Greco, Maria S.; LePage, Kevin

    2014-12-01

    This paper deals with the problem of estimating the directions of arrival (DOA) of multiple source signals from a single observation vector of an array data. In particular, four estimation algorithms based on the theory of compressed sensing (CS), i.e., the classical ℓ 1 minimization (or Least Absolute Shrinkage and Selection Operator, LASSO), the fast smooth ℓ 0 minimization, and the Sparse Iterative Covariance-Based Estimator, SPICE and the Iterative Adaptive Approach for Amplitude and Phase Estimation, IAA-APES algorithms, are analyzed, and their statistical properties are investigated and compared with the classical Fourier beamformer (FB) in different simulated scenarios. We show that unlike the classical FB, a CS-based beamformer (CSB) has some desirable properties typical of the adaptive algorithms (e.g., Capon and MUSIC) even in the single snapshot case. Particular attention is devoted to the super-resolution property. Theoretical arguments and simulation analysis provide evidence that a CS-based beamformer can achieve resolution beyond the classical Rayleigh limit. Finally, the theoretical findings are validated by processing a real sonar dataset.

  14. Canonical partition functions: ideal quantum gases, interacting classical gases, and interacting quantum gases

    NASA Astrophysics Data System (ADS)

    Zhou, Chi-Chun; Dai, Wu-Sheng

    2018-02-01

    In statistical mechanics, for a system with a fixed number of particles, e.g. a finite-size system, strictly speaking, the thermodynamic quantity needs to be calculated in the canonical ensemble. Nevertheless, the calculation of the canonical partition function is difficult. In this paper, based on the mathematical theory of the symmetric function, we suggest a method for the calculation of the canonical partition function of ideal quantum gases, including ideal Bose, Fermi, and Gentile gases. Moreover, we express the canonical partition functions of interacting classical and quantum gases given by the classical and quantum cluster expansion methods in terms of the Bell polynomial in mathematics. The virial coefficients of ideal Bose, Fermi, and Gentile gases are calculated from the exact canonical partition function. The virial coefficients of interacting classical and quantum gases are calculated from the canonical partition function by using the expansion of the Bell polynomial, rather than calculated from the grand canonical potential.

  15. The modification of generalized uncertainty principle applied in the detection technique of femtosecond laser

    NASA Astrophysics Data System (ADS)

    Li, Ziyi

    2017-12-01

    Generalized uncertainty principle (GUP), also known as the generalized uncertainty relationship, is the modified form of the classical Heisenberg’s Uncertainty Principle in special cases. When we apply quantum gravity theories such as the string theory, the theoretical results suggested that there should be a “minimum length of observation”, which is about the size of the Planck-scale (10-35m). Taking into account the basic scale of existence, we need to fix a new common form of Heisenberg’s uncertainty principle in the thermodynamic system and make effective corrections to statistical physical questions concerning about the quantum density of states. Especially for the condition at high temperature and high energy levels, generalized uncertainty calculations have a disruptive impact on classical statistical physical theories but the present theory of Femtosecond laser is still established on the classical Heisenberg’s Uncertainty Principle. In order to improve the detective accuracy and temporal resolution of the Femtosecond laser, we applied the modified form of generalized uncertainty principle to the wavelength, energy and pulse time of Femtosecond laser in our work. And we designed three typical systems from micro to macro size to estimate the feasibility of our theoretical model and method, respectively in the chemical solution condition, crystal lattice condition and nuclear fission reactor condition.

  16. Dielectric properties of classical and quantized ionic fluids.

    PubMed

    Høye, Johan S

    2010-06-01

    We study time-dependent correlation functions of classical and quantum gases using methods of equilibrium statistical mechanics for systems of uniform as well as nonuniform densities. The basis for our approach is the path integral formalism of quantum mechanical systems. With this approach the statistical mechanics of a quantum mechanical system becomes the equivalent of a classical polymer problem in four dimensions where imaginary time is the fourth dimension. Several nontrivial results for quantum systems have been obtained earlier by this analogy. Here, we will focus upon the presence of a time-dependent electromagnetic pair interaction where the electromagnetic vector potential that depends upon currents, will be present. Thus both density and current correlations are needed to evaluate the influence of this interaction. Then we utilize that densities and currents can be expressed by polarizations by which the ionic fluid can be regarded as a dielectric one for which a nonlocal susceptibility is found. This nonlocality has as a consequence that we find no contribution from a possible transverse electric zero-frequency mode for the Casimir force between metallic plates. Further, we establish expressions for a leading correction to ab initio calculations for the energies of the quantized electrons of molecules where now retardation effects also are taken into account.

  17. Nonparametric functional data estimation applied to ozone data: prediction and extreme value analysis.

    PubMed

    Quintela-del-Río, Alejandro; Francisco-Fernández, Mario

    2011-02-01

    The study of extreme values and prediction of ozone data is an important topic of research when dealing with environmental problems. Classical extreme value theory is usually used in air-pollution studies. It consists in fitting a parametric generalised extreme value (GEV) distribution to a data set of extreme values, and using the estimated distribution to compute return levels and other quantities of interest. Here, we propose to estimate these values using nonparametric functional data methods. Functional data analysis is a relatively new statistical methodology that generally deals with data consisting of curves or multi-dimensional variables. In this paper, we use this technique, jointly with nonparametric curve estimation, to provide alternatives to the usual parametric statistical tools. The nonparametric estimators are applied to real samples of maximum ozone values obtained from several monitoring stations belonging to the Automatic Urban and Rural Network (AURN) in the UK. The results show that nonparametric estimators work satisfactorily, outperforming the behaviour of classical parametric estimators. Functional data analysis is also used to predict stratospheric ozone concentrations. We show an application, using the data set of mean monthly ozone concentrations in Arosa, Switzerland, and the results are compared with those obtained by classical time series (ARIMA) analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.

  18. Playing-related disabling musculoskeletal disorders in young and adult classical piano students.

    PubMed

    Bruno, S; Lorusso, A; L'Abbate, N

    2008-07-01

    To determine the prevalence of instrument-related musculoskeletal problems in classical piano students and investigate piano-specific risk factors. A specially developed four parts questionnaire was administered to classical piano students of two Apulian conservatories, in southern Italy. A cross-sectional design was used. Prevalences of playing related musculoskeletal disorders (MSDs) were calculated and cases were compared with non-cases. A total of 195 out of the 224 piano students responded (87%). Among 195 responders, 75 (38.4%) were considered affected according to the pre-established criteria. Disabling MSDs showed similar prevalence rates for neck (29.3%), thoracic spine (21.3%) and upper limbs (from 20.0 to 30.4%) in the affected group. Univariate analyses showed statistical differences concerning mean age, number of hours per week spent playing, more than 60 min of continuative playing without breaks, lack of sport practice and acceptability of "No pain, no gain" criterion in students with music-related pain compared with pianists not affected. Statistical correlation was found only between upper limbs diseases in pianists and hand sizes. No correlation with the model of piano played was found in the affected group. The multivariate analyses performed by logistic regression confirmed the independent correlation of the risk factors age, lack of sport practice and acceptability of "No pain, no gain" criterion. Our study showed MSDs to be a common problem among classical piano students. With variance in several studies reported, older students appeared to be more frequently affected by disabling MSDs and no difference in the prevalence rate of the disorders was found in females.

  19. Objective Dysphonia Quantification in Vocal Fold Paralysis: Comparing Nonlinear with Classical Measures

    PubMed Central

    Little, Max A.; Costello, Declan A. E.; Harries, Meredydd L.

    2010-01-01

    Summary Clinical acoustic voice-recording analysis is usually performed using classical perturbation measures, including jitter, shimmer, and noise-to-harmonic ratios (NHRs). However, restrictive mathematical limitations of these measures prevent analysis for severely dysphonic voices. Previous studies of alternative nonlinear random measures addressed wide varieties of vocal pathologies. Here, we analyze a single vocal pathology cohort, testing the performance of these alternative measures alongside classical measures. We present voice analysis pre- and postoperatively in 17 patients with unilateral vocal fold paralysis (UVFP). The patients underwent standard medialization thyroplasty surgery, and the voices were analyzed using jitter, shimmer, NHR, nonlinear recurrence period density entropy (RPDE), detrended fluctuation analysis (DFA), and correlation dimension. In addition, we similarly analyzed 11 healthy controls. Systematizing the preanalysis editing of the recordings, we found that the novel measures were more stable and, hence, reliable than the classical measures on healthy controls. RPDE and jitter are sensitive to improvements pre- to postoperation. Shimmer, NHR, and DFA showed no significant change (P > 0.05). All measures detect statistically significant and clinically important differences between controls and patients, both treated and untreated (P < 0.001, area under curve [AUC] > 0.7). Pre- to postoperation grade, roughness, breathiness, asthenia, and strain (GRBAS) ratings show statistically significant and clinically important improvement in overall dysphonia grade (G) (AUC = 0.946, P < 0.001). Recalculating AUCs from other study data, we compare these results in terms of clinical importance. We conclude that, when preanalysis editing is systematized, nonlinear random measures may be useful for monitoring UVFP-treatment effectiveness, and there may be applications to other forms of dysphonia. PMID:19900790

  20. Quantifying the statistical importance of utilizing regression over classic energy intensity calculations for tracking efficiency improvements in industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nimbalkar, Sachin U.; Wenning, Thomas J.; Guo, Wei

    In the United States, manufacturing facilities account for about 32% of total domestic energy consumption in 2014. Robust energy tracking methodologies are critical to understanding energy performance in manufacturing facilities. Due to its simplicity and intuitiveness, the classic energy intensity method (i.e. the ratio of total energy use over total production) is the most widely adopted. However, the classic energy intensity method does not take into account the variation of other relevant parameters (i.e. product type, feed stock type, weather, etc.). Furthermore, the energy intensity method assumes that the facilities’ base energy consumption (energy use at zero production) is zero,more » which rarely holds true. Therefore, it is commonly recommended to utilize regression models rather than the energy intensity approach for tracking improvements at the facility level. Unfortunately, many energy managers have difficulties understanding why regression models are statistically better than utilizing the classic energy intensity method. While anecdotes and qualitative information may convince some, many have major reservations about the accuracy of regression models and whether it is worth the time and effort to gather data and build quality regression models. This paper will explain why regression models are theoretically and quantitatively more accurate for tracking energy performance improvements. Based on the analysis of data from 114 manufacturing plants over 12 years, this paper will present quantitative results on the importance of utilizing regression models over the energy intensity methodology. This paper will also document scenarios where regression models do not have significant relevance over the energy intensity method.« less

  1. A Gaussian wave packet phase-space representation of quantum canonical statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coughtrie, David J.; Tew, David P.

    2015-07-28

    We present a mapping of quantum canonical statistical averages onto a phase-space average over thawed Gaussian wave-packet (GWP) parameters, which is exact for harmonic systems at all temperatures. The mapping invokes an effective potential surface, experienced by the wave packets, and a temperature-dependent phase-space integrand, to correctly transition from the GWP average at low temperature to classical statistics at high temperature. Numerical tests on weakly and strongly anharmonic model systems demonstrate that thermal averages of the system energy and geometric properties are accurate to within 1% of the exact quantum values at all temperatures.

  2. [Bayesian statistics in medicine -- part II: main applications and inference].

    PubMed

    Montomoli, C; Nichelatti, M

    2008-01-01

    Bayesian statistics is not only used when one is dealing with 2-way tables, but it can be used for inferential purposes. Using the basic concepts presented in the first part, this paper aims to give a simple overview of Bayesian methods by introducing its foundation (Bayes' theorem) and then applying this rule to a very simple practical example; whenever possible, the elementary processes at the basis of analysis are compared to those of frequentist (classical) statistical analysis. The Bayesian reasoning is naturally connected to medical activity, since it appears to be quite similar to a diagnostic process.

  3. Models of dyadic social interaction.

    PubMed Central

    Griffin, Dale; Gonzalez, Richard

    2003-01-01

    We discuss the logic of research designs for dyadic interaction and present statistical models with parameters that are tied to psychologically relevant constructs. Building on Karl Pearson's classic nineteenth-century statistical analysis of within-organism similarity, we describe several approaches to indexing dyadic interdependence and provide graphical methods for visualizing dyadic data. We also describe several statistical and conceptual solutions to the 'levels of analytic' problem in analysing dyadic data. These analytic strategies allow the researcher to examine and measure psychological questions of interdependence and social influence. We provide illustrative data from casually interacting and romantic dyads. PMID:12689382

  4. Fisher, Neyman, and Bayes at FDA.

    PubMed

    Rubin, Donald B

    2016-01-01

    The wise use of statistical ideas in practice essentially requires some Bayesian thinking, in contrast to the classical rigid frequentist dogma. This dogma too often has seemed to influence the applications of statistics, even at agencies like the FDA. Greg Campbell was one of the most important advocates there for more nuanced modes of thought, especially Bayesian statistics. Because two brilliant statisticians, Ronald Fisher and Jerzy Neyman, are often credited with instilling the traditional frequentist approach in current practice, I argue that both men were actually seeking very Bayesian answers, and neither would have endorsed the rigid application of their ideas.

  5. Classical boson sampling algorithms with superior performance to near-term experiments

    NASA Astrophysics Data System (ADS)

    Neville, Alex; Sparrow, Chris; Clifford, Raphaël; Johnston, Eric; Birchall, Patrick M.; Montanaro, Ashley; Laing, Anthony

    2017-12-01

    It is predicted that quantum computers will dramatically outperform their conventional counterparts. However, large-scale universal quantum computers are yet to be built. Boson sampling is a rudimentary quantum algorithm tailored to the platform of linear optics, which has sparked interest as a rapid way to demonstrate such quantum supremacy. Photon statistics are governed by intractable matrix functions, which suggests that sampling from the distribution obtained by injecting photons into a linear optical network could be solved more quickly by a photonic experiment than by a classical computer. The apparently low resource requirements for large boson sampling experiments have raised expectations of a near-term demonstration of quantum supremacy by boson sampling. Here we present classical boson sampling algorithms and theoretical analyses of prospects for scaling boson sampling experiments, showing that near-term quantum supremacy via boson sampling is unlikely. Our classical algorithm, based on Metropolised independence sampling, allowed the boson sampling problem to be solved for 30 photons with standard computing hardware. Compared to current experiments, a demonstration of quantum supremacy over a successful implementation of these classical methods on a supercomputer would require the number of photons and experimental components to increase by orders of magnitude, while tackling exponentially scaling photon loss.

  6. Universal scaling for the quantum Ising chain with a classical impurity

    NASA Astrophysics Data System (ADS)

    Apollaro, Tony J. G.; Francica, Gianluca; Giuliano, Domenico; Falcone, Giovanni; Palma, G. Massimo; Plastina, Francesco

    2017-10-01

    We study finite-size scaling for the magnetic observables of an impurity residing at the end point of an open quantum Ising chain with transverse magnetic field, realized by locally rescaling the field by a factor μ ≠1 . In the homogeneous chain limit at μ =1 , we find the expected finite-size scaling for the longitudinal impurity magnetization, with no specific scaling for the transverse magnetization. At variance, in the classical impurity limit μ =0 , we recover finite scaling for the longitudinal magnetization, while the transverse one basically does not scale. We provide both analytic approximate expressions for the magnetization and the susceptibility as well as numerical evidences for the scaling behavior. At intermediate values of μ , finite-size scaling is violated, and we provide a possible explanation of this result in terms of the appearance of a second, impurity-related length scale. Finally, by going along the standard quantum-to-classical mapping between statistical models, we derive the classical counterpart of the quantum Ising chain with an end-point impurity as a classical Ising model on a square lattice wrapped on a half-infinite cylinder, with the links along the first circle modified as a function of μ .

  7. Statistical speed of quantum states: Generalized quantum Fisher information and Schatten speed

    NASA Astrophysics Data System (ADS)

    Gessner, Manuel; Smerzi, Augusto

    2018-02-01

    We analyze families of measures for the quantum statistical speed which include as special cases the quantum Fisher information, the trace speed, i.e., the quantum statistical speed obtained from the trace distance, and more general quantifiers obtained from the family of Schatten norms. These measures quantify the statistical speed under generic quantum evolutions and are obtained by maximizing classical measures over all possible quantum measurements. We discuss general properties, optimal measurements, and upper bounds on the speed of separable states. We further provide a physical interpretation for the trace speed by linking it to an analog of the quantum Cramér-Rao bound for median-unbiased quantum phase estimation.

  8. Quantum work in the Bohmian framework

    NASA Astrophysics Data System (ADS)

    Sampaio, R.; Suomela, S.; Ala-Nissila, T.; Anders, J.; Philbin, T. G.

    2018-01-01

    At nonzero temperature classical systems exhibit statistical fluctuations of thermodynamic quantities arising from the variation of the system's initial conditions and its interaction with the environment. The fluctuating work, for example, is characterized by the ensemble of system trajectories in phase space and, by including the probabilities for various trajectories to occur, a work distribution can be constructed. However, without phase-space trajectories, the task of constructing a work probability distribution in the quantum regime has proven elusive. Here we use quantum trajectories in phase space and define fluctuating work as power integrated along the trajectories, in complete analogy to classical statistical physics. The resulting work probability distribution is valid for any quantum evolution, including cases with coherences in the energy basis. We demonstrate the quantum work probability distribution and its properties with an exactly solvable example of a driven quantum harmonic oscillator. An important feature of the work distribution is its dependence on the initial statistical mixture of pure states, which is reflected in higher moments of the work. The proposed approach introduces a fundamentally different perspective on quantum thermodynamics, allowing full thermodynamic characterization of the dynamics of quantum systems, including the measurement process.

  9. Classical subjective expected utility.

    PubMed

    Cerreia-Vioglio, Simone; Maccheroni, Fabio; Marinacci, Massimo; Montrucchio, Luigi

    2013-04-23

    We consider decision makers who know that payoff-relevant observations are generated by a process that belongs to a given class M, as postulated in Wald [Wald A (1950) Statistical Decision Functions (Wiley, New York)]. We incorporate this Waldean piece of objective information within an otherwise subjective setting à la Savage [Savage LJ (1954) The Foundations of Statistics (Wiley, New York)] and show that this leads to a two-stage subjective expected utility model that accounts for both state and model uncertainty.

  10. Specificity and timescales of cortical adaptation as inferences about natural movie statistics.

    PubMed

    Snow, Michoel; Coen-Cagli, Ruben; Schwartz, Odelia

    2016-10-01

    Adaptation is a phenomenological umbrella term under which a variety of temporal contextual effects are grouped. Previous models have shown that some aspects of visual adaptation reflect optimal processing of dynamic visual inputs, suggesting that adaptation should be tuned to the properties of natural visual inputs. However, the link between natural dynamic inputs and adaptation is poorly understood. Here, we extend a previously developed Bayesian modeling framework for spatial contextual effects to the temporal domain. The model learns temporal statistical regularities of natural movies and links these statistics to adaptation in primary visual cortex via divisive normalization, a ubiquitous neural computation. In particular, the model divisively normalizes the present visual input by the past visual inputs only to the degree that these are inferred to be statistically dependent. We show that this flexible form of normalization reproduces classical findings on how brief adaptation affects neuronal selectivity. Furthermore, prior knowledge acquired by the Bayesian model from natural movies can be modified by prolonged exposure to novel visual stimuli. We show that this updating can explain classical results on contrast adaptation. We also simulate the recent finding that adaptation maintains population homeostasis, namely, a balanced level of activity across a population of neurons with different orientation preferences. Consistent with previous disparate observations, our work further clarifies the influence of stimulus-specific and neuronal-specific normalization signals in adaptation.

  11. Specificity and timescales of cortical adaptation as inferences about natural movie statistics

    PubMed Central

    Snow, Michoel; Coen-Cagli, Ruben; Schwartz, Odelia

    2016-01-01

    Adaptation is a phenomenological umbrella term under which a variety of temporal contextual effects are grouped. Previous models have shown that some aspects of visual adaptation reflect optimal processing of dynamic visual inputs, suggesting that adaptation should be tuned to the properties of natural visual inputs. However, the link between natural dynamic inputs and adaptation is poorly understood. Here, we extend a previously developed Bayesian modeling framework for spatial contextual effects to the temporal domain. The model learns temporal statistical regularities of natural movies and links these statistics to adaptation in primary visual cortex via divisive normalization, a ubiquitous neural computation. In particular, the model divisively normalizes the present visual input by the past visual inputs only to the degree that these are inferred to be statistically dependent. We show that this flexible form of normalization reproduces classical findings on how brief adaptation affects neuronal selectivity. Furthermore, prior knowledge acquired by the Bayesian model from natural movies can be modified by prolonged exposure to novel visual stimuli. We show that this updating can explain classical results on contrast adaptation. We also simulate the recent finding that adaptation maintains population homeostasis, namely, a balanced level of activity across a population of neurons with different orientation preferences. Consistent with previous disparate observations, our work further clarifies the influence of stimulus-specific and neuronal-specific normalization signals in adaptation. PMID:27699416

  12. Influence of complaints and singing style in singers voice handicap.

    PubMed

    Moreti, Felipe; Ávila, Maria Emília Barros de; Rocha, Clara; Borrego, Maria Cristina de Menezes; Oliveira, Gisele; Behlau, Mara

    2012-01-01

    The aim of this research was to verify whether the difference of singing styles and the presence of vocal complaints influence the perception of voice handicap of singers. One hundred eighteen singing voice handicap self-assessment protocols were selected: 17 popular singers with vocal complaints, 42 popular singers without complaints, 17 classic singers with complaints, and 42 classic singers without complaints. The groups were similar regarding age, gender and voice types. Both protocols used--Modern Singing Handicap Index (MSHI) and Classical Singing Handicap Index (CSHI)--have specific questions to their respective singing styles, and consist of 30 items equally divided into three subscales: disability (functional domain), handicap (emotional domain) and impairment (organic domain), answered according to the frequency of occurrence. Each subscale has a maximum of 40 points, and the total score is 120 points. The higher the score, the higher the singing voice handicap perceived. For statistical analysis, we used the ANOVA test, with 5% of significance. Classical and popular singers referred higher impairment, followed by disability and handicap. However, the degree of this perception varied according to the singing style and the presence of vocal complaints. The classical singers with vocal complaints showed higher voice handicap than popular singers with vocal complaints, while the classic singers without complaints reported lower handicap than popular singers without complaints. This evidences that classical singers have higher perception of their own voice, and that vocal disturbances in this group may cause greater voice handicap when compared to popular singers.

  13. Quantum walks with tuneable self-avoidance in one dimension

    PubMed Central

    Camilleri, Elizabeth; Rohde, Peter P.; Twamley, Jason

    2014-01-01

    Quantum walks exhibit many unique characteristics compared to classical random walks. In the classical setting, self-avoiding random walks have been studied as a variation on the usual classical random walk. Here the walker has memory of its previous locations and preferentially avoids stepping back to locations where it has previously resided. Classical self-avoiding random walks have found numerous algorithmic applications, most notably in the modelling of protein folding. We consider the analogous problem in the quantum setting – a quantum walk in one dimension with tunable levels of self-avoidance. We complement a quantum walk with a memory register that records where the walker has previously resided. The walker is then able to avoid returning back to previously visited sites or apply more general memory conditioned operations to control the walk. We characterise this walk by examining the variance of the walker's distribution against time, the standard metric for quantifying how quantum or classical a walk is. We parameterise the strength of the memory recording and the strength of the memory back-action on the walker, and investigate their effect on the dynamics of the walk. We find that by manipulating these parameters, which dictate the degree of self-avoidance, the walk can be made to reproduce ideal quantum or classical random walk statistics, or a plethora of more elaborate diffusive phenomena. In some parameter regimes we observe a close correspondence between classical self-avoiding random walks and the quantum self-avoiding walk. PMID:24762398

  14. Classical molecular dynamics simulations for non-equilibrium correlated plasmas

    NASA Astrophysics Data System (ADS)

    Ferri, S.; Calisti, A.; Talin, B.

    2017-03-01

    A classical molecular dynamics model was recently extended to simulate neutral multi-component plasmas where various charge states of the same atom and electrons coexist. It is used to investigate the plasma effects on the ion charge and on the ionization potential in dense plasmas. Different simulated statistical properties will show that the concept of isolated particles is lost in such correlated plasmas. The charge equilibration is discussed for a carbon plasma at solid density and investigation on the charge distribution and on the ionization potential depression (IPD) for aluminum plasmas is discussed with reference to existing experiments.

  15. Signatures of chaos in the Brillouin zone.

    PubMed

    Barr, Aaron; Barr, Ariel; Porter, Max D; Reichl, Linda E

    2017-10-01

    When the classical dynamics of a particle in a finite two-dimensional billiard undergoes a transition to chaos, the quantum dynamics of the particle also shows manifestations of chaos in the form of scarring of wave functions and changes in energy level spacing distributions. If we "tile" an infinite plane with such billiards, we find that the Bloch states on the lattice undergo avoided crossings, energy level spacing statistics change from Poisson-like to Wigner-like, and energy sheets of the Brillouin zone begin to "mix" as the classical dynamics of the billiard changes from regular to chaotic behavior.

  16. ASSESSING THE IMPACTS OF ANTHROPOGENIC STRESSORS ON MACROINVERTEBRATE INDICATORS IN OHIO

    EPA Science Inventory

    In the past few years, there has been increasing interest in using biological community data to provide information about specific anthropogenic factors impacting streams. Previous studies have used statistical approaches that are variants of classical and modern multiple regres...

  17. Performance Characterization of an Instrument.

    ERIC Educational Resources Information Center

    Salin, Eric D.

    1984-01-01

    Describes an experiment designed to teach students to apply the same statistical awareness to instrumentation they commonly apply to classical techniques. Uses propagation of error techniques to pinpoint instrumental limitations and breakdowns and to demonstrate capabilities and limitations of volumetric and gravimetric methods. Provides lists of…

  18. Analytic Methods for Adjusting Subjective Rating Schemes.

    ERIC Educational Resources Information Center

    Cooper, Richard V. L.; Nelson, Gary R.

    Statistical and econometric techniques of correcting for supervisor bias in models of individual performance appraisal were developed, using a variant of the classical linear regression model. Location bias occurs when individual performance is systematically overestimated or underestimated, while scale bias results when raters either exaggerate…

  19. Polymer Principles in the Undergraduate Physical Chemistry Course. Part 2.

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 1985

    1985-01-01

    Part l (SE 538 305) covered application of classical thermodynamics, polymer crystallinity, and phase diagrams to teaching physical chemistry. This part covers statistical thermodynamics, conformation, molecular weights, rubber elasticity and viscoelasticity, and kinetics of polymerization. Eight polymer-oriented, multiple-choice test questions…

  20. Semi-classical statistical description of Fröhlich condensation.

    PubMed

    Preto, Jordane

    2017-06-01

    Fröhlich's model equations describing phonon condensation in open systems of biological relevance are reinvestigated within a semi-classical statistical framework. The main assumptions needed to deduce Fröhlich's rate equations are identified and it is shown how they lead us to write an appropriate form for the corresponding master equation. It is shown how solutions of the master equation can be numerically computed and can highlight typical features of the condensation effect. Our approach provides much more information compared to the existing ones as it allows to investigate the time evolution of the probability density function instead of following single averaged quantities. The current work is also motivated, on the one hand, by recent experimental evidences of long-lived excited modes in the protein structure of hen-egg white lysozyme, which were reported as a consequence of the condensation effect, and, on the other hand, by a growing interest in investigating long-range effects of electromagnetic origin and their influence on the dynamics of biochemical reactions.

  1. A Monte Carlo–Based Bayesian Approach for Measuring Agreement in a Qualitative Scale

    PubMed Central

    Pérez Sánchez, Carlos Javier

    2014-01-01

    Agreement analysis has been an active research area whose techniques have been widely applied in psychology and other fields. However, statistical agreement among raters has been mainly considered from a classical statistics point of view. Bayesian methodology is a viable alternative that allows the inclusion of subjective initial information coming from expert opinions, personal judgments, or historical data. A Bayesian approach is proposed by providing a unified Monte Carlo–based framework to estimate all types of measures of agreement in a qualitative scale of response. The approach is conceptually simple and it has a low computational cost. Both informative and non-informative scenarios are considered. In case no initial information is available, the results are in line with the classical methodology, but providing more information on the measures of agreement. For the informative case, some guidelines are presented to elicitate the prior distribution. The approach has been applied to two applications related to schizophrenia diagnosis and sensory analysis. PMID:29881002

  2. Book Review: Maxwell's Demon 2: Entropy, classical and quantum information, computing. Harvey Leff and Andrew Rex (Eds.); Institute of Physics, Bristol, 2003, 500pp., US 55, ISBN 0750307595

    NASA Astrophysics Data System (ADS)

    Shenker, Orly R.

    2004-09-01

    In 1867, James Clerk Maxwell proposed a perpetuum mobile of the second kind, that is, a counter example for the Second Law of thermodynamics, which came to be known as "Maxwell's Demon." Unlike any other perpetual motion machine, this one escaped attempts by the best scientists and philosophers to show that the Second Law or its statistical mechanical counterparts are universal after all. "Maxwell's demon lives on. After more than 130 years of uncertain life and at least two pronouncements of death, this fanciful character seems more vibrant than ever." These words of Harvey Leff and Andrew Rex (1990), which open their introduction to Maxwell's Demon 2: Entropy, Classical and Quantum Information, Computing (hereafter MD2) are very true: the Demon is as challenging and as intriguing as ever, and forces us to think and rethink about the foundations of thermodynamics and of statistical mechanics.

  3. Space-time models based on random fields with local interactions

    NASA Astrophysics Data System (ADS)

    Hristopulos, Dionissios T.; Tsantili, Ivi C.

    2016-08-01

    The analysis of space-time data from complex, real-life phenomena requires the use of flexible and physically motivated covariance functions. In most cases, it is not possible to explicitly solve the equations of motion for the fields or the respective covariance functions. In the statistical literature, covariance functions are often based on mathematical constructions. In this paper, we propose deriving space-time covariance functions by solving “effective equations of motion”, which can be used as statistical representations of systems with diffusive behavior. In particular, we propose to formulate space-time covariance functions based on an equilibrium effective Hamiltonian using the linear response theory. The effective space-time dynamics is then generated by a stochastic perturbation around the equilibrium point of the classical field Hamiltonian leading to an associated Langevin equation. We employ a Hamiltonian which extends the classical Gaussian field theory by including a curvature term and leads to a diffusive Langevin equation. Finally, we derive new forms of space-time covariance functions.

  4. Strongly magnetized classical plasma models

    NASA Technical Reports Server (NTRS)

    Montgomery, D. C.

    1972-01-01

    The class of plasma processes for which the so-called Vlasov approximation is inadequate is investigated. Results from the equilibrium statistical mechanics of two-dimensional plasmas are derived. These results are independent of the presence of an external dc magnetic field. The nonequilibrium statistical mechanics of the electrostatic guiding-center plasma, a two-dimensional plasma model, is discussed. This model is then generalized to three dimensions. The guiding-center model is relaxed to include finite Larmor radius effects for a two-dimensional plasma.

  5. Classical statistical mechanics approach to multipartite entanglement

    NASA Astrophysics Data System (ADS)

    Facchi, P.; Florio, G.; Marzolino, U.; Parisi, G.; Pascazio, S.

    2010-06-01

    We characterize the multipartite entanglement of a system of n qubits in terms of the distribution function of the bipartite purity over balanced bipartitions. We search for maximally multipartite entangled states, whose average purity is minimal, and recast this optimization problem into a problem of statistical mechanics, by introducing a cost function, a fictitious temperature and a partition function. By investigating the high-temperature expansion, we obtain the first three moments of the distribution. We find that the problem exhibits frustration.

  6. Dimensionally regularized Tsallis' statistical mechanics and two-body Newton's gravitation

    NASA Astrophysics Data System (ADS)

    Zamora, J. D.; Rocca, M. C.; Plastino, A.; Ferri, G. L.

    2018-05-01

    Typical Tsallis' statistical mechanics' quantifiers like the partition function and the mean energy exhibit poles. We are speaking of the partition function Z and the mean energy 〈 U 〉 . The poles appear for distinctive values of Tsallis' characteristic real parameter q, at a numerable set of rational numbers of the q-line. These poles are dealt with dimensional regularization resources. The physical effects of these poles on the specific heats are studied here for the two-body classical gravitation potential.

  7. Quantum Behavior of an Autonomous Maxwell Demon

    NASA Astrophysics Data System (ADS)

    Chapman, Adrian; Miyake, Akimasa

    2015-03-01

    A Maxwell Demon is an agent that can exploit knowledge of a system's microstate to perform useful work. The second law of thermodynamics is only recovered upon taking into account the work required to irreversibly update the demon's memory, bringing information theoretic concepts into a thermodynamic framework. Recently, there has been interest in modeling a classical Maxwell demon as an autonomous physical system to study this information-work tradeoff explicitly. Motivated by the idea that states with non-local entanglement structure can be used as a computational resource, we ask whether these states have thermodynamic resource quality as well by generalizing a particular classical autonomous Maxwell demon to the quantum regime. We treat the full quantum description using a matrix product operator formalism, which allows us to handle quantum and classical correlations in a unified framework. Applying this, together with techniques from statistical mechanics, we are able to approximate nonlocal quantities such as the erasure performed on the demon's memory register when correlations are present. Finally, we examine how the demon may use these correlations as a resource to outperform its classical counterpart.

  8. From classical to quantum and back: Hamiltonian adaptive resolution path integral, ring polymer, and centroid molecular dynamics

    NASA Astrophysics Data System (ADS)

    Kreis, Karsten; Kremer, Kurt; Potestio, Raffaello; Tuckerman, Mark E.

    2017-12-01

    Path integral-based methodologies play a crucial role for the investigation of nuclear quantum effects by means of computer simulations. However, these techniques are significantly more demanding than corresponding classical simulations. To reduce this numerical effort, we recently proposed a method, based on a rigorous Hamiltonian formulation, which restricts the quantum modeling to a small but relevant spatial region within a larger reservoir where particles are treated classically. In this work, we extend this idea and show how it can be implemented along with state-of-the-art path integral simulation techniques, including path-integral molecular dynamics, which allows for the calculation of quantum statistical properties, and ring-polymer and centroid molecular dynamics, which allow the calculation of approximate quantum dynamical properties. To this end, we derive a new integration algorithm that also makes use of multiple time-stepping. The scheme is validated via adaptive classical-path-integral simulations of liquid water. Potential applications of the proposed multiresolution method are diverse and include efficient quantum simulations of interfaces as well as complex biomolecular systems such as membranes and proteins.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoilova, N. I.

    Generalized quantum statistics, such as paraboson and parafermion statistics, are characterized by triple relations which are related to Lie (super)algebras of type B. The correspondence of the Fock spaces of parabosons, parafermions as well as the Fock space of a system of parafermions and parabosons to irreducible representations of (super)algebras of type B will be pointed out. Example of generalized quantum statistics connected to the basic classical Lie superalgebra B(1|1) ≡ osp(3|2) with interesting physical properties, such as noncommutative coordinates, will be given. Therefore the article focuses on the question, addressed already in 1950 by Wigner: do the equation ofmore » motion determine the quantum mechanical commutation relation?.« less

  10. Prospective randomized clinical trial: single and weekly viscosupplementation

    PubMed Central

    Zóboli, Alejandro Agustin Carri; de Rezende, Márcia Uchôa; de Campos, Gustavo Constantino; Pasqualin, Thiago; Frucchi, Renato; de Camargo, Olavo Pires

    2013-01-01

    OBJECTIVE: To compare two different dosages of an intermediate molecular weight sodium hyaluronate (HA) (Osteonil®-TRB Pharma) assessing whether a single 6 ml application of this HA has the same effectiveness as the classical three-weekly 2 ml dose. METHODS: 108 patients with knee osteoarthritis were randomized into two groups of 54 patients each. The groups were designated "single" (S) and "weekly" (W). Patients in group S underwent a viscosupplementation procedure by application of only 6 ml of sodium hyaluronate and 1 ml triamcinolone hexacetonide. Patients in group W underwent the procedure of viscosupplementation through three applications with 2 ml sodium hyaluronate with a week interval between them, and the first application was also performed with the infiltration of 1 ml (20 mg) of Triamcinolone Hexacetonide. Both groups were assessed before, at one month and three months after application, by responding to the WOMAC, Lequesne, IKDC and VAS questionnaires. RESULTS: There was no statistical difference between the single application of 6 ml of sodium hyaluronate and classic application with three weekly injections. However, only the classical regime showed statistically significant improvement in baseline pain (WOMAC pain and VAS). CONCLUSION: Our results suggest that both application schemes improve application function, but the three-weekly regimen of 2 ml was more effective in reducing pain. Level of Evidence I, Prospective Randomized, Clinical Trial. PMID:24453681

  11. Quantum-mechanical analysis of low-gain free-electron laser oscillators

    NASA Astrophysics Data System (ADS)

    Fares, H.; Yamada, M.; Chiadroni, E.; Ferrario, M.

    2018-05-01

    In the previous classical theory of the low-gain free-electron laser (FEL) oscillators, the electron is described as a point-like particle, a delta function in the spatial space. On the other hand, in the previous quantum treatments, the electron is described as a plane wave with a single momentum state, a delta function in the momentum space. In reality, an electron must have statistical uncertainties in the position and momentum domains. Then, the electron is neither a point-like charge nor a plane wave of a single momentum. In this paper, we rephrase the theory of the low-gain FEL where the interacting electron is represented quantum mechanically by a plane wave with a finite spreading length (i.e., a wave packet). Using the concepts of the transformation of reference frames and the statistical quantum mechanics, an expression for the single-pass radiation gain is derived. The spectral broadening of the radiation is expressed in terms of the spreading length of an electron, the relaxation time characterizing the energy spread of electrons, and the interaction time. We introduce a comparison between our results and those obtained in the already known classical analyses where a good agreement between both results is shown. While the correspondence between our results and the classical results are shown, novel insights into the electron dynamics and the interaction mechanism are presented.

  12. The Gibbs paradox and the physical criteria for indistinguishability of identical particles

    NASA Astrophysics Data System (ADS)

    Unnikrishnan, C. S.

    2016-08-01

    Gibbs paradox in the context of statistical mechanics addresses the issue of additivity of entropy of mixing gases. The usual discussion attributes the paradoxical situation to classical distinguishability of identical particles and credits quantum theory for enabling indistinguishability of identical particles to solve the problem. We argue that indistinguishability of identical particles is already a feature in classical mechanics and this is clearly brought out when the problem is treated in the language of information and associated entropy. We pinpoint the physical criteria for indistinguishability that is crucial for the treatment of the Gibbs’ problem and the consistency of its solution with conventional thermodynamics. Quantum mechanics provides a quantitative criterion, not possible in the classical picture, for the degree of indistinguishability in terms of visibility of quantum interference, or overlap of the states as pointed out by von Neumann, thereby endowing the entropy expression with mathematical continuity and physical reasonableness.

  13. Structural aspects of the solvation shell of lysine and acetylated lysine: A Car-Parrinello and classical molecular dynamics investigation

    NASA Astrophysics Data System (ADS)

    Carnevale, V.; Raugei, S.

    2009-12-01

    Lysine acetylation is a post-translational modification, which modulates the affinity of protein-protein and/or protein-DNA complexes. Its crucial role as a switch in signaling pathways highlights the relevance of charged chemical groups in determining the interactions between water and biomolecules. A great effort has been recently devoted to assess the reliability of classical molecular dynamics simulations in describing the solvation properties of charged moieties. In the spirit of these investigations, we performed classical and Car-Parrinello molecular dynamics simulations on lysine and acetylated-lysine in aqueous solution. A comparative analysis between the two computational schemes is presented with a focus on the first solvation shell of the charged groups. An accurate structural analysis unveils subtle, yet statistically significant, differences which are discussed in connection to the significant electronic density charge transfer occurring between the solute and the surrounding water molecules.

  14. Nonclassicality Criteria in Multiport Interferometry

    NASA Astrophysics Data System (ADS)

    Rigovacca, L.; Di Franco, C.; Metcalf, B. J.; Walmsley, I. A.; Kim, M. S.

    2016-11-01

    Interference lies at the heart of the behavior of classical and quantum light. It is thus crucial to understand the boundaries between which interference patterns can be explained by a classical electromagnetic description of light and which, on the other hand, can only be understood with a proper quantum mechanical approach. While the case of two-mode interference has received a lot of attention, the multimode case has not yet been fully explored. Here we study a general scenario of intensity interferometry: we derive a bound on the average correlations between pairs of output intensities for the classical wavelike model of light, and we show how it can be violated in a quantum framework. As a consequence, this violation acts as a nonclassicality witness, able to detect the presence of sources with sub-Poissonian photon-number statistics. We also develop a criterion that can certify the impossibility of dividing a given interferometer into two independent subblocks.

  15. Probing quantum and classical turbulence analogy in von Kármán liquid helium, nitrogen, and water experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saint-Michel, B.; Aix Marseille Université, CNRS, Centrale Marseille, IRPHE UMR 7342, 13384 Marseille; Herbert, E.

    2014-12-15

    We report measurements of the dissipation in the Superfluid helium high REynold number von Kármán flow experiment for different forcing conditions. Statistically steady flows are reached; they display a hysteretic behavior similar to what has been observed in a 1:4 scale water experiment. Our macroscopical measurements indicate no noticeable difference between classical and superfluid flows, thereby providing evidence of the same dissipation scaling laws in the two phases. A detailed study of the evolution of the hysteresis cycle with the Reynolds number supports the idea that the stability of the steady states of classical turbulence in this closed flow ismore » partly governed by the dissipative scales. It also supports the idea that the normal and the superfluid components at these temperatures (1.6 K) are locked down to the dissipative length scale.« less

  16. A Critical Review and Appropriation of Pierre Bourdieu's Analysis of Social and Cultural Reproduction

    ERIC Educational Resources Information Center

    Shirley, Dennis

    1986-01-01

    Makes accessible Bourdieu's comprehensive and systematic sociology of French education; which integrates classical sociological theory and statistical analysis. Isolates and explicates key terminology, links these concepts together, and critiques the work from the perspective of the philosophy of praxis. (LHW)

  17. Theory-Based Causal Induction

    ERIC Educational Resources Information Center

    Griffiths, Thomas L.; Tenenbaum, Joshua B.

    2009-01-01

    Inducing causal relationships from observations is a classic problem in scientific inference, statistics, and machine learning. It is also a central part of human learning, and a task that people perform remarkably well given its notorious difficulties. People can learn causal structure in various settings, from diverse forms of data: observations…

  18. NEUROBEHAVIORAL EVALUATIONS OF BINARY AND TERTIARY MIXTURES OF CHEMICALS: LESSIONS LEARNING.

    EPA Science Inventory

    The classical approach to the statistical analysis of binary chemical mixtures is to construct full dose-response curves for one compound in the presence of a range of doses of the second compound (isobolographic analyses). For interaction studies using more than two chemicals, ...

  19. Linking Performance Measures to Resource Allocation: Exploring Unmapped Terrain.

    ERIC Educational Resources Information Center

    Ewell, Peter T.

    1999-01-01

    Examination of how (and whether) particular types of institutional performance measures can be beneficially used in making resource allocation decisions finds that only easily verifiable "hard" statistics should be used in classic performance funding approaches, although surveys and the use of good practices by institutions may…

  20. Spear Phishing Attack Detection

    DTIC Science & Technology

    2011-03-24

    the insider amongst senior leaders of an organization [Mes08], the undercover detective within a drug cartel, or the classic secret agent planted in...to a mimicry attack that shapes the embedded malware to have a statistical distribution similar to "normal" or benign behavior. 2.3.1.3

  1. Molecular activity prediction by means of supervised subspace projection based ensembles of classifiers.

    PubMed

    Cerruela García, G; García-Pedrajas, N; Luque Ruiz, I; Gómez-Nieto, M Á

    2018-03-01

    This paper proposes a method for molecular activity prediction in QSAR studies using ensembles of classifiers constructed by means of two supervised subspace projection methods, namely nonparametric discriminant analysis (NDA) and hybrid discriminant analysis (HDA). We studied the performance of the proposed ensembles compared to classical ensemble methods using four molecular datasets and eight different models for the representation of the molecular structure. Using several measures and statistical tests for classifier comparison, we observe that our proposal improves the classification results with respect to classical ensemble methods. Therefore, we show that ensembles constructed using supervised subspace projections offer an effective way of creating classifiers in cheminformatics.

  2. From classical to quantum mechanics: ``How to translate physical ideas into mathematical language''

    NASA Astrophysics Data System (ADS)

    Bergeron, H.

    2001-09-01

    Following previous works by E. Prugovečki [Physica A 91A, 202 (1978) and Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)] on common features of classical and quantum mechanics, we develop a unified mathematical framework for classical and quantum mechanics (based on L2-spaces over classical phase space), in order to investigate to what extent quantum mechanics can be obtained as a simple modification of classical mechanics (on both logical and analytical levels). To obtain this unified framework, we split quantum theory in two parts: (i) general quantum axiomatics (a system is described by a state in a Hilbert space, observables are self-adjoints operators, and so on) and (ii) quantum mechanics proper that specifies the Hilbert space as L2(Rn); the Heisenberg rule [pi,qj]=-iℏδij with p=-iℏ∇, the free Hamiltonian H=-ℏ2Δ/2m and so on. We show that general quantum axiomatics (up to a supplementary "axiom of classicity") can be used as a nonstandard mathematical ground to formulate physical ideas and equations of ordinary classical statistical mechanics. So, the question of a "true quantization" with "ℏ" must be seen as an independent physical problem not directly related with quantum formalism. At this stage, we show that this nonstandard formulation of classical mechanics exhibits a new kind of operation that has no classical counterpart: this operation is related to the "quantization process," and we show why quantization physically depends on group theory (the Galilei group). This analytical procedure of quantization replaces the "correspondence principle" (or canonical quantization) and allows us to map classical mechanics into quantum mechanics, giving all operators of quantum dynamics and the Schrödinger equation. The great advantage of this point of view is that quantization is based on concrete physical arguments and not derived from some "pure algebraic rule" (we exhibit also some limit of the correspondence principle). Moreover spins for particles are naturally generated, including an approximation of their interaction with magnetic fields. We also recover by this approach the semi-classical formalism developed by E. Prugovečki [Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)].

  3. Ten reasons why a thermalized system cannot be described by a many-particle wave function

    NASA Astrophysics Data System (ADS)

    Drossel, Barbara

    2017-05-01

    It is widely believed that the underlying reality behind statistical mechanics is a deterministic and unitary time evolution of a many-particle wave function, even though this is in conflict with the irreversible, stochastic nature of statistical mechanics. The usual attempts to resolve this conflict for instance by appealing to decoherence or eigenstate thermalization are riddled with problems. This paper considers theoretical physics of thermalized systems as it is done in practice and shows that all approaches to thermalized systems presuppose in some form limits to linear superposition and deterministic time evolution. These considerations include, among others, the classical limit, extensivity, the concepts of entropy and equilibrium, and symmetry breaking in phase transitions and quantum measurement. As a conclusion, the paper suggests that the irreversibility and stochasticity of statistical mechanics should be taken as a real property of nature. It follows that a gas of a macroscopic number N of atoms in thermal equilibrium is best represented by a collection of N wave packets of a size of the order of the thermal de Broglie wave length, which behave quantum mechanically below this scale but classically sufficiently far beyond this scale. In particular, these wave packets must localize again after scattering events, which requires stochasticity and indicates a connection to the measurement process.

  4. A quantitative approach to evolution of music and philosophy

    NASA Astrophysics Data System (ADS)

    Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira, Osvaldo N., Jr.; da Fontoura Costa, Luciano

    2012-08-01

    The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master-apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.

  5. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  6. On the transition from the quantum to the classical regime for massive scalar particles: A spatiotemporal approach

    NASA Astrophysics Data System (ADS)

    Lusanna, Luca; Pauri, Massimo

    2014-08-01

    If the classical structure of space-time is assumed to define an a priori scenario for the formulation of quantum theory (QT), the coordinate representation of the solutions of the Schroedinger equation of a quantum system containing one ( N) massive scalar particle has a preferred status. Let us consider all of the solutions admitting a multipolar expansion of the probability density function (and more generally of the Wigner function) around a space-time trajectory to be properly selected. For every normalized solution there is a privileged trajectory implying the vanishing of the dipole moment of the multipolar expansion: it is given by the expectation value of the position operator . Then, the special subset of solutions which satisfy Ehrenfest's Theorem (named thereby Ehrenfest monopole wave functions (EMWF)), have the important property that this privileged classical trajectory is determined by a closed Newtonian equation of motion where the effective force is the Newtonian force plus non-Newtonian terms (of order ħ 2 or higher) depending on the higher multipoles of the probability distribution ρ. Note that the superposition of two EMWFs is not an EMWF, a result to be strongly hoped for, given the possible unwanted implications concerning classical spatial perception. These results can be extended to N-particle systems in such a way that, when N classical trajectories with all the dipole moments vanishing and satisfying Ehrenfest theorem are associated with the normalized wave functions of the N-body system, we get a natural transition from the 3 N-dimensional configuration space to the space-time. Moreover, these results can be extended to relativistic quantum mechanics. Consequently, in suitable states of N quantum particle which are EMWF, we get the "emergence" of corresponding "classical particles" following Newton-like trajectories in space-time. Note that all this holds true in the standard framework of quantum mechanics, i.e. assuming, in particular, the validity of Born's rule and the individual system interpretation of the wave function (no ensemble interpretation). These results are valid without any approximation (like ħ → 0, big quantum numbers, etc.). Moreover, we do not commit ourselves to any specific ontological interpretation of quantum theory (such as, e.g., the Bohmian one). We will argue that, in substantial agreement with Bohr's viewpoint, the macroscopic description of the preparation, certain intermediate steps and the detection of the final outcome of experiments involving massive particles are dominated by these classical "effective" trajectories. This approach can be applied to the point of view of de-coherence in the case of a diagonal reduced density matrix ρ red (an improper mixture) depending on the position variables of a massive particle and of a pointer. When both the particle and the pointer wave functions appearing in ρ red are EMWF, the expectation value of the particle and pointer position variables becomes a statistical average on a classical ensemble. In these cases an improper quantum mixture becomes a classical statistical one, thus providing a particular answer to an open problem of de-coherence about the emergence of classicality.

  7. An Integrated Approach to Thermodynamics in the Introductory Physics Course.

    ERIC Educational Resources Information Center

    Alonso, Marcelo; Finn, Edward J.

    1995-01-01

    Presents an approach to combine the empirical approach of classical thermodynamics with the structural approach of statistical mechanics. Topics covered include dynamical foundation of the first law; mechanical work, heat, radiation, and the first law; thermal equilibrium; thermal processes; thermodynamic probability; entropy; the second law;…

  8. A New Challenge for Compression Algorithms: Genetic Sequences.

    ERIC Educational Resources Information Center

    Grumbach, Stephane; Tahi, Fariza

    1994-01-01

    Analyzes the properties of genetic sequences that cause the failure of classical algorithms used for data compression. A lossless algorithm, which compresses the information contained in DNA and RNA sequences by detecting regularities such as palindromes, is presented. This algorithm combines substitutional and statistical methods and appears to…

  9. Spacecraft Formation Control and Estimation Via Improved Relative Motion Dynamics

    DTIC Science & Technology

    2017-03-30

    statistical (e.g. batch least-squares or Extended Kalman Filter ) estimator. In addition, the IROD approach can be applied to classical (ground-based...covariance  Test the viability of IROD solutions by injecting them into precise orbit determination schemes (e.g. various strains of Kalman filters

  10. Neuroimaging Research: from Null-Hypothesis Falsification to Out-Of-Sample Generalization

    ERIC Educational Resources Information Center

    Bzdok, Danilo; Varoquaux, Gaël; Thirion, Bertrand

    2017-01-01

    Brain-imaging technology has boosted the quantification of neurobiological phenomena underlying human mental operations and their disturbances. Since its inception, drawing inference on neurophysiological effects hinged on classical statistical methods, especially, the general linear model. The tens of thousands of variables per brain scan were…

  11. Effects of the Family Environment: Gene-Environment Interaction and Passive Gene-Environment Correlation

    ERIC Educational Resources Information Center

    Price, Thomas S.; Jaffee, Sara R.

    2008-01-01

    The classical twin study provides a useful resource for testing hypotheses about how the family environment influences children's development, including how genes can influence sensitivity to environmental effects. However, existing statistical models do not account for the possibility that children can inherit exposure to family environments…

  12. Computer Simulation of Classic Studies in Psychology.

    ERIC Educational Resources Information Center

    Bradley, Drake R.

    This paper describes DATASIM, a comprehensive software package which generates simulated data for actual or hypothetical research designs. DATASIM is primarily intended for use in statistics and research methods courses, where it is used to generate "individualized" datasets for students to analyze, and later to correct their answers.…

  13. Structural Equation Modeling: Possibilities for Language Learning Researchers

    ERIC Educational Resources Information Center

    Hancock, Gregory R.; Schoonen, Rob

    2015-01-01

    Although classical statistical techniques have been a valuable tool in second language (L2) research, L2 research questions have started to grow beyond those techniques' capabilities, and indeed are often limited by them. Questions about how complex constructs relate to each other or to constituent subskills, about longitudinal development in…

  14. Urns and Chameleons: two metaphors for two different types of measurements

    NASA Astrophysics Data System (ADS)

    Accardi, Luigi

    2013-09-01

    The awareness of the physical possibility of models of space, alternative with respect to the Euclidean one, begun to emerge towards the end of the 19-th century. At the end of the 20-th century a similar awareness emerged concerning the physical possibility of models of the laws of chance alternative with respect to the classical probabilistic models (Kolmogorov model). In geometry the mathematical construction of several non-Euclidean models of space preceded of about one century their applications in physics, which came with the theory of relativity. In physics the opposite situation took place. In fact, while the first example of non Kolmogorov probabilistic models emerged in quantum physics approximately one century ago, at the beginning of 1900, the awareness of the fact that this new mathematical formalism reflected a new mathematical model of the laws of chance had to wait until the early 1980's. In this long time interval the classical and the new probabilistic models were both used in the description and the interpretation of quantum phenomena and negatively interfered with each other because of the absence (for many decades) of a mathematical theory that clearly delimited the respective domains of application. The result of this interference was the emergence of the so-called the "paradoxes of quantum theory". For several decades there have been many different attempts to solve these paradoxes giving rise to what K. Popper baptized "the great quantum muddle": a debate which has been at the core of the philosophy of science for more than 50 years. However these attempts have led to contradictions between the two fundamental theories of the contemporary physical: the quantum theory and the theory of the relativity. Quantum probability identifies the reason of the emergence of non Kolmogorov models, and therefore of the so-called the paradoxes of quantum theory, in the difference between the notion of passive measurements like "reading pre-existent properties" (urn metaphor) and measurements consisting in reading "a response to an interaction" (chameleon metaphor). The non-trivial point is that one can prove that, while the urn scheme cannot lead to empirical data outside of classic probability, response based measurements can give rise to non classical statistics. The talk will include entirely classical examples of non classical statistics and potential applications to economic, sociological or biomedical phenomena.

  15. Diabetes mellitus in classical trigeminal neuralgia: A predisposing factor for its development.

    PubMed

    Xu, Zhenq; Zhang, Ping; Long, Li; He, Huiy; Zhang, Jianch; Sun, Shup

    2016-12-01

    A higher prevalence of diabetes mellitus in classical trigeminal neuralgia patients was observed in few pilot surveys. The study was aimed to investigate whether diabetes mellitus is a predisposing factor for developing trigeminal neuralgia. Patients with classical trigeminal neuralgia were enrolled in the case study group. The control group consisted of the same number of age- and gender-matched, randomly sampled subjects without trigeminal neuralgia. Characteristics of classical trigeminal neuralgia cases were analyzed. The prevalence of diabetes mellitus in the cases and controls was calculated using the Chi-square test. The onset age ranged from 31 to 93 in 256 patients affected classical trigeminal neuralgia (162 females; 94 males) with a peak age between the fifth and seventh decade; right-side involvement and mandibular branch affliction occurred at a greater frequency. 21.9% patients in the study group was affected by diabetes mellitus compared to 12.9% of controls. The increased prevalence of diabetes mellitus in the trigeminal neuralgia group was statistically significant (P=0.01). Diabetes is a risk factor to the development of classical trigeminal neuralgia, and nerve damage duing to hyperglycemia might be the linkage to the two diseases. More works should be done to consolidate the correlation and to clarify the underlying mechanism for the positive association which would provide new insight into the pathogenesis of trigeminal neuralgia and may open new therapeutic perspectives. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Pre-attentive auditory discrimination skill in Indian classical vocal musicians and non-musicians.

    PubMed

    Sanju, Himanshu Kumar; Kumar, Prawin

    2016-09-01

    To test for pre-attentive auditory discrimination skills in Indian classical vocal musicians and non-musicians. Mismatch negativity (MMN) was recorded to test for pre-attentive auditory discrimination skills with a pair of stimuli of /1000 Hz/ and /1100 Hz/, with /1000 Hz/ as the frequent stimulus and /1100 Hz/ as the infrequent stimulus. Onset, offset and peak latencies were the considered latency parameters, whereas peak amplitude and area under the curve were considered for amplitude analysis. Exactly 50 participants, out of which the experimental group had 25 adult Indian classical vocal musicians and 25 age-matched non-musicians served as the control group, were included in the study. Experimental group participants had a minimum professional music experience in Indian classic vocal music of 10 years. However, control group participants did not have any formal training in music. Descriptive statistics showed better waveform morphology in the experimental group as compared to the control. MANOVA showed significantly better onset latency, peak amplitude and area under the curve in the experimental group but no significant difference in the offset and peak latencies between the two groups. The present study probably points towards the enhancement of pre-attentive auditory discrimination skills in Indian classical vocal musicians compared to non-musicians. It indicates that Indian classical musical training enhances pre-attentive auditory discrimination skills in musicians, leading to higher peak amplitude and a greater area under the curve compared to non-musicians.

  17. Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization

    NASA Astrophysics Data System (ADS)

    Eroglu, Sertac

    2014-10-01

    The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.

  18. Irradiation-hyperthermia in canine hemangiopericytomas: large-animal model for therapeutic response.

    PubMed

    Richardson, R C; Anderson, V L; Voorhees, W D; Blevins, W E; Inskeep, T K; Janas, W; Shupe, R E; Babbs, C F

    1984-11-01

    Results of irradiation-hyperthermia treatment in 11 dogs with naturally occurring hemangiopericytoma were reported. Similarities of canine and human hemangiopericytomas were described. Orthovoltage X-irradiation followed by microwave-induced hyperthermia resulted in a 91% objective response rate. A statistical procedure was given to evaluate quantitatively the clinical behavior of locally invasive, nonmetastatic tumors in dogs that were undergoing therapy for control of local disease. The procedure used a small sample size and demonstrated distribution of the data on a scaled response as well as transformation of the data through classical parametric and nonparametric statistical methods. These statistical methods set confidence limits on the population mean and placed tolerance limits on a population percentage. Application of the statistical methods to human and animal clinical trials was apparent.

  19. Statistical Optics

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  20. Statistics of excitations in the electron glass model

    NASA Astrophysics Data System (ADS)

    Palassini, Matteo

    2011-03-01

    We study the statistics of elementary excitations in the classical electron glass model of localized electrons interacting via the unscreened Coulomb interaction in the presence of disorder. We reconsider the long-standing puzzle of the exponential suppression of the single-particle density of states near the Fermi level, by measuring accurately the density of states of charged and electron-hole pair excitations via finite temperature Monte Carlo simulation and zero-temperature relaxation. We also investigate the statistics of large charge rearrangements after a perturbation of the system, which may shed some light on the slow relaxation and glassy phenomena recently observed in a variety of Anderson insulators. In collaboration with Martin Goethe.

  1. Negative values of quasidistributions and quantum wave and number statistics

    NASA Astrophysics Data System (ADS)

    Peřina, J.; Křepelka, J.

    2018-04-01

    We consider nonclassical wave and number quantum statistics, and perform a decomposition of quasidistributions for nonlinear optical down-conversion processes using Bessel functions. We show that negative values of the quasidistribution do not directly represent probabilities; however, they directly influence measurable number statistics. Negative terms in the decomposition related to the nonclassical behavior with negative amplitudes of probability can be interpreted as positive amplitudes of probability in the negative orthogonal Bessel basis, whereas positive amplitudes of probability in the positive basis describe classical cases. However, probabilities are positive in all cases, including negative values of quasidistributions. Negative and positive contributions of decompositions to quasidistributions are estimated. The approach can be adapted to quantum coherence functions.

  2. Towards Solving the Mixing Problem in the Decomposition of Geophysical Time Series by Independent Component Analysis

    NASA Technical Reports Server (NTRS)

    Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)

    2000-01-01

    The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.

  3. New selection effect in statistical investigations of supernova remnants

    NASA Astrophysics Data System (ADS)

    Allakhverdiev, A. O.; Guseinov, O. Kh.; Kasumov, F. K.

    1986-01-01

    The influence of H II regions on the parameters of supernova remnants (SNR) is investigated. It has been shown that the projection of such regions on the SNRs leads to: a) local changes of morphological structure of young shell-type SNRs and b) considerable distortions of integral parameters of evolved shell-type SNRs (with D > 10 pc) and plerions, up to their complete undetectability on the background of classical and gigantic H II regions. A new selection effect, in fact, arises from these factors connected with additional limitations made by the real structure of the interstellar medium on the statistical investigations of SNRs. The influence of this effect on the statistical completeness of objects has been estimated.

  4. Nonclassical acoustics

    NASA Technical Reports Server (NTRS)

    Kentzer, C. P.

    1976-01-01

    A statistical approach to sound propagation is considered in situations where, due to the presence of large gradients of properties of the medium, the classical (deterministic) treatment of wave motion is inadequate. Mathematical methods for wave motions not restricted to small wavelengths (analogous to known methods of quantum mechanics) are used to formulate a wave theory of sound in nonuniform flows. Nonlinear transport equations for field probabilities are derived for the limiting case of noninteracting sound waves and it is postulated that such transport equations, appropriately generalized, may be used to predict the statistical behavior of sound in arbitrary flows.

  5. A Critical Examination of Figure of Merit (FOM). Assessing the Goodness-of-Fit in Gamma/X-ray Peak Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croft, S.; Favalli, Andrea; Weaver, Brian Phillip

    2015-10-06

    In this paper we develop and investigate several criteria for assessing how well a proposed spectral form fits observed spectra. We consider the classical improved figure of merit (FOM) along with several modifications, as well as criteria motivated by Poisson regression from the statistical literature. We also develop a new FOM that is based on the statistical idea of the bootstrap. A spectral simulator has been developed to assess the performance of these different criteria under multiple data configurations.

  6. Acceleration techniques for dependability simulation. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Barnette, James David

    1995-01-01

    As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.

  7. Quantum mechanics of black holes.

    PubMed

    Witten, Edward

    2012-08-03

    The popular conception of black holes reflects the behavior of the massive black holes found by astronomers and described by classical general relativity. These objects swallow up whatever comes near and emit nothing. Physicists who have tried to understand the behavior of black holes from a quantum mechanical point of view, however, have arrived at quite a different picture. The difference is analogous to the difference between thermodynamics and statistical mechanics. The thermodynamic description is a good approximation for a macroscopic system, but statistical mechanics describes what one will see if one looks more closely.

  8. Watching entangled circular DNA in real time with super-resolution

    NASA Astrophysics Data System (ADS)

    Jee, Ah-Young; Kim, Hyeongju; Granick, Steve

    In this talk, we will show how we unraveled the conformational dynamics of entangled ring-shaped polymers in network, which is one of the most well-known problems in polymer physics, using deep imaging based on super-resolution fluorescence imaging, stimulated emission depletion (STED) microscopy. By using home-written software, we obtained the statistics of each of the hundreds of molecules, mapping out a large statistical distribution. Through inspection we not only found some aspects of the classic understanding of polymers, but some surprising aspects as well.

  9. Playing-related musculoskeletal disorders among icelandic music students: differences between students playing classical vs rhythmic music.

    PubMed

    Arnason, Kári; Arnason, Arni; Briem, Kristín

    2014-06-01

    Most research studies investigating the prevalence of musculoskeletal disorders affecting musicians and music students have focused on classical music, while less is known about their prevalence in other music genres. The purpose of this study was to document cumulative and point prevalence of playing-related musculoskeletal disorders (PRMD) among music students in Iceland and, specifically, to identify differences between those studying classical vs rhythmic music. We hypothesized that students of classical music would report more frequent and more severe musculoskeletal disorders than students involved in rhythmic music, as classical instruments and composition typically require more demanding, sustained postures during practice and performance. A total of 74 students from two classical music schools (schools A and B) and 1 rhythmic school (school C) participated in the study by answering a questionnaire assessing PRMDs. The results showed that 62% of participants had, at some point in their musical career, suffered a PRMD. The cumulative prevalence was highest in music school A (71.4%) and lowest in music school C (38.9%). A statistically significant difference was identified between the cumulative prevalence of PRMD from schools A and B combined compared to music school C (p=0.019). Over 40% of participants reported a "current PRMD," and a significant difference was identified between the three schools (p=0.011), with the highest point prevalence being registered in music school A (66.6%) and the lowest in music school C (22.2%). The prevalence of PRMDs among Icelandic music students was high. The difference found between students who play classical vs rhythmic music may be explained by different demands of the instruments and composition on playing posture.

  10. Association of presenile cataract with galactose-1-phosphate uridyl transferase gene mutations.

    PubMed

    Nema, Nitin; Kumar, Ravindra; Verma, Abha; Verma, Sonam; Chaturvedi, Kiran

    2017-01-01

    Presenile cataract is commonly idiopathic in origin. However, patients with presenile cataract could have an underlying genetic abnormality of galactose metabolism. We studied the association, if any, between idiopathic presenile cataract and galactose-1 -phosphate uridyl transferase (GALT) gene mutation. We selected 50 patients with idiopathic presenile cataract, <45 years of age, and 50 age- and sex-matched controls for the study. Mutations in the GALT gene were determined by polymerase chain reaction restriction fragment length polymorphism. The classical galactosaemia was characterized by Q188R and K285N mutations, whereas Duarte galactosaemia by N314D mutations (Duarte-2: N314D with IVS5-24G >A and Duarte-1: N314D without IVS5- 24G>A). The most common mutation observed was the N314D (Duarte) mutation. The frequencies of classical and N31 4D alleles in patients with presenile cataract (16%) and controls (26%) were not statistically different (p=0.32, OR 0.54, 95% CI 0.20-1.45). Similarly, there was no statistically significant difference in the frequency distribution of Duarte-1 (p=0.77, OR 0.77, 95% CI 0.23-0.24) and Duarte-2 (p=0.44, OR 0.38, 95% CI 0.07-2.03) galactosaemia mutations in patients and controls. Duarte galactosaemia, a milder form of the disease, is more common than classical galactosaemia in the Indian population. Duarte galactosaemia is unlikely to be a causative factor in presenile cataract.

  11. Deconstructing multivariate decoding for the study of brain function.

    PubMed

    Hebart, Martin N; Baker, Chris I

    2017-08-04

    Multivariate decoding methods were developed originally as tools to enable accurate predictions in real-world applications. The realization that these methods can also be employed to study brain function has led to their widespread adoption in the neurosciences. However, prior to the rise of multivariate decoding, the study of brain function was firmly embedded in a statistical philosophy grounded on univariate methods of data analysis. In this way, multivariate decoding for brain interpretation grew out of two established frameworks: multivariate decoding for predictions in real-world applications, and classical univariate analysis based on the study and interpretation of brain activation. We argue that this led to two confusions, one reflecting a mixture of multivariate decoding for prediction or interpretation, and the other a mixture of the conceptual and statistical philosophies underlying multivariate decoding and classical univariate analysis. Here we attempt to systematically disambiguate multivariate decoding for the study of brain function from the frameworks it grew out of. After elaborating these confusions and their consequences, we describe six, often unappreciated, differences between classical univariate analysis and multivariate decoding. We then focus on how the common interpretation of what is signal and noise changes in multivariate decoding. Finally, we use four examples to illustrate where these confusions may impact the interpretation of neuroimaging data. We conclude with a discussion of potential strategies to help resolve these confusions in interpreting multivariate decoding results, including the potential departure from multivariate decoding methods for the study of brain function. Copyright © 2017. Published by Elsevier Inc.

  12. Comparative spectral analysis of veterinary powder product by continuous wavelet and derivative transforms

    NASA Astrophysics Data System (ADS)

    Dinç, Erdal; Kanbur, Murat; Baleanu, Dumitru

    2007-10-01

    Comparative simultaneous determination of chlortetracycline and benzocaine in the commercial veterinary powder product was carried out by continuous wavelet transform (CWT) and classical derivative transform (or classical derivative spectrophotometry). In this quantitative spectral analysis, two proposed analytical methods do not require any chemical separation process. In the first step, several wavelet families were tested to find an optimal CWT for the overlapping signal processing of the analyzed compounds. Subsequently, we observed that the coiflets (COIF-CWT) method with dilation parameter, a = 400, gives suitable results for this analytical application. For a comparison, the classical derivative spectrophotometry (CDS) approach was also applied to the simultaneous quantitative resolution of the same analytical problem. Calibration functions were obtained by measuring the transform amplitudes corresponding to zero-crossing points for both CWT and CDS methods. The utility of these two analytical approaches were verified by analyzing various synthetic mixtures consisting of chlortetracycline and benzocaine and they were applied to the real samples consisting of veterinary powder formulation. The experimental results obtained from the COIF-CWT approach were statistically compared with those obtained by classical derivative spectrophotometry and successful results were reported.

  13. A Study of Two Instructional Sequences Informed by Alternative Learning Progressions in Genetics

    NASA Astrophysics Data System (ADS)

    Duncan, Ravit Golan; Choi, Jinnie; Castro-Faix, Moraima; Cavera, Veronica L.

    2017-12-01

    Learning progressions (LPs) are hypothetical models of how learning in a domain develops over time with appropriate instruction. In the domain of genetics, there are two independently developed alternative LPs. The main difference between the two progressions hinges on their assumptions regarding the accessibility of classical (Mendelian) versus molecular genetics and the order in which they should be taught. In order to determine the relative difficulty of the different genetic ideas included in the two progressions, and to test which one is a better fit with students' actual learning, we developed two modules in classical and molecular genetics and alternated their sequence in an implementation study with 11th grade students studying biology. We developed a set of 56 ordered multiple-choice items that collectively assessed both molecular and classical genetic ideas. We found significant gains in students' learning in both molecular and classical genetics, with the largest gain relating to understanding the informational content of genes and the smallest gain in understanding modes of inheritance. Using multidimensional item response modeling, we found no statistically significant differences between the two instructional sequences. However, there was a trend of slightly higher gains for the molecular-first sequence for all genetic ideas.

  14. Potentials of Mean Force With Ab Initio Mixed Hamiltonian Models of Solvation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dupuis, Michel; Schenter, Gregory K.; Garrett, Bruce C.

    2003-08-01

    We give an account of a computationally tractable and efficient procedure for the calculation of potentials of mean force using mixed Hamiltonian models of electronic structure where quantum subsystems are described with computationally intensive ab initio wavefunctions. The mixed Hamiltonian is mapped into an all-classical Hamiltonian that is amenable to a thermodynamic perturbation treatment for the calculation of free energies. A small number of statistically uncorrelated (solute-solvent) configurations are selected from the Monte Carlo random walk generated with the all-classical Hamiltonian approximation. Those are used in the averaging of the free energy using the mixed quantum/classical Hamiltonian. The methodology ismore » illustrated for the micro-solvated SN2 substitution reaction of methyl chloride by hydroxide. We also compare the potential of mean force calculated with the above protocol with an approximate formalism, one in which the potential of mean force calculated with the all-classical Hamiltonian is simply added to the energy of the isolated (non-solvated) solute along the reaction path. Interestingly the latter approach is found to be in semi-quantitative agreement with the full mixed Hamiltonian approximation.« less

  15. Rage against the Machine: Evaluation Metrics in the 21st Century

    ERIC Educational Resources Information Center

    Yang, Charles

    2017-01-01

    I review the classic literature in generative grammar and Marr's three-level program for cognitive science to defend the Evaluation Metric as a psychological theory of language learning. Focusing on well-established facts of language variation, change, and use, I argue that optimal statistical principles embodied in Bayesian inference models are…

  16. Comparing and Contrasting Neural Net Solutions to Classical Statistical Solutions.

    ERIC Educational Resources Information Center

    Van Nelson, C.; Neff, Kathryn J.

    Data from two studies in which subjects were classified as successful or unsuccessful were analyzed using neural net technology after being analyzed with a linear regression function. Data were obtained from admission records of 201 students admitted to undergraduate and 285 students admitted to graduate programs. Data included grade point…

  17. Turbulent Chemically Reacting Flows According to a Kinetic Theory. Ph.D. Thesis; [statistical analysis/gas flow

    NASA Technical Reports Server (NTRS)

    Hong, Z. C.

    1975-01-01

    A review of various methods of calculating turbulent chemically reacting flow such as the Green Function, Navier-Stokes equation, and others is presented. Nonequilibrium degrees of freedom were employed to study the mixing behavior of a multiscale turbulence field. Classical and modern theories are discussed.

  18. Performance Evaluation Methods for Army Finance and Accounting Offices.

    DTIC Science & Technology

    1981-12-01

    FINOPS and FINES. FINOPS provides data through command channels to USAFAC, which is the basis for manangement to ascertain the overall perfor- mance of...IV-I. 69 LU . LU z z * 4 . I L 0; 9 7 - It should be emphasized that these tests do not constitute a classical statistical, controlled experiment to

  19. A Study of Statistics through Tootsie Pops

    ERIC Educational Resources Information Center

    Aaberg, Shelby; Vitosh, Jason; Smith, Wendy

    2016-01-01

    A classic TV commercial once asked, "How many licks does it take to get to the center of a Tootsie Roll Tootsie Pop?" The narrator claims, "The world may never know" (Tootsie Roll 2012), but an Internet search returns a multitude of answers, some of which include rigorous systematic approaches by academics to address the…

  20. Assessing the Accuracy and Consistency of Language Proficiency Classification under Competing Measurement Models

    ERIC Educational Resources Information Center

    Zhang, Bo

    2010-01-01

    This article investigates how measurement models and statistical procedures can be applied to estimate the accuracy of proficiency classification in language testing. The paper starts with a concise introduction of four measurement models: the classical test theory (CTT) model, the dichotomous item response theory (IRT) model, the testlet response…

  1. Some Research Orientations for Research in Social Studies Education. [Draft].

    ERIC Educational Resources Information Center

    van Manen, M. J. Max

    The need for a different conception of research from the classical statistical approach to theory development in social studies teaching is addressed in this paper. In a schema of dominant orientations of social theory, the outstanding epistemological features of the three main schools of contemporary metascience are outlined. Three systems of…

  2. The Public/Private Divide in Higher Education: A Global Revision

    ERIC Educational Resources Information Center

    Marginson, Simon

    2007-01-01

    Our common understandings of the public/private distinction in higher education are drawn from neo-classical economics and/or statist political philosophy. However, the development of competition and markets at the national level, and the new potentials for private and public goods created by globalisation in higher education, have exposed…

  3. Modeling Conditional Probabilities in Complex Educational Assessments. CSE Technical Report.

    ERIC Educational Resources Information Center

    Mislevy, Robert J.; Almond, Russell; Dibello, Lou; Jenkins, Frank; Steinberg, Linda; Yan, Duanli; Senturk, Deniz

    An active area in psychometric research is coordinated task design and statistical analysis built around cognitive models. Compared with classical test theory and item response theory, there is often less information from observed data about the measurement-model parameters. On the other hand, there is more information from the grounding…

  4. Conquering the Physics GRE

    NASA Astrophysics Data System (ADS)

    Kahn, Yoni; Anderson, Adam

    2018-03-01

    Preface; How to use this book; Resources; 1. Classical mechanics; 2. Electricity and magnetism; 3. Optics and waves; 4. Thermodynamics and statistical mechanics; 5. Quantum mechanics and atomic physics; 6. Special relativity; 7. Laboratory methods; 8. Specialized topics; 9. Special tips and tricks for the Physics GRE; Sample exams and solutions; References; Equation index; Subject index; Problems index.

  5. Child and Family: Demographic Developments in the OECD Countries.

    ERIC Educational Resources Information Center

    Le Bras, Herve

    This study of early childhood and the family in member countries of the Organisation for Economic Co-Operation and Development (OECD) employs two statistical approaches to the problem of providing an accurate picture of modern conditions of family life. A classical demographic approach to population studies is initially used, then is critiqued,…

  6. The Information Function for the One-Parameter Logistic Model: Is it Reliability?

    ERIC Educational Resources Information Center

    Doran, Harold C.

    2005-01-01

    The information function is an important statistic in item response theory (IRT) applications. Although the information function is often described as the IRT version of reliability, it differs from the classical notion of reliability from a critical perspective: replication. This article first explores the information function for the…

  7. [Comparison of efficacy of heel ulcer prevention between classic padded bandage and polyurethane heel in a medium-stay hospital: randomized controlled trial].

    PubMed

    Ferrer Solà, Marta; Espaulella Panicot, Joan; Altimires Roset, Jacint; Ylla-Català Borè, Elisenda; Moreno Susi, María

    2013-01-01

    The aim of the study is to determine the incidence of heel pressure ulcers (UPPT) and to compare the two systems for UPPT prevention: classic padded bandage and polyurethane heel. Prospective intervention study in a medium-long hospital stay of all people admitted that had no UPPT but had a risk of UPPT according to the Braden Scale or clinical judgment. The patients were randomized to prevention with classic padded bandage or polyurethane heel. The outcome variable was the incidence of UPPT for each study group, which was recorded every 15 days or when there were clinical changes. Of the 940 patients evaluated, 409 with a mean age of 80.5 years and 59.1% women,were included in the study. Of these, 78% had Barthel score ≤30; 28.6% dementia; delirium 37.6%; 27.6% diabetes; and 19.6% other UPP. The overall incidence was 2.9% UPPT; 2.49% in the classic padded bandage and 3.37% in the polyurethane heel group (p=0.82). No statistically significant differences were observed between the group with the classical dressing and the group with the polyurethane heel dressing. The use of multiple measures to prevent UPPT achieved a low incidence of these. Copyright © 2011 SEGG. Published by Elsevier Espana. All rights reserved.

  8. Responses of calcification of massive and encrusting corals to past, present, and near-future ocean carbon dioxide concentrations.

    PubMed

    Iguchi, Akira; Kumagai, Naoki H; Nakamura, Takashi; Suzuki, Atsushi; Sakai, Kazuhiko; Nojiri, Yukihiro

    2014-12-15

    In this study, we report the acidification impact mimicking the pre-industrial, the present, and near-future oceans on calcification of two coral species (Porites australiensis, Isopora palifera) by using precise pCO2 control system which can produce acidified seawater under stable pCO2 values with low variations. In the analyses, we performed Bayesian modeling approaches incorporating the variations of pCO2 and compared the results between our modeling approach and classical statistical one. The results showed highest calcification rates in pre-industrial pCO2 level and gradual decreases of calcification in the near-future ocean acidification level, which suggests that ongoing and near-future ocean acidification would negatively impact coral calcification. In addition, it was expected that the variations of parameters of carbon chemistry may affect the inference of the best model on calcification responses to these parameters between Bayesian modeling approach and classical statistical one even under stable pCO2 values with low variations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. A pedestrian approach to the measurement problem in quantum mechanics

    NASA Astrophysics Data System (ADS)

    Boughn, Stephen; Reginatto, Marcel

    2013-09-01

    The quantum theory of measurement has been a matter of debate for over eighty years. Most of the discussion has focused on theoretical issues with the consequence that other aspects (such as the operational prescriptions that are an integral part of experimental physics) have been largely ignored. This has undoubtedly exacerbated attempts to find a solution to the "measurement problem". How the measurement problem is defined depends to some extent on how the theoretical concepts introduced by the theory are interpreted. In this paper, we fully embrace the minimalist statistical (ensemble) interpretation of quantum mechanics espoused by Einstein, Ballentine, and others. According to this interpretation, the quantum state description applies only to a statistical ensemble of similarly prepared systems rather than representing an individual system. Thus, the statistical interpretation obviates the need to entertain reduction of the state vector, one of the primary dilemmas of the measurement problem. The other major aspect of the measurement problem, the necessity of describing measurements in terms of classical concepts that lay outside of quantum theory, remains. A consistent formalism for interacting quantum and classical systems, like the one based on ensembles on configuration space that we refer to in this paper, might seem to eliminate this facet of the measurement problem; however, we argue that the ultimate interface with experiments is described by operational prescriptions and not in terms of the concepts of classical theory. There is no doubt that attempts to address the measurement problem have yielded important advances in fundamental physics; however, it is also very clear that the measurement problem is still far from being resolved. The pedestrian approach presented here suggests that this state of affairs is in part the result of searching for a theoretical/mathematical solution to what is fundamentally an experimental/observational question. It suggests also that the measurement problem is, in some sense, ill-posed and might never be resolved. This point of view is tenable so long as one is willing to view physical theories as providing models of nature rather than complete descriptions of reality. Among other things, these considerations lead us to suggest that the Copenhagen interpretation's insistence on the classicality of the measurement apparatus should be replaced by the requirement that a measurement, which is specified operationally, should simply be of sufficient precision.

  10. New insights into faster computation of uncertainties

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Atreyee

    2012-11-01

    Heavy computation power, lengthy simulations, and an exhaustive number of model runs—often these seem like the only statistical tools that scientists have at their disposal when computing uncertainties associated with predictions, particularly in cases of environmental processes such as groundwater movement. However, calculation of uncertainties need not be as lengthy, a new study shows. Comparing two approaches—the classical Bayesian “credible interval” and a less commonly used regression-based “confidence interval” method—Lu et al. show that for many practical purposes both methods provide similar estimates of uncertainties. The advantage of the regression method is that it demands 10-1000 model runs, whereas the classical Bayesian approach requires 10,000 to millions of model runs.

  11. Super-resolution from single photon emission: toward biological application

    NASA Astrophysics Data System (ADS)

    Moreva, E.; Traina, P.; Forneris, J.; Ditalia Tchernij, S.; Guarina, L.; Franchino, C.; Picollo, F.; Ruo Berchera, I.; Brida, G.; Degiovanni, I. P.; Carabelli, V.; Olivero, P.; Genovese, M.

    2017-08-01

    Properties of quantum light represent a tool for overcoming limits of classical optics. Several experiments have demonstrated this advantage ranging from quantum enhanced imaging to quantum illumination. In this work, experimental demonstration of quantum-enhanced resolution in confocal fluorescence microscopy will be presented. This is achieved by exploiting the non-classical photon statistics of fluorescence emission of single nitrogen-vacancy (NV) color centers in diamond. By developing a general model of super-resolution based on the direct sampling of the kth-order autocorrelation function of the photoluminescence signal, we show the possibility to resolve, in principle, arbitrarily close emitting centers. Finally, possible applications of NV-based fluorescent nanodiamonds in biosensing and future developments will be presented.

  12. Exact Extremal Statistics in the Classical 1D Coulomb Gas

    NASA Astrophysics Data System (ADS)

    Dhar, Abhishek; Kundu, Anupam; Majumdar, Satya N.; Sabhapandit, Sanjib; Schehr, Grégory

    2017-08-01

    We consider a one-dimensional classical Coulomb gas of N -like charges in a harmonic potential—also known as the one-dimensional one-component plasma. We compute, analytically, the probability distribution of the position xmax of the rightmost charge in the limit of large N . We show that the typical fluctuations of xmax around its mean are described by a nontrivial scaling function, with asymmetric tails. This distribution is different from the Tracy-Widom distribution of xmax for Dyson's log gas. We also compute the large deviation functions of xmax explicitly and show that the system exhibits a third-order phase transition, as in the log gas. Our theoretical predictions are verified numerically.

  13. Conservative classical and quantum resolution limits for incoherent imaging

    NASA Astrophysics Data System (ADS)

    Tsang, Mankei

    2018-06-01

    I propose classical and quantum limits to the statistical resolution of two incoherent optical point sources from the perspective of minimax parameter estimation. Unlike earlier results based on the Cramér-Rao bound (CRB), the limits proposed here, based on the worst-case error criterion and a Bayesian version of the CRB, are valid for any biased or unbiased estimator and obey photon-number scalings that are consistent with the behaviours of actual estimators. These results prove that, from the minimax perspective, the spatial-mode demultiplexing measurement scheme recently proposed by Tsang, Nair, and Lu [Phys. Rev. X 2016, 6 031033.] remains superior to direct imaging for sufficiently high photon numbers.

  14. Mean-field approximation for spacing distribution functions in classical systems

    NASA Astrophysics Data System (ADS)

    González, Diego Luis; Pimpinelli, Alberto; Einstein, T. L.

    2012-01-01

    We propose a mean-field method to calculate approximately the spacing distribution functions p(n)(s) in one-dimensional classical many-particle systems. We compare our method with two other commonly used methods, the independent interval approximation and the extended Wigner surmise. In our mean-field approach, p(n)(s) is calculated from a set of Langevin equations, which are decoupled by using a mean-field approximation. We find that in spite of its simplicity, the mean-field approximation provides good results in several systems. We offer many examples illustrating that the three previously mentioned methods give a reasonable description of the statistical behavior of the system. The physical interpretation of each method is also discussed.

  15. Performance of quantum annealing on random Ising problems implemented using the D-Wave Two

    NASA Astrophysics Data System (ADS)

    Wang, Zhihui; Job, Joshua; Rønnow, Troels F.; Troyer, Matthias; Lidar, Daniel A.; USC Collaboration; ETH Collaboration

    2014-03-01

    Detecting a possible speedup of quantum annealing compared to classical algorithms is a pressing task in experimental adiabatic quantum computing. In this talk, we discuss the performance of the D-Wave Two quantum annealing device on Ising spin glass problems. The expected time to solution for the device to solve random instances with up to 503 spins and with specified coupling ranges is evaluated while carefully addressing the issue of statistical errors. We perform a systematic comparison of the expected time to solution between the D-Wave Two and classical stochastic solvers, specifically simulated annealing, and simulated quantum annealing based on quantum Monte Carlo, and discuss the question of speedup.

  16. A simple white noise analysis of neuronal light responses.

    PubMed

    Chichilnisky, E J

    2001-05-01

    A white noise technique is presented for estimating the response properties of spiking visual system neurons. The technique is simple, robust, efficient and well suited to simultaneous recordings from multiple neurons. It provides a complete and easily interpretable model of light responses even for neurons that display a common form of response nonlinearity that precludes classical linear systems analysis. A theoretical justification of the technique is presented that relies only on elementary linear algebra and statistics. Implementation is described with examples. The technique and the underlying model of neural responses are validated using recordings from retinal ganglion cells, and in principle are applicable to other neurons. Advantages and disadvantages of the technique relative to classical approaches are discussed.

  17. The uniform quantized electron gas revisited

    NASA Astrophysics Data System (ADS)

    Lomba, Enrique; Høye, Johan S.

    2017-11-01

    In this article we continue and extend our recent work on the correlation energy of the quantized electron gas of uniform density at temperature T=0 . As before, we utilize the methods, properties, and results obtained by means of classical statistical mechanics. These were extended to quantized systems via the Feynman path integral formalism. The latter translates the quantum problem into a classical polymer problem in four dimensions. Again, the well known RPA (random phase approximation) is recovered as a basic result which we then modify and improve upon. Here we analyze the condition of thermodynamic self-consistency. Our numerical calculations exhibit a remarkable agreement with well known results of a standard parameterization of Monte Carlo correlation energies.

  18. Convexity of quantum χ2-divergence.

    PubMed

    Hansen, Frank

    2011-06-21

    The general quantum χ(2)-divergence has recently been introduced by Temme et al. [Temme K, Kastoryano M, Ruskai M, Wolf M, Verstrate F (2010) J Math Phys 51:122201] and applied to quantum channels (quantum Markov processes). The quantum χ(2)-divergence is not unique, as opposed to the classical χ(2)-divergence, but depends on the choice of quantum statistics. It was noticed that the elements in a particular one-parameter family of quantum χ(2)-divergences are convex functions in the density matrices (ρ,σ), thus mirroring the convexity of the classical χ(2)(p,q)-divergence in probability distributions (p,q). We prove that any quantum χ(2)-divergence is a convex function in its two arguments.

  19. Nonequilibrium Precondensation of Classical Waves in Two Dimensions Propagating through Atomic Vapors

    NASA Astrophysics Data System (ADS)

    Šantić, Neven; Fusaro, Adrien; Salem, Sabeur; Garnier, Josselin; Picozzi, Antonio; Kaiser, Robin

    2018-02-01

    The nonlinear Schrödinger equation, used to describe the dynamics of quantum fluids, is known to be valid not only for massive particles but also for the propagation of light in a nonlinear medium, predicting condensation of classical waves. Here we report on the initial evolution of random waves with Gaussian statistics using atomic vapors as an efficient two dimensional nonlinear medium. Experimental and theoretical analysis of near field images reveal a phenomenon of nonequilibrium precondensation, characterized by a fast relaxation towards a precondensate fraction of up to 75%. Such precondensation is in contrast to complete thermalization to the Rayleigh-Jeans equilibrium distribution, requiring prohibitive long interaction lengths.

  20. Statistical analysis of 4 types of neck whiplash injuries based on classical meridian theory.

    PubMed

    Chen, Yemeng; Zhao, Yan; Xue, Xiaolin; Li, Hui; Wu, Xiuyan; Zhang, Qunce; Zheng, Xin; Wang, Tianfang

    2015-01-01

    As one component of the Chinese medicine meridian system, the meridian sinew (Jingjin, (see text), tendino-musculo) is specially described as being for acupuncture treatment of the musculoskeletal system because of its dynamic attributes and tender point correlations. In recent decades, the therapeutic importance of the sinew meridian has become revalued in clinical application. Based on this theory, the authors have established therapeutic strategies of acupuncture treatment in Whiplash-Associated Disorders (WAD) by categorizing four types of neck symptom presentations. The advantage of this new system is to make it much easier for the clinician to find effective acupuncture points. This study attempts to prove the significance of the proposed therapeutic strategies by analyzing data collected from a clinical survey of various WAD using non-supervised statistical methods, such as correlation analysis, factor analysis, and cluster analysis. The clinical survey data have successfully verified discrete characteristics of four neck syndromes, based upon the range of motion (ROM) and tender point location findings. A summary of the relationships among the symptoms of the four neck syndromes has shown the correlation coefficient as having a statistical significance (P < 0.01 or P < 0.05), especially with regard to ROM. Furthermore, factor and cluster analyses resulted in a total of 11 categories of general symptoms, which implies syndrome factors are more related to the Liver, as originally described in classical theory. The hypothesis of meridian sinew syndromes in WAD is clearly supported by the statistical analysis of the clinical trials. This new discovery should be beneficial in improving therapeutic outcomes.

  1. Scaling and stochastic cascade properties of NEMO oceanic simulations and their potential value for GCM evaluation and downscaling

    NASA Astrophysics Data System (ADS)

    Verrier, Sébastien; Crépon, Michel; Thiria, Sylvie

    2014-09-01

    Spectral scaling properties have already been evidenced on oceanic numerical simulations and have been subject to several interpretations. They can be used to evaluate classical turbulence theories that predict scaling with specific exponents and to evaluate the quality of GCM outputs from a statistical and multiscale point of view. However, a more complete framework based on multifractal cascades is able to generalize the classical but restrictive second-order spectral framework to other moment orders, providing an accurate description of probability distributions of the fields at multiple scales. The predictions of this formalism still needed systematic verification in oceanic GCM while they have been confirmed recently for their atmospheric counterparts by several papers. The present paper is devoted to a systematic analysis of several oceanic fields produced by the NEMO oceanic GCM. Attention is focused to regional, idealized configurations that permit to evaluate the NEMO engine core from a scaling point of view regardless of limitations involved by land masks. Based on classical multifractal analysis tools, multifractal properties were evidenced for several oceanic state variables (sea surface temperature and salinity, velocity components, etc.). While first-order structure functions estimated a different nonconservativity parameter H in two scaling ranges, the multiorder statistics of turbulent fluxes were scaling over almost the whole available scaling range. This multifractal scaling was then parameterized with the help of the universal multifractal framework, providing parameters that are coherent with existing empirical literature. Finally, we argue that the knowledge of these properties may be useful for oceanographers. The framework seems very well suited for the statistical evaluation of OGCM outputs. Moreover, it also provides practical solutions to simulate subpixel variability stochastically for GCM downscaling purposes. As an independent perspective, the existence of multifractal properties in oceanic flows seems also interesting for investigating scale dependencies in remote sensing inversion algorithms.

  2. New robust statistical procedures for the polytomous logistic regression models.

    PubMed

    Castilla, Elena; Ghosh, Abhik; Martin, Nirian; Pardo, Leandro

    2018-05-17

    This article derives a new family of estimators, namely the minimum density power divergence estimators, as a robust generalization of the maximum likelihood estimator for the polytomous logistic regression model. Based on these estimators, a family of Wald-type test statistics for linear hypotheses is introduced. Robustness properties of both the proposed estimators and the test statistics are theoretically studied through the classical influence function analysis. Appropriate real life examples are presented to justify the requirement of suitable robust statistical procedures in place of the likelihood based inference for the polytomous logistic regression model. The validity of the theoretical results established in the article are further confirmed empirically through suitable simulation studies. Finally, an approach for the data-driven selection of the robustness tuning parameter is proposed with empirical justifications. © 2018, The International Biometric Society.

  3. Null but not void: considerations for hypothesis testing.

    PubMed

    Shaw, Pamela A; Proschan, Michael A

    2013-01-30

    Standard statistical theory teaches us that once the null and alternative hypotheses have been defined for a parameter, the choice of the statistical test is clear. Standard theory does not teach us how to choose the null or alternative hypothesis appropriate to the scientific question of interest. Neither does it tell us that in some cases, depending on which alternatives are realistic, we may want to define our null hypothesis differently. Problems in statistical practice are frequently not as pristinely summarized as the classic theory in our textbooks. In this article, we present examples in statistical hypothesis testing in which seemingly simple choices are in fact rich with nuance that, when given full consideration, make the choice of the right hypothesis test much less straightforward. Published 2012. This article is a US Government work and is in the public domain in the USA.

  4. Demineralization of resin-sealed enamel by soft drinks in a clinically relevant pH cycling model.

    PubMed

    Bartels, Agata A; Evans, Carla A; Viana, Grace; Bedran-Russo, Ana K

    2016-04-01

    To compare the in vitro protective effect of orthodontic sealants on the enamel demineralization under a soft drink-induced erosive challenge. The facial surfaces of bovine incisors were sectioned into 5 mm x 4 mm x 4 mm enamel blocks. Specimens were randomly assigned to three surface protection measures: control (exposed enamel), coating with Transbond XT (unfilled resin primer), or coating with Opal Seal (filled and fluoride releasing primer). Thermocycling was used to simulate aging. The specimens were pH cycled through an acidic buffer, test beverage and a neutral buffer for a total of 7 days. Test beverages included water, Diet Mountain Dew, and Coke Classic. Quantitative light-induced fluorescence (QLF) images were taken at baseline and after aging. Final QLF images were taken to evaluate the demineralization of enamel. Data were analyzed statistically using a two-way ANOVA to compare the interaction between enamel surface protection and beverages as well as one-way ANOVA to compare surface protection and the test beverage levels. A statistically significant interaction was found between the surface protected groups and the test beverage groups (P < 0.05). Statistically significant differences were found among the test beverage groups (P < 0.05) and among the surface protection groups (P < 0.05). Coke Classic went through the sealant layer resulting in high enamel demineralization. Enamel coating with Opal Seal significantly reduced the erosive attack of beverages.

  5. Assessing compositional variability through graphical analysis and Bayesian statistical approaches: case studies on transgenic crops.

    PubMed

    Harrigan, George G; Harrison, Jay M

    2012-01-01

    New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.

  6. Revised standards for statistical evidence.

    PubMed

    Johnson, Valen E

    2013-11-26

    Recent advances in Bayesian hypothesis testing have led to the development of uniformly most powerful Bayesian tests, which represent an objective, default class of Bayesian hypothesis tests that have the same rejection regions as classical significance tests. Based on the correspondence between these two classes of tests, it is possible to equate the size of classical hypothesis tests with evidence thresholds in Bayesian tests, and to equate P values with Bayes factors. An examination of these connections suggest that recent concerns over the lack of reproducibility of scientific studies can be attributed largely to the conduct of significance tests at unjustifiably high levels of significance. To correct this problem, evidence thresholds required for the declaration of a significant finding should be increased to 25-50:1, and to 100-200:1 for the declaration of a highly significant finding. In terms of classical hypothesis tests, these evidence standards mandate the conduct of tests at the 0.005 or 0.001 level of significance.

  7. Confidence of compliance: a Bayesian approach for percentile standards.

    PubMed

    McBride, G B; Ellis, J C

    2001-04-01

    Rules for assessing compliance with percentile standards commonly limit the number of exceedances permitted in a batch of samples taken over a defined assessment period. Such rules are commonly developed using classical statistical methods. Results from alternative Bayesian methods are presented (using beta-distributed prior information and a binomial likelihood), resulting in "confidence of compliance" graphs. These allow simple reading of the consumer's risk and the supplier's risks for any proposed rule. The influence of the prior assumptions required by the Bayesian technique on the confidence results is demonstrated, using two reference priors (uniform and Jeffreys') and also using optimistic and pessimistic user-defined priors. All four give less pessimistic results than does the classical technique, because interpreting classical results as "confidence of compliance" actually invokes a Bayesian approach with an extreme prior distribution. Jeffreys' prior is shown to be the most generally appropriate choice of prior distribution. Cost savings can be expected using rules based on this approach.

  8. Thermal helium clusters at 3.2 Kelvin in classical and semiclassical simulations

    NASA Astrophysics Data System (ADS)

    Schulte, J.

    1993-03-01

    The thermodynamic stability of4He4-13 at 3.2 K is investigated with the classical Monte Carlo method, with the semiclassical path-integral Monte Carlo (PIMC) method, and with the semiclassical all-order many-body method. In the all-order many-body simulation the dipole-dipole approximation including short-range correction is used. The resulting stability plots are discussed and related to recent TOF experiments by Stephens and King. It is found that with classical Monte Carlo of course the characteristics of the measured mass spectrum cannot be resolved. With PIMC, switching on more and more quantum mechanics. by raising the number of virtual time steps results in more structure in the stability plot, but this did not lead to sufficient agreement with the TOF experiment. Only the all-order many-body method resolved the characteristic structures of the measured mass spectrum, including magic numbers. The result shows the influence of quantum statistics and quantum mechanics on the stability of small neutral helium clusters.

  9. Wave chaos in the elastic disk.

    PubMed

    Sondergaard, Niels; Tanner, Gregor

    2002-12-01

    The relation between the elastic wave equation for plane, isotropic bodies and an underlying classical ray dynamics is investigated. We study, in particular, the eigenfrequencies of an elastic disk with free boundaries and their connection to periodic rays inside the circular domain. Even though the problem is separable, wave mixing between the shear and pressure component of the wave field at the boundary leads to an effective stochastic part in the ray dynamics. This introduces phenomena typically associated with classical chaos as, for example, an exponential increase in the number of periodic orbits. Classically, the problem can be decomposed into an integrable part and a simple binary Markov process. Similarly, the wave equation can, in the high-frequency limit, be mapped onto a quantum graph. Implications of this result for the level statistics are discussed. Furthermore, a periodic trace formula is derived from the scattering matrix based on the inside-outside duality between eigenmodes and scattering solutions and periodic orbits are identified by Fourier transforming the spectral density.

  10. Parametric resonance in tunable superconducting cavities

    NASA Astrophysics Data System (ADS)

    Wustmann, Waltraut; Shumeiko, Vitaly

    2013-05-01

    We develop a theory of parametric resonance in tunable superconducting cavities. The nonlinearity introduced by the superconducting quantum interference device (SQUID) attached to the cavity and damping due to connection of the cavity to a transmission line are taken into consideration. We study in detail the nonlinear classical dynamics of the cavity field below and above the parametric threshold for the degenerate parametric resonance, featuring regimes of multistability and parametric radiation. We investigate the phase-sensitive amplification of external signals on resonance, as well as amplification of detuned signals, and relate the amplifier performance to that of linear parametric amplifiers. We also discuss applications of the device for dispersive qubit readout. Beyond the classical response of the cavity, we investigate small quantum fluctuations around the amplified classical signals. We evaluate the noise power spectrum both for the internal field in the cavity and the output field. Other quantum-statistical properties of the noise are addressed such as squeezing spectra, second-order coherence, and two-mode entanglement.

  11. A System Computational Model of Implicit Emotional Learning

    PubMed Central

    Puviani, Luca; Rama, Sidita

    2016-01-01

    Nowadays, the experimental study of emotional learning is commonly based on classical conditioning paradigms and models, which have been thoroughly investigated in the last century. Unluckily, models based on classical conditioning are unable to explain or predict important psychophysiological phenomena, such as the failure of the extinction of emotional responses in certain circumstances (for instance, those observed in evaluative conditioning, in post-traumatic stress disorders and in panic attacks). In this manuscript, starting from the experimental results available from the literature, a computational model of implicit emotional learning based both on prediction errors computation and on statistical inference is developed. The model quantitatively predicts (a) the occurrence of evaluative conditioning, (b) the dynamics and the resistance-to-extinction of the traumatic emotional responses, (c) the mathematical relation between classical conditioning and unconditioned stimulus revaluation. Moreover, we discuss how the derived computational model can lead to the development of new animal models for resistant-to-extinction emotional reactions and novel methodologies of emotions modulation. PMID:27378898

  12. A System Computational Model of Implicit Emotional Learning.

    PubMed

    Puviani, Luca; Rama, Sidita

    2016-01-01

    Nowadays, the experimental study of emotional learning is commonly based on classical conditioning paradigms and models, which have been thoroughly investigated in the last century. Unluckily, models based on classical conditioning are unable to explain or predict important psychophysiological phenomena, such as the failure of the extinction of emotional responses in certain circumstances (for instance, those observed in evaluative conditioning, in post-traumatic stress disorders and in panic attacks). In this manuscript, starting from the experimental results available from the literature, a computational model of implicit emotional learning based both on prediction errors computation and on statistical inference is developed. The model quantitatively predicts (a) the occurrence of evaluative conditioning, (b) the dynamics and the resistance-to-extinction of the traumatic emotional responses, (c) the mathematical relation between classical conditioning and unconditioned stimulus revaluation. Moreover, we discuss how the derived computational model can lead to the development of new animal models for resistant-to-extinction emotional reactions and novel methodologies of emotions modulation.

  13. Optical nonclassicality test based on third-order intensity correlations

    NASA Astrophysics Data System (ADS)

    Rigovacca, L.; Kolthammer, W. S.; Di Franco, C.; Kim, M. S.

    2018-03-01

    We develop a nonclassicality criterion for the interference of three delayed, but otherwise identical, light fields in a three-mode Bell interferometer. We do so by comparing the prediction of quantum mechanics with those of a classical framework in which independent sources emit electric fields with random phases. In particular, we evaluate third-order correlations among output intensities as a function of the delays, and show how the presence of a correlation revival for small delays cannot be explained by the classical model of light. The observation of a revival is thus a nonclassicality signature, which can be achieved only by sources with a photon-number statistics that is highly sub-Poissonian. Our analysis provides strong evidence for the nonclassicality of the experiment discussed by Menssen et al. [Phys. Rev. Lett. 118, 153603 (2017), 10.1103/PhysRevLett.118.153603], and shows how a collective "triad" phase affects the interference of any three or more light fields, irrespective of their quantum or classical character.

  14. A Semiclassical Derivation of the QCD Coupling

    NASA Technical Reports Server (NTRS)

    Batchelor, David

    2009-01-01

    The measured value of the QCD coupling alpha(sub s) at the energy M(sub Zo), the variation of alpha(sub s) as a function of energy in QCD, and classical relativistic dynamics are used to investigate virtual pairs of quarks and antiquarks in vacuum fluctuations. For virtual pairs of bottom quarks and antiquarks, the pair lifetime in the classical model agrees with the lifetime from quantum mechanics to good approximation, and the action integral in the classical model agrees as well with the action that follows from the Uncertainty Principle. This suggests that the particles might have small de Broglie wavelengths and behave with well-localized pointlike dynamics. It also permits alpha(sub s) at the mass energy twice the bottom quark mass to be expressed as a simple fraction: 3/16. This is accurate to approximately 10%. The model in this paper predicts the measured value of alpha(sub s)(M(sub Zo)) to be 0.121, which is in agreement with recent measurements within statistical uncertainties.

  15. Reflections on Quantum Data Hiding

    NASA Astrophysics Data System (ADS)

    Winter, Andreas

    Quantum data hiding, originally invented as a limitation on local operations and classical communications (LOCC) in distinguishing globally orthogonal states, is actually a phenomenon arising generically in statistics whenever comparing a `strong' set of measurements (i.e., decision rules) with a `weak' one. The classical statistical analogue of this would be secret sharing, in which two perfectly distinguishable multi-partite hypotheses appear to be indistinguishable when accessing only a marginal. The quantum versions are richer in that for example LOCC allows for state tomography, so the states cannot be come perfectly indistinguishable but only nearly so, and hence the question is one of efficiency. I will discuss two concrete examples and associated sets of problems: 1. Gaussian operations and classical computation (GOCC): Not very surprisingly, GOCC cannot distinguish optimally even two coherent states of a single mode. Here we find states, each a mixture of multi-mode coherent states, which are almost perfectly distinguishable by suitable measurements, by when restricted to GOCC, i.e. linear optics and post-processing, the states appear almost identical. The construction is random and relies on coding arguments. Open questions include whether there one can give a constructive version of the argument, and whether for instance even thermal states can be used, or how efficient the hiding is. 2. Local operation and classical communication (LOCC): It is well-known that in a bipartite dxd-system, asymptotically logd bits can be hidden. Here we show for the first time, using the calculus of min-entropies, that this is asymptotically optimal. In fact, we get bounds on the data hiding capacity of any preparation system; these are however not always tight. While it is known that data hiding by separable states is possible (i.e. the state preparation can be done by LOCC), it is open whether the optimal information efficiency of (asymptotically) log d bits can be achieved by separable states.

  16. EHME: a new word database for research in Basque language.

    PubMed

    Acha, Joana; Laka, Itziar; Landa, Josu; Salaburu, Pello

    2014-11-14

    This article presents EHME, the frequency dictionary of Basque structure, an online program that enables researchers in psycholinguistics to extract word and nonword stimuli, based on a broad range of statistics concerning the properties of Basque words. The database consists of 22.7 million tokens, and properties available include morphological structure frequency and word-similarity measures, apart from classical indexes: word frequency, orthographic structure, orthographic similarity, bigram and biphone frequency, and syllable-based measures. Measures are indexed at the lemma, morpheme and word level. We include reliability and validation analysis. The application is freely available, and enables the user to extract words based on concrete statistical criteria 1 , as well as to obtain statistical characteristics from a list of words

  17. Bayesian theories of conditioning in a changing world.

    PubMed

    Courville, Aaron C; Daw, Nathaniel D; Touretzky, David S

    2006-07-01

    The recent flowering of Bayesian approaches invites the re-examination of classic issues in behavior, even in areas as venerable as Pavlovian conditioning. A statistical account can offer a new, principled interpretation of behavior, and previous experiments and theories can inform many unexplored aspects of the Bayesian enterprise. Here we consider one such issue: the finding that surprising events provoke animals to learn faster. We suggest that, in a statistical account of conditioning, surprise signals change and therefore uncertainty and the need for new learning. We discuss inference in a world that changes and show how experimental results involving surprise can be interpreted from this perspective, and also how, thus understood, these phenomena help constrain statistical theories of animal and human learning.

  18. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    PubMed

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  19. A Pedagogical Approach to the Boltzmann Factor through Experiments and Simulations

    ERIC Educational Resources Information Center

    Battaglia, O. R.; Bonura, A.; Sperandeo-Mineo, R. M.

    2009-01-01

    The Boltzmann factor is the basis of a huge amount of thermodynamic and statistical physics, both classical and quantum. It governs the behaviour of all systems in nature that are exchanging energy with their environment. To understand why the expression has this specific form involves a deep mathematical analysis, whose flow of logic is hard to…

  20. Smoking and Cancers: Case-Robust Analysis of a Classic Data Set

    ERIC Educational Resources Information Center

    Bentler, Peter M.; Satorra, Albert; Yuan, Ke-Hai

    2009-01-01

    A typical structural equation model is intended to reproduce the means, variances, and correlations or covariances among a set of variables based on parameter estimates of a highly restricted model. It is not widely appreciated that the sample statistics being modeled can be quite sensitive to outliers and influential observations, leading to bias…

  1. A Formal Derivation of the Gibbs Entropy for Classical Systems Following the Schrodinger Quantum Mechanical Approach

    ERIC Educational Resources Information Center

    Santillan, M.; Zeron, E. S.; Del Rio-Correa, J. L.

    2008-01-01

    In the traditional statistical mechanics textbooks, the entropy concept is first introduced for the microcanonical ensemble and then extended to the canonical and grand-canonical cases. However, in the authors' experience, this procedure makes it difficult for the student to see the bigger picture and, although quite ingenuous, the subtleness of…

  2. Comparing the Effectiveness of SPSS and EduG Using Different Designs for Generalizability Theory

    ERIC Educational Resources Information Center

    Teker, Gulsen Tasdelen; Guler, Nese; Uyanik, Gulden Kaya

    2015-01-01

    Generalizability theory (G theory) provides a broad conceptual framework for social sciences such as psychology and education, and a comprehensive construct for numerous measurement events by using analysis of variance, a strong statistical method. G theory, as an extension of both classical test theory and analysis of variance, is a model which…

  3. Selected Topics in Experimental Statistics with Army Applications

    DTIC Science & Technology

    1983-12-01

    could employ the sum indicated by ’t pjXij + f, as the model. 6-45 UM«OUIVI-K /Ub- IUd As is usual and for use in significance tests, we will...were taken from a study and classical example of Bortkiewicz (Ref. 40), which describes the number of deaths from kicks of horses in the Prussian Army

  4. Developing a Test for Assessing Elementary Students' Comprehension of Science Texts

    ERIC Educational Resources Information Center

    Wang, Jing-Ru; Chen, Shin-Feng; Tsay, Reuy-Fen; Chou, Ching-Ting; Lin, Sheau-Wen; Kao, Huey-Lien

    2012-01-01

    This study reports on the process of developing a test to assess students' reading comprehension of scientific materials and on the statistical results of the verification study. A combination of classic test theory and item response theory approaches was used to analyze the assessment data from a verification study. Data analysis indicates the…

  5. Prerequisites for Systems Analysts: Analytic and Management Demands of a New Approach to Educational Administration.

    ERIC Educational Resources Information Center

    Ammentorp, William

    There is much to be gained by using systems analysis in educational administration. Most administrators, presently relying on classical statistical techniques restricted to problems having few variables, should be trained to use more sophisticated tools such as systems analysis. The systems analyst, interested in the basic processes of a group or…

  6. A Model for Post Hoc Evaluation.

    ERIC Educational Resources Information Center

    Theimer, William C., Jr.

    Often a research department in a school system is called on to make an after the fact evaluation of a program or project. Although the department is operating under a handicap, it can still provide some data useful for evaluative purposes. It is suggested that all the classical methods of descriptive statistics be brought into play. The use of…

  7. Spin flip statistics and spin wave interference patterns in Ising ferromagnetic films: A Monte Carlo study.

    PubMed

    Acharyya, Muktish

    2017-07-01

    The spin wave interference is studied in two dimensional Ising ferromagnet driven by two coherent spherical magnetic field waves by Monte Carlo simulation. The spin waves are found to propagate and interfere according to the classic rule of interference pattern generated by two point sources. The interference pattern of spin wave is observed in one boundary of the lattice. The interference pattern is detected and studied by spin flip statistics at high and low temperatures. The destructive interference is manifested as the large number of spin flips and vice versa.

  8. Degraded Chinese rubbing images thresholding based on local first-order statistics

    NASA Astrophysics Data System (ADS)

    Wang, Fang; Hou, Ling-Ying; Huang, Han

    2017-06-01

    It is a necessary step for Chinese character segmentation from degraded document images in Optical Character Recognizer (OCR); however, it is challenging due to various kinds of noising in such an image. In this paper, we present three local first-order statistics method that had been adaptive thresholding for segmenting text and non-text of Chinese rubbing image. Both visual inspection and numerically investigate for the segmentation results of rubbing image had been obtained. In experiments, it obtained better results than classical techniques in the binarization of real Chinese rubbing image and PHIBD 2012 datasets.

  9. Full statistical mode reconstruction of a light field via a photon-number-resolved measurement

    NASA Astrophysics Data System (ADS)

    Burenkov, I. A.; Sharma, A. K.; Gerrits, T.; Harder, G.; Bartley, T. J.; Silberhorn, C.; Goldschmidt, E. A.; Polyakov, S. V.

    2017-05-01

    We present a method to reconstruct the complete statistical mode structure and optical losses of multimode conjugated optical fields using an experimentally measured joint photon-number probability distribution. We demonstrate that this method evaluates classical and nonclassical properties using a single measurement technique and is well suited for quantum mesoscopic state characterization. We obtain a nearly perfect reconstruction of a field comprised of up to ten modes based on a minimal set of assumptions. To show the utility of this method, we use it to reconstruct the mode structure of an unknown bright parametric down-conversion source.

  10. Central Limit Theorems for Linear Statistics of Heavy Tailed Random Matrices

    NASA Astrophysics Data System (ADS)

    Benaych-Georges, Florent; Guionnet, Alice; Male, Camille

    2014-07-01

    We show central limit theorems (CLT) for the linear statistics of symmetric matrices with independent heavy tailed entries, including entries in the domain of attraction of α-stable laws and entries with moments exploding with the dimension, as in the adjacency matrices of Erdös-Rényi graphs. For the second model, we also prove a central limit theorem of the moments of its empirical eigenvalues distribution. The limit laws are Gaussian, but unlike the case of standard Wigner matrices, the normalization is the one of the classical CLT for independent random variables.

  11. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    PubMed

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students' understanding and suggests better long-term knowledge retention.

  12. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    PubMed Central

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2009-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students’ understanding and suggests better long-term knowledge retention. PMID:19750185

  13. Single, Complete, Probability Spaces Consistent With EPR-Bohm-Bell Experimental Data

    NASA Astrophysics Data System (ADS)

    Avis, David; Fischer, Paul; Hilbert, Astrid; Khrennikov, Andrei

    2009-03-01

    We show that paradoxical consequences of violations of Bell's inequality are induced by the use of an unsuitable probabilistic description for the EPR-Bohm-Bell experiment. The conventional description (due to Bell) is based on a combination of statistical data collected for different settings of polarization beam splitters (PBSs). In fact, such data consists of some conditional probabilities which only partially define a probability space. Ignoring this conditioning leads to apparent contradictions in the classical probabilistic model (due to Kolmogorov). We show how to make a completely consistent probabilistic model by taking into account the probabilities of selecting the settings of the PBSs. Our model matches both the experimental data and is consistent with classical probability theory.

  14. Two-dimensional collective electron magnetotransport, oscillations, and chaos in a semiconductor superlattice

    NASA Astrophysics Data System (ADS)

    Bonilla, L. L.; Carretero, M.; Segura, A.

    2017-12-01

    When quantized, traces of classically chaotic single-particle systems include eigenvalue statistics and scars in eigenfuntions. Since 2001, many theoretical and experimental works have argued that classically chaotic single-electron dynamics influences and controls collective electron transport. For transport in semiconductor superlattices under tilted magnetic and electric fields, these theories rely on a reduction to a one-dimensional self-consistent drift model. A two-dimensional theory based on self-consistent Boltzmann transport does not support that single-electron chaos influences collective transport. This theory agrees with existing experimental evidence of current self-oscillations, predicts spontaneous collective chaos via a period doubling scenario, and could be tested unambiguously by measuring the electric potential inside the superlattice under a tilted magnetic field.

  15. Mean-field approximation for spacing distribution functions in classical systems.

    PubMed

    González, Diego Luis; Pimpinelli, Alberto; Einstein, T L

    2012-01-01

    We propose a mean-field method to calculate approximately the spacing distribution functions p((n))(s) in one-dimensional classical many-particle systems. We compare our method with two other commonly used methods, the independent interval approximation and the extended Wigner surmise. In our mean-field approach, p((n))(s) is calculated from a set of Langevin equations, which are decoupled by using a mean-field approximation. We find that in spite of its simplicity, the mean-field approximation provides good results in several systems. We offer many examples illustrating that the three previously mentioned methods give a reasonable description of the statistical behavior of the system. The physical interpretation of each method is also discussed. © 2012 American Physical Society

  16. Two-dimensional collective electron magnetotransport, oscillations, and chaos in a semiconductor superlattice.

    PubMed

    Bonilla, L L; Carretero, M; Segura, A

    2017-12-01

    When quantized, traces of classically chaotic single-particle systems include eigenvalue statistics and scars in eigenfuntions. Since 2001, many theoretical and experimental works have argued that classically chaotic single-electron dynamics influences and controls collective electron transport. For transport in semiconductor superlattices under tilted magnetic and electric fields, these theories rely on a reduction to a one-dimensional self-consistent drift model. A two-dimensional theory based on self-consistent Boltzmann transport does not support that single-electron chaos influences collective transport. This theory agrees with existing experimental evidence of current self-oscillations, predicts spontaneous collective chaos via a period doubling scenario, and could be tested unambiguously by measuring the electric potential inside the superlattice under a tilted magnetic field.

  17. Bacterial turbulence in motion

    NASA Astrophysics Data System (ADS)

    Rusconi, Roberto; Smriga, Steven; Stocker, Roman; Secchi, Eleonora; Buzzaccaro, Stefano; Piazza, Roberto

    2014-11-01

    Dense suspensions of motile bacteria exhibit collective dynamics akin to those observed in classic, high Reynolds number turbulence, yet this analogy has remained largely qualitative. Here we present experiments in which a dense suspension of Bacillus subtilis bacteria was flown through narrow microchannels and the velocity statistics of the flowing suspension were accurately quantified with a recently developed velocimetry technique. This revealed a robust intermittency phenomenon, whereby the average velocity profile of the flowing suspension oscillated between a plug-like flow and a parabolic flow. This intermittency is a hallmark of classic turbulence and was associated with the presence of collective structures in the suspension. Furthermore, quantification of the Reynolds stress profile revealed a direct link between the turbulent nature of the suspension and its anomalous viscosity.

  18. Liquid-based cytology improves preoperative diagnostic accuracy of the tall cell variant of papillary thyroid carcinoma.

    PubMed

    Lee, Sung Hak; Jung, Chan Kwon; Bae, Ja Seong; Jung, So Lyung; Choi, Yeong Jin; Kang, Chang Suk

    2014-01-01

    The tall cell variant (TCV) of papillary thyroid carcinoma (PTC) is the most common among the aggressive variants of the disease. We aimed to investigate the clinicopathologic characteristics of TCV, and evaluate the diagnostic efficacy of liquid-based cytology (LBC) in TCV detection compared with conventional smear in thyroid fine needle aspiration (FNA). A total of 266 consecutive patients (220 women and 46 men) with PTC were enrolled. We analyzed tumor characteristics according to histologic growth patterns as classic, classic PTC with tall cell features, and TCV. The cytomorphologic features of these subtypes were investigated according to the preparation methods of conventional smear and LBC. TCV and classic PTC with tall cell features comprised 4.9% and 6.0% of all tumors, respectively, and were significantly associated with older age at presentation, larger tumor size, high frequency of extrathyroid extension, and BRAF mutation in comparison with classic PTC. However, there was no statistically significant difference in clinicopathologic features between TCV and classic PTC with tall cell features. Tall cells were more easily detected by LBC than by conventional smear. The percentage of tall cells identified using LBC was well correlated with three histologic subtypes. Our results demonstrate that TCV is more common than previously recognized in Korea and any PTC containing tall cells may have identical biological behavior regardless of the precise proportions of tall cells. It is possible to make a preoperative diagnosis of TCV using LBC. Copyright © 2013 Wiley Periodicals, Inc.

  19. On the emergence of classical gravity

    NASA Astrophysics Data System (ADS)

    Larjo, Klaus

    In this thesis I will discuss how certain black holes arise as an effective, thermodynamical description from non-singular microstates in string theory. This provides a possible solution to the information paradox, and strengthens the case for treating black holes as thermodynamical objects. I will characterize the data defining a microstate of a black hole in several settings, and demonstrate that most of the data is unmeasurable for a classical observer. I will further show that the data that is measurable is universal for nearly all microstates, making it impossible for a classical observer to distinguish between microstates, thus giving rise to an effective statistical description for the black hole. In the first half of the thesis I will work with two specific systems: the half-BPS sector of [Special characters omitted.] = 4 super Yang-Mills the and the conformal field theory corresponding to the D1/D5 system; in both cases the high degree of symmetry present provides great control over potentially intractable computations. For these systems, I will further specify the conditions a quantum mechanical microstate must satisfy in order to have a classical description in terms of a unique metric, and define a 'metric operator' whose eigenstates correspond to classical geometries. In the second half of the thesis I will consider a much broader setting, general [Special characters omitted.] = I superconformal quiver gauge the= ories and their dual gravity theories, and demonstrate that a similar effective description arises also in this setting.

  20. Classicality condition on a system observable in a quantum measurement and a relative-entropy conservation law

    NASA Astrophysics Data System (ADS)

    Kuramochi, Yui; Ueda, Masahito

    2015-03-01

    We consider the information flow on a system observable X corresponding to a positive-operator-valued measure under a quantum measurement process Y described by a completely positive instrument from the viewpoint of the relative entropy. We establish a sufficient condition for the relative-entropy conservation law which states that the average decrease in the relative entropy of the system observable X equals the relative entropy of the measurement outcome of Y , i.e., the information gain due to measurement. This sufficient condition is interpreted as an assumption of classicality in the sense that there exists a sufficient statistic in a joint successive measurement of Y followed by X such that the probability distribution of the statistic coincides with that of a single measurement of X for the premeasurement state. We show that in the case when X is a discrete projection-valued measure and Y is discrete, the classicality condition is equivalent to the relative-entropy conservation for arbitrary states. The general theory on the relative-entropy conservation is applied to typical quantum measurement models, namely, quantum nondemolition measurement, destructive sharp measurements on two-level systems, a photon counting, a quantum counting, homodyne and heterodyne measurements. These examples except for the nondemolition and photon-counting measurements do not satisfy the known Shannon-entropy conservation law proposed by Ban [M. Ban, J. Phys. A: Math. Gen. 32, 1643 (1999), 10.1088/0305-4470/32/9/012], implying that our approach based on the relative entropy is applicable to a wider class of quantum measurements.

  1. Influence of velocity effects on the shape of N2 (and air) broadened H2O lines revisited with classical molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Ngo, N. H.; Tran, H.; Gamache, R. R.; Bermejo, D.; Domenech, J.-L.

    2012-08-01

    The modeling of the shape of H2O lines perturbed by N2 (and air) using the Keilson-Storer (KS) kernel for collision-induced velocity changes is revisited with classical molecular dynamics simulations (CMDS). The latter have been performed for a large number of molecules starting from intermolecular-potential surfaces. Contrary to the assumption made in a previous study [H. Tran, D. Bermejo, J.-L. Domenech, P. Joubert, R. R. Gamache, and J.-M. Hartmann, J. Quant. Spectrosc. Radiat. Transf. 108, 126 (2007)], 10.1016/j.jqsrt.2007.03.009, the results of these CMDS show that the velocity-orientation and -modulus changes statistically occur at the same time scale. This validates the use of a single memory parameter in the Keilson-Storer kernel to describe both the velocity-orientation and -modulus changes. The CMDS results also show that velocity- and rotational state-changing collisions are statistically partially correlated. A partially correlated speed-dependent Keilson-Storer model has thus been used to describe the line-shape. For this, the velocity changes KS kernel parameters have been directly determined from CMDS, while the speed-dependent broadening and shifting coefficients have been calculated with a semi-classical approach. Comparisons between calculated spectra and measurements of several lines of H2O broadened by N2 (and air) in the ν3 and 2ν1 + ν2 + ν3 bands for a wide range of pressure show very satisfactory agreement. The evolution of non-Voigt effects from Doppler to collisional regimes is also presented and discussed.

  2. Estimation of plant sampling uncertainty: an example based on chemical analysis of moss samples.

    PubMed

    Dołęgowska, Sabina

    2016-11-01

    In order to estimate the level of uncertainty arising from sampling, 54 samples (primary and duplicate) of the moss species Pleurozium schreberi (Brid.) Mitt. were collected within three forested areas (Wierna Rzeka, Piaski, Posłowice Range) in the Holy Cross Mountains (south-central Poland). During the fieldwork, each primary sample composed of 8 to 10 increments (subsamples) was taken over an area of 10 m 2 whereas duplicate samples were collected in the same way at a distance of 1-2 m. Subsequently, all samples were triple rinsed with deionized water, dried, milled, and digested (8 mL HNO 3 (1:1) + 1 mL 30 % H 2 O 2 ) in a closed microwave system Multiwave 3000. The prepared solutions were analyzed twice for Cu, Fe, Mn, and Zn using FAAS and GFAAS techniques. All datasets were checked for normality and for normally distributed elements (Cu from Piaski, Zn from Posłowice, Fe, Zn from Wierna Rzeka). The sampling uncertainty was computed with (i) classical ANOVA, (ii) classical RANOVA, (iii) modified RANOVA, and (iv) range statistics. For the remaining elements, the sampling uncertainty was calculated with traditional and/or modified RANOVA (if the amount of outliers did not exceed 10 %) or classical ANOVA after Box-Cox transformation (if the amount of outliers exceeded 10 %). The highest concentrations of all elements were found in moss samples from Piaski, whereas the sampling uncertainty calculated with different statistical methods ranged from 4.1 to 22 %.

  3. Semi-Poisson statistics in quantum chaos.

    PubMed

    García-García, Antonio M; Wang, Jiao

    2006-03-01

    We investigate the quantum properties of a nonrandom Hamiltonian with a steplike singularity. It is shown that the eigenfunctions are multifractals and, in a certain range of parameters, the level statistics is described exactly by semi-Poisson statistics (SP) typical of pseudointegrable systems. It is also shown that our results are universal, namely, they depend exclusively on the presence of the steplike singularity and are not modified by smooth perturbations of the potential or the addition of a magnetic flux. Although the quantum properties of our system are similar to those of a disordered conductor at the Anderson transition, we report important quantitative differences in both the level statistics and the multifractal dimensions controlling the transition. Finally, the study of quantum transport properties suggests that the classical singularity induces quantum anomalous diffusion. We discuss how these findings may be experimentally corroborated by using ultracold atoms techniques.

  4. An astronomer's guide to period searching

    NASA Astrophysics Data System (ADS)

    Schwarzenberg-Czerny, A.

    2003-03-01

    We concentrate on analysis of unevenly sampled time series, interrupted by periodic gaps, as often encountered in astronomy. While some of our conclusions may appear surprising, all are based on classical statistical principles of Fisher & successors. Except for discussion of the resolution issues, it is best for the reader to forget temporarily about Fourier transforms and to concentrate on problems of fitting of a time series with a model curve. According to their statistical content we divide the issues into several sections, consisting of: (ii) statistical numerical aspects of model fitting, (iii) evaluation of fitted models as hypotheses testing, (iv) the role of the orthogonal models in signal detection (v) conditions for equivalence of periodograms (vi) rating sensitivity by test power. An experienced observer working with individual objects would benefit little from formalized statistical approach. However, we demonstrate the usefulness of this approach in evaluation of performance of periodograms and in quantitative design of large variability surveys.

  5. Quantum statistics and squeezing for a microwave-driven interacting magnon system.

    PubMed

    Haghshenasfard, Zahra; Cottam, Michael G

    2017-02-01

    Theoretical studies are reported for the statistical properties of a microwave-driven interacting magnon system. Both the magnetic dipole-dipole and the exchange interactions are included and the theory is developed for the case of parallel pumping allowing for the inclusion of the nonlinear processes due to the four-magnon interactions. The method of second quantization is used to transform the total Hamiltonian from spin operators to boson creation and annihilation operators. By using the coherent magnon state representation we have studied the magnon occupation number and the statistical behavior of the system. In particular, it is shown that the nonlinearities introduced by the parallel pumping field and the four-magnon interactions lead to non-classical quantum statistical properties of the system, such as magnon squeezing. Also control of the collapse-and-revival phenomena for the time evolution of the average magnon number is demonstrated by varying the parallel pumping amplitude and the four-magnon coupling.

  6. Bayesian demography 250 years after Bayes

    PubMed Central

    Bijak, Jakub; Bryant, John

    2016-01-01

    Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889

  7. Body composition in patients with classical homocystinuria: body mass relates to homocysteine and choline metabolism.

    PubMed

    Poloni, Soraia; Leistner-Segal, Sandra; Bandeira, Isabel Cristina; D'Almeida, Vânia; de Souza, Carolina Fischinger Moura; Spritzer, Poli Mara; Castro, Kamila; Tonon, Tássia; Nalin, Tatiéle; Imbard, Apolline; Blom, Henk J; Schwartz, Ida V D

    2014-08-10

    Classical homocystinuria is a rare genetic disease caused by cystathionine β-synthase deficiency, resulting in homocysteine accumulation. Growing evidence suggests that reduced fat mass in patients with classical homocystinuria may be associated with alterations in choline and homocysteine pathways. This study aimed to evaluate the body composition of patients with classical homocystinuria, identifying changes in body fat percentage and correlating findings with biochemical markers of homocysteine and choline pathways, lipoprotein levels and bone mineral density (BMD) T-scores. Nine patients with classical homocystinuria were included in the study. Levels of homocysteine, methionine, cysteine, choline, betaine, dimethylglycine and ethanolamine were determined. Body composition was assessed by bioelectrical impedance analysis (BIA) in patients and in 18 controls. Data on the last BMD measurement and lipoprotein profile were obtained from medical records. Of 9 patients, 4 (44%) had a low body fat percentage, but no statistically significant differences were found between patients and controls. Homocysteine and methionine levels were negatively correlated with body mass index (BMI), while cysteine showed a positive correlation with BMI (p<0.05). There was a trend between total choline levels and body fat percentage (r=0.439, p=0.07). HDL cholesterol correlated with choline and ethanolamine levels (r=0.757, p=0.049; r=0.847, p=0.016, respectively), and total cholesterol also correlated with choline levels (r=0.775, p=0.041). There was no association between BMD T-scores and body composition. These results suggest that reduced fat mass is common in patients with classical homocystinuria, and that alterations in homocysteine and choline pathways affect body mass and lipid metabolism. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Principle of minimal work fluctuations.

    PubMed

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)].

  9. Emotion Recognition From Singing Voices Using Contemporary Commercial Music and Classical Styles.

    PubMed

    Hakanpää, Tua; Waaramaa, Teija; Laukkanen, Anne-Maria

    2018-02-22

    This study examines the recognition of emotion in contemporary commercial music (CCM) and classical styles of singing. This information may be useful in improving the training of interpretation in singing. This is an experimental comparative study. Thirteen singers (11 female, 2 male) with a minimum of 3 years' professional-level singing studies (in CCM or classical technique or both) participated. They sang at three pitches (females: a, e1, a1, males: one octave lower) expressing anger, sadness, joy, tenderness, and a neutral state. Twenty-nine listeners listened to 312 short (0.63- to 4.8-second) voice samples, 135 of which were sung using a classical singing technique and 165 of which were sung in a CCM style. The listeners were asked which emotion they heard. Activity and valence were derived from the chosen emotions. The percentage of correct recognitions out of all the answers in the listening test (N = 9048) was 30.2%. The recognition percentage for the CCM-style singing technique was higher (34.5%) than for the classical-style technique (24.5%). Valence and activation were better perceived than the emotions themselves, and activity was better recognized than valence. A higher pitch was more likely to be perceived as joy or anger, and a lower pitch as sorrow. Both valence and activation were better recognized in the female CCM samples than in the other samples. There are statistically significant differences in the recognition of emotions between classical and CCM styles of singing. Furthermore, in the singing voice, pitch affects the perception of emotions, and valence and activity are more easily recognized than emotions. Copyright © 2018 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  10. Acute Auditory Stimulation with Different Styles of Music Influences Cardiac Autonomic Regulation in Men

    PubMed Central

    da Silva, Sheila Ap. F.; Guida, Heraldo L.; dos Santos Antonio, Ana Marcia; de Abreu, Luiz Carlos; Monteiro, Carlos B. M.; Ferreira, Celso; Ribeiro, Vivian F.; Barnabe, Viviani; Silva, Sidney B.; Fonseca, Fernando L. A.; Adami, Fernando; Petenusso, Marcio; Raimundo, Rodrigo D.; Valenti, Vitor E.

    2014-01-01

    Background: No clear evidence is available in the literature regarding the acute effect of different styles of music on cardiac autonomic control. Objectives: The present study aimed to evaluate the acute effects of classical baroque and heavy metal musical auditory stimulation on Heart Rate Variability (HRV) in healthy men. Patients and Methods: In this study, HRV was analyzed regarding time (SDNN, RMSSD, NN50, and pNN50) and frequency domain (LF, HF, and LF / HF) in 12 healthy men. HRV was recorded at seated rest for 10 minutes. Subsequently, the participants were exposed to classical baroque or heavy metal music for five minutes through an earphone at seated rest. After exposure to the first song, they remained at rest for five minutes and they were again exposed to classical baroque or heavy metal music. The music sequence was random for each individual. Standard statistical methods were used for calculation of means and standard deviations. Besides, ANOVA and Friedman test were used for parametric and non-parametric distributions, respectively. Results: While listening to heavy metal music, SDNN was reduced compared to the baseline (P = 0.023). In addition, the LF index (ms2 and nu) was reduced during exposure to both heavy metal and classical baroque musical auditory stimulation compared to the control condition (P = 0.010 and P = 0.048, respectively). However, the HF index (ms2) was reduced only during auditory stimulation with music heavy metal (P = 0.01). The LF/HF ratio on the other hand decreased during auditory stimulation with classical baroque music (P = 0.019). Conclusions: Acute auditory stimulation with the selected heavy metal musical auditory stimulation decreased the sympathetic and parasympathetic modulation on the heart, while exposure to a selected classical baroque music reduced sympathetic regulation on the heart. PMID:25177673

  11. Quantum theory of the classical: quantum jumps, Born's Rule and objective classical reality via quantum Darwinism.

    PubMed

    Zurek, Wojciech Hubert

    2018-07-13

    The emergence of the classical world from the quantum substrate of our Universe is a long-standing conundrum. In this paper, I describe three insights into the transition from quantum to classical that are based on the recognition of the role of the environment. I begin with the derivation of preferred sets of states that help to define what exists-our everyday classical reality. They emerge as a result of the breaking of the unitary symmetry of the Hilbert space which happens when the unitarity of quantum evolutions encounters nonlinearities inherent in the process of amplification-of replicating information. This derivation is accomplished without the usual tools of decoherence, and accounts for the appearance of quantum jumps and the emergence of preferred pointer states consistent with those obtained via environment-induced superselection, or einselection The pointer states obtained in this way determine what can happen-define events-without appealing to Born's Rule for probabilities. Therefore, p k =| ψ k | 2 can now be deduced from the entanglement-assisted invariance, or envariance -a symmetry of entangled quantum states. With probabilities at hand, one also gains new insights into the foundations of quantum statistical physics. Moreover, one can now analyse the information flows responsible for decoherence. These information flows explain how the perception of objective classical reality arises from the quantum substrate: the effective amplification that they represent accounts for the objective existence of the einselected states of macroscopic quantum systems through the redundancy of pointer state records in their environment-through quantum Darwinism This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).

  12. Overuse Injuries in Professional Ballet

    PubMed Central

    Sobrino, Francisco José; de la Cuadra, Crótida; Guillén, Pedro

    2015-01-01

    Background Despite overuse injuries being previously described as the most frequent in ballet, there are no studies on professional dancers providing the specific clinical diagnoses or type of injury based on the discipline. Hypothesis Overuse injuries are the most frequent injuries in ballet, with differences in the type and frequency of injuries based on discipline. Study Design Cross-sectional study; Level of evidence, 3. Methods This was a descriptive cross-sectional study performed between January 1, 2005, and October 10, 2010, on injuries occurring in professional dancers from leading Spanish dance companies who practiced disciplines such as classical, neoclassical, contemporary, and Spanish ballet. Data, including type of injury, were obtained from specialized medical services at the Trauma Service, Fremap, Madrid, Spain. Results A total of 486 injuries were evaluated, a significant number of which were overuse disorders (P < .0001), especially in the most technically demanding discipline of classical ballet (82.60%). Injuries were more frequent among female dancers (75.90%) and classical ballet (83.60%). A statistically significant prevalence of patellofemoral pain syndrome was found in the classical discipline (P = .007). Injuries of the adductor muscles of the thigh (P = .001) and of the low back facet (P = .02) in the Spanish ballet discipline and lateral snapping hip (P = .02) in classical and Spanish ballet disciplines were significant. Conclusion Overuse injuries were the most frequent injuries among the professional dancers included in this study. The prevalence of injuries was greater for the most technically demanding discipline (classical ballet) as well as for women. Patellofemoral pain syndrome was the most prevalent overuse injury, followed by Achilles tendinopathy, patellar tendinopathy, and mechanical low back pain. Clinical Relevance Specific clinical diagnoses and injury-based differences between the disciplines are a key factor in ballet. PMID:26665100

  13. Overuse Injuries in Professional Ballet: Injury-Based Differences Among Ballet Disciplines.

    PubMed

    Sobrino, Francisco José; de la Cuadra, Crótida; Guillén, Pedro

    2015-06-01

    Despite overuse injuries being previously described as the most frequent in ballet, there are no studies on professional dancers providing the specific clinical diagnoses or type of injury based on the discipline. Overuse injuries are the most frequent injuries in ballet, with differences in the type and frequency of injuries based on discipline. Cross-sectional study; Level of evidence, 3. This was a descriptive cross-sectional study performed between January 1, 2005, and October 10, 2010, on injuries occurring in professional dancers from leading Spanish dance companies who practiced disciplines such as classical, neoclassical, contemporary, and Spanish ballet. Data, including type of injury, were obtained from specialized medical services at the Trauma Service, Fremap, Madrid, Spain. A total of 486 injuries were evaluated, a significant number of which were overuse disorders (P < .0001), especially in the most technically demanding discipline of classical ballet (82.60%). Injuries were more frequent among female dancers (75.90%) and classical ballet (83.60%). A statistically significant prevalence of patellofemoral pain syndrome was found in the classical discipline (P = .007). Injuries of the adductor muscles of the thigh (P = .001) and of the low back facet (P = .02) in the Spanish ballet discipline and lateral snapping hip (P = .02) in classical and Spanish ballet disciplines were significant. Overuse injuries were the most frequent injuries among the professional dancers included in this study. The prevalence of injuries was greater for the most technically demanding discipline (classical ballet) as well as for women. Patellofemoral pain syndrome was the most prevalent overuse injury, followed by Achilles tendinopathy, patellar tendinopathy, and mechanical low back pain. Specific clinical diagnoses and injury-based differences between the disciplines are a key factor in ballet.

  14. Whose statistical reasoning is facilitated by a causal structure intervention?

    PubMed

    McNair, Simon; Feeney, Aidan

    2015-02-01

    People often struggle when making Bayesian probabilistic estimates on the basis of competing sources of statistical evidence. Recently, Krynski and Tenenbaum (Journal of Experimental Psychology: General, 136, 430-450, 2007) proposed that a causal Bayesian framework accounts for peoples' errors in Bayesian reasoning and showed that, by clarifying the causal relations among the pieces of evidence, judgments on a classic statistical reasoning problem could be significantly improved. We aimed to understand whose statistical reasoning is facilitated by the causal structure intervention. In Experiment 1, although we observed causal facilitation effects overall, the effect was confined to participants high in numeracy. We did not find an overall facilitation effect in Experiment 2 but did replicate the earlier interaction between numerical ability and the presence or absence of causal content. This effect held when we controlled for general cognitive ability and thinking disposition. Our results suggest that clarifying causal structure facilitates Bayesian judgments, but only for participants with sufficient understanding of basic concepts in probability and statistics.

  15. Statistical tests for detecting associations with groups of genetic variants: generalization, evaluation, and implementation

    PubMed Central

    Ferguson, John; Wheeler, William; Fu, YiPing; Prokunina-Olsson, Ludmila; Zhao, Hongyu; Sampson, Joshua

    2013-01-01

    With recent advances in sequencing, genotyping arrays, and imputation, GWAS now aim to identify associations with rare and uncommon genetic variants. Here, we describe and evaluate a class of statistics, generalized score statistics (GSS), that can test for an association between a group of genetic variants and a phenotype. GSS are a simple weighted sum of single-variant statistics and their cross-products. We show that the majority of statistics currently used to detect associations with rare variants are equivalent to choosing a specific set of weights within this framework. We then evaluate the power of various weighting schemes as a function of variant characteristics, such as MAF, the proportion associated with the phenotype, and the direction of effect. Ultimately, we find that two classical tests are robust and powerful, but details are provided as to when other GSS may perform favorably. The software package CRaVe is available at our website (http://dceg.cancer.gov/bb/tools/crave). PMID:23092956

  16. Discriminating strength: a bona fide measure of non-classical correlations

    NASA Astrophysics Data System (ADS)

    Farace, A.; De Pasquale, A.; Rigovacca, L.; Giovannetti, V.

    2014-07-01

    A new measure of non-classical correlations is introduced and characterized. It tests the ability of using a state ρ of a composite system AB as a probe for a quantum illumination task (e.g. see Lloyd 2008 Science 321 1463), in which one is asked to remotely discriminate between the two following scenarios: (i) either nothing happens to the probe, or (ii) the subsystem A is transformed via a local unitary {{R}_{A}} whose properties are partially unspecified when producing ρ. This new measure can be seen as the discrete version of the recently introduced interferometric power measure (Girolami et al 2013 e-print arXiv:1309.1472) and, at least for the case in which A is a qubit, it is shown to coincide (up to an irrelevant scaling factor) with the local quantum uncertainty measure of Girolami, Tufarelli and Adesso (2013 Phys. Rev. Lett. 110 240402). Analytical expressions are derived which allow us to formally prove that, within the set of separable configurations, the maximum value of our non-classicality measure is achieved over the set of quantum-classical states (i.e. states ρ which admit a statistical unravelling where each element of the associated ensemble is distinguishable via local measures on B).

  17. [Small infundibulectomy versus ventriculotomy in tetralogy of Fallot].

    PubMed

    Bojórquez-Ramos, Julio César

    2013-01-01

    the surgical correction of tetralogy of Fallot (TOF) is standardized on the way to close the septal defect, but differs in the way of expanding the right ventricular outflow tract (RVOT). The aim was to compare the early postoperative clinical course of the RVOT obstruction enlargement in classical ventriculotomy technique and the small infundibulectomy (SI). We analyzed the database of the pediatric heart surgery service from 2008 to 2011. Patients with non-complex TOF undergoing complete correction by classical ventriculotomy or SI were selected. Anova, χ(2) and Fisher statistical test were applied. the data included 47 patients, 55 % (26) male, mean age 43 months (6-172), classical ventriculotomy was performed in 61.7 % (29). This group had higher peak levels of lactate (9.07 versus 6.8 mmol/L) p = 0049, and greater magnitude in the index bleeding/kg in the first 12 hours (39.1 versus 20.3 mL/kg) p = 0.016. Death occurred in 9 cases (31.03 %) versus one (5.6 %) in the SI group with p = 0.037; complications exclusive as acute renal failure, hemopneumothorax, pneumonia, permanent AV-block and multiple organ failure were observed. morbidity and mortality was higher in classical ventriculotomy group in comparison with SI. This is possibly associated with higher blood volume.

  18. Asymptotic modal analysis and statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Dowell, Earl H.

    1992-01-01

    Asymptotic Modal Analysis (AMA) is a method which is used to model linear dynamical systems with many participating modes. The AMA method was originally developed to show the relationship between statistical energy analysis (SEA) and classical modal analysis (CMA). In the limit of a large number of modes of a vibrating system, the classical modal analysis result can be shown to be equivalent to the statistical energy analysis result. As the CMA result evolves into the SEA result, a number of systematic assumptions are made. Most of these assumptions are based upon the supposition that the number of modes approaches infinity. It is for this reason that the term 'asymptotic' is used. AMA is the asymptotic result of taking the limit of CMA as the number of modes approaches infinity. AMA refers to any of the intermediate results between CMA and SEA, as well as the SEA result which is derived from CMA. The main advantage of the AMA method is that individual modal characteristics are not required in the model or computations. By contrast, CMA requires that each modal parameter be evaluated at each frequency. In the latter, contributions from each mode are computed and the final answer is obtained by summing over all the modes in the particular band of interest. AMA evaluates modal parameters only at their center frequency and does not sum the individual contributions from each mode in order to obtain a final result. The method is similar to SEA in this respect. However, SEA is only capable of obtaining spatial averages or means, as it is a statistical method. Since AMA is systematically derived from CMA, it can obtain local spatial information as well.

  19. Rigorous Statistical Bounds in Uncertainty Quantification for One-Layer Turbulent Geophysical Flows

    NASA Astrophysics Data System (ADS)

    Qi, Di; Majda, Andrew J.

    2018-04-01

    Statistical bounds controlling the total fluctuations in mean and variance about a basic steady-state solution are developed for the truncated barotropic flow over topography. Statistical ensemble prediction is an important topic in weather and climate research. Here, the evolution of an ensemble of trajectories is considered using statistical instability analysis and is compared and contrasted with the classical deterministic instability for the growth of perturbations in one pointwise trajectory. The maximum growth of the total statistics in fluctuations is derived relying on the statistical conservation principle of the pseudo-energy. The saturation bound of the statistical mean fluctuation and variance in the unstable regimes with non-positive-definite pseudo-energy is achieved by linking with a class of stable reference states and minimizing the stable statistical energy. Two cases with dependence on initial statistical uncertainty and on external forcing and dissipation are compared and unified under a consistent statistical stability framework. The flow structures and statistical stability bounds are illustrated and verified by numerical simulations among a wide range of dynamical regimes, where subtle transient statistical instability exists in general with positive short-time exponential growth in the covariance even when the pseudo-energy is positive-definite. Among the various scenarios in this paper, there exist strong forward and backward energy exchanges between different scales which are estimated by the rigorous statistical bounds.

  20. surrosurv: An R package for the evaluation of failure time surrogate endpoints in individual patient data meta-analyses of randomized clinical trials.

    PubMed

    Rotolo, Federico; Paoletti, Xavier; Michiels, Stefan

    2018-03-01

    Surrogate endpoints are attractive for use in clinical trials instead of well-established endpoints because of practical convenience. To validate a surrogate endpoint, two important measures can be estimated in a meta-analytic context when individual patient data are available: the R indiv 2 or the Kendall's τ at the individual level, and the R trial 2 at the trial level. We aimed at providing an R implementation of classical and well-established as well as more recent statistical methods for surrogacy assessment with failure time endpoints. We also intended incorporating utilities for model checking and visualization and data generating methods described in the literature to date. In the case of failure time endpoints, the classical approach is based on two steps. First, a Kendall's τ is estimated as measure of individual level surrogacy using a copula model. Then, the R trial 2 is computed via a linear regression of the estimated treatment effects; at this second step, the estimation uncertainty can be accounted for via measurement-error model or via weights. In addition to the classical approach, we recently developed an approach based on bivariate auxiliary Poisson models with individual random effects to measure the Kendall's τ and treatment-by-trial interactions to measure the R trial 2 . The most common data simulation models described in the literature are based on: copula models, mixed proportional hazard models, and mixture of half-normal and exponential random variables. The R package surrosurv implements the classical two-step method with Clayton, Plackett, and Hougaard copulas. It also allows to optionally adjusting the second-step linear regression for measurement-error. The mixed Poisson approach is implemented with different reduced models in addition to the full model. We present the package functions for estimating the surrogacy models, for checking their convergence, for performing leave-one-trial-out cross-validation, and for plotting the results. We illustrate their use in practice on individual patient data from a meta-analysis of 4069 patients with advanced gastric cancer from 20 trials of chemotherapy. The surrosurv package provides an R implementation of classical and recent statistical methods for surrogacy assessment of failure time endpoints. Flexible simulation functions are available to generate data according to the methods described in the literature. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Integrative genetic analysis of transcription modules: towards filling the gap between genetic lociand inherited traits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Hongqiang; Chen, Hao; Bao, Lei

    2005-01-01

    Genetic loci that regulate inherited traits are routinely identified using quantitative trait locus (QTL) mapping methods. However, the genotype-phenotype associations do not provide information on the gene expression program through which the genetic loci regulate the traits. Transcription modules are 'selfconsistent regulatory units' and are closely related to the modular components of gene regulatory network [Ihmels, J., Friedlander, G., Bergmann, S., Sarig, O., Ziv, Y. and Barkai, N. (2002) Revealing modular organization in the yeast transcriptional network. Nat. Genet., 31, 370-377; Segal, E., Shapira, M., Regev, A., Pe'er, D., Botstein, D., Koller, D. and Friedman, N. (2003) Module networks: identifyingmore » regulatory modules and their condition-specific regulators from gene expression data. Nat. Genet., 34, 166-176]. We used genome-wide genotype and gene expression data of a genetic reference population that consists of mice of 32 recombinant inbred strains to identify the transcription modules and the genetic loci regulating them. Twenty-nine transcription modules defined by genetic variations were identified. Statistically significant associations between the transcription modules and 18 classical physiological and behavioral traits were found. Genome-wide interval mapping showed that major QTLs regulating the transcription modules are often co-localized with the QTLs regulating the associated classical traits. The association and the possible co-regulation of the classical trait and transcription module indicate that the transcription module may be involved in the gene pathways connecting the QTL and the classical trait. Our results show that a transcription module may associate with multiple seemingly unrelated classical traits and a classical trait may associate with different modules. Literature mining results provided strong independent evidences for the relations among genes of the transcription modules, genes in the regions of the QTLs regulating the transcription modules and the keywords representing the classical traits.« less

  2. Wilson's Disease: a challenge of diagnosis. The 5-year experience of a tertiary centre.

    PubMed

    Gheorghe, Liana; Popescu, Irinel; Iacob, Speranta; Gheorghe, Cristian; Vaidan, Roxana; Constantinescu, Alexandra; Iacob, Razvan; Becheanu, Gabriel; Angelescu, Corina; Diculescu, Mircea

    2004-09-01

    Because molecular diagnosis is considered impractical and no patognomonic features have been described, diagnosis of Wilson's disease (WD) using clinical and biochemical findings is still challenging. We analysed predictive factors for the diagnosis in 55 patients with WD diagnosed in our centre between 1st January 1999 and 1st April 2004. All patients presented predominant liver disease classified as: 1) asymptomatic, found incidentally, 2) chronic hepatitis or cirrhosis, or 3) fulminant hepatic failure. Diagnosis was considered as classic (two out of the three following criteria: 1) serum ceruloplasmin < 20 mg/dl, 2) the presence of Kayser-Fleischer rings and/or 3) hepatic copper > 250 mg/g dry weight liver tissue), and non-classic (clinical manifestations plus laboratory parameters suggesting impaired copper metabolism). The association between the predictive factors and non-classic diagnosis was assessed based on the level of statistical significance (p value<0.05) associated with the chi-squared test in contingency tables. Multivariate analysis was performed by logistic regression using SPSS 10. There were 31 males (56.3%) and 24 females (43.7%) with the mean age at diagnosis of 20.92 +/- 9.97 years (4-52 years); 51 patients (92.7%) were younger than 40 years. Asymptomatic WD was diagnosed in 14 patients (25.4%), chronic liver disease due to WD in 29 patients (52.8%) and fulminant hepatic failure in 12 patients (21.8%). The classic diagnosis was made in 32 patients (58.18%). In the univariate analysis the non-classic diagnosis was associated with: age>18 years (p=0.03), increased copper excretion (p<0.0001), Coombs-negative hemolysis (p=0.03), absence of neurological manifestations (p<0.0001). Multivariate analysis identified age over 18 years, increased urinary copper, and isolated hepatic involvement as independent predictors. In clinical practice, WD should be considered also in patients who do not fulfil classic criteria. Independent factors associated with non-classic diagnosis were age over 18 years, increased cupruresis and isolated liver disease.

  3. Multilevel Assessment of the Predictive Validity of Teacher Made Tests in the Zimbabwean Primary Education Sector

    ERIC Educational Resources Information Center

    Machingambi, Zadzisai

    2017-01-01

    The principal focus of this study was to undertake a multilevel assessment of the predictive validity of teacher made tests in the Zimbabwean primary education sector. A correlational research design was adopted for the study, mainly to allow for statistical treatment of data and subsequent classical hypotheses testing using the spearman's rho.…

  4. Surface Impact Simulations of Helium Nanodroplets

    DTIC Science & Technology

    2015-06-30

    mechanical delocalization of the individual helium atoms in the droplet and the quan- tum statistical effects that accompany the interchange of identical...incorporates the effects of atomic delocaliza- tion by treating individual atoms as smeared-out probability distributions that move along classical...probability density distributions to give effec- tive interatomic potential energy curves that have zero-point averaging effects built into them [25

  5. On the Benefits of Latent Variable Modeling for Norming Scales: The Case of the "Supports Intensity Scale-Children's Version"

    ERIC Educational Resources Information Center

    Seo, Hyojeong; Little, Todd D.; Shogren, Karrie A.; Lang, Kyle M.

    2016-01-01

    Structural equation modeling (SEM) is a powerful and flexible analytic tool to model latent constructs and their relations with observed variables and other constructs. SEM applications offer advantages over classical models in dealing with statistical assumptions and in adjusting for measurement error. So far, however, SEM has not been fully used…

  6. Work distributions of one-dimensional fermions and bosons with dual contact interactions

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Zhang, Jingning; Quan, H. T.

    2018-05-01

    We extend the well-known static duality [M. Girardeau, J. Math. Phys. 1, 516 (1960), 10.1063/1.1703687; T. Cheon and T. Shigehara, Phys. Rev. Lett. 82, 2536 (1999), 10.1103/PhysRevLett.82.2536] between one-dimensional (1D) bosons and 1D fermions to the dynamical version. By utilizing this dynamical duality, we find the duality of nonequilibrium work distributions between interacting 1D bosonic (Lieb-Liniger model) and 1D fermionic (Cheon-Shigehara model) systems with dual contact interactions. As a special case, the work distribution of the Tonks-Girardeau gas is identical to that of 1D noninteracting fermionic system even though their momentum distributions are significantly different. In the classical limit, the work distributions of Lieb-Liniger models (Cheon-Shigehara models) with arbitrary coupling strength converge to that of the 1D noninteracting distinguishable particles, although their elementary excitations (quasiparticles) obey different statistics, e.g., the Bose-Einstein, the Fermi-Dirac, and the fractional statistics. We also present numerical results of the work distributions of Lieb-Liniger model with various coupling strengths, which demonstrate the convergence of work distributions in the classical limit.

  7. Statistical mechanics of high-density bond percolation

    NASA Astrophysics Data System (ADS)

    Timonin, P. N.

    2018-05-01

    High-density (HD) percolation describes the percolation of specific κ -clusters, which are the compact sets of sites each connected to κ nearest filled sites at least. It takes place in the classical patterns of independently distributed sites or bonds in which the ordinary percolation transition also exists. Hence, the study of series of κ -type HD percolations amounts to the description of classical clusters' structure for which κ -clusters constitute κ -cores nested one into another. Such data are needed for description of a number of physical, biological, and information properties of complex systems on random lattices, graphs, and networks. They range from magnetic properties of semiconductor alloys to anomalies in supercooled water and clustering in biological and social networks. Here we present the statistical mechanics approach to study HD bond percolation on an arbitrary graph. It is shown that the generating function for κ -clusters' size distribution can be obtained from the partition function of the specific q -state Potts-Ising model in the q →1 limit. Using this approach we find exact κ -clusters' size distributions for the Bethe lattice and Erdos-Renyi graph. The application of the method to Euclidean lattices is also discussed.

  8. Computational methods in the exploration of the classical and statistical mechanics of celestial scale strings: Rotating Space Elevators

    NASA Astrophysics Data System (ADS)

    Knudsen, Steven; Golubovic, Leonardo

    2015-04-01

    With the advent of ultra-strong materials, the Space Elevator has changed from science fiction to real science. We discuss computational and theoretical methods we developed to explore classical and statistical mechanics of rotating Space Elevators (RSE). An RSE is a loopy string reaching deep into outer space. The floppy RSE loop executes a motion which is nearly a superposition of two rotations: geosynchronous rotation around the Earth, and yet another faster rotational motion of the string which goes on around a line perpendicular to the Earth at its equator. Strikingly, objects sliding along the RSE loop spontaneously oscillate between two turning points, one of which is close to the Earth (starting point) whereas the other one is deeply in the outer space. The RSE concept thus solves a major problem in space elevator science which is how to supply energy to the climbers moving along space elevator strings. The exploration of the dynamics of a floppy string interacting with objects sliding along it has required development of novel finite element algorithms described in this presentation. We thank Prof. Duncan Lorimer of WVU for kindly providing us access to his computational facility.

  9. Quantum weak turbulence with applications to semiconductor lasers

    NASA Astrophysics Data System (ADS)

    Lvov, Yuri Victorovich

    Based on a model Hamiltonian appropriate for the description of fermionic systems such as semiconductor lasers, we describe a natural asymptotic closure of the BBGKY hierarchy in complete analogy with that derived for classical weak turbulence. The main features of the interaction Hamiltonian are the inclusion of full Fermi statistics containing Pauli blocking and a simple, phenomenological, uniformly weak two particle interaction potential equivalent to the static screening approximation. The resulting asymytotic closure and quantum kinetic Boltzmann equation are derived in a self consistent manner without resorting to a priori statistical hypotheses or cumulant discard assumptions. We find a new class of solutions to the quantum kinetic equation which are analogous to the Kolmogorov spectra of hydrodynamics and classical weak turbulence. They involve finite fluxes of particles and energy across momentum space and are particularly relevant for describing the behavior of systems containing sources and sinks. We explore these solutions by using differential approximation to collision integral. We make a prima facie case that these finite flux solutions can be important in the context of semiconductor lasers. We show that semiconductor laser output efficiency can be improved by exciting these finite flux solutions. Numerical simulations of the semiconductor Maxwell Bloch equations support the claim.

  10. On the statistical distribution in a deformed solid

    NASA Astrophysics Data System (ADS)

    Gorobei, N. N.; Luk'yanenko, A. S.

    2017-09-01

    A modification of the Gibbs distribution in a thermally insulated mechanically deformed solid, where its linear dimensions (shape parameters) are excluded from statistical averaging and included among the macroscopic parameters of state alongside with the temperature, is proposed. Formally, this modification is reduced to corresponding additional conditions when calculating the statistical sum. The shape parameters and the temperature themselves are found from the conditions of mechanical and thermal equilibria of a body, and their change is determined using the first law of thermodynamics. Known thermodynamic phenomena are analyzed for the simple model of a solid, i.e., an ensemble of anharmonic oscillators, within the proposed formalism with an accuracy of up to the first order by the anharmonicity constant. The distribution modification is considered for the classic and quantum temperature regions apart.

  11. Applications of quantum entropy to statistics

    NASA Astrophysics Data System (ADS)

    Silver, R. N.; Martz, H. F.

    This paper develops two generalizations of the maximum entropy (ME) principle. First, Shannon classical entropy is replaced by von Neumann quantum entropy to yield a broader class of information divergences (or penalty functions) for statistics applications. Negative relative quantum entropy enforces convexity, positivity, non-local extensivity and prior correlations such as smoothness. This enables the extension of ME methods from their traditional domain of ill-posed in-verse problems to new applications such as non-parametric density estimation. Second, given a choice of information divergence, a combination of ME and Bayes rule is used to assign both prior and posterior probabilities. Hyperparameters are interpreted as Lagrange multipliers enforcing constraints. Conservation principles are proposed to act statistical regularization and other hyperparameters, such as conservation of information and smoothness. ME provides an alternative to hierarchical Bayes methods.

  12. Maximum entropy models of ecosystem functioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertram, Jason, E-mail: jason.bertram@anu.edu.au

    2014-12-05

    Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on themore » information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.« less

  13. Walking through the statistical black boxes of plant breeding.

    PubMed

    Xavier, Alencar; Muir, William M; Craig, Bruce; Rainey, Katy Martin

    2016-10-01

    The main statistical procedures in plant breeding are based on Gaussian process and can be computed through mixed linear models. Intelligent decision making relies on our ability to extract useful information from data to help us achieve our goals more efficiently. Many plant breeders and geneticists perform statistical analyses without understanding the underlying assumptions of the methods or their strengths and pitfalls. In other words, they treat these statistical methods (software and programs) like black boxes. Black boxes represent complex pieces of machinery with contents that are not fully understood by the user. The user sees the inputs and outputs without knowing how the outputs are generated. By providing a general background on statistical methodologies, this review aims (1) to introduce basic concepts of machine learning and its applications to plant breeding; (2) to link classical selection theory to current statistical approaches; (3) to show how to solve mixed models and extend their application to pedigree-based and genomic-based prediction; and (4) to clarify how the algorithms of genome-wide association studies work, including their assumptions and limitations.

  14. ‘… a metal conducts and a non-metal doesn't’

    PubMed Central

    Edwards, P. P.; Lodge, M. T. J.; Hensel, F.; Redmer, R.

    2010-01-01

    In a letter to one of the authors, Sir Nevill Mott, then in his tenth decade, highlighted the fact that the statement ‘… a metal conducts, and a non-metal doesn’t’ can be true only at the absolute zero of temperature, T=0 K. But, of course, experimental studies of metals, non-metals and, indeed, the electronic and thermodynamic transition between these canonical states of matter must always occur above T=0 K, and, in many important cases, for temperatures far above the absolute zero. Here, we review the issues—theoretical and experimental—attendant on studies of the metal to non-metal transition in doped semiconductors at temperatures close to absolute zero (T=0.03 K) and fluid chemical elements at temperatures far above absolute zero (T>1000 K). We attempt to illustrate Mott’s insights for delving into such complex phenomena and experimental systems, finding intuitively the dominant features of the science, and developing a coherent picture of the different competing electronic processes. A particular emphasis is placed on the idea of a ‘Mott metal to non-metal transition’ in the nominally metallic chemical elements rubidium, caesium and mercury, and the converse metallization transition in the nominally non-metal elements hydrogen and oxygen. We also review major innovations by D. A. Goldhammer (Goldhammer 1913 Dispersion und absorption des lichtes) and K. F. Herzfeld (Herzfeld 1927 Phys. Rev. 29, 701–705. (doi:10.1103/PhysRev.29.701)) in a pre-quantum theory description of the metal–non-metal transition, which emphasize the pivotal role of atomic properties in dictating the metallic or non-metallic status of the chemical elements of the periodic table under ambient and extreme conditions; a link with Pauling’s ‘metallic orbital’ is also established here. PMID:20123742

  15. Expectation maximization for hard X-ray count modulation profiles

    NASA Astrophysics Data System (ADS)

    Benvenuto, F.; Schwartz, R.; Piana, M.; Massone, A. M.

    2013-07-01

    Context. This paper is concerned with the image reconstruction problem when the measured data are solar hard X-ray modulation profiles obtained from the Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI) instrument. Aims: Our goal is to demonstrate that a statistical iterative method classically applied to the image deconvolution problem is very effective when utilized to analyze count modulation profiles in solar hard X-ray imaging based on rotating modulation collimators. Methods: The algorithm described in this paper solves the maximum likelihood problem iteratively and encodes a positivity constraint into the iterative optimization scheme. The result is therefore a classical expectation maximization method this time applied not to an image deconvolution problem but to image reconstruction from count modulation profiles. The technical reason that makes our implementation particularly effective in this application is the use of a very reliable stopping rule which is able to regularize the solution providing, at the same time, a very satisfactory Cash-statistic (C-statistic). Results: The method is applied to both reproduce synthetic flaring configurations and reconstruct images from experimental data corresponding to three real events. In this second case, the performance of expectation maximization, when compared to Pixon image reconstruction, shows a comparable accuracy and a notably reduced computational burden; when compared to CLEAN, shows a better fidelity with respect to the measurements with a comparable computational effectiveness. Conclusions: If optimally stopped, expectation maximization represents a very reliable method for image reconstruction in the RHESSI context when count modulation profiles are used as input data.

  16. Topics in quantum chaos

    NASA Astrophysics Data System (ADS)

    Jordan, Andrew Noble

    2002-09-01

    In this dissertation, we study the quantum mechanics of classically chaotic dynamical systems. We begin by considering the decoherence effects a quantum chaotic system has on a simple quantum few state system. Typical time evolution of a quantum system whose classical limit is chaotic generates structures in phase space whose size is much smaller than Planck's constant. A naive application of Heisenberg's uncertainty principle indicates that these structures are not physically relevant. However, if we take the quantum chaotic system in question to be an environment which interacts with a simple two state quantum system (qubit), we show that these small phase-space structures cause the qubit to generically lose quantum coherence if and only if the environment has many degrees of freedom, such as a dilute gas. This implies that many-body environments may be crucial for the phenomenon of quantum decoherence. Next, we turn to an analysis of statistical properties of time correlation functions and matrix elements of quantum chaotic systems. A semiclassical evaluation of matrix elements of an operator indicates that the dominant contribution will be related to a classical time correlation function over the energy surface. For a highly chaotic class of dynamics, these correlation functions may be decomposed into sums of Ruelle resonances, which control exponential decay to the ergodic distribution. The theory is illustrated both numerically and theoretically on the Baker map. For this system, we are able to isolate individual Ruelle modes. We further consider dynamical systems whose approach to ergodicity is given by a power law rather than an exponential in time. We propose a billiard with diffusive boundary conditions, whose classical solution may be calculated analytically. We go on to compare the exact solution with an approximation scheme, as well calculate asympotic corrections. Quantum spectral statistics are calculated assuming the validity of the Again, Altshuler and Andreev ansatz. We find singular behavior of the two point spectral correlator in the limit of small spacing. Finally, we analyse the effect that slow decay to ergodicity has on the structure of the quantum propagator, as well as wavefunction localization. We introduce a statistical quantum description of systems that are composed of both an orderly region and a random region. By averaging over the random region only, we find that measures of localization in momentum space semiclassically diverge with the dimension of the Hilbert space. We illustrate this numerically with quantum maps and suggest various other systems where this behavior should be important.

  17. New features in the structure of the classical Kuiper Belt

    NASA Astrophysics Data System (ADS)

    Gladman, Brett; Bannister, Michele T.; Alexandersen, Mike; Chen, Ying-Tung; Gwyn, Stephen; Kavelaars, J. J.; Petit, Jean-Marc; Volk, Kathryn; OSSOS Collaboration

    2016-10-01

    We report fascinating new dynamical structures emerging from a higher precision view of the classical Kuiper belt (the plentiful non-resonant orbits with semimajor axes in roughly the a=35-60 au range). The classical Kuiper Belt divides into multiple sub-populations: an 'inner' classical belt (a small group of non-resonant objects with a<39.4 au where the 3:2 resonance is located), an abundant 'main' classical belt (between the 3:2 and the 2:1 at a=47.4 au), and a difficult to study outer classical belt beyond the 2:1. We examine the dynamical structure, as precisely revealed in the detections from OSSOS (the Outer Solar System Origin's Survey); the data set is of superb quality in terms of orbital element and numbers of detections (Kavelaars et al, this meeting).The previous CFEPS survey showed that the main classical belt requires a complex dynamical substructure that goes beyond a simple 'hot versus cold' division based primarily on orbital inclination; the 'cold' inclination component requires two sub-components in the semimajor axis and perihelion distance q space (Petit et al 2011). CFEPS modelled this as a 'stirred' component present at all a=40-47 AU semimajor axes, with a dense superposed 'kernel' near a=44 AU at low eccentricity; the first OSSOS data release remained consistent with this (Bannister et al 2016). As with the main asteroid belt, as statistics and orbital quality improve we see additional significant substructure emerging in the classical belt's orbital distribution.OSSOS continues to add evidence that the cold stirred component extends smoothly beyond the 2:1 (Bannister et al 2016). Unexpectedly, the data also reveal the clear existence of a paucity of orbits just beyond the outer edge of the kernel; there are significantly fewer TNOs in the narrow semimajor axis band from a=44.5-45.0 AU. This may be related to the kernel population's creation, or it may be an independent feature created by planet migration as resonances moved in the primordial Kuiper Belt.

  18. Statistical correlation analysis for comparing vibration data from test and analysis

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.; Purves, L. R.; Hershfeld, D. J.

    1986-01-01

    A theory was developed to compare vibration modes obtained by NASTRAN analysis with those obtained experimentally. Because many more analytical modes can be obtained than experimental modes, the analytical set was treated as expansion functions for putting both sources in comparative form. The dimensional symmetry was developed for three general cases: nonsymmetric whole model compared with a nonsymmetric whole structural test, symmetric analytical portion compared with a symmetric experimental portion, and analytical symmetric portion with a whole experimental test. The theory was coded and a statistical correlation program was installed as a utility. The theory is established with small classical structures.

  19. Entropy in sound and vibration: towards a new paradigm.

    PubMed

    Le Bot, A

    2017-01-01

    This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart.

  20. Constraints on cosmological parameters from the analysis of the cosmic lens all sky survey radio-selected gravitational lens statistics.

    PubMed

    Chae, K-H; Biggs, A D; Blandford, R D; Browne, I W A; De Bruyn, A G; Fassnacht, C D; Helbig, P; Jackson, N J; King, L J; Koopmans, L V E; Mao, S; Marlow, D R; McKean, J P; Myers, S T; Norbury, M; Pearson, T J; Phillips, P M; Readhead, A C S; Rusin, D; Sykes, C M; Wilkinson, P N; Xanthopoulos, E; York, T

    2002-10-07

    We derive constraints on cosmological parameters and the properties of the lensing galaxies from gravitational lens statistics based on the final Cosmic Lens All Sky Survey data. For a flat universe with a classical cosmological constant, we find that the present matter fraction of the critical density is Omega(m)=0.31(+0.27)(-0.14) (68%)+0.12-0.10 (syst). For a flat universe with a constant equation of state for dark energy w=p(x)(pressure)/rho(x)(energy density), we find w<-0.55(+0.18)(-0.11) (68%).

  1. Suggestions for presenting the results of data analyses

    USGS Publications Warehouse

    Anderson, David R.; Link, William A.; Johnson, Douglas H.; Burnham, Kenneth P.

    2001-01-01

    We give suggestions for the presentation of research results from frequentist, information-theoretic, and Bayesian analysis paradigms, followed by several general suggestions. The information-theoretic and Bayesian methods offer alternative approaches to data analysis and inference compared to traditionally used methods. Guidance is lacking on the presentation of results under these alternative procedures and on nontesting aspects of classical frequentists methods of statistical analysis. Null hypothesis testing has come under intense criticism. We recommend less reporting of the results of statistical tests of null hypotheses in cases where the null is surely false anyway, or where the null hypothesis is of little interest to science or management.

  2. Contribution of artificial intelligence to the knowledge of prognostic factors in laryngeal carcinoma.

    PubMed

    Zapater, E; Moreno, S; Fortea, M A; Campos, A; Armengot, M; Basterra, J

    2000-11-01

    Many studies have investigated prognostic factors in laryngeal carcinoma, with sometimes conflicting results. Apart from the importance of environmental factors, the different statistical methods employed may have influenced such discrepancies. A program based on artificial intelligence techniques is designed to determine the prognostic factors in a series of 122 laryngeal carcinomas. The results obtained are compared with those derived from two classical statistical methods (Cox regression and mortality tables). Tumor location was found to be the most important prognostic factor by all methods. The proposed intelligent system is found to be a sound method capable of detecting exceptional cases.

  3. Molecular dynamics studies of electron-ion temperature equilibration in hydrogen plasmas within the coupled-mode regime

    DOE PAGES

    Benedict, Lorin X.; Surh, Michael P.; Stanton, Liam G.; ...

    2017-04-10

    Here, we use classical molecular dynamics (MD) to study electron-ion temperature equilibration in two-component plasmas in regimes for which the presence of coupled collective modes has been predicted to substantively reduce the equilibration rate. Guided by previous kinetic theory work, we examine hydrogen plasmas at a density of n = 10 26cm –3, T i = 10 5K, and 10 7 K < Te < 10 9K. The nonequilibrium classical MD simulations are performed with interparticle interactions modeled by quantum statistical potentials (QSPs). Our MD results indicate (i) a large effect from time-varying potential energy, which we quantify by appealingmore » to an adiabatic two-temperature equation of state, and (ii) a notable deviation in the energy equilibration rate when compared to calculations from classical Lenard-Balescu theory including the QSPs. In particular, it is shown that the energy equilibration rates from MD are more similar to those of the theory when coupled modes are neglected. We suggest possible reasons for this surprising result and propose directions of further research along these lines.« less

  4. PDF-based heterogeneous multiscale filtration model.

    PubMed

    Gong, Jian; Rutland, Christopher J

    2015-04-21

    Motivated by modeling of gasoline particulate filters (GPFs), a probability density function (PDF) based heterogeneous multiscale filtration (HMF) model is developed to calculate filtration efficiency of clean particulate filters. A new methodology based on statistical theory and classic filtration theory is developed in the HMF model. Based on the analysis of experimental porosimetry data, a pore size probability density function is introduced to represent heterogeneity and multiscale characteristics of the porous wall. The filtration efficiency of a filter can be calculated as the sum of the contributions of individual collectors. The resulting HMF model overcomes the limitations of classic mean filtration models which rely on tuning of the mean collector size. Sensitivity analysis shows that the HMF model recovers the classical mean model when the pore size variance is very small. The HMF model is validated by fundamental filtration experimental data from different scales of filter samples. The model shows a good agreement with experimental data at various operating conditions. The effects of the microstructure of filters on filtration efficiency as well as the most penetrating particle size are correctly predicted by the model.

  5. Grand Canonical adaptive resolution simulation for molecules with electrons: A theoretical framework based on physical consistency

    NASA Astrophysics Data System (ADS)

    Delle Site, Luigi

    2018-01-01

    A theoretical scheme for the treatment of an open molecular system with electrons and nuclei is proposed. The idea is based on the Grand Canonical description of a quantum region embedded in a classical reservoir of molecules. Electronic properties of the quantum region are calculated at constant electronic chemical potential equal to that of the corresponding (large) bulk system treated at full quantum level. Instead, the exchange of molecules between the quantum region and the classical environment occurs at the chemical potential of the macroscopic thermodynamic conditions. The Grand Canonical Adaptive Resolution Scheme is proposed for the treatment of the classical environment; such an approach can treat the exchange of molecules according to first principles of statistical mechanics and thermodynamic. The overall scheme is build on the basis of physical consistency, with the corresponding definition of numerical criteria of control of the approximations implied by the coupling. Given the wide range of expertise required, this work has the intention of providing guiding principles for the construction of a well founded computational protocol for actual multiscale simulations from the electronic to the mesoscopic scale.

  6. A novel semiconductor-based, fully incoherent amplified spontaneous emission light source for ghost imaging

    PubMed Central

    Hartmann, Sébastien; Elsäßer, Wolfgang

    2017-01-01

    Initially, ghost imaging (GI) was demonstrated with entangled light from parametric down conversion. Later, classical light sources were introduced with the development of thermal light GI concepts. State-of-the-art classical GI light sources rely either on complex combinations of coherent light with spatially randomizing optical elements or on incoherent lamps with monochromating optics, however suffering strong losses of efficiency and directionality. Here, a broad-area superluminescent diode is proposed as a new light source for classical ghost imaging. The coherence behavior of this spectrally broadband emitting opto-electronic light source is investigated in detail. An interferometric two-photon detection technique is exploited in order to resolve the ultra-short correlation timescales. We thereby quantify the coherence time, the photon statistics as well as the number of spatial modes unveiling a complete incoherent light behavior. With a one-dimensional proof-of-principle GI experiment, we introduce these compact emitters to the field which could be beneficial for high-speed GI systems as well as for long range GI sensing in future applications. PMID:28150737

  7. InGaAs tunnel diodes for the calibration of semi-classical and quantum mechanical band-to-band tunneling models

    NASA Astrophysics Data System (ADS)

    Smets, Quentin; Verreck, Devin; Verhulst, Anne S.; Rooyackers, Rita; Merckling, Clément; Van De Put, Maarten; Simoen, Eddy; Vandervorst, Wilfried; Collaert, Nadine; Thean, Voon Y.; Sorée, Bart; Groeseneken, Guido; Heyns, Marc M.

    2014-05-01

    Promising predictions are made for III-V tunnel-field-effect transistor (FET), but there is still uncertainty on the parameters used in the band-to-band tunneling models. Therefore, two simulators are calibrated in this paper; the first one uses a semi-classical tunneling model based on Kane's formalism, and the second one is a quantum mechanical simulator implemented with an envelope function formalism. The calibration is done for In0.53Ga0.47As using several p+/intrinsic/n+ diodes with different intrinsic region thicknesses. The dopant profile is determined by SIMS and capacitance-voltage measurements. Error bars are used based on statistical and systematic uncertainties in the measurement techniques. The obtained parameters are in close agreement with theoretically predicted values and validate the semi-classical and quantum mechanical models. Finally, the models are applied to predict the input characteristics of In0.53Ga0.47As n- and p-lineTFET, with the n-lineTFET showing competitive performance compared to MOSFET.

  8. Track and vertex reconstruction: From classical to adaptive methods

    NASA Astrophysics Data System (ADS)

    Strandlie, Are; Frühwirth, Rudolf

    2010-04-01

    This paper reviews classical and adaptive methods of track and vertex reconstruction in particle physics experiments. Adaptive methods have been developed to meet the experimental challenges at high-energy colliders, in particular, the CERN Large Hadron Collider. They can be characterized by the obliteration of the traditional boundaries between pattern recognition and statistical estimation, by the competition between different hypotheses about what constitutes a track or a vertex, and by a high level of flexibility and robustness achieved with a minimum of assumptions about the data. The theoretical background of some of the adaptive methods is described, and it is shown that there is a close connection between the two main branches of adaptive methods: neural networks and deformable templates, on the one hand, and robust stochastic filters with annealing, on the other hand. As both classical and adaptive methods of track and vertex reconstruction presuppose precise knowledge of the positions of the sensitive detector elements, the paper includes an overview of detector alignment methods and a survey of the alignment strategies employed by past and current experiments.

  9. The development of ensemble theory. A new glimpse at the history of statistical mechanics

    NASA Astrophysics Data System (ADS)

    Inaba, Hajime

    2015-12-01

    This paper investigates the history of statistical mechanics from the viewpoint of the development of the ensemble theory from 1871 to 1902. In 1871, Ludwig Boltzmann introduced a prototype model of an ensemble that represents a polyatomic gas. In 1879, James Clerk Maxwell defined an ensemble as copies of systems of the same energy. Inspired by H.W. Watson, he called his approach "statistical". Boltzmann and Maxwell regarded the ensemble theory as a much more general approach than the kinetic theory. In the 1880s, influenced by Hermann von Helmholtz, Boltzmann made use of ensembles to establish thermodynamic relations. In Elementary Principles in Statistical Mechanics of 1902, Josiah Willard Gibbs tried to get his ensemble theory to mirror thermodynamics, including thermodynamic operations in its scope. Thermodynamics played the role of a "blind guide". His theory of ensembles can be characterized as more mathematically oriented than Einstein's theory proposed in the same year. Mechanical, empirical, and statistical approaches to foundations of statistical mechanics are presented. Although it was formulated in classical terms, the ensemble theory provided an infrastructure still valuable in quantum statistics because of its generality.

  10. Power spectra as a diagnostic tool in probing statistical/nonstatistical behavior in unimolecular reactions

    NASA Astrophysics Data System (ADS)

    Chang, Xiaoyen Y.; Sewell, Thomas D.; Raff, Lionel M.; Thompson, Donald L.

    1992-11-01

    The possibility of utilizing different types of power spectra obtained from classical trajectories as a diagnostic tool to identify the presence of nonstatistical dynamics is explored by using the unimolecular bond-fission reactions of 1,2-difluoroethane and the 2-chloroethyl radical as test cases. In previous studies, the reaction rates for these systems were calculated by using a variational transition-state theory and classical trajectory methods. A comparison of the results showed that 1,2-difluoroethane is a nonstatistical system, while the 2-chloroethyl radical behaves statistically. Power spectra for these two systems have been generated under various conditions. The characteristics of these spectra are as follows: (1) The spectra for the 2-chloroethyl radical are always broader and more coupled to other modes than is the case for 1,2-difluoroethane. This is true even at very low levels of excitation. (2) When an internal energy near or above the dissociation threshold is initially partitioned into a local C-H stretching mode, the power spectra for 1,2-difluoroethane broaden somewhat, but discrete and somewhat isolated bands are still clearly evident. In contrast, the analogous power spectra for the 2-chloroethyl radical exhibit a near complete absence of isolated bands. The general appearance of the spectrum suggests a very high level of mode-to-mode coupling, large intramolecular vibrational energy redistribution (IVR) rates, and global statistical behavior. (3) The appearance of the power spectrum for the 2-chloroethyl radical is unaltered regardless of whether the initial C-H excitation is in the CH2 or the CH2Cl group. This result also suggests statistical behavior. These results are interpreted to mean that power spectra may be used as a diagnostic tool to assess the statistical character of a system. The presence of a diffuse spectrum exhibiting a nearly complete loss of isolated structures indicates that the dissociation dynamics of the molecule will be well described by statistical theories. If, however, the power spectrum maintains its discrete, isolated character, as is the case for 1,2-difluoroethane, the opposite conclusion is suggested. Since power spectra are very easily computed, this diagnostic method may prove to be useful.

  11. Quantum speedup of Monte Carlo methods.

    PubMed

    Montanaro, Ashley

    2015-09-08

    Monte Carlo methods use random sampling to estimate numerical quantities which are hard to compute deterministically. One important example is the use in statistical physics of rapidly mixing Markov chains to approximately compute partition functions. In this work, we describe a quantum algorithm which can accelerate Monte Carlo methods in a very general setting. The algorithm estimates the expected output value of an arbitrary randomized or quantum subroutine with bounded variance, achieving a near-quadratic speedup over the best possible classical algorithm. Combining the algorithm with the use of quantum walks gives a quantum speedup of the fastest known classical algorithms with rigorous performance bounds for computing partition functions, which use multiple-stage Markov chain Monte Carlo techniques. The quantum algorithm can also be used to estimate the total variation distance between probability distributions efficiently.

  12. Region growing using superpixels with learned shape prior

    NASA Astrophysics Data System (ADS)

    Borovec, Jiří; Kybic, Jan; Sugimoto, Akihiro

    2017-11-01

    Region growing is a classical image segmentation method based on hierarchical region aggregation using local similarity rules. Our proposed method differs from classical region growing in three important aspects. First, it works on the level of superpixels instead of pixels, which leads to a substantial speed-up. Second, our method uses learned statistical shape properties that encourage plausible shapes. In particular, we use ray features to describe the object boundary. Third, our method can segment multiple objects and ensure that the segmentations do not overlap. The problem is represented as an energy minimization and is solved either greedily or iteratively using graph cuts. We demonstrate the performance of the proposed method and compare it with alternative approaches on the task of segmenting individual eggs in microscopy images of Drosophila ovaries.

  13. Occam’s Quantum Strop: Synchronizing and Compressing Classical Cryptic Processes via a Quantum Channel

    NASA Astrophysics Data System (ADS)

    Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.

    2016-02-01

    A stochastic process’ statistical complexity stands out as a fundamental property: the minimum information required to synchronize one process generator to another. How much information is required, though, when synchronizing over a quantum channel? Recent work demonstrated that representing causal similarity as quantum state-indistinguishability provides a quantum advantage. We generalize this to synchronization and offer a sequence of constructions that exploit extended causal structures, finding substantial increase of the quantum advantage. We demonstrate that maximum compression is determined by the process’ cryptic order-a classical, topological property closely allied to Markov order, itself a measure of historical dependence. We introduce an efficient algorithm that computes the quantum advantage and close noting that the advantage comes at a cost-one trades off prediction for generation complexity.

  14. Classical heat transport in anharmonic molecular junctions: exact solutions.

    PubMed

    Liu, Sha; Agarwalla, Bijay Kumar; Wang, Jian-Sheng; Li, Baowen

    2013-02-01

    We study full counting statistics for classical heat transport through anharmonic or nonlinear molecular junctions formed by interacting oscillators. An analytical result of the steady-state heat flux for an overdamped anharmonic junction with arbitrary temperature bias is obtained. It is found that the thermal conductance can be expressed in terms of a temperature-dependent effective force constant. The role of anharmonicity is identified. We also give the general formula for the second cumulant of heat in steady state, as well as the average geometric heat flux when two system parameters are modulated adiabatically. We present an anharmonic example for which all cumulants for heat can be obtained exactly. For a bounded single oscillator model with mass we found that the cumulants are independent of the nonlinear potential.

  15. Occam's Quantum Strop: Synchronizing and Compressing Classical Cryptic Processes via a Quantum Channel.

    PubMed

    Mahoney, John R; Aghamohammadi, Cina; Crutchfield, James P

    2016-02-15

    A stochastic process' statistical complexity stands out as a fundamental property: the minimum information required to synchronize one process generator to another. How much information is required, though, when synchronizing over a quantum channel? Recent work demonstrated that representing causal similarity as quantum state-indistinguishability provides a quantum advantage. We generalize this to synchronization and offer a sequence of constructions that exploit extended causal structures, finding substantial increase of the quantum advantage. We demonstrate that maximum compression is determined by the process' cryptic order--a classical, topological property closely allied to Markov order, itself a measure of historical dependence. We introduce an efficient algorithm that computes the quantum advantage and close noting that the advantage comes at a cost-one trades off prediction for generation complexity.

  16. Quantum speedup of Monte Carlo methods

    PubMed Central

    Montanaro, Ashley

    2015-01-01

    Monte Carlo methods use random sampling to estimate numerical quantities which are hard to compute deterministically. One important example is the use in statistical physics of rapidly mixing Markov chains to approximately compute partition functions. In this work, we describe a quantum algorithm which can accelerate Monte Carlo methods in a very general setting. The algorithm estimates the expected output value of an arbitrary randomized or quantum subroutine with bounded variance, achieving a near-quadratic speedup over the best possible classical algorithm. Combining the algorithm with the use of quantum walks gives a quantum speedup of the fastest known classical algorithms with rigorous performance bounds for computing partition functions, which use multiple-stage Markov chain Monte Carlo techniques. The quantum algorithm can also be used to estimate the total variation distance between probability distributions efficiently. PMID:26528079

  17. The Effect of Substituting p for alpha on the Unconditional and Conditional Powers of a Null Hypothesis Test.

    ERIC Educational Resources Information Center

    Martuza, Victor R.; Engel, John D.

    Results from classical power analysis (Brewer, 1972) suggest that a researcher should not set a=p (when p is less than a) in a posteriori fashion when a study yields statistically significant results because of a resulting decrease in power. The purpose of the present report is to use Bayesian theory in examining the validity of this…

  18. The Importance of Variance in Statistical Analysis: Don't Throw Out the Baby with the Bathwater.

    ERIC Educational Resources Information Center

    Peet, Martha W.

    This paper analyzes what happens to the effect size of a given dataset when the variance is removed by categorization for the purpose of applying "OVA" methods (analysis of variance, analysis of covariance). The dataset is from a classic study by Holzinger and Swinefors (1939) in which more than 20 ability test were administered to 301…

  19. Decomposition of the Inequality of Income Distribution by Income Types—Application for Romania

    NASA Astrophysics Data System (ADS)

    Andrei, Tudorel; Oancea, Bogdan; Richmond, Peter; Dhesi, Gurjeet; Herteliu, Claudiu

    2017-09-01

    This paper identifies the salient factors that characterize the inequality income distribution for Romania. Data analysis is rigorously carried out using sophisticated techniques borrowed from classical statistics (Theil). Decomposition of the inequalities measured by the Theil index is also performed. This study relies on an exhaustive (11.1 million records for 2014) data-set for total personal gross income of Romanian citizens.

  20. Dynamics of Markets

    NASA Astrophysics Data System (ADS)

    McCauley, Joseph L.

    2009-09-01

    Preface; 1. Econophysics: why and what; 2. Neo-classical economic theory; 3. Probability and stochastic processes; 4. Introduction to financial economics; 5. Introduction to portfolio selection theory; 6. Scaling, pair correlations, and conditional densities; 7. Statistical ensembles: deducing dynamics from time series; 8. Martingale option pricing; 9. FX market globalization: evolution of the dollar to worldwide reserve currency; 10. Macroeconomics and econometrics: regression models vs. empirically based modeling; 11. Complexity; Index.

  1. Academician Nikolai Nikolaevich Bogolyubov (for the 100th anniversary of his birth)

    NASA Astrophysics Data System (ADS)

    Martynyuk, A. A.; Mishchenko, E. F.; Samoilenko, A. M.; Sukhanov, A. D.

    2009-07-01

    This paper is dedicated to the memory of N. N. Bogolyubov in recognition of his towering stature in nonlinear mechanics and theoretical physics, his remarkable many-sided genius, and the originality and depth of his contribution to the world's science. The paper briefly describes Bogolyubov's achievements in nonlinear mechanics, classical statistical physics, theory of superconductivity, quantum field theory, and strong interaction theory

  2. On the Benefits of Latent Variable Modeling for Norming Scales: The Case of the "Supports Intensity Scale--Children's Version"

    ERIC Educational Resources Information Center

    Seo, Hyojeong; Little, Todd D.; Shogren, Karrie A.; Lang, Kyle M.

    2016-01-01

    Structural equation modeling (SEM) is a powerful and flexible analytic tool to model latent constructs and their relations with observed variables and other constructs. SEM applications offer advantages over classical models in dealing with statistical assumptions and in adjusting for measurement error. So far, however, SEM has not been fully used…

  3. On the potential for the Partial Triadic Analysis to grasp the spatio-temporal variability of groundwater hydrochemistry

    NASA Astrophysics Data System (ADS)

    Gourdol, L.; Hissler, C.; Pfister, L.

    2012-04-01

    The Luxembourg sandstone aquifer is of major relevance for the national supply of drinking water in Luxembourg. The city of Luxembourg (20% of the country's population) gets almost 2/3 of its drinking water from this aquifer. As a consequence, the study of both the groundwater hydrochemistry, as well as its spatial and temporal variations, are considered as of highest priority. Since 2005, a monitoring network has been implemented by the Water Department of Luxembourg City, with a view to a more sustainable management of this strategic water resource. The data collected to date forms a large and complex dataset, describing spatial and temporal variations of many hydrochemical parameters. The data treatment issue is tightly connected to this kind of water monitoring programs and complex databases. Standard multivariate statistical techniques, such as principal components analysis and hierarchical cluster analysis, have been widely used as unbiased methods for extracting meaningful information from groundwater quality data and are now classically used in many hydrogeological studies, in particular to characterize temporal or spatial hydrochemical variations induced by natural and anthropogenic factors. But these classical multivariate methods deal with two-way matrices, usually parameters/sites or parameters/time, while often the dataset resulting from qualitative water monitoring programs should be seen as a datacube parameters/sites/time. Three-way matrices, such as the one we propose here, are difficult to handle and to analyse by classical multivariate statistical tools and thus should be treated with approaches dealing with three-way data structures. One possible analysis approach consists in the use of partial triadic analysis (PTA). The PTA was previously used with success in many ecological studies but never to date in the domain of hydrogeology. Applied to the dataset of the Luxembourg Sandstone aquifer, the PTA appears as a new promising statistical instrument for hydrogeologists, in particular to characterize temporal and spatial hydrochemical variations induced by natural and anthropogenic factors. This new approach for groundwater management offers potential for 1) identifying a common multivariate spatial structure, 2) untapping the different hydrochemical patterns and explaining their controlling factors and 3) analysing the temporal variability of this structure and grasping hydrochemical changes.

  4. Detecting central fixation by means of artificial neural networks in a pediatric vision screener using retinal birefringence scanning.

    PubMed

    Gramatikov, Boris I

    2017-04-27

    Reliable detection of central fixation and eye alignment is essential in the diagnosis of amblyopia ("lazy eye"), which can lead to blindness. Our lab has developed and reported earlier a pediatric vision screener that performs scanning of the retina around the fovea and analyzes changes in the polarization state of light as the scan progresses. Depending on the direction of gaze and the instrument design, the screener produces several signal frequencies that can be utilized in the detection of central fixation. The objective of this study was to compare artificial neural networks with classical statistical methods, with respect to their ability to detect central fixation reliably. A classical feedforward, pattern recognition, two-layer neural network architecture was used, consisting of one hidden layer and one output layer. The network has four inputs, representing normalized spectral powers at four signal frequencies generated during retinal birefringence scanning. The hidden layer contains four neurons. The output suggests presence or absence of central fixation. Backpropagation was used to train the network, using the gradient descent algorithm and the cross-entropy error as the performance function. The network was trained, validated and tested on a set of controlled calibration data obtained from 600 measurements from ten eyes in a previous study, and was additionally tested on a clinical set of 78 eyes, independently diagnosed by an ophthalmologist. In the first part of this study, a neural network was designed around the calibration set. With a proper architecture and training, the network provided performance that was comparable to classical statistical methods, allowing perfect separation between the central and paracentral fixation data, with both the sensitivity and the specificity of the instrument being 100%. In the second part of the study, the neural network was applied to the clinical data. It allowed reliable separation between normal subjects and affected subjects, its accuracy again matching that of the statistical methods. With a proper choice of a neural network architecture and a good, uncontaminated training data set, the artificial neural network can be an efficient classification tool for detecting central fixation based on retinal birefringence scanning.

  5. Spectra of turbulently advected scalars that have small Schmidt number

    NASA Astrophysics Data System (ADS)

    Hill, Reginald J.

    2017-09-01

    Exact statistical equations are derived for turbulent advection of a passive scalar having diffusivity much larger than the kinematic viscosity, i.e., small Schmidt number. The equations contain all terms needed for precise direct numerical simulation (DNS) quantification. In the appropriate limit, the equations reduce to the classical theory for which the scalar spectrum is proportional to the energy spectrum multiplied by k-4, which, in turn, results in the inertial-diffusive range power law, k-17 /3. The classical theory was derived for the case of isotropic velocity and scalar fields. The exact equations are simplified for less restrictive cases: (1) locally isotropic scalar fluctuations at dissipation scales with no restriction on symmetry of the velocity field, (2) isotropic velocity field with averaging over all wave-vector directions with no restriction on the symmetry of the scalar, motivated by that average being used for DNS, and (3) isotropic velocity field with axisymmetric scalar fluctuations, motivated by the mean-scalar-gradient-source case. The equations are applied to recently published DNSs of passive scalars for the cases of a freely decaying scalar and a mean-scalar-gradient source. New terms in the exact equations are estimated for those cases and are found to be significant; those terms cause the deviations from the classical theory found by the DNS studies. A new formula for the mean-scalar-gradient case explains the variation of the scalar spectra for the DNS of the smallest Schmidt-number cases. Expansion in Legendre polynomials reveals the effect of axisymmetry. Inertial-diffusive-range formulas for both the zero- and second-order Legendre contributions are given. Exact statistical equations reveal what must be quantified using DNS to determine what causes deviations from asymptotic relationships.

  6. Experimental design and statistical methods for improved hit detection in high-throughput screening.

    PubMed

    Malo, Nathalie; Hanley, James A; Carlile, Graeme; Liu, Jing; Pelletier, Jerry; Thomas, David; Nadon, Robert

    2010-09-01

    Identification of active compounds in high-throughput screening (HTS) contexts can be substantially improved by applying classical experimental design and statistical inference principles to all phases of HTS studies. The authors present both experimental and simulated data to illustrate how true-positive rates can be maximized without increasing false-positive rates by the following analytical process. First, the use of robust data preprocessing methods reduces unwanted variation by removing row, column, and plate biases. Second, replicate measurements allow estimation of the magnitude of the remaining random error and the use of formal statistical models to benchmark putative hits relative to what is expected by chance. Receiver Operating Characteristic (ROC) analyses revealed superior power for data preprocessed by a trimmed-mean polish method combined with the RVM t-test, particularly for small- to moderate-sized biological hits.

  7. Statistics of extreme waves in the framework of one-dimensional Nonlinear Schrodinger Equation

    NASA Astrophysics Data System (ADS)

    Agafontsev, Dmitry; Zakharov, Vladimir

    2013-04-01

    We examine the statistics of extreme waves for one-dimensional classical focusing Nonlinear Schrodinger (NLS) equation, iΨt + Ψxx + |Ψ |2Ψ = 0, (1) as well as the influence of the first nonlinear term beyond Eq. (1) - the six-wave interactions - on the statistics of waves in the framework of generalized NLS equation accounting for six-wave interactions, dumping (linear dissipation, two- and three-photon absorption) and pumping terms, We solve these equations numerically in the box with periodically boundary conditions starting from the initial data Ψt=0 = F(x) + ?(x), where F(x) is an exact modulationally unstable solution of Eq. (1) seeded by stochastic noise ?(x) with fixed statistical properties. We examine two types of initial conditions F(x): (a) condensate state F(x) = 1 for Eq. (1)-(2) and (b) cnoidal wave for Eq. (1). The development of modulation instability in Eq. (1)-(2) leads to formation of one-dimensional wave turbulence. In the integrable case the turbulence is called integrable and relaxes to one of infinite possible stationary states. Addition of six-wave interactions term leads to appearance of collapses that eventually are regularized by the dumping terms. The energy lost during regularization of collapses in (2) is restored by the pumping term. In the latter case the system does not demonstrate relaxation-like behavior. We measure evolution of spectra Ik =< |Ψk|2 >, spatial correlation functions and the PDFs for waves amplitudes |Ψ|, concentrating special attention on formation of "fat tails" on the PDFs. For the classical integrable NLS equation (1) with condensate initial condition we observe Rayleigh tails for extremely large waves and a "breathing region" for middle waves with oscillations of the frequency of waves appearance with time, while nonintegrable NLS equation with dumping and pumping terms (2) with the absence of six-wave interactions α = 0 demonstrates perfectly Rayleigh PDFs without any oscillations with time. In case of the cnoidal wave initial condition we observe severely non-Rayleigh PDFs for the classical NLS equation (1) with the regions corresponding to 2-, 3- and so on soliton collisions clearly seen of the PDFs. Addition of six-wave interactions in Eq. (2) for condensate initial condition results in appearance of non-Rayleigh addition to the PDFs that increase with six-wave interaction constant α and disappears with the absence of six-wave interactions α = 0. References: [1] D.S. Agafontsev, V.E. Zakharov, Rogue waves statistics in the framework of one-dimensional Generalized Nonlinear Schrodinger Equation, arXiv:1202.5763v3.

  8. A Matched Filter Technique for Slow Radio Transient Detection and First Demonstration with the Murchison Widefield Array

    NASA Astrophysics Data System (ADS)

    Feng, L.; Vaulin, R.; Hewitt, J. N.; Remillard, R.; Kaplan, D. L.; Murphy, Tara; Kudryavtseva, N.; Hancock, P.; Bernardi, G.; Bowman, J. D.; Briggs, F.; Cappallo, R. J.; Deshpande, A. A.; Gaensler, B. M.; Greenhill, L. J.; Hazelton, B. J.; Johnston-Hollitt, M.; Lonsdale, C. J.; McWhirter, S. R.; Mitchell, D. A.; Morales, M. F.; Morgan, E.; Oberoi, D.; Ord, S. M.; Prabu, T.; Udaya Shankar, N.; Srivani, K. S.; Subrahmanyan, R.; Tingay, S. J.; Wayth, R. B.; Webster, R. L.; Williams, A.; Williams, C. L.

    2017-03-01

    Many astronomical sources produce transient phenomena at radio frequencies, but the transient sky at low frequencies (<300 MHz) remains relatively unexplored. Blind surveys with new wide-field radio instruments are setting increasingly stringent limits on the transient surface density on various timescales. Although many of these instruments are limited by classical confusion noise from an ensemble of faint, unresolved sources, one can in principle detect transients below the classical confusion limit to the extent that the classical confusion noise is independent of time. We develop a technique for detecting radio transients that is based on temporal matched filters applied directly to time series of images, rather than relying on source-finding algorithms applied to individual images. This technique has well-defined statistical properties and is applicable to variable and transient searches for both confusion-limited and non-confusion-limited instruments. Using the Murchison Widefield Array as an example, we demonstrate that the technique works well on real data despite the presence of classical confusion noise, sidelobe confusion noise, and other systematic errors. We searched for transients lasting between 2 minutes and 3 months. We found no transients and set improved upper limits on the transient surface density at 182 MHz for flux densities between ˜20 and 200 mJy, providing the best limits to date for hour- and month-long transients.

  9. Intertooth patterns of hypoplasia expression: implications for childhood health in the classic Maya collapse.

    PubMed

    Wright, L E

    1997-02-01

    Enamel hypoplasias, which record interacting stresses of nutrition and illness during the period of tooth formation, are a key tool in the study of childhood health in prehistory. But interpretation of the age of peak morbidity is complicated by differences in susceptibility to stress both between tooth positions and within a single tooth. Here, hypoplasias are used to evaluate the prevailing ecological model for the collapse of Classic Period Lowland Maya civilization, circa AD 900. Hypoplasias were recorded in the full dentition of 160 adult skeletons from six archaeological sites in the Pasion River region of Guatemala. Instead of constructing a composite scale of stress experience, teeth are considered separately by position in the analysis. No statistical differences are found in the proportion of teeth affected by hypoplasia between "Early," Late Classic, and Terminal Classic Periods for anterior teeth considered to be most susceptible to stress, indicating stability in the overall stress loads affecting children of the three chronological periods. However, hypoplasia trends in posterior teeth may imply a change in the ontogenetic-timing of more severe stress episodes during the final occupation and perhaps herald a shift in child-care practices. These results provide little support for the ecological model of collapse but do call to attention the potential of posterior teeth to reveal subtle changes in childhood morbidity when consideredindividually.

  10. Life satisfaction and school performance of children exposed to classic and cyber peer bullying.

    PubMed

    Bilić, Vesna; Flander, Gordana Buljan; Rafajac, Branko

    2014-03-01

    This paper analyses the relationship between the exposure of school children to various forms of peer bullying (classic/cyber) and their life satisfaction in the domain of school, family, friends and school performance. The sample included 562 children from rural and urban areas of Croatia who were attending the seventh and the eighth grade of primary school. Results show that children were more often exposed to classic forms of peer bullying, especially verbal, and then physical bullying. On the other hand, cyber bullying most often comprises harassment in forums, blogs, chats or social networks, then on the web, by e-mail and mobile phone. Almost half of the examinees knew the identity of the bully, while a minority believes that bullies are the same ones who also physically abuse them at school. We found that children exposed to all forms of both classic and cyber bullying, unlike their peers who do not have such experience, show less satisfaction with friends, while those exposed to physical and cyber bullying show dissatisfaction with their family, too. However no statistically significant difference was found in their satisfaction with school. Children exposed to physical bullying showed poorer school performance, poorer achievement in Croatian and math, while children exposed to verbal and cyber bullying and children who were not exposed to such forms of bullying showed no differences in their school achievement.

  11. The midline central artery forehead flap: a valid alternative to supratrochlear-based forehead flaps.

    PubMed

    Faris, Callum; van der Eerden, Paul; Vuyk, Hade

    2015-01-01

    This study clarifies the pedicle geometry and vascular supply of a midline forehead flap for nasal reconstruction. It reports on the vascular reliability of this flap and its ability to reduce hair transposition to the nose, a major complicating factor of previous forehead flap designs. To compare the vascular reliability of 3 different pedicle designs of the forehead flap in nasal reconstruction (classic paramedian, glabellar paramedian, and central artery flap design) and evaluate hair transposition rates and aesthetic results. Retrospective analysis of patient data and outcomes retrieved from computer files generated at the time of surgery, supplemented by data from the patient medical records and photographic documentation, from a tertiary referral nasal reconstructive practice, within a secondary-care hospital setting. The study population included all consecutive patients over a 19-year period who underwent primary forehead flap repair of nasal defects, with more than 3 months of postoperative follow-up and photographic documentation. Three sequential forehead flap patterns were used (classic paramedian flap, glabella flap, and central artery flap) for nasal reconstruction over the study duration. Data collected included patient characteristics, method of repair, complications, functional outcome, and patient satisfaction score. For cosmetic outcome, photographic documentation was scored by a medical juror. No forehead flap had vascular compromise in the first stage. Partial flap necrosis was reported in subsequent stages in 4 patients (1%), with no statistical difference in the rate of vascular compromise between the 3 flap designs. Hair transposition to the nose was lower in the central artery forehead flap (7%) compared with the classic paramedian (23%) and glabellar paramedian (13%) flaps (P < .05). Photographic evaluation in 227 patients showed that brow position (98%) and color match (83%) were good in the majority of the patients. In this series, the central artery forehead flap was as reliable (in terms of vascularity) as the glabellar and classic paramedian forehead flap. Its use resulted in a statistically significant reduction in transfer of hair to the nose in our series. 3.

  12. Delta13C and delta18O isotopic composition of CaCO3 measured by continuous flow isotope ratio mass spectrometry: statistical evaluation and verification by application to Devils Hole core DH-11 calcite.

    PubMed

    Révész, Kinga M; Landwehr, Jurate M

    2002-01-01

    A new method was developed to analyze the stable carbon and oxygen isotope ratios of small samples (400 +/- 20 micro g) of calcium carbonate. This new method streamlines the classical phosphoric acid/calcium carbonate (H(3)PO(4)/CaCO(3)) reaction method by making use of a recently available Thermoquest-Finnigan GasBench II preparation device and a Delta Plus XL continuous flow isotope ratio mass spectrometer. Conditions for which the H(3)PO(4)/CaCO(3) reaction produced reproducible and accurate results with minimal error had to be determined. When the acid/carbonate reaction temperature was kept at 26 degrees C and the reaction time was between 24 and 54 h, the precision of the carbon and oxygen isotope ratios for pooled samples from three reference standard materials was

  13. A Bayesian approach to meta-analysis of plant pathology studies.

    PubMed

    Mila, A L; Ngugi, H K

    2011-01-01

    Bayesian statistical methods are used for meta-analysis in many disciplines, including medicine, molecular biology, and engineering, but have not yet been applied for quantitative synthesis of plant pathology studies. In this paper, we illustrate the key concepts of Bayesian statistics and outline the differences between Bayesian and classical (frequentist) methods in the way parameters describing population attributes are considered. We then describe a Bayesian approach to meta-analysis and present a plant pathological example based on studies evaluating the efficacy of plant protection products that induce systemic acquired resistance for the management of fire blight of apple. In a simple random-effects model assuming a normal distribution of effect sizes and no prior information (i.e., a noninformative prior), the results of the Bayesian meta-analysis are similar to those obtained with classical methods. Implementing the same model with a Student's t distribution and a noninformative prior for the effect sizes, instead of a normal distribution, yields similar results for all but acibenzolar-S-methyl (Actigard) which was evaluated only in seven studies in this example. Whereas both the classical (P = 0.28) and the Bayesian analysis with a noninformative prior (95% credibility interval [CRI] for the log response ratio: -0.63 to 0.08) indicate a nonsignificant effect for Actigard, specifying a t distribution resulted in a significant, albeit variable, effect for this product (CRI: -0.73 to -0.10). These results confirm the sensitivity of the analytical outcome (i.e., the posterior distribution) to the choice of prior in Bayesian meta-analyses involving a limited number of studies. We review some pertinent literature on more advanced topics, including modeling of among-study heterogeneity, publication bias, analyses involving a limited number of studies, and methods for dealing with missing data, and show how these issues can be approached in a Bayesian framework. Bayesian meta-analysis can readily include information not easily incorporated in classical methods, and allow for a full evaluation of competing models. Given the power and flexibility of Bayesian methods, we expect them to become widely adopted for meta-analysis of plant pathology studies.

  14. Sanov and central limit theorems for output statistics of quantum Markov chains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horssen, Merlijn van, E-mail: merlijn.vanhorssen@nottingham.ac.uk; Guţă, Mădălin, E-mail: madalin.guta@nottingham.ac.uk

    2015-02-15

    In this paper, we consider the statistics of repeated measurements on the output of a quantum Markov chain. We establish a large deviations result analogous to Sanov’s theorem for the multi-site empirical measure associated to finite sequences of consecutive outcomes of a classical stochastic process. Our result relies on the construction of an extended quantum transition operator (which keeps track of previous outcomes) in terms of which we compute moment generating functions, and whose spectral radius is related to the large deviations rate function. As a corollary to this, we obtain a central limit theorem for the empirical measure. Suchmore » higher level statistics may be used to uncover critical behaviour such as dynamical phase transitions, which are not captured by lower level statistics such as the sample mean. As a step in this direction, we give an example of a finite system whose level-1 (empirical mean) rate function is independent of a model parameter while the level-2 (empirical measure) rate is not.« less

  15. Quantile regression for the statistical analysis of immunological data with many non-detects.

    PubMed

    Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth

    2012-07-07

    Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.

  16. Inverse statistical estimation via order statistics: a resolution of the ill-posed inverse problem of PERT scheduling

    NASA Astrophysics Data System (ADS)

    Pickard, William F.

    2004-10-01

    The classical PERT inverse statistics problem requires estimation of the mean, \\skew1\\bar{m} , and standard deviation, s, of a unimodal distribution given estimates of its mode, m, and of the smallest, a, and largest, b, values likely to be encountered. After placing the problem in historical perspective and showing that it is ill-posed because it is underdetermined, this paper offers an approach to resolve the ill-posedness: (a) by interpreting a and b modes of order statistic distributions; (b) by requiring also an estimate of the number of samples, N, considered in estimating the set {m, a, b}; and (c) by maximizing a suitable likelihood, having made the traditional assumption that the underlying distribution is beta. Exact formulae relating the four parameters of the beta distribution to {m, a, b, N} and the assumed likelihood function are then used to compute the four underlying parameters of the beta distribution; and from them, \\skew1\\bar{m} and s are computed using exact formulae.

  17. Comment on Pisarenko et al., "Characterization of the Tail of the Distribution of Earthquake Magnitudes by Combining the GEV and GPD Descriptions of Extreme Value Theory"

    NASA Astrophysics Data System (ADS)

    Raschke, Mathias

    2016-02-01

    In this short note, I comment on the research of Pisarenko et al. (Pure Appl. Geophys 171:1599-1624, 2014) regarding the extreme value theory and statistics in the case of earthquake magnitudes. The link between the generalized extreme value distribution (GEVD) as an asymptotic model for the block maxima of a random variable and the generalized Pareto distribution (GPD) as a model for the peaks over threshold (POT) of the same random variable is presented more clearly. Inappropriately, Pisarenko et al. (Pure Appl. Geophys 171:1599-1624, 2014) have neglected to note that the approximations by GEVD and GPD work only asymptotically in most cases. This is particularly the case with truncated exponential distribution (TED), a popular distribution model for earthquake magnitudes. I explain why the classical models and methods of the extreme value theory and statistics do not work well for truncated exponential distributions. Consequently, these classical methods should be used for the estimation of the upper bound magnitude and corresponding parameters. Furthermore, I comment on various issues of statistical inference in Pisarenko et al. and propose alternatives. I argue why GPD and GEVD would work for various types of stochastic earthquake processes in time, and not only for the homogeneous (stationary) Poisson process as assumed by Pisarenko et al. (Pure Appl. Geophys 171:1599-1624, 2014). The crucial point of earthquake magnitudes is the poor convergence of their tail distribution to the GPD, and not the earthquake process over time.

  18. Geostatistics - a tool applied to the distribution of Legionella pneumophila in a hospital water system.

    PubMed

    Laganà, Pasqualina; Moscato, Umberto; Poscia, Andrea; La Milia, Daniele Ignazio; Boccia, Stefania; Avventuroso, Emanuela; Delia, Santi

    2015-01-01

    Legionnaires' disease is normally acquired by inhalation of legionellae from a contaminated environmental source. Water systems of large buildings, such as hospitals, are often contaminated with legionellae and therefore represent a potential risk for the hospital population. The aim of this study was to evaluate the potential contamination of Legionella pneumophila (LP) in a large hospital in Italy through georeferential statistical analysis to assess the possible sources of dispersion and, consequently, the risk of exposure for both health care staff and patients. LP serogroups 1 and 2-14 distribution was considered in the wards housed on two consecutive floors of the hospital building. On the basis of information provided by 53 bacteriological analysis, a 'random' grid of points was chosen and spatial geostatistics or FAIk Kriging was applied and compared with the results of classical statistical analysis. Over 50% of the examined samples were positive for Legionella pneumophila. LP 1 was isolated in 69% of samples from the ground floor and in 60% of sample from the first floor; LP 2-14 in 36% of sample from the ground floor and 24% from the first. The iso-estimation maps show clearly the most contaminated pipe and the difference in the diffusion of the different L. pneumophila serogroups. Experimental work has demonstrated that geostatistical methods applied to the microbiological analysis of water matrices allows a better modeling of the phenomenon under study, a greater potential for risk management and a greater choice of methods of prevention and environmental recovery to be put in place with respect to the classical statistical analysis.

  19. The coordinate-based meta-analysis of neuroimaging data.

    PubMed

    Samartsidis, Pantelis; Montagna, Silvia; Nichols, Thomas E; Johnson, Timothy D

    2017-01-01

    Neuroimaging meta-analysis is an area of growing interest in statistics. The special characteristics of neuroimaging data render classical meta-analysis methods inapplicable and therefore new methods have been developed. We review existing methodologies, explaining the benefits and drawbacks of each. A demonstration on a real dataset of emotion studies is included. We discuss some still-open problems in the field to highlight the need for future research.

  20. [Organic brain syndrome].

    PubMed

    Hojaij, C R

    1984-12-01

    Organic Brain Syndrome (OBS) is an expression finding in the Diagnostic and Statistical Manual of Mental Disorders belonging to the great chapter of Organic Mental Disorders. With this meaning, it has been used in psychiatric centers outside the United States. Beginning with a lecture of the major aspects of the OBS, a critical revision is formulated under methodological and conceptual views of psychopathology. For that, classic authors are revised from Bonhoeffer to Weitbrecht.

  1. Statistical Inference and Reverse Engineering of Gene Regulatory Networks from Observational Expression Data

    PubMed Central

    Emmert-Streib, Frank; Glazko, Galina V.; Altay, Gökmen; de Matos Simoes, Ricardo

    2012-01-01

    In this paper, we present a systematic and conceptual overview of methods for inferring gene regulatory networks from observational gene expression data. Further, we discuss two classic approaches to infer causal structures and compare them with contemporary methods by providing a conceptual categorization thereof. We complement the above by surveying global and local evaluation measures for assessing the performance of inference algorithms. PMID:22408642

  2. The coordinate-based meta-analysis of neuroimaging data

    PubMed Central

    Samartsidis, Pantelis; Montagna, Silvia; Nichols, Thomas E.; Johnson, Timothy D.

    2017-01-01

    Neuroimaging meta-analysis is an area of growing interest in statistics. The special characteristics of neuroimaging data render classical meta-analysis methods inapplicable and therefore new methods have been developed. We review existing methodologies, explaining the benefits and drawbacks of each. A demonstration on a real dataset of emotion studies is included. We discuss some still-open problems in the field to highlight the need for future research. PMID:29545671

  3. Introduction to Geostatistics

    NASA Astrophysics Data System (ADS)

    Kitanidis, P. K.

    1997-05-01

    Introduction to Geostatistics presents practical techniques for engineers and earth scientists who routinely encounter interpolation and estimation problems when analyzing data from field observations. Requiring no background in statistics, and with a unique approach that synthesizes classic and geostatistical methods, this book offers linear estimation methods for practitioners and advanced students. Well illustrated with exercises and worked examples, Introduction to Geostatistics is designed for graduate-level courses in earth sciences and environmental engineering.

  4. Solar-cell interconnect design for terrestrial photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1984-01-01

    Useful solar cell interconnect reliability design and life prediction algorithms are presented, together with experimental data indicating that the classical strain cycle (fatigue) curve for the interconnect material does not account for the statistical scatter that is required in reliability predictions. This shortcoming is presently addressed by fitting a functional form to experimental cumulative interconnect failure rate data, which thereby yields statistical fatigue curves enabling not only the prediction of cumulative interconnect failures during the design life of an array field, but also the quantitative interpretation of data from accelerated thermal cycling tests. Optimal interconnect cost reliability design algorithms are also derived which may allow the minimization of energy cost over the design life of the array field.

  5. Statistical Analysis for Collision-free Boson Sampling.

    PubMed

    Huang, He-Liang; Zhong, Han-Sen; Li, Tan; Li, Feng-Guang; Fu, Xiang-Qun; Zhang, Shuo; Wang, Xiang; Bao, Wan-Su

    2017-11-10

    Boson sampling is strongly believed to be intractable for classical computers but solvable with photons in linear optics, which raises widespread concern as a rapid way to demonstrate the quantum supremacy. However, due to its solution is mathematically unverifiable, how to certify the experimental results becomes a major difficulty in the boson sampling experiment. Here, we develop a statistical analysis scheme to experimentally certify the collision-free boson sampling. Numerical simulations are performed to show the feasibility and practicability of our scheme, and the effects of realistic experimental conditions are also considered, demonstrating that our proposed scheme is experimentally friendly. Moreover, our broad approach is expected to be generally applied to investigate multi-particle coherent dynamics beyond the boson sampling.

  6. Meteorological regimes for the classification of aerospace air quality predictions for NASA-Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Stephens, J. B.; Sloan, J. C.

    1976-01-01

    A method is described for developing a statistical air quality assessment for the launch of an aerospace vehicle from the Kennedy Space Center in terms of existing climatological data sets. The procedure can be refined as developing meteorological conditions are identified for use with the NASA-Marshall Space Flight Center Rocket Exhaust Effluent Diffusion (REED) description. Classical climatological regimes for the long range analysis can be narrowed as the synoptic and mesoscale structure is identified. Only broad synoptic regimes are identified at this stage of analysis. As the statistical data matrix is developed, synoptic regimes will be refined in terms of the resulting eigenvectors as applicable to aerospace air quality predictions.

  7. Statistics of work performed on a forced quantum oscillator.

    PubMed

    Talkner, Peter; Burada, P Sekhar; Hänggi, Peter

    2008-07-01

    Various aspects of the statistics of work performed by an external classical force on a quantum mechanical system are elucidated for a driven harmonic oscillator. In this special case two parameters are introduced that are sufficient to completely characterize the force protocol. Explicit results for the characteristic function of work and the corresponding probability distribution are provided and discussed for three different types of initial states of the oscillator: microcanonical, canonical, and coherent states. Depending on the choice of the initial state the probability distributions of the performed work may greatly differ. This result in particular also holds true for identical force protocols. General fluctuation and work theorems holding for microcanonical and canonical initial states are confirmed.

  8. Solar-cell interconnect design for terrestrial photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1984-11-01

    Useful solar cell interconnect reliability design and life prediction algorithms are presented, together with experimental data indicating that the classical strain cycle (fatigue) curve for the interconnect material does not account for the statistical scatter that is required in reliability predictions. This shortcoming is presently addressed by fitting a functional form to experimental cumulative interconnect failure rate data, which thereby yields statistical fatigue curves enabling not only the prediction of cumulative interconnect failures during the design life of an array field, but also the quantitative interpretation of data from accelerated thermal cycling tests. Optimal interconnect cost reliability design algorithms are also derived which may allow the minimization of energy cost over the design life of the array field.

  9. Use of the Digamma Function in Statistical Astrophysics Distributions

    NASA Astrophysics Data System (ADS)

    Cahill, Michael

    2017-06-01

    Relaxed astrophysical statistical distributions may be constructed by using the inverse of a most probable energy distribution equation giving the energy ei of each particle in cell i in terms of the cell’s particle population Ni. The digamma mediated equation is A + Bei = Ψ(1+ Ni), where the constants A & B are Lagrange multipliers and Ψ is the digamma function given by Ψ(1+x) = dln(x!)/dx. Results are discussed for a Monatomic Ideal Gas, Atmospheres of Spherical Planets or Satellites and for Spherical Globular Clusters. These distributions are self-terminating even if other factors do not cause a cutoff. The examples are discussed classically but relativistic extensions are possible.

  10. Statistical thermodynamics of long straight rigid rods on triangular lattices: nematic order and adsorption thermodynamic functions.

    PubMed

    Matoz-Fernandez, D A; Linares, D H; Ramirez-Pastor, A J

    2012-09-04

    The statistical thermodynamics of straight rigid rods of length k on triangular lattices was developed on a generalization in the spirit of the lattice-gas model and the classical Guggenheim-DiMarzio approximation. In this scheme, the Helmholtz free energy and its derivatives were written in terms of the order parameter, δ, which characterizes the nematic phase occurring in the system at intermediate densities. Then, using the principle of minimum free energy with δ as a parameter, the main adsorption properties were calculated. Comparisons with Monte Carlo simulations and experimental data were performed in order to evaluate the outcome and limitations of the theoretical model.

  11. Entropy in sound and vibration: towards a new paradigm

    PubMed Central

    2017-01-01

    This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart. PMID:28265190

  12. Statistical power and utility of meta-analysis methods for cross-phenotype genome-wide association studies.

    PubMed

    Zhu, Zhaozhong; Anttila, Verneri; Smoller, Jordan W; Lee, Phil H

    2018-01-01

    Advances in recent genome wide association studies (GWAS) suggest that pleiotropic effects on human complex traits are widespread. A number of classic and recent meta-analysis methods have been used to identify genetic loci with pleiotropic effects, but the overall performance of these methods is not well understood. In this work, we use extensive simulations and case studies of GWAS datasets to investigate the power and type-I error rates of ten meta-analysis methods. We specifically focus on three conditions commonly encountered in the studies of multiple traits: (1) extensive heterogeneity of genetic effects; (2) characterization of trait-specific association; and (3) inflated correlation of GWAS due to overlapping samples. Although the statistical power is highly variable under distinct study conditions, we found the superior power of several methods under diverse heterogeneity. In particular, classic fixed-effects model showed surprisingly good performance when a variant is associated with more than a half of study traits. As the number of traits with null effects increases, ASSET performed the best along with competitive specificity and sensitivity. With opposite directional effects, CPASSOC featured the first-rate power. However, caution is advised when using CPASSOC for studying genetically correlated traits with overlapping samples. We conclude with a discussion of unresolved issues and directions for future research.

  13. Tuning the Photon Statistics of a Strongly Coupled Nanophotonic System

    NASA Astrophysics Data System (ADS)

    Dory, C.; Fischer, K. A.; Müller, K.; Lagoudakis, K. G.; Sarmiento, T.; Rundquist, A.; Zhang, J. L.; Kelaita, Y.; Sapra, N. V.; Vučković, J.

    Strongly coupled quantum-dot-photonic-crystal cavity systems provide a nonlinear ladder of hybridized light-matter states, which are a promising platform for non-classical light generation. The transmission of light through such systems enables light generation with tunable photon counting statistics. By detuning the frequencies of quantum emitter and cavity, we can tune the transmission of light to strongly enhance either single- or two-photon emission processes. However, these nanophotonic systems show a strongly dissipative nature and classical light obscures any quantum character of the emission. In this work, we utilize a self-homodyne interference technique combined with frequency-filtering to overcome this obstacle. This allows us to generate emission with a strong two-photon component in the multi-photon regime, where we measure a second-order coherence value of g (2) [ 0 ] = 1 . 490 +/- 0 . 034 . We propose rate equation models that capture the dominant processes of emission both in the single- and multi-photon regimes and support them by quantum-optical simulations that fully capture the frequency filtering of emission from our solid-state system. Finally, we simulate a third-order coherence value of g (3) [ 0 ] = 0 . 872 +/- 0 . 021 . Army Research Office (ARO) (W911NF1310309), National Science Foundation (1503759), Stanford Graduate Fellowship.

  14. Modeling failure in brittle porous ceramics

    NASA Astrophysics Data System (ADS)

    Keles, Ozgur

    Brittle porous materials (BPMs) are used for battery, fuel cell, catalyst, membrane, filter, bone graft, and pharmacy applications due to the multi-functionality of their underlying porosity. However, in spite of its technological benefits the effects of porosity on BPM fracture strength and Weibull statistics are not fully understood--limiting a wider use. In this context, classical fracture mechanics was combined with two-dimensional finite element simulations not only to account for pore-pore stress interactions, but also to numerically quantify the relationship between the local pore volume fraction and fracture statistics. Simulations show that even the microstructures with the same porosity level and size of pores differ substantially in fracture strength. The maximum reliability of BPMs was shown to be limited by the underlying pore--pore interactions. Fracture strength of BMPs decreases at a faster rate under biaxial loading than under uniaxial loading. Three different types of deviation from classic Weibull behavior are identified: P-type corresponding to a positive lower tail deviation, N-type corresponding to a negative lower tail deviation, and S-type corresponding to both positive upper and lower tail deviations. Pore-pore interactions result in either P-type or N-type deviation in the limit of low porosity, whereas S-type behavior occurs when clusters of low and high fracture strengths coexist in a fracture data.

  15. Nonlinear multivariate and time series analysis by neural network methods

    NASA Astrophysics Data System (ADS)

    Hsieh, William W.

    2004-03-01

    Methods in multivariate statistical analysis are essential for working with large amounts of geophysical data, data from observational arrays, from satellites, or from numerical model output. In classical multivariate statistical analysis, there is a hierarchy of methods, starting with linear regression at the base, followed by principal component analysis (PCA) and finally canonical correlation analysis (CCA). A multivariate time series method, the singular spectrum analysis (SSA), has been a fruitful extension of the PCA technique. The common drawback of these classical methods is that only linear structures can be correctly extracted from the data. Since the late 1980s, neural network methods have become popular for performing nonlinear regression and classification. More recently, neural network methods have been extended to perform nonlinear PCA (NLPCA), nonlinear CCA (NLCCA), and nonlinear SSA (NLSSA). This paper presents a unified view of the NLPCA, NLCCA, and NLSSA techniques and their applications to various data sets of the atmosphere and the ocean (especially for the El Niño-Southern Oscillation and the stratospheric quasi-biennial oscillation). These data sets reveal that the linear methods are often too simplistic to describe real-world systems, with a tendency to scatter a single oscillatory phenomenon into numerous unphysical modes or higher harmonics, which can be largely alleviated in the new nonlinear paradigm.

  16. Context-invariant quasi hidden variable (qHV) modelling of all joint von Neumann measurements for an arbitrary Hilbert space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loubenets, Elena R.

    We prove the existence for each Hilbert space of the two new quasi hidden variable (qHV) models, statistically noncontextual and context-invariant, reproducing all the von Neumann joint probabilities via non-negative values of real-valued measures and all the quantum product expectations—via the qHV (classical-like) average of the product of the corresponding random variables. In a context-invariant model, a quantum observable X can be represented by a variety of random variables satisfying the functional condition required in quantum foundations but each of these random variables equivalently models X under all joint von Neumann measurements, regardless of their contexts. The proved existence ofmore » this model negates the general opinion that, in terms of random variables, the Hilbert space description of all the joint von Neumann measurements for dimH≥3 can be reproduced only contextually. The existence of a statistically noncontextual qHV model, in particular, implies that every N-partite quantum state admits a local quasi hidden variable model introduced in Loubenets [J. Math. Phys. 53, 022201 (2012)]. The new results of the present paper point also to the generality of the quasi-classical probability model proposed in Loubenets [J. Phys. A: Math. Theor. 45, 185306 (2012)].« less

  17. Experimental evaluation of nonclassical correlations between measurement outcomes and target observable in a quantum measurement

    NASA Astrophysics Data System (ADS)

    Iinuma, Masataka; Suzuki, Yutaro; Nii, Taiki; Kinoshita, Ryuji; Hofmann, Holger F.

    2016-03-01

    In general, it is difficult to evaluate measurement errors when the initial and final conditions of the measurement make it impossible to identify the correct value of the target observable. Ozawa proposed a solution based on the operator algebra of observables which has recently been used in experiments investigating the error-disturbance trade-off of quantum measurements. Importantly, this solution makes surprisingly detailed statements about the relations between measurement outcomes and the unknown target observable. In the present paper, we investigate this relation by performing a sequence of two measurements on the polarization of a photon, so that the first measurement commutes with the target observable and the second measurement is sensitive to a complementary observable. While the initial measurement can be evaluated using classical statistics, the second measurement introduces the effects of quantum correlations between the noncommuting physical properties. By varying the resolution of the initial measurement, we can change the relative contribution of the nonclassical correlations and identify their role in the evaluation of the quantum measurement. It is shown that the most striking deviation from classical expectations is obtained at the transition between weak and strong measurements, where the competition between different statistical effects results in measurement values well outside the range of possible eigenvalues.

  18. Quantum-Like Bayesian Networks for Modeling Decision Making

    PubMed Central

    Moreira, Catarina; Wichert, Andreas

    2016-01-01

    In this work, we explore an alternative quantum structure to perform quantum probabilistic inferences to accommodate the paradoxical findings of the Sure Thing Principle. We propose a Quantum-Like Bayesian Network, which consists in replacing classical probabilities by quantum probability amplitudes. However, since this approach suffers from the problem of exponential growth of quantum parameters, we also propose a similarity heuristic that automatically fits quantum parameters through vector similarities. This makes the proposed model general and predictive in contrast to the current state of the art models, which cannot be generalized for more complex decision scenarios and that only provide an explanatory nature for the observed paradoxes. In the end, the model that we propose consists in a nonparametric method for estimating inference effects from a statistical point of view. It is a statistical model that is simpler than the previous quantum dynamic and quantum-like models proposed in the literature. We tested the proposed network with several empirical data from the literature, mainly from the Prisoner's Dilemma game and the Two Stage Gambling game. The results obtained show that the proposed quantum Bayesian Network is a general method that can accommodate violations of the laws of classical probability theory and make accurate predictions regarding human decision-making in these scenarios. PMID:26858669

  19. Characteristic functions of quantum heat with baths at different temperatures

    NASA Astrophysics Data System (ADS)

    Aurell, Erik

    2018-06-01

    This paper is about quantum heat defined as the change in energy of a bath during a process. The presentation takes into account recent developments in classical strong-coupling thermodynamics and addresses a version of quantum heat that satisfies quantum-classical correspondence. The characteristic function and the full counting statistics of quantum heat are shown to be formally similar. The paper further shows that the method can be extended to more than one bath, e.g., two baths at different temperatures, which opens up the prospect of studying correlations and heat flow. The paper extends earlier results on the expected quantum heat in the setting of one bath [E. Aurell and R. Eichhorn, New J. Phys. 17, 065007 (2015), 10.1088/1367-2630/17/6/065007; E. Aurell, Entropy 19, 595 (2017), 10.3390/e19110595].

  20. New class of generalized photon-added coherent states and some of their non-classical properties

    NASA Astrophysics Data System (ADS)

    Mojaveri, B.; Dehghani, A.; Mahmoodi, S.

    2014-08-01

    In this paper, we construct a new class of generalized photon added coherent states (GPACSs), |z,m{{\\rangle }_{r}} by excitations on a newly introduced family of generalized coherent states (GCSs) |z{{\\rangle }_{r}} (A Dehghani and B Mojaveri 2012 J. Phys. A: Math. Theor. 45 095304), obtained via generalized hypergeometric type displacement operators acting on the vacuum state of the simple harmonic oscillator. We show that these states realize resolution of the identity property through positive definite measures on the complex plane. Meanwhile, we demonstrate that the introduced states can also be interpreted as nonlinear coherent states (NLCSs), with a spacial nonlinearity function. Finally, some of their non-classical features as well as their quantum statistical properties are compared with Agarwal's photon-added coherent states (PACSs), \\left| z,m \\right\\rangle .

  1. The Phenomenology of Small-Scale Turbulence

    NASA Astrophysics Data System (ADS)

    Sreenivasan, K. R.; Antonia, R. A.

    I have sometimes thought that what makes a man's work classic is often just this multiplicity [of interpretations], which invites and at the same time resists our craving for a clear understanding. Wright (1982, p. 34), on Wittgenstein's philosophy Small-scale turbulence has been an area of especially active research in the recent past, and several useful research directions have been pursued. Here, we selectively review this work. The emphasis is on scaling phenomenology and kinematics of small-scale structure. After providing a brief introduction to the classical notions of universality due to Kolmogorov and others, we survey the existing work on intermittency, refined similarity hypotheses, anomalous scaling exponents, derivative statistics, intermittency models, and the structure and kinematics of small-scale structure - the latter aspect coming largely from the direct numerical simulation of homogeneous turbulence in a periodic box.

  2. Efficient multidimensional free energy calculations for ab initio molecular dynamics using classical bias potentials

    NASA Astrophysics Data System (ADS)

    VandeVondele, Joost; Rothlisberger, Ursula

    2000-09-01

    We present a method for calculating multidimensional free energy surfaces within the limited time scale of a first-principles molecular dynamics scheme. The sampling efficiency is enhanced using selected terms of a classical force field as a bias potential. This simple procedure yields a very substantial increase in sampling accuracy while retaining the high quality of the underlying ab initio potential surface and can thus be used for a parameter free calculation of free energy surfaces. The success of the method is demonstrated by the applications to two gas phase molecules, ethane and peroxynitrous acid, as test case systems. A statistical analysis of the results shows that the entire free energy landscape is well converged within a 40 ps simulation at 500 K, even for a system with barriers as high as 15 kcal/mol.

  3. Ghirardi-Rimini-Weber model with massive flashes

    NASA Astrophysics Data System (ADS)

    Tilloy, Antoine

    2018-01-01

    I introduce a modification of the Ghirardi-Rimini-Weber (GRW) model in which the flashes (or space-time collapse events) source a classical gravitational field. The resulting semiclassical theory of Newtonian gravity preserves the statistical interpretation of quantum states of matter in contrast with mean field approaches. It can be seen as a discrete version of recent proposals of consistent hybrid quantum classical theories. The model is in agreement with known experimental data and introduces new falsifiable predictions: (1) single particles do not self-interact, (2) the 1 /r gravitational potential of Newtonian gravity is cut off at short (≲10-7 m ) distances, and (3) gravity makes spatial superpositions decohere at a rate inversely proportional to that coming from the vanilla GRW model. Together, the last two predictions make the model experimentally falsifiable for all values of its parameters.

  4. Quantum optical signatures in strong-field laser physics: Infrared photon counting in high-order-harmonic generation.

    PubMed

    Gonoskov, I A; Tsatrafyllis, N; Kominis, I K; Tzallas, P

    2016-09-07

    We analytically describe the strong-field light-electron interaction using a quantized coherent laser state with arbitrary photon number. We obtain a light-electron wave function which is a closed-form solution of the time-dependent Schrödinger equation (TDSE). This wave function provides information about the quantum optical features of the interaction not accessible by semi-classical theories. With this approach we can reveal the quantum optical properties of high harmonic generation (HHG) process in gases by measuring the photon statistics of the transmitted infrared (IR) laser radiation. This work can lead to novel experiments in high-resolution spectroscopy in extreme-ultraviolet (XUV) and attosecond science without the need to measure the XUV light, while it can pave the way for the development of intense non-classical light sources.

  5. Effect of the Modified Glasgow Coma Scale Score Criteria for Mild Traumatic Brain Injury on Mortality Prediction: Comparing Classic and Modified Glasgow Coma Scale Score Model Scores of 13

    PubMed Central

    Mena, Jorge Humberto; Sanchez, Alvaro Ignacio; Rubiano, Andres M.; Peitzman, Andrew B.; Sperry, Jason L.; Gutierrez, Maria Isabel; Puyana, Juan Carlos

    2011-01-01

    Objective The Glasgow Coma Scale (GCS) classifies Traumatic Brain Injuries (TBI) as Mild (14–15); Moderate (9–13) or Severe (3–8). The ATLS modified this classification so that a GCS score of 13 is categorized as mild TBI. We investigated the effect of this modification on mortality prediction, comparing patients with a GCS of 13 classified as moderate TBI (Classic Model) to patients with GCS of 13 classified as mild TBI (Modified Model). Methods We selected adult TBI patients from the Pennsylvania Outcome Study database (PTOS). Logistic regressions adjusting for age, sex, cause, severity, trauma center level, comorbidities, and isolated TBI were performed. A second evaluation included the time trend of mortality. A third evaluation also included hypothermia, hypotension, mechanical ventilation, screening for drugs, and severity of TBI. Discrimination of the models was evaluated using the area under receiver operating characteristic curve (AUC). Calibration was evaluated using the Hoslmer-Lemershow goodness of fit (GOF) test. Results In the first evaluation, the AUCs were 0.922 (95 %CI, 0.917–0.926) and 0.908 (95 %CI, 0.903–0.912) for classic and modified models, respectively. Both models showed poor calibration (p<0.001). In the third evaluation, the AUCs were 0.946 (95 %CI, 0.943 – 0.949) and 0.938 (95 %CI, 0.934 –0.940) for the classic and modified models, respectively, with improvements in calibration (p=0.30 and p=0.02 for the classic and modified models, respectively). Conclusion The lack of overlap between ROC curves of both models reveals a statistically significant difference in their ability to predict mortality. The classic model demonstrated better GOF than the modified model. A GCS of 13 classified as moderate TBI in a multivariate logistic regression model performed better than a GCS of 13 classified as mild. PMID:22071923

  6. Hidden Statistics Approach to Quantum Simulations

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2010-01-01

    Recent advances in quantum information theory have inspired an explosion of interest in new quantum algorithms for solving hard computational (quantum and non-quantum) problems. The basic principle of quantum computation is that the quantum properties can be used to represent structure data, and that quantum mechanisms can be devised and built to perform operations with this data. Three basic non-classical properties of quantum mechanics superposition, entanglement, and direct-product decomposability were main reasons for optimism about capabilities of quantum computers that promised simultaneous processing of large massifs of highly correlated data. Unfortunately, these advantages of quantum mechanics came with a high price. One major problem is keeping the components of the computer in a coherent state, as the slightest interaction with the external world would cause the system to decohere. That is why the hardware implementation of a quantum computer is still unsolved. The basic idea of this work is to create a new kind of dynamical system that would preserve the main three properties of quantum physics superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods. In other words, such a system would reinforce the advantages and minimize limitations of both quantum and classical aspects. Based upon a concept of hidden statistics, a new kind of dynamical system for simulation of Schroedinger equation is proposed. The system represents a modified Madelung version of Schroedinger equation. It preserves superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods. Such an optimal combination of characteristics is a perfect match for simulating quantum systems. The model includes a transitional component of quantum potential (that has been overlooked in previous treatment of the Madelung equation). The role of the transitional potential is to provide a jump from a deterministic state to a random state with prescribed probability density. This jump is triggered by blowup instability due to violation of Lipschitz condition generated by the quantum potential. As a result, the dynamics attains quantum properties on a classical scale. The model can be implemented physically as an analog VLSI-based (very-large-scale integration-based) computer, or numerically on a digital computer. This work opens a way of developing fundamentally new algorithms for quantum simulations of exponentially complex problems that expand NASA capabilities in conducting space activities. It has been illustrated that the complexity of simulations of particle interaction can be reduced from an exponential one to a polynomial one.

  7. SU-D-BRB-05: Quantum Learning for Knowledge-Based Response-Adaptive Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    El Naqa, I; Ten, R

    Purpose: There is tremendous excitement in radiotherapy about applying data-driven methods to develop personalized clinical decisions for real-time response-based adaptation. However, classical statistical learning methods lack in terms of efficiency and ability to predict outcomes under conditions of uncertainty and incomplete information. Therefore, we are investigating physics-inspired machine learning approaches by utilizing quantum principles for developing a robust framework to dynamically adapt treatments to individual patient’s characteristics and optimize outcomes. Methods: We studied 88 liver SBRT patients with 35 on non-adaptive and 53 on adaptive protocols. Adaptation was based on liver function using a split-course of 3+2 fractions with amore » month break. The radiotherapy environment was modeled as a Markov decision process (MDP) of baseline and one month into treatment states. The patient environment was modeled by a 5-variable state represented by patient’s clinical and dosimetric covariates. For comparison of classical and quantum learning methods, decision-making to adapt at one month was considered. The MDP objective was defined by the complication-free tumor control (P{sup +}=TCPx(1-NTCP)). A simple regression model represented state-action mapping. Single bit in classical MDP and a qubit of 2-superimposed states in quantum MDP represented the decision actions. Classical decision selection was done using reinforcement Q-learning and quantum searching was performed using Grover’s algorithm, which applies uniform superposition over possible states and yields quadratic speed-up. Results: Classical/quantum MDPs suggested adaptation (probability amplitude ≥0.5) 79% of the time for splitcourses and 100% for continuous-courses. However, the classical MDP had an average adaptation probability of 0.5±0.22 while the quantum algorithm reached 0.76±0.28. In cases where adaptation failed, classical MDP yielded 0.31±0.26 average amplitude while the quantum approach averaged a more optimistic 0.57±0.4, but with high phase fluctuations. Conclusion: Our results demonstrate that quantum machine learning approaches provide a feasible and promising framework for real-time and sequential clinical decision-making in adaptive radiotherapy.« less

  8. Experimental investigations into visual and electronic tooth color measurement.

    PubMed

    Ratzmann, Anja; Treichel, Anja; Langforth, Gabriele; Gedrange, Tomasz; Welk, Alexander

    2011-04-01

    The present study aimed to examine the validity of the visual color assessment and an electronic tooth color measurement system by means of Shade Inspector™ in comparison with a gold standard. Additionally, reproducibility of electronic measurements was demonstrated by means of two reference systems. Ceramic specimens of two thicknesses (h=1.6 mm, h=2.6 mm) were used. Three experienced dental technicians using the VITAPAN Classical(®) color scale carried out all visual tests. Validity of the visual assessment and the electronic measurements was confirmed separately for both thicknesses by means of lightness and hue of the VITAPAN Classical(®) color scale. Reproducibility of electronic measurements was confirmed by means of the VITAPAN Classical(®) and 3D-Master(®). The 3D-Master(®) data were calculated according to lightness, hue and chroma. Intraclass correlation coefficient (ICC) was used in assessing validity/reproducibility for lightness and chroma, Kappa statistics were used for hue. A level ≥0.75 was pre-established for ICC and ≥0.60 for the Kappa index. RESULTS OF VISUAL COLOR ASSESSMENT: Validity for lightness was good for both thicknesses; agreement rates for hue were inconsistent. ELECTRONIC MEASUREMENT: Validity for lightness was fair to good, hue values were below 0.60. Reproducibility of lightness was good to very good for both reference systems. Hue values (VITAPAN Classical(®)) for 1.6 mm test specimens were upside, for 2.6 mm below 0.60, Kappa values for 3D-Master(®) were ≥0.60 for all measurements, reproducibility of chroma was very good. Validity was better for visual than for electronic color assessment. Reproducibility of the electronic device by means of the Shade Inspector™ was given for the VITAPAN Classical(®) and 3D-Master(®) systems.

  9. Statistical inference and Aristotle's Rhetoric.

    PubMed

    Macdonald, Ranald R

    2004-11-01

    Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.

  10. Prediction of the Electromagnetic Field Distribution in a Typical Aircraft Using the Statistical Energy Analysis

    NASA Astrophysics Data System (ADS)

    Kovalevsky, Louis; Langley, Robin S.; Caro, Stephane

    2016-05-01

    Due to the high cost of experimental EMI measurements significant attention has been focused on numerical simulation. Classical methods such as Method of Moment or Finite Difference Time Domain are not well suited for this type of problem, as they require a fine discretisation of space and failed to take into account uncertainties. In this paper, the authors show that the Statistical Energy Analysis is well suited for this type of application. The SEA is a statistical approach employed to solve high frequency problems of electromagnetically reverberant cavities at a reduced computational cost. The key aspects of this approach are (i) to consider an ensemble of system that share the same gross parameter, and (ii) to avoid solving Maxwell's equations inside the cavity, using the power balance principle. The output is an estimate of the field magnitude distribution in each cavity. The method is applied on a typical aircraft structure.

  11. Noninformative prior in the quantum statistical model of pure states

    NASA Astrophysics Data System (ADS)

    Tanaka, Fuyuhiko

    2012-06-01

    In the present paper, we consider a suitable definition of a noninformative prior on the quantum statistical model of pure states. While the full pure-states model is invariant under unitary rotation and admits the Haar measure, restricted models, which we often see in quantum channel estimation and quantum process tomography, have less symmetry and no compelling rationale for any choice. We adopt a game-theoretic approach that is applicable to classical Bayesian statistics and yields a noninformative prior for a general class of probability distributions. We define the quantum detection game and show that there exist noninformative priors for a general class of a pure-states model. Theoretically, it gives one of the ways that we represent ignorance on the given quantum system with partial information. Practically, our method proposes a default distribution on the model in order to use the Bayesian technique in the quantum-state tomography with a small sample.

  12. Observation of non-classical correlations in sequential measurements of photon polarization

    NASA Astrophysics Data System (ADS)

    Suzuki, Yutaro; Iinuma, Masataka; Hofmann, Holger F.

    2016-10-01

    A sequential measurement of two non-commuting quantum observables results in a joint probability distribution for all output combinations that can be explained in terms of an initial joint quasi-probability of the non-commuting observables, modified by the resolution errors and back-action of the initial measurement. Here, we show that the error statistics of a sequential measurement of photon polarization performed at different measurement strengths can be described consistently by an imaginary correlation between the statistics of resolution and back-action. The experimental setup was designed to realize variable strength measurements with well-controlled imaginary correlation between the statistical errors caused by the initial measurement of diagonal polarizations, followed by a precise measurement of the horizontal/vertical polarization. We perform the experimental characterization of an elliptically polarized input state and show that the same complex joint probability distribution is obtained at any measurement strength.

  13. Statistical methods for thermonuclear reaction rates and nucleosynthesis simulations

    NASA Astrophysics Data System (ADS)

    Iliadis, Christian; Longland, Richard; Coc, Alain; Timmes, F. X.; Champagne, Art E.

    2015-03-01

    Rigorous statistical methods for estimating thermonuclear reaction rates and nucleosynthesis are becoming increasingly established in nuclear astrophysics. The main challenge being faced is that experimental reaction rates are highly complex quantities derived from a multitude of different measured nuclear parameters (e.g., astrophysical S-factors, resonance energies and strengths, particle and γ-ray partial widths). We discuss the application of the Monte Carlo method to two distinct, but related, questions. First, given a set of measured nuclear parameters, how can one best estimate the resulting thermonuclear reaction rates and associated uncertainties? Second, given a set of appropriate reaction rates, how can one best estimate the abundances from nucleosynthesis (i.e., reaction network) calculations? The techniques described here provide probability density functions that can be used to derive statistically meaningful reaction rates and final abundances for any desired coverage probability. Examples are given for applications to s-process neutron sources, core-collapse supernovae, classical novae, and Big Bang nucleosynthesis.

  14. Transition from Poissonian to Gaussian-orthogonal-ensemble level statistics in a modified Artin's billiard

    NASA Astrophysics Data System (ADS)

    Csordás, A.; Graham, R.; Szépfalusy, P.; Vattay, G.

    1994-01-01

    One wall of an Artin's billiard on the Poincaré half-plane is replaced by a one-parameter (cp) family of nongeodetic walls. A brief description of the classical phase space of this system is given. In the quantum domain, the continuous and gradual transition from the Poisson-like to Gaussian-orthogonal-ensemble (GOE) level statistics due to the small perturbations breaking the symmetry responsible for the ``arithmetic chaos'' at cp=1 is studied. Another GOE-->Poisson transition due to the mixed phase space for large perturbations is also investigated. A satisfactory description of the intermediate level statistics by the Brody distribution was found in both cases. The study supports the existence of a scaling region around cp=1. A finite-size scaling relation for the Brody parameter as a function of 1-cp and the number of levels considered can be established.

  15. On the Chronological Structure of the Solutrean in Southern Iberia

    PubMed Central

    Cascalheira, João; Bicho, Nuno

    2015-01-01

    The Solutrean techno-complex has gained particular significance over time for representing a clear demographic and techno-typological deviation from the developments occurred during the course of the Upper Paleolithic in Western Europe. Some of Solutrean’s most relevant features are the diversity and techno-typological characteristics of the lithic armatures. These have been recurrently used as pivotal elements in numerous Solutrean-related debates, including the chronological organization of the techno-complex across Iberia and Southwestern France. In Southern Iberia, patterns of presence and/or absence of specific point types in stratified sequences tend to validate the classical ordering of the techno-complex into Lower, Middle and Upper phases, although some evidence, namely radiocarbon determinations, have not always been corroborative. Here we present the first comprehensive analysis of the currently available radiocarbon data for the Solutrean in Southern Iberia. We use a Bayesian statistical approach from 13 stratified sequences to compare the duration, and the start and end moments of each classic Solutrean phase across sites. We conclude that, based on the current data, the traditional organization of the Solutrean cannot be unquestionably confirmed for Southern Iberia, calling into doubt the status of the classically-defined type-fossils as precise temporal markers. PMID:26355459

  16. Quantum Foundations of Quantum Information

    NASA Astrophysics Data System (ADS)

    Griffiths, Robert

    2009-03-01

    The main foundational issue for quantum information is: What is quantum information about? What does it refer to? Classical information typically refers to physical properties, and since classical is a subset of quantum information (assuming the world is quantum mechanical), quantum information should--and, it will be argued, does--refer to quantum physical properties represented by projectors on appropriate subspaces of a quantum Hilbert space. All sorts of microscopic and macroscopic properties, not just measurement outcomes, can be represented in this way, and are thus a proper subject of quantum information. The Stern-Gerlach experiment illustrates this. When properties are compatible, which is to say their projectors commute, Shannon's classical information theory based on statistical correlations extends without difficulty or change to the quantum case. When projectors do not commute, giving rise to characteristic quantum effects, a foundation for the subject can still be constructed by replacing the ``measurement and wave-function collapse'' found in textbooks--an efficient calculational tool, but one giving rise to numerous conceptual difficulties--with a fully consistent and paradox free stochastic formulation of standard quantum mechanics. This formulation is particularly helpful in that it contains no nonlocal superluminal influences; the reason the latter carry no information is that they do not exist.

  17. Periodic orbit spectrum in terms of Ruelle-Pollicott resonances

    NASA Astrophysics Data System (ADS)

    Leboeuf, P.

    2004-02-01

    Fully chaotic Hamiltonian systems possess an infinite number of classical solutions which are periodic, e.g., a trajectory “p” returns to its initial conditions after some fixed time τp. Our aim is to investigate the spectrum {τ1,τ2,…} of periods of the periodic orbits. An explicit formula for the density ρ(τ)=∑pδ(τ-τp) is derived in terms of the eigenvalues of the classical evolution operator. The density is naturally decomposed into a smooth part plus an interferent sum over oscillatory terms. The frequencies of the oscillatory terms are given by the imaginary part of the complex eigenvalues (Ruelle-Pollicott resonances). For large periods, corrections to the well-known exponential growth of the smooth part of the density are obtained. An alternative formula for ρ(τ) in terms of the zeros and poles of the Ruelle ζ function is also discussed. The results are illustrated with the geodesic motion in billiards of constant negative curvature. Connections with the statistical properties of the corresponding quantum eigenvalues, random-matrix theory, and discrete maps are also considered. In particular, a random-matrix conjecture is proposed for the eigenvalues of the classical evolution operator of chaotic billiards.

  18. The locking-decoding frontier for generic dynamics.

    PubMed

    Dupuis, Frédéric; Florjanczyk, Jan; Hayden, Patrick; Leung, Debbie

    2013-11-08

    It is known that the maximum classical mutual information, which can be achieved between measurements on pairs of quantum systems, can drastically underestimate the quantum mutual information between them. In this article, we quantify this distinction between classical and quantum information by demonstrating that after removing a logarithmic-sized quantum system from one half of a pair of perfectly correlated bitstrings, even the most sensitive pair of measurements might yield only outcomes essentially independent of each other. This effect is a form of information locking but the definition we use is strictly stronger than those used previously. Moreover, we find that this property is generic, in the sense that it occurs when removing a random subsystem. As such, the effect might be relevant to statistical mechanics or black hole physics. While previous works had always assumed a uniform message, we assume only a min-entropy bound and also explore the effect of entanglement. We find that classical information is strongly locked almost until it can be completely decoded. Finally, we exhibit a quantum key distribution protocol that is 'secure' in the sense of accessible information but in which leakage of even a logarithmic number of bits compromises the secrecy of all others.

  19. Determination of free fatty acids in pharmaceutical lipids by ¹H NMR and comparison with the classical acid value.

    PubMed

    Skiera, Christina; Steliopoulos, Panagiotis; Kuballa, Thomas; Diehl, Bernd; Holzgrabe, Ulrike

    2014-05-01

    Indices like acid value, peroxide value, and saponification value play an important role in quality control and identification of lipids. Requirements on these parameters are given by the monographs of the European pharmacopeia. (1)H NMR spectroscopy provides a fast and simple alternative to these classical approaches. In the present work a new (1)H NMR approach to determine the acid value is described. The method was validated using a statistical approach based on a variance components model. The performance under repeatability and in-house reproducibility conditions was assessed. We applied this (1)H NMR assay to a wide range of different fatty oils. A total of 305 oil and fat samples were examined by both the classical and the NMR method. Except for hard fat, the data obtained by the two methods were in good agreement. The (1)H NMR method was adapted to analyse waxes and oleyloleat. Furthermore, the effect of solvent and in the case of castor oil the effect of the oil matrix on line broadening and chemical shift of the carboxyl group signal are discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Competing quantum effects in the free energy profiles and diffusion rates of hydrogen and deuterium molecules through clathrate hydrates.

    PubMed

    Cendagorta, Joseph R; Powers, Anna; Hele, Timothy J H; Marsalek, Ondrej; Bačić, Zlatko; Tuckerman, Mark E

    2016-11-30

    Clathrate hydrates hold considerable promise as safe and economical materials for hydrogen storage. Here we present a quantum mechanical study of H 2 and D 2 diffusion through a hexagonal face shared by two large cages of clathrate hydrates over a wide range of temperatures. Path integral molecular dynamics simulations are used to compute the free-energy profiles for the diffusion of H 2 and D 2 as a function of temperature. Ring polymer molecular dynamics rate theory, incorporating both exact quantum statistics and approximate quantum dynamical effects, is utilized in the calculations of the H 2 and D 2 diffusion rates in a broad temperature interval. We find that the shape of the quantum free-energy profiles and their height relative to the classical free energy barriers at a given temperature, as well as the rate of diffusion, are strongly affected by competing quantum effects: above 25 K, zero-point energy (ZPE) perpendicular to the reaction path for diffusion between cavities decreases the quantum rate compared to the classical rate, whereas at lower temperatures tunneling outcompetes the ZPE and as a result the quantum rate is greater than the classical rate.

  1. Elementary test for nonclassicality based on measurements of position and momentum

    NASA Astrophysics Data System (ADS)

    Fresta, Luca; Borregaard, Johannes; Sørensen, Anders S.

    2015-12-01

    We generalize a nonclassicality test described by Kot et al. [Phys. Rev. Lett. 108, 233601 (2012), 10.1103/PhysRevLett.108.233601], which can be used to rule out any classical description of a physical system. The test is based on measurements of quadrature operators and works by proving a contradiction with the classical description in terms of a probability distribution in phase space. As opposed to the previous work, we generalize the test to include states without rotational symmetry in phase space. Furthermore, we compare the performance of the nonclassicality test with classical tomography methods based on the inverse Radon transform, which can also be used to establish the quantum nature of a physical system. In particular, we consider a nonclassicality test based on the so-called filtered back-projection formula. We show that the general nonclassicality test is conceptually simpler, requires less assumptions on the system, and is statistically more reliable than the tests based on the filtered back-projection formula. As a specific example, we derive the optimal test for quadrature squeezed single-photon states and show that the efficiency of the test does not change with the degree of squeezing.

  2. The locking-decoding frontier for generic dynamics

    PubMed Central

    Dupuis, Frédéric; Florjanczyk, Jan; Hayden, Patrick; Leung, Debbie

    2013-01-01

    It is known that the maximum classical mutual information, which can be achieved between measurements on pairs of quantum systems, can drastically underestimate the quantum mutual information between them. In this article, we quantify this distinction between classical and quantum information by demonstrating that after removing a logarithmic-sized quantum system from one half of a pair of perfectly correlated bitstrings, even the most sensitive pair of measurements might yield only outcomes essentially independent of each other. This effect is a form of information locking but the definition we use is strictly stronger than those used previously. Moreover, we find that this property is generic, in the sense that it occurs when removing a random subsystem. As such, the effect might be relevant to statistical mechanics or black hole physics. While previous works had always assumed a uniform message, we assume only a min-entropy bound and also explore the effect of entanglement. We find that classical information is strongly locked almost until it can be completely decoded. Finally, we exhibit a quantum key distribution protocol that is ‘secure’ in the sense of accessible information but in which leakage of even a logarithmic number of bits compromises the secrecy of all others. PMID:24204183

  3. Wetting of heterogeneous substrates. A classical density-functional-theory approach

    NASA Astrophysics Data System (ADS)

    Yatsyshin, Peter; Parry, Andrew O.; Rascón, Carlos; Duran-Olivencia, Miguel A.; Kalliadasis, Serafim

    2017-11-01

    Wetting is a nucleation of a third phase (liquid) on the interface between two different phases (solid and gas). In many experimentally accessible cases of wetting, the interplay between the substrate structure, and the fluid-fluid and fluid-substrate intermolecular interactions leads to the appearance of a whole ``zoo'' of exciting interface phase transitions, associated with the formation of nano-droplets/bubbles, and thin films. Practical applications of wetting at small scales are numerous and include the design of lab-on-a-chip devices and superhydrophobic surfaces. In this talk, we will use a fully microscopic approach to explore the phase space of a planar wall, decorated with patches of different hydrophobicity, and demonstrate the highly non-trivial behaviour of the liquid-gas interface near the substrate. We will present fluid density profiles, adsorption isotherms and wetting phase diagrams. Our analysis is based on a formulation of statistical mechanics, commonly known as classical density-functional theory. It provides a computationally-friendly and rigorous framework, suitable for probing small-scale physics of classical fluids and other soft-matter systems. EPSRC Grants No. EP/L027186,EP/K503733;ERC Advanced Grant No. 247031.

  4. The Global Error Assessment (GEA) model for the selection of differentially expressed genes in microarray data.

    PubMed

    Mansourian, Robert; Mutch, David M; Antille, Nicolas; Aubert, Jerome; Fogel, Paul; Le Goff, Jean-Marc; Moulin, Julie; Petrov, Anton; Rytz, Andreas; Voegel, Johannes J; Roberts, Matthew-Alan

    2004-11-01

    Microarray technology has become a powerful research tool in many fields of study; however, the cost of microarrays often results in the use of a low number of replicates (k). Under circumstances where k is low, it becomes difficult to perform standard statistical tests to extract the most biologically significant experimental results. Other more advanced statistical tests have been developed; however, their use and interpretation often remain difficult to implement in routine biological research. The present work outlines a method that achieves sufficient statistical power for selecting differentially expressed genes under conditions of low k, while remaining as an intuitive and computationally efficient procedure. The present study describes a Global Error Assessment (GEA) methodology to select differentially expressed genes in microarray datasets, and was developed using an in vitro experiment that compared control and interferon-gamma treated skin cells. In this experiment, up to nine replicates were used to confidently estimate error, thereby enabling methods of different statistical power to be compared. Gene expression results of a similar absolute expression are binned, so as to enable a highly accurate local estimate of the mean squared error within conditions. The model then relates variability of gene expression in each bin to absolute expression levels and uses this in a test derived from the classical ANOVA. The GEA selection method is compared with both the classical and permutational ANOVA tests, and demonstrates an increased stability, robustness and confidence in gene selection. A subset of the selected genes were validated by real-time reverse transcription-polymerase chain reaction (RT-PCR). All these results suggest that GEA methodology is (i) suitable for selection of differentially expressed genes in microarray data, (ii) intuitive and computationally efficient and (iii) especially advantageous under conditions of low k. The GEA code for R software is freely available upon request to authors.

  5. Systematic review of statistical approaches to quantify, or correct for, measurement error in a continuous exposure in nutritional epidemiology.

    PubMed

    Bennett, Derrick A; Landry, Denise; Little, Julian; Minelli, Cosetta

    2017-09-19

    Several statistical approaches have been proposed to assess and correct for exposure measurement error. We aimed to provide a critical overview of the most common approaches used in nutritional epidemiology. MEDLINE, EMBASE, BIOSIS and CINAHL were searched for reports published in English up to May 2016 in order to ascertain studies that described methods aimed to quantify and/or correct for measurement error for a continuous exposure in nutritional epidemiology using a calibration study. We identified 126 studies, 43 of which described statistical methods and 83 that applied any of these methods to a real dataset. The statistical approaches in the eligible studies were grouped into: a) approaches to quantify the relationship between different dietary assessment instruments and "true intake", which were mostly based on correlation analysis and the method of triads; b) approaches to adjust point and interval estimates of diet-disease associations for measurement error, mostly based on regression calibration analysis and its extensions. Two approaches (multiple imputation and moment reconstruction) were identified that can deal with differential measurement error. For regression calibration, the most common approach to correct for measurement error used in nutritional epidemiology, it is crucial to ensure that its assumptions and requirements are fully met. Analyses that investigate the impact of departures from the classical measurement error model on regression calibration estimates can be helpful to researchers in interpreting their findings. With regard to the possible use of alternative methods when regression calibration is not appropriate, the choice of method should depend on the measurement error model assumed, the availability of suitable calibration study data and the potential for bias due to violation of the classical measurement error model assumptions. On the basis of this review, we provide some practical advice for the use of methods to assess and adjust for measurement error in nutritional epidemiology.

  6. Comparison of Three Different New Bipolar Energy Modalities and Classic Bipolar in Vivo for Tissue Thermal Spread.

    PubMed

    Çaltekin, Melike Demir; Aydoğmuş, Serpil; Yalçin, Serenat Eriş; Demirel, Emine; Unay, Fulya Cakalağaoğlu; Özbay, Pelin Özün; Özdemir, Aslı; Yalçin, Yakup; Kelekçi, Sefa

    2017-01-01

    The aim of this study was to compare three different new bipolar energy modalities and classic bipolar in vivo for tissue thermal spread. This prospective, randomized, single-blind study was conducted between Septemsber 2012 and July 2013. Eighteen patients aged 40-65 years undergoing hysterectomy and bilateral salpingectomy for benign etiology were included in the study. Before the hysterectomy operation began, it was marked nearly distal third cm started from uterine corn and proximal close third cm started from fimbrial bottoms by visualizing both fallopian tubes. The surgery was performed using one 5 mm applicator of PlasmaKinetics™, EnSeal®, LigaSure™ or classic bipolar energy modality. The time each device was used was standardized as the minimum time of the audible warning of the device for tissue impedance and as tissue vaporization on classic bipolar. Tissues were dyed by both H&E and Masson's Trichrome in the pathology laboratory. Thermal spread was compared. Evaluation of the damage on the uterine tubes by each device used revealed that LigaSure™ was associated with increased thermal injury compared to PlasmaKinetics™ (p=0.007). Apart from PlasmaKineticsTM (p=0.022), there was no statistically significant difference between the three devices in terms of thermal damage spread in the distal and proximal fallopian tubes. To reduce lateral thermal damage, Plasmakinetics™ may be preferable to Ligasure™ among the three different new bipolar energy modalities.

  7. Liver Transplantation for Classical Maple Syrup Urine Disease: Long-Term Follow-Up in 37 Patients and Comparative United Network for Organ Sharing Experience

    PubMed Central

    Mazariegos, George V.; Morton, D. Holmes; Sindhi, Rakesh; Soltys, Kyle; Nayyar, Navdeep; Bond, Geoffrey; Shellmer, Diana; Shneider, Benjamin; Vockley, Jerry; Strauss, Kevin A.

    2012-01-01

    Objective To assess clinical and neurocognitive function in children who have undergone liver transplantation for classical maple syrup urine disease (MSUD). Study design A total of 35 patients with classical MSUD (age 9.9 ± 7.9 years) underwent liver transplantation between 2004 and 2009. Six patients donated their liver to recipients without MSUD (“domino” transplant). We analyzed clinical outcomes for our cohort and 17 additional cases from the national United Network for Organ Sharing registry; 33 patients completed IQ and adaptive testing before transplantation, and 14 completed testing 1 year later. Results Patient and graft survival were 100% at 4.5 ± 2.2 years of follow-up. Liver function was normal in all patients. Branched-chain amino acid levels were corrected within hours after surgery and remained stable, with leucine tolerance increasing more than 10-fold. All domino transplant recipients were alive and well with normal branched-chain amino acid homeostasis at the time of this report. Patient and graft survival for all 54 patients with MSUD undergoing liver transplantation in the United States during this period were 98%and 96%, respectively. One-third of our patients were mentally impaired (IQ ≤ 70) before transplantation, with no statistically significant change 1 year later. Conclusion Liver transplantation is an effective long-term treatment for classical MSUD and may arrest brain damage, but will not reverse it. PMID:21839471

  8. Comparison of massage based on the tensegrity principle and classic massage in treating chronic shoulder pain.

    PubMed

    Kassolik, Krzysztof; Andrzejewski, Waldemar; Brzozowski, Marcin; Wilk, Iwona; Górecka-Midura, Lucyna; Ostrowska, Bożena; Krzyżanowski, Dominik; Kurpas, Donata

    2013-09-01

    The purpose of this study was to compare the clinical outcomes of classic massage to massage based on the tensegrity principle for patients with chronic idiopathic shoulder pain. Thirty subjects with chronic shoulder pain symptoms were divided into 2 groups, 15 subjects received classic (Swedish) massage to tissues surrounding the glenohumeral joint and 15 subjects received the massage using techniques based on the tensegrity principle. The tensegrity principle is based on directing treatment to the painful area and the tissues (muscles, fascia, and ligaments) that structurally support the painful area, thus treating tissues that have direct and indirect influence on the motion segment. Both treatment groups received 10 sessions over 2 weeks, each session lasted 20 minutes. The McGill Pain Questionnaire and glenohumeral ranges of motion were measured immediately before the first massage session, on the day the therapy ended 2 weeks after therapy started, and 1 month after the last massage. Subjects receiving massage based on the tensegrity principle demonstrated statistically significance improvement in the passive and active ranges of flexion and abduction of the glenohumeral joint. Pain decreased in both massage groups. This study showed increases in passive and active ranges of motion for flexion and abduction in patients who had massage based on the tensegrity principle. For pain outcomes, both classic and tensegrity massage groups demonstrated improvement. Copyright © 2013 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shore, B.W.; Knight, P.L.

    The Jaynes-Cummings Model (JCM), a soluble fully quantum mechanical model of an atom in a field, was first used (in 1963) to examine the classical aspects of spontaneous emission and to reveal the existence of Rabi oscillations in atomic excitation probability for fields with sharply defined energy (or photon number). For fields having a statistical distributions of photon numbers the oscillations collapse to an expected steady value. In 1980 it was discovered that with appropriate initial conditions (e.g. a near-classical field), the Rabi oscillations would eventually revive -- only to collapse and revive repeatedly in a complicated pattern. The existencemore » of these revivals, present in the analytic solutions of the JCM, provided direct evidence for discreteness of field excitation (photons) and hence for the truly quantum nature of radiation. Subsequent study revealed further nonclassical properties of the JCM field, such as a tendency of the photons to antibunch. Within the last two years it has been found that during the quiescent intervals of collapsed Rabi oscillations the atom and field exist in a macroscopic superposition state (a Schroedinger cat). This discovery offers the opportunity to use the JCM to elucidate the basic properties of quantum correlation (entanglement) and to explore still further the relationship between classical and quantum physics. In tribute to E. D. Jaynes, who first recognized the importance of the JCM for clarifying the differences and similarities between quantum and classical physics, we here present an overview of the theory of the JCM and some of the many remarkable discoveries about it.« less

  10. Legitimate Techniques for Improving the R-Square and Related Statistics of a Multiple Regression Model

    DTIC Science & Technology

    1981-01-01

    explanatory variable has been ommitted. Ramsey (1974) has developed a rather interesting test for detecting specification errors using estimates of the...Peter. (1979) A Guide to Econometrics , Cambridge, MA: The MIT Press. Ramsey , J.B. (1974), "Classical Model Selection Through Specification Error... Tests ," in P. Zarembka, Ed. Frontiers in Econometrics , New York: Academia Press. Theil, Henri. (1971), Principles of Econometrics , New York: John Wiley

  11. Characterization of the Pathological and Biochemical Markers that Correlate to the Clinical Features of Autism

    DTIC Science & Technology

    2012-08-01

    system in relation to its evolution, structure and function . New York: CRC Press 1997 American Psychiatric Association. Diagnostic and statistical...Hesslow G, Yeo CH. The functional anatomy of skeletal conditioning. In: Moore JW, Editor. A neuroscientist’s guide to classical conditioning. New ...1995; Selkoe, 2001). Aβ is generated and detected in the endoplasmic reticulum/ Golgi apparatus and endosomal-lysosomal pathway (Cook D.G. et al., 1997

  12. Matrix Concentration Inequalities via the Method of Exchangeable Pairs

    DTIC Science & Technology

    2012-01-27

    viewed as an exchangeable pairs version of the Burkholder –Davis–Gundy (BDG) inequality from classical martingale theory [Bur73]. Matrix extensions of...non-commutative probability. Math. Ann., 319:1–16, 2001. [Bur73] D. L. Burkholder . Distribution function inequalities for martingales. Ann. Probab., 1...Statist. Assoc., 58(301):13–30, 1963. [JX03] M. Junge and Q. Xu. Noncommutative Burkholder /Rosenthal inequalities. Ann. Probab., 31(2):948–995, 2003

  13. {Phi}{sup 4} kinks: Statistical mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habib, S.

    1995-12-31

    Some recent investigations of the thermal equilibrium properties of kinks in a 1+1-dimensional, classical {phi}{sup 4} field theory are reviewed. The distribution function, kink density, correlation function, and certain thermodynamic quantities were studied both theoretically and via large scale simulations. A simple double Gaussian variational approach within the transfer operator formalism was shown to give good results in the intermediate temperature range where the dilute gas theory is known to fail.

  14. Shapes of rotating superfluid helium nanodroplets

    DOE PAGES

    Bernando, Charles; Tanyag, Rico Mayro P.; Jones, Curtis; ...

    2017-02-16

    Rotating superfluid He droplets of approximately 1 μm in diameter were obtained in a free nozzle beam expansion of liquid He in vacuum and were studied by single-shot coherent diffractive imaging using an x-ray free electron laser. The formation of strongly deformed droplets is evidenced by large anisotropies and intensity anomalies (streaks) in the obtained diffraction images. The analysis of the images shows that in addition to previously described axially symmetric oblate shapes, some droplets exhibit prolate shapes. Forward modeling of the diffraction images indicates that the shapes of rotating superfluid droplets are very similar to their classical counterparts, givingmore » direct access to the droplet angular momenta and angular velocities. Here, the analyses of the radial intensity distribution and appearance statistics of the anisotropic images confirm the existence of oblate metastable superfluid droplets with large angular momenta beyond the classical bifurcation threshold.« less

  15. Belief propagation decoding of quantum channels by passing quantum messages

    NASA Astrophysics Data System (ADS)

    Renes, Joseph M.

    2017-07-01

    The belief propagation (BP) algorithm is a powerful tool in a wide range of disciplines from statistical physics to machine learning to computational biology, and is ubiquitous in decoding classical error-correcting codes. The algorithm works by passing messages between nodes of the factor graph associated with the code and enables efficient decoding of the channel, in some cases even up to the Shannon capacity. Here we construct the first BP algorithm which passes quantum messages on the factor graph and is capable of decoding the classical-quantum channel with pure state outputs. This gives explicit decoding circuits whose number of gates is quadratic in the code length. We also show that this decoder can be modified to work with polar codes for the pure state channel and as part of a decoder for transmitting quantum information over the amplitude damping channel. These represent the first explicit capacity-achieving decoders for non-Pauli channels.

  16. Shapes of rotating superfluid helium nanodroplets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernando, Charles; Tanyag, Rico Mayro P.; Jones, Curtis

    Rotating superfluid He droplets of approximately 1 μm in diameter were obtained in a free nozzle beam expansion of liquid He in vacuum and were studied by single-shot coherent diffractive imaging using an x-ray free electron laser. The formation of strongly deformed droplets is evidenced by large anisotropies and intensity anomalies (streaks) in the obtained diffraction images. The analysis of the images shows that in addition to previously described axially symmetric oblate shapes, some droplets exhibit prolate shapes. Forward modeling of the diffraction images indicates that the shapes of rotating superfluid droplets are very similar to their classical counterparts, givingmore » direct access to the droplet angular momenta and angular velocities. Here, the analyses of the radial intensity distribution and appearance statistics of the anisotropic images confirm the existence of oblate metastable superfluid droplets with large angular momenta beyond the classical bifurcation threshold.« less

  17. Different amplitude and time distribution of the sound of light and classical music

    NASA Astrophysics Data System (ADS)

    Diodati, P.; Piazza, S.

    2000-08-01

    Several pieces of different musical kinds were studied measuring $N(A)$, the output amplitude of a peak detector driven by the electric signal arriving to the loudspeaker. Fixed a suitable threshold $\\bar{A}$, we considered $N(A)$, the number of times that $A(t)>\\bar{A}$, each of them we named event and $N(t)$, the distribution of times $t$ between two consecutive events. Some $N(A)$ and $N(t)$ distributions are displayed in the reported logarithmic plots, showing that jazz, pop, rock and other popular rhythms have noise-distribution, while classical pieces of music are characterized by more complex statistics. We pointed out the extraordinary case of the aria ``\\textit{La calunnia \\`{e} un venticello}'', where the words describe an avalanche or seismic process, calumny, and the rossinian music shows $N(A)$ and $N(t)$ distribution typical of earthquakes.

  18. Experimental scattershot boson sampling

    PubMed Central

    Bentivegna, Marco; Spagnolo, Nicolò; Vitelli, Chiara; Flamini, Fulvio; Viggianiello, Niko; Latmiral, Ludovico; Mataloni, Paolo; Brod, Daniel J.; Galvão, Ernesto F.; Crespi, Andrea; Ramponi, Roberta; Osellame, Roberto; Sciarrino, Fabio

    2015-01-01

    Boson sampling is a computational task strongly believed to be hard for classical computers, but efficiently solvable by orchestrated bosonic interference in a specialized quantum computer. Current experimental schemes, however, are still insufficient for a convincing demonstration of the advantage of quantum over classical computation. A new variation of this task, scattershot boson sampling, leads to an exponential increase in speed of the quantum device, using a larger number of photon sources based on parametric down-conversion. This is achieved by having multiple heralded single photons being sent, shot by shot, into different random input ports of the interferometer. We report the first scattershot boson sampling experiments, where six different photon-pair sources are coupled to integrated photonic circuits. We use recently proposed statistical tools to analyze our experimental data, providing strong evidence that our photonic quantum simulator works as expected. This approach represents an important leap toward a convincing experimental demonstration of the quantum computational supremacy. PMID:26601164

  19. Experimental scattershot boson sampling.

    PubMed

    Bentivegna, Marco; Spagnolo, Nicolò; Vitelli, Chiara; Flamini, Fulvio; Viggianiello, Niko; Latmiral, Ludovico; Mataloni, Paolo; Brod, Daniel J; Galvão, Ernesto F; Crespi, Andrea; Ramponi, Roberta; Osellame, Roberto; Sciarrino, Fabio

    2015-04-01

    Boson sampling is a computational task strongly believed to be hard for classical computers, but efficiently solvable by orchestrated bosonic interference in a specialized quantum computer. Current experimental schemes, however, are still insufficient for a convincing demonstration of the advantage of quantum over classical computation. A new variation of this task, scattershot boson sampling, leads to an exponential increase in speed of the quantum device, using a larger number of photon sources based on parametric down-conversion. This is achieved by having multiple heralded single photons being sent, shot by shot, into different random input ports of the interferometer. We report the first scattershot boson sampling experiments, where six different photon-pair sources are coupled to integrated photonic circuits. We use recently proposed statistical tools to analyze our experimental data, providing strong evidence that our photonic quantum simulator works as expected. This approach represents an important leap toward a convincing experimental demonstration of the quantum computational supremacy.

  20. Thermal stability of charged rotating quantum black holes

    NASA Astrophysics Data System (ADS)

    Sinha, Aloke Kumar; Majumdar, Parthasarathi

    2017-12-01

    Criteria for thermal stability of charged rotating black holes of any dimension are derived for horizon areas that are large relative to the Planck area (in these dimensions). The derivation is based on generic assumptions of quantum geometry, supported by some results of loop quantum gravity, and equilibrium statistical mechanics of the Grand Canonical ensemble. There is no explicit use of classical spacetime geometry in this analysis. The only assumption is that the mass of the black hole is a function of its horizon area, charge and angular momentum. Our stability criteria are then tested in detail against specific classical black holes in spacetime dimensions 4 and 5, whose metrics provide us with explicit relations for the dependence of the mass on the charge and angular momentum of the black holes. This enables us to predict which of these black holes are expected to be thermally unstable under Hawking radiation.

  1. The swan-song phenomenon: last-works effects for 172 classical composers.

    PubMed

    Simonton, D K

    1989-03-01

    Creative individuals approaching their final years of life may undergo a transformation in outlook that is reflected in their last works. This hypothesized effect was quantitatively assessed for an extensive sample of 1,919 works by 172 classical composers. The works were independently gauged on seven aesthetic attributes (melodic originality, melodic variation, repertoire popularity, aesthetic significance, listener accessibility, performance duration, and thematic size), and potential last-works effects were operationally defined two separate ways (linearly and exponentially). Statistical controls were introduced for both longitudinal changes (linear, quadratic, and cubic age functions) and individual differences (eminence and lifetime productivity). Hierarchical regression analyses indicated that composers' swan songs tend to score lower in melodic originality and performance duration but higher in repertoire popularity and aesthetic significance. These last-works effects survive control for total compositional output, eminence, and most significantly, the composer's age when the last works were created.

  2. Generation of a tunable environment for electrical oscillator systems.

    PubMed

    León-Montiel, R de J; Svozilík, J; Torres, Juan P

    2014-07-01

    Many physical, chemical, and biological systems can be modeled by means of random-frequency harmonic oscillator systems. Even though the noise-free evolution of harmonic oscillator systems can be easily implemented, the way to experimentally introduce, and control, noise effects due to a surrounding environment remains a subject of lively interest. Here, we experimentally demonstrate a setup that provides a unique tool to generate a fully tunable environment for classical electrical oscillator systems. We illustrate the operation of the setup by implementing the case of a damped random-frequency harmonic oscillator. The high degree of tunability and control of our scheme is demonstrated by gradually modifying the statistics of the oscillator's frequency fluctuations. This tunable system can readily be used to experimentally study interesting noise effects, such as noise-induced transitions in systems driven by multiplicative noise, and noise-induced transport, a phenomenon that takes place in quantum and classical coupled oscillator networks.

  3. Superradiant Quantum Heat Engine.

    PubMed

    Hardal, Ali Ü C; Müstecaplıoğlu, Özgür E

    2015-08-11

    Quantum physics revolutionized classical disciplines of mechanics, statistical physics, and electrodynamics. One branch of scientific knowledge however seems untouched: thermodynamics. Major motivation behind thermodynamics is to develop efficient heat engines. Technology has a trend to miniaturize engines, reaching to quantum regimes. Development of quantum heat engines (QHEs) requires emerging field of quantum thermodynamics. Studies of QHEs debate whether quantum coherence can be used as a resource. We explore an alternative where it can function as an effective catalyst. We propose a QHE which consists of a photon gas inside an optical cavity as the working fluid and quantum coherent atomic clusters as the fuel. Utilizing the superradiance, where a cluster can radiate quadratically faster than a single atom, we show that the work output becomes proportional to the square of the number of the atoms. In addition to practical value of cranking up QHE, our result is a fundamental difference of a quantum fuel from its classical counterpart.

  4. Hidden Markov models incorporating fuzzy measures and integrals for protein sequence identification and alignment.

    PubMed

    Bidargaddi, Niranjan P; Chetty, Madhu; Kamruzzaman, Joarder

    2008-06-01

    Profile hidden Markov models (HMMs) based on classical HMMs have been widely applied for protein sequence identification. The formulation of the forward and backward variables in profile HMMs is made under statistical independence assumption of the probability theory. We propose a fuzzy profile HMM to overcome the limitations of that assumption and to achieve an improved alignment for protein sequences belonging to a given family. The proposed model fuzzifies the forward and backward variables by incorporating Sugeno fuzzy measures and Choquet integrals, thus further extends the generalized HMM. Based on the fuzzified forward and backward variables, we propose a fuzzy Baum-Welch parameter estimation algorithm for profiles. The strong correlations and the sequence preference involved in the protein structures make this fuzzy architecture based model as a suitable candidate for building profiles of a given family, since the fuzzy set can handle uncertainties better than classical methods.

  5. Thermal quantum time-correlation functions from classical-like dynamics

    NASA Astrophysics Data System (ADS)

    Hele, Timothy J. H.

    2017-07-01

    Thermal quantum time-correlation functions are of fundamental importance in quantum dynamics, allowing experimentally measurable properties such as reaction rates, diffusion constants and vibrational spectra to be computed from first principles. Since the exact quantum solution scales exponentially with system size, there has been considerable effort in formulating reliable linear-scaling methods involving exact quantum statistics and approximate quantum dynamics modelled with classical-like trajectories. Here, we review recent progress in the field with the development of methods including centroid molecular dynamics , ring polymer molecular dynamics (RPMD) and thermostatted RPMD (TRPMD). We show how these methods have recently been obtained from 'Matsubara dynamics', a form of semiclassical dynamics which conserves the quantum Boltzmann distribution. We also apply the Matsubara formalism to reaction rate theory, rederiving t → 0+ quantum transition-state theory (QTST) and showing that Matsubara-TST, like RPMD-TST, is equivalent to QTST. We end by surveying areas for future progress.

  6. Enterocolic lymphocytic phlebitis: statistical analysis of histology features in viable and ischemic bowel.

    PubMed

    Medlicott, Shaun A C; Guggisberg, Kelly A; DesCôteaux, Jean-Gaston; Beck, Paul

    2006-07-01

    Enterocolic lymphocytic phlebitis is a rare cause of segmental ischemic enterocolitis. This artery-sparing transmural vasculitis is classically a circumferential phlebitis with perivenular lymphocyte cuffing and thrombi in the absence of systemic manifestations. Myointimal hyperplasia may represent a chronic phase of enterocolic lymphocytic phlebitis. Subclinical or early stage enterocolic lymphocytic phlebitis is not well delineated. We analyzed 600 submucosal and subserosal veins from both ischemic and intact bowel segments to discern if vascular morphology varied between sites. Crescentic and circumferential lymphocytic phlebitis is more common in viable bowel than in the ischemic segment. A nonsignificant trend was found for increased crescentic morphology between intact bowel remote from the ischemic focus compared with that adjacent to the ischemic focus. Hallmarks of ischemic bowel are necrotizing phlebitis and thrombi formation. Thrombophlebitis morphology is distinctly different in viable and ischemic bowel, changing from the classic lymphocytic to necrotizing lesions respectively.

  7. Quantum weak turbulence with applications to semiconductor lasers

    NASA Astrophysics Data System (ADS)

    Lvov, Y. V.; Binder, R.; Newell, A. C.

    1998-10-01

    Based on a model Hamiltonian appropriate for the description of fermionic systems such as semiconductor lasers, we describe a natural asymptotic closure of the BBGKY hierarchy in complete analogy with that derived for classical weak turbulence. The main features of the interaction Hamiltonian are the inclusion of full Fermi statistics containing Pauli blocking and a simple, phenomenological, uniformly weak two-particle interaction potential equivalent to the static screening approximation. We find a new class of solutions to the quantum kinetic equation which are analogous to the Kolmogorov spectra of hydrodynamics and classical weak turbulence. They involve finite fluxes of particles and energy in momentum space and are particularly relevant for describing the behavior of systems containing sources and sinks. We make a prima facie case that these finite flux solutions can be important in the context of semiconductor lasers and show how they might be used to enhance laser performance.

  8. Brane Physics in M-theory

    NASA Astrophysics Data System (ADS)

    Argurio, Riccardo

    1998-07-01

    The thesis begins with an introduction to M-theory (at a graduate student's level), starting from perturbative string theory and proceeding to dualities, D-branes and finally Matrix theory. The following chapter treats, in a self-contained way, of general classical p-brane solutions. Black and extremal branes are reviewed, along with their semi-classical thermodynamics. We then focus on intersecting extremal branes, the intersection rules being derived both with and without the explicit use of supersymmetry. The last three chapters comprise more advanced aspects of brane physics, such as the dynamics of open branes, the little theories on the world-volume of branes and how the four dimensional Schwarzschild black hole can be mapped to an extremal configuration of branes, thus allowing for a statistical interpretation of its entropy. The original results were already reported in hep-th/9701042, hep-th/9704190, hep-th/9710027 and hep-th/9801053.

  9. Experimentally modeling stochastic processes with less memory by the use of a quantum processor

    PubMed Central

    Palsson, Matthew S.; Gu, Mile; Ho, Joseph; Wiseman, Howard M.; Pryde, Geoff J.

    2017-01-01

    Computer simulation of observable phenomena is an indispensable tool for engineering new technology, understanding the natural world, and studying human society. However, the most interesting systems are often so complex that simulating their future behavior demands storing immense amounts of information regarding how they have behaved in the past. For increasingly complex systems, simulation becomes increasingly difficult and is ultimately constrained by resources such as computer memory. Recent theoretical work shows that quantum theory can reduce this memory requirement beyond ultimate classical limits, as measured by a process’ statistical complexity, C. We experimentally demonstrate this quantum advantage in simulating stochastic processes. Our quantum implementation observes a memory requirement of Cq = 0.05 ± 0.01, far below the ultimate classical limit of C = 1. Scaling up this technique would substantially reduce the memory required in simulations of more complex systems. PMID:28168218

  10. Experiments with central-limit properties of spatial samples from locally covariant random fields

    USGS Publications Warehouse

    Barringer, T.H.; Smith, T.E.

    1992-01-01

    When spatial samples are statistically dependent, the classical estimator of sample-mean standard deviation is well known to be inconsistent. For locally dependent samples, however, consistent estimators of sample-mean standard deviation can be constructed. The present paper investigates the sampling properties of one such estimator, designated as the tau estimator of sample-mean standard deviation. In particular, the asymptotic normality properties of standardized sample means based on tau estimators are studied in terms of computer experiments with simulated sample-mean distributions. The effects of both sample size and dependency levels among samples are examined for various value of tau (denoting the size of the spatial kernel for the estimator). The results suggest that even for small degrees of spatial dependency, the tau estimator exhibits significantly stronger normality properties than does the classical estimator of standardized sample means. ?? 1992.

  11. Classical-processing and quantum-processing signal separation methods for qubit uncoupling

    NASA Astrophysics Data System (ADS)

    Deville, Yannick; Deville, Alain

    2012-12-01

    The Blind Source Separation problem consists in estimating a set of unknown source signals from their measured combinations. It was only investigated in a non-quantum framework up to now. We propose its first quantum extensions. We thus introduce the Quantum Source Separation field, investigating both its blind and non-blind configurations. More precisely, we show how to retrieve individual quantum bits (qubits) only from the global state resulting from their undesired coupling. We consider cylindrical-symmetry Heisenberg coupling, which e.g. occurs when two electron spins interact through exchange. We first propose several qubit uncoupling methods which typically measure repeatedly the coupled quantum states resulting from individual qubits preparations, and which then statistically process the classical data provided by these measurements. Numerical tests prove the effectiveness of these methods. We then derive a combination of quantum gates for performing qubit uncoupling, thus avoiding repeated qubit preparations and irreversible measurements.

  12. Spectral likelihood expansions for Bayesian inference

    NASA Astrophysics Data System (ADS)

    Nagel, Joseph B.; Sudret, Bruno

    2016-03-01

    A spectral approach to Bayesian inference is presented. It pursues the emulation of the posterior probability density. The starting point is a series expansion of the likelihood function in terms of orthogonal polynomials. From this spectral likelihood expansion all statistical quantities of interest can be calculated semi-analytically. The posterior is formally represented as the product of a reference density and a linear combination of polynomial basis functions. Both the model evidence and the posterior moments are related to the expansion coefficients. This formulation avoids Markov chain Monte Carlo simulation and allows one to make use of linear least squares instead. The pros and cons of spectral Bayesian inference are discussed and demonstrated on the basis of simple applications from classical statistics and inverse modeling.

  13. A statistical theory for sound radiation and reflection from a duct

    NASA Technical Reports Server (NTRS)

    Cho, Y. C.

    1979-01-01

    A new analytical method is introduced for the study of the sound radiation and reflection from the open end of a duct. The sound is thought of as an aggregation of the quasiparticles-phonons. The motion of the latter is described in terms of the statistical distribution, which is derived from the classical wave theory. The results are in good agreement with the solutions obtained using the Wiener-Hopf technique when the latter is applicable, but the new method is simple and provides straightforward physical interpretation of the problem. Furthermore, it is applicable to a problem involving a duct in which modes are difficult to determine or cannot be defined at all, whereas the Wiener-Hopf technique is not.

  14. Responding to the Medical Malpractice Insurance Crisis: A National Risk Management Information System

    PubMed Central

    Wess, Bernard P.; Jacobson, Gary

    1987-01-01

    In the process of forming a new medical malpractice reinsurance company, the authors analyzed thousands of medical malpractice cases, settlements, and verdicts. The evidence of those analyses indicated that the medical malpractice crisis is (1)emerging nation- and world-wide, (2)exacerbated by but not primarily a result of “predatory” legal action, (3)statistically determined by a small percentage of physicians and procedures, (4)overburdened with data but poor on information, (5)subject to classic forms of quality control and automation. The management information system developed to address this problem features a tiered data base architecture to accommodate medical, administrative, procedural, statistical, and actuarial analyses necessary to predict claims from untoward events, not merely to report them.

  15. A normative inference approach for optimal sample sizes in decisions from experience

    PubMed Central

    Ostwald, Dirk; Starke, Ludger; Hertwig, Ralph

    2015-01-01

    “Decisions from experience” (DFE) refers to a body of work that emerged in research on behavioral decision making over the last decade. One of the major experimental paradigms employed to study experience-based choice is the “sampling paradigm,” which serves as a model of decision making under limited knowledge about the statistical structure of the world. In this paradigm respondents are presented with two payoff distributions, which, in contrast to standard approaches in behavioral economics, are specified not in terms of explicit outcome-probability information, but by the opportunity to sample outcomes from each distribution without economic consequences. Participants are encouraged to explore the distributions until they feel confident enough to decide from which they would prefer to draw from in a final trial involving real monetary payoffs. One commonly employed measure to characterize the behavior of participants in the sampling paradigm is the sample size, that is, the number of outcome draws which participants choose to obtain from each distribution prior to terminating sampling. A natural question that arises in this context concerns the “optimal” sample size, which could be used as a normative benchmark to evaluate human sampling behavior in DFE. In this theoretical study, we relate the DFE sampling paradigm to the classical statistical decision theoretic literature and, under a probabilistic inference assumption, evaluate optimal sample sizes for DFE. In our treatment we go beyond analytically established results by showing how the classical statistical decision theoretic framework can be used to derive optimal sample sizes under arbitrary, but numerically evaluable, constraints. Finally, we critically evaluate the value of deriving optimal sample sizes under this framework as testable predictions for the experimental study of sampling behavior in DFE. PMID:26441720

  16. Statistical Mechanics of Disordered Systems - Series: Cambridge Series in Statistical and Probabilistic Mathematics (No. 18)

    NASA Astrophysics Data System (ADS)

    Bovier, Anton

    2006-06-01

    Our mathematical understanding of the statistical mechanics of disordered systems is going through a period of stunning progress. This self-contained book is a graduate-level introduction for mathematicians and for physicists interested in the mathematical foundations of the field, and can be used as a textbook for a two-semester course on mathematical statistical mechanics. It assumes only basic knowledge of classical physics and, on the mathematics side, a good working knowledge of graduate-level probability theory. The book starts with a concise introduction to statistical mechanics, proceeds to disordered lattice spin systems, and concludes with a presentation of the latest developments in the mathematical understanding of mean-field spin glass models. In particular, recent progress towards a rigorous understanding of the replica symmetry-breaking solutions of the Sherrington-Kirkpatrick spin glass models, due to Guerra, Aizenman-Sims-Starr and Talagrand, is reviewed in some detail. Comprehensive introduction to an active and fascinating area of research Clear exposition that builds to the state of the art in the mathematics of spin glasses Written by a well-known and active researcher in the field

  17. Ecological statistics of Gestalt laws for the perceptual organization of contours.

    PubMed

    Elder, James H; Goldberg, Richard M

    2002-01-01

    Although numerous studies have measured the strength of visual grouping cues for controlled psychophysical stimuli, little is known about the statistical utility of these various cues for natural images. In this study, we conducted experiments in which human participants trace perceived contours in natural images. These contours are automatically mapped to sequences of discrete tangent elements detected in the image. By examining relational properties between pairs of successive tangents on these traced curves, and between randomly selected pairs of tangents, we are able to estimate the likelihood distributions required to construct an optimal Bayesian model for contour grouping. We employed this novel methodology to investigate the inferential power of three classical Gestalt cues for contour grouping: proximity, good continuation, and luminance similarity. The study yielded a number of important results: (1) these cues, when appropriately defined, are approximately uncorrelated, suggesting a simple factorial model for statistical inference; (2) moderate image-to-image variation of the statistics indicates the utility of general probabilistic models for perceptual organization; (3) these cues differ greatly in their inferential power, proximity being by far the most powerful; and (4) statistical modeling of the proximity cue indicates a scale-invariant power law in close agreement with prior psychophysics.

  18. The image recognition based on neural network and Bayesian decision

    NASA Astrophysics Data System (ADS)

    Wang, Chugege

    2018-04-01

    The artificial neural network began in 1940, which is an important part of artificial intelligence. At present, it has become a hot topic in the fields of neuroscience, computer science, brain science, mathematics, and psychology. Thomas Bayes firstly reported the Bayesian theory in 1763. After the development in the twentieth century, it has been widespread in all areas of statistics. In recent years, due to the solution of the problem of high-dimensional integral calculation, Bayesian Statistics has been improved theoretically, which solved many problems that cannot be solved by classical statistics and is also applied to the interdisciplinary fields. In this paper, the related concepts and principles of the artificial neural network are introduced. It also summarizes the basic content and principle of Bayesian Statistics, and combines the artificial neural network technology and Bayesian decision theory and implement them in all aspects of image recognition, such as enhanced face detection method based on neural network and Bayesian decision, as well as the image classification based on the Bayesian decision. It can be seen that the combination of artificial intelligence and statistical algorithms has always been the hot research topic.

  19. Reversed inverse regression for the univariate linear calibration and its statistical properties derived using a new methodology

    NASA Astrophysics Data System (ADS)

    Kang, Pilsang; Koo, Changhoi; Roh, Hokyu

    2017-11-01

    Since simple linear regression theory was established at the beginning of the 1900s, it has been used in a variety of fields. Unfortunately, it cannot be used directly for calibration. In practical calibrations, the observed measurements (the inputs) are subject to errors, and hence they vary, thus violating the assumption that the inputs are fixed. Therefore, in the case of calibration, the regression line fitted using the method of least squares is not consistent with the statistical properties of simple linear regression as already established based on this assumption. To resolve this problem, "classical regression" and "inverse regression" have been proposed. However, they do not completely resolve the problem. As a fundamental solution, we introduce "reversed inverse regression" along with a new methodology for deriving its statistical properties. In this study, the statistical properties of this regression are derived using the "error propagation rule" and the "method of simultaneous error equations" and are compared with those of the existing regression approaches. The accuracy of the statistical properties thus derived is investigated in a simulation study. We conclude that the newly proposed regression and methodology constitute the complete regression approach for univariate linear calibrations.

  20. Quantifying Safety Margin Using the Risk-Informed Safety Margin Characterization (RISMC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia

    2015-04-26

    The Risk-Informed Safety Margin Characterization (RISMC), developed by Idaho National Laboratory as part of the Light-Water Reactor Sustainability Project, utilizes a probabilistic safety margin comparison between a load and capacity distribution, rather than a deterministic comparison between two values, as is usually done in best-estimate plus uncertainty analyses. The goal is to determine the failure probability, or in other words, the probability of the system load equaling or exceeding the system capacity. While this method has been used in pilot studies, there has been little work conducted investigating the statistical significance of the resulting failure probability. In particular, it ismore » difficult to determine how many simulations are necessary to properly characterize the failure probability. This work uses classical (frequentist) statistics and confidence intervals to examine the impact in statistical accuracy when the number of simulations is varied. Two methods are proposed to establish confidence intervals related to the failure probability established using a RISMC analysis. The confidence interval provides information about the statistical accuracy of the method utilized to explore the uncertainty space, and offers a quantitative method to gauge the increase in statistical accuracy due to performing additional simulations.« less

  1. HYPOTHESIS SETTING AND ORDER STATISTIC FOR ROBUST GENOMIC META-ANALYSIS.

    PubMed

    Song, Chi; Tseng, George C

    2014-01-01

    Meta-analysis techniques have been widely developed and applied in genomic applications, especially for combining multiple transcriptomic studies. In this paper, we propose an order statistic of p-values ( r th ordered p-value, rOP) across combined studies as the test statistic. We illustrate different hypothesis settings that detect gene markers differentially expressed (DE) "in all studies", "in the majority of studies", or "in one or more studies", and specify rOP as a suitable method for detecting DE genes "in the majority of studies". We develop methods to estimate the parameter r in rOP for real applications. Statistical properties such as its asymptotic behavior and a one-sided testing correction for detecting markers of concordant expression changes are explored. Power calculation and simulation show better performance of rOP compared to classical Fisher's method, Stouffer's method, minimum p-value method and maximum p-value method under the focused hypothesis setting. Theoretically, rOP is found connected to the naïve vote counting method and can be viewed as a generalized form of vote counting with better statistical properties. The method is applied to three microarray meta-analysis examples including major depressive disorder, brain cancer and diabetes. The results demonstrate rOP as a more generalizable, robust and sensitive statistical framework to detect disease-related markers.

  2. Rescaled earthquake recurrence time statistics: application to microrepeaters

    NASA Astrophysics Data System (ADS)

    Goltz, Christian; Turcotte, Donald L.; Abaimov, Sergey G.; Nadeau, Robert M.; Uchida, Naoki; Matsuzawa, Toru

    2009-01-01

    Slip on major faults primarily occurs during `characteristic' earthquakes. The recurrence statistics of characteristic earthquakes play an important role in seismic hazard assessment. A major problem in determining applicable statistics is the short sequences of characteristic earthquakes that are available worldwide. In this paper, we introduce a rescaling technique in which sequences can be superimposed to establish larger numbers of data points. We consider the Weibull and log-normal distributions, in both cases we rescale the data using means and standard deviations. We test our approach utilizing sequences of microrepeaters, micro-earthquakes which recur in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Microrepeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. In this paper, we present results for the analysis of recurrence times for several microrepeater sequences from Parkfield, CA as well as NE Japan. We find that, once the respective sequence can be considered to be of sufficient stationarity, the statistics can be well fitted by either a Weibull or a log-normal distribution. We clearly demonstrate this fact by our technique of rescaled combination. We conclude that the recurrence statistics of the microrepeater sequences we consider are similar to the recurrence statistics of characteristic earthquakes on major faults.

  3. Randomized central limit theorems: A unified theory.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  4. Randomized central limit theorems: A unified theory

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles’ aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles’ extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic—scaling all ensemble components by a common deterministic scale. However, there are “random environment” settings in which the underlying scaling schemes are stochastic—scaling the ensemble components by different random scales. Examples of such settings include Holtsmark’s law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)—in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes—and present “randomized counterparts” to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  5. Non-statistical effects in bond fission reactions of 1,2-difluoroethane

    NASA Astrophysics Data System (ADS)

    Schranz, Harold W.; Raff, Lionel M.; Thompson, Donald L.

    1991-08-01

    A microcanonical, classical variational transition-state theory based on the use of the efficient microcanonical sampling (EMS) procedure is applied to simple bond fission in 1,2-difluoroethane. Comparison is made with results of trajectory calculations performed on the same global potential-energy surface. Agreement between the statistical theory and trajectory results for CC CF and CH bond fissions is poor with differences as large as a factor of 125. Most importantly, at the lower energy studied, 6.0 eV, the statistical calculations predict considerably slower rates than those computed from trajectories. We conclude from these results that the statistical assumptions inherent in the transition-state theory method are not valid for 1,2-difluoroethane in spite of the fact that the total intramolecular energy transfer rate out of CH and CC normal and local modes is large relative to the bond fission rates. The IVR rate is not globally rapid and the trajectories do not access all of the energetically available phase space uniformly on the timescale of the reactions.

  6. Fisher's method of combining dependent statistics using generalizations of the gamma distribution with applications to genetic pleiotropic associations.

    PubMed

    Li, Qizhai; Hu, Jiyuan; Ding, Juan; Zheng, Gang

    2014-04-01

    A classical approach to combine independent test statistics is Fisher's combination of $p$-values, which follows the $\\chi ^2$ distribution. When the test statistics are dependent, the gamma distribution (GD) is commonly used for the Fisher's combination test (FCT). We propose to use two generalizations of the GD: the generalized and the exponentiated GDs. We study some properties of mis-using the GD for the FCT to combine dependent statistics when one of the two proposed distributions are true. Our results show that both generalizations have better control of type I error rates than the GD, which tends to have inflated type I error rates at more extreme tails. In practice, common model selection criteria (e.g. Akaike information criterion/Bayesian information criterion) can be used to help select a better distribution to use for the FCT. A simple strategy of the two generalizations of the GD in genome-wide association studies is discussed. Applications of the results to genetic pleiotrophic associations are described, where multiple traits are tested for association with a single marker.

  7. GASP cloud- and particle-encounter statistics and their application to LPC aircraft studies. Volume 1: Analysis and conclusions

    NASA Technical Reports Server (NTRS)

    Jasperson, W. H.; Nastrom, G. D.; Davis, R. E.; Holdeman, J. D.

    1984-01-01

    Summary studies are presented for the entire cloud observation archieve from the NASA Global Atmospheric Sampling Program (GASP). Studies are also presented for GASP particle concentration data gathered concurrently with the cloud observations. Cloud encounters are shown on about 15 percent of the data samples overall, but the probability of cloud encounter is shown to vary significantly with altitude, latitude, and distance from the tropopause. Several meteorological circulation features are apparent in the latitudinal distribution of cloud cover, and the cloud encounter statistics are shown to be consistent with the classical mid-latitude cyclone model. Observations of clouds spaced more closely than 90 minutes are shown to be statistically dependent. The statistics for cloud and particle encounter are utilized to estimate the frequency of cloud encounter on long range airline routes, and to assess the probability and extent of laminar flow loss due to cloud or particle encounter by aircraft utilizing laminar flow control (LFC). It is shown that the probability of extended cloud encounter is too low, of itself, to make LFC impractical.

  8. δ13C and δ18O isotopic composition of CaCO3 measured by continuous flow isotope ratio mass spectrometry: statistical evaluation and verification by application to Devils Hole core DH-11 calcite

    USGS Publications Warehouse

    Revesz, Kinga M.; Landwehr, Jurate M.

    2002-01-01

    A new method was developed to analyze the stable carbon and oxygen isotope ratios of small samples (400 ± 20 µg) of calcium carbonate. This new method streamlines the classical phosphoric acid/calcium carbonate (H3PO4/CaCO3) reaction method by making use of a recently available Thermoquest-Finnigan GasBench II preparation device and a Delta Plus XL continuous flow isotope ratio mass spectrometer. Conditions for which the H3PO4/CaCO3 reaction produced reproducible and accurate results with minimal error had to be determined. When the acid/carbonate reaction temperature was kept at 26 °C and the reaction time was between 24 and 54 h, the precision of the carbon and oxygen isotope ratios for pooled samples from three reference standard materials was ≤0.1 and ≤0.2 per mill or ‰, respectively, although later analysis showed that materials from one specific standard required reaction time between 34 and 54 h for δ18O to achieve this level of precision. Aliquot screening methods were shown to further minimize the total error. The accuracy and precision of the new method were analyzed and confirmed by statistical analysis. The utility of the method was verified by analyzing calcite from Devils Hole, Nevada, for which isotope-ratio values had previously been obtained by the classical method. Devils Hole core DH-11 recently had been re-cut and re-sampled, and isotope-ratio values were obtained using the new method. The results were comparable with those obtained by the classical method with correlation = +0.96 for both isotope ratios. The consistency of the isotopic results is such that an alignment offset could be identified in the re-sampled core material, and two cutting errors that occurred during re-sampling then were confirmed independently. This result indicates that the new method is a viable alternative to the classical reaction method. In particular, the new method requires less sample material permitting finer resolution and allows automation of some processes resulting in considerable time savings. 

  9. Poincaré resonances and the limits of trajectory dynamics.

    PubMed Central

    Petrosky, T; Prigogine, I

    1993-01-01

    In previous papers we have shown that the elimination of the resonance divergences in large Poincare systems leads to complex irreducible spectral representations for the Liouville-von Neumann operator. Complex means that time symmetry is broken and irreducibility means that this representation is implementable only by statistical ensembles and not by trajectories. We consider in this paper classical potential scattering. Our theory applies to persistent scattering. Numerical simulations show quantitative agreement with our predictions. PMID:11607428

  10. Long-time predictions in nonlinear dynamics

    NASA Technical Reports Server (NTRS)

    Szebehely, V.

    1980-01-01

    It is known that nonintegrable dynamical systems do not allow precise predictions concerning their behavior for arbitrary long times. The available series solutions are not uniformly convergent according to Poincare's theorem and numerical integrations lose their meaningfulness after the elapse of arbitrary long times. Two approaches are the use of existing global integrals and statistical methods. This paper presents a generalized method along the first approach. As examples long-time predictions in the classical gravitational satellite and planetary problems are treated.

  11. Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.

    2013-05-01

    Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.

  12. Simulations of relativistic quantum plasmas using real-time lattice scalar QED

    NASA Astrophysics Data System (ADS)

    Shi, Yuan; Xiao, Jianyuan; Qin, Hong; Fisch, Nathaniel J.

    2018-05-01

    Real-time lattice quantum electrodynamics (QED) provides a unique tool for simulating plasmas in the strong-field regime, where collective plasma scales are not well separated from relativistic-quantum scales. As a toy model, we study scalar QED, which describes self-consistent interactions between charged bosons and electromagnetic fields. To solve this model on a computer, we first discretize the scalar-QED action on a lattice, in a way that respects geometric structures of exterior calculus and U(1)-gauge symmetry. The lattice scalar QED can then be solved, in the classical-statistics regime, by advancing an ensemble of statistically equivalent initial conditions in time, using classical field equations obtained by extremizing the discrete action. To demonstrate the capability of our numerical scheme, we apply it to two example problems. The first example is the propagation of linear waves, where we recover analytic wave dispersion relations using numerical spectrum. The second example is an intense laser interacting with a one-dimensional plasma slab, where we demonstrate natural transition from wakefield acceleration to pair production when the wave amplitude exceeds the Schwinger threshold. Our real-time lattice scheme is fully explicit and respects local conservation laws, making it reliable for long-time dynamics. The algorithm is readily parallelized using domain decomposition, and the ensemble may be computed using quantum parallelism in the future.

  13. Comparison of two modalities: a novel technique, 'chromohysteroscopy', and blind endometrial sampling for the evaluation of abnormal uterine bleeding.

    PubMed

    Alay, Asli; Usta, Taner A; Ozay, Pinar; Karadugan, Ozgur; Ates, Ugur

    2014-05-01

    The objective of this study was to compare classical blind endometrial tissue sampling with hysteroscopic biopsy sampling following methylene blue dyeing in premenopausal and postmenopausal patients with abnormal uterine bleeding. A prospective case-control study was carried out in the Office Hysteroscopy Unit. Fifty-four patients with complaints of abnormal uterine bleeding were evaluated. Data of 38 patients were included in the statistical analysis. Three groups were compared by examining samples obtained through hysteroscopic biopsy before and after methylene blue dyeing, and classical blind endometrial tissue sampling. First, uterine cavity was evaluated with office hysteroscopy. Methylene blue dye was administered through the hysteroscopic inlet. Tissue samples were obtained from stained and non-stained areas. Blind endometrial sampling was performed in the same patients immediately after the hysteroscopy procedure. The results of hysteroscopic biopsy from methylene blue stained and non-stained areas and blind biopsy were compared. No statistically significant differences were determined in the comparison of biopsy samples obtained from methylene-blue stained, non-stained areas and blind biopsy (P > 0.05). We suggest that chromohysteroscopy is not superior to endometrial sampling in cases of abnormal uterine bleeding. Further studies with greater sample sizes should be performed to assess the validity of routine use of endometrial dyeing. © 2014 The Authors. Journal of Obstetrics and Gynaecology Research © 2014 Japan Society of Obstetrics and Gynecology.

  14. Quantum mean-field approximation for lattice quantum models: Truncating quantum correlations and retaining classical ones

    NASA Astrophysics Data System (ADS)

    Malpetti, Daniele; Roscilde, Tommaso

    2017-02-01

    The mean-field approximation is at the heart of our understanding of complex systems, despite its fundamental limitation of completely neglecting correlations between the elementary constituents. In a recent work [Phys. Rev. Lett. 117, 130401 (2016), 10.1103/PhysRevLett.117.130401], we have shown that in quantum many-body systems at finite temperature, two-point correlations can be formally separated into a thermal part and a quantum part and that quantum correlations are generically found to decay exponentially at finite temperature, with a characteristic, temperature-dependent quantum coherence length. The existence of these two different forms of correlation in quantum many-body systems suggests the possibility of formulating an approximation, which affects quantum correlations only, without preventing the correct description of classical fluctuations at all length scales. Focusing on lattice boson and quantum Ising models, we make use of the path-integral formulation of quantum statistical mechanics to introduce such an approximation, which we dub quantum mean-field (QMF) approach, and which can be readily generalized to a cluster form (cluster QMF or cQMF). The cQMF approximation reduces to cluster mean-field theory at T =0 , while at any finite temperature it produces a family of systematically improved, semi-classical approximations to the quantum statistical mechanics of the lattice theory at hand. Contrary to standard MF approximations, the correct nature of thermal critical phenomena is captured by any cluster size. In the two exemplary cases of the two-dimensional quantum Ising model and of two-dimensional quantum rotors, we study systematically the convergence of the cQMF approximation towards the exact result, and show that the convergence is typically linear or sublinear in the boundary-to-bulk ratio of the clusters as T →0 , while it becomes faster than linear as T grows. These results pave the way towards the development of semiclassical numerical approaches based on an approximate, yet systematically improved account of quantum correlations.

  15. Colours of minor bodies in the outer solar system. II. A statistical analysis revisited

    NASA Astrophysics Data System (ADS)

    Hainaut, O. R.; Boehnhardt, H.; Protopapa, S.

    2012-10-01

    We present an update of the visible and near-infrared colour database of Minor Bodies in the Outer Solar System (MBOSSes), which now includes over 2000 measurement epochs of 555 objects, extracted from over 100 articles. The list is fairly complete as of December 2011. The database is now large enough to enable any dataset with a large dispersion to be safely identified and rejected from the analysis. The selection method used is quite insensitive to individual outliers. Most of the rejected datasets were observed during the early days of MBOSS photometry. The individual measurements are combined in a way that avoids possible rotational artifacts. The spectral gradient over the visible range is derived from the colours, as well as the R absolute magnitude M(1,1). The average colours, absolute magnitude, and spectral gradient are listed for each object, as well as the physico-dynamical classes using a classification adapted from Gladman and collaborators. Colour-colour diagrams, histograms, and various other plots are presented to illustrate and investigate class characteristics and trends with other parameters, whose significances are evaluated using standard statistical tests. Except for a small discrepancy for the J-H colour, the largest objects, with M(1,1) < 5, are indistinguishable from the smaller ones. The larger ones are slightly bluer than the smaller ones in J-H. Short-period comets, Plutinos and other resonant objects, hot classical disk objects, scattered disk objects and detached disk objects have similar properties in the visible, while the cold classical disk objects and the Jupiter Trojans form two separate groups of their spectral properties in the visible wavelength range. The well-known colour bimodality of Centaurs is confirmed. The hot classical disk objects with large inclinations, or large orbital excitations are found to be bluer than the others, confirming a previously known result. Additionally, the hot classical disk objects with a smaller perihelion distance are bluer than those that do not come as close to the Sun. The bluer hot classical disk objects and resonant objects have fainter absolute magnitudes than the redder ones of the same class. Finally, we discuss possible scenarios for the origin of the colour diversity observed in MBOSSes, i.e. colouration caused by evolutionary or formation processes. The colour tables and all plots are also available on the MBOSS colour web page, which will be updated when new measurements are published Full Tables 2 and 3 are only available at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/546/A115

  16. Vaccine Efficacy Against a New Avian Influenza (H9N2) Field Isolate from the Middle East (Serology and Challenge Studies).

    PubMed

    Gharaibeh, Saad; Amareen, Shadi

    2016-05-01

    Avian influenza subtype H9N2 is endemic in many countries in the Middle East. The reported prevalence of infection was variable between countries and ranged from 28.7% in Tunisia to 71% in Jordan. Several commercial killed whole-virus vaccine products are used as monovalent or bivalent mixed with Newcastle disease virus. Recently, we have noticed that many of the vaccinated broiler flocks did not show a production advantage over nonvaccinated flocks in the field. A new avian influenza field virus (H9N2) was isolated from these vaccinated and infected broiler flocks in 2013. This virus had 89.1% similarity of its hemagglutinin (HA) gene to the classical virus used for manufacturing the classical vaccine. Inactivated autogenous vaccine was manufactured from this new field isolate to investigate its serological response and protection in specific-pathogen-free (SPF) and breeder-male chickens compared to the classical vaccine. Oropharyngeal virus shedding of vaccinated breeder-male chickens was evaluated at 3, 9, 10, and 14 days postchallenge (DPC). Percentage of chickens shedding the virus at 3 DPC was 64%, 50%, and 64% in the classical vaccine group, autogenous vaccine group, and the control challenged group, respectively. At 7 DPC percentage of virus shedding was 42%, 7%, and 64% in the classical vaccine group, autogenous vaccine group, and the control challenged group, respectively. At 10 DPC only 9% of classical vaccine group was shedding the virus and there was no virus shedding in any of the groups at 14 DPC. There was statistical significance difference (P < 0.05) in shedding only at 7 DPC between the autogenous vaccine group and the other two groups. At 42 days of age (14 DPC), average body weight was 2.720, 2.745, 2.290, and 2.760 kg for the classical vaccine group, autogenous vaccine group, control challenged group, and control unchallenged group, respectively. Only the control challenged group had significantly (P < 0.05) lower average body weight. In another experiment, vaccinated SPF chicks had hemagglutination inhibition (HI) geometric mean titers (GMTs), with classical antigen, of 8.7 and 3.1 log 2 for classical and autogenous vaccine groups, respectively. When the autogenous antigen was used for HI, GMTs were 6.0 and 8.1 log 2, respectively. Both vaccines protected against body weight suppression after challenge. However, autogenous vaccine elicited significantly higher HI titer and reduced viral shedding at 7 DPC. In conclusion, it is important to revise the vaccine virus strains used in each region to protect against and control infection from new field strains. Further field experiments are needed to demonstrate the efficacy of new vaccines under field conditions.

  17. q-bosons and the q-analogue quantized field

    NASA Technical Reports Server (NTRS)

    Nelson, Charles A.

    1995-01-01

    The q-analogue coherent states are used to identify physical signatures for the presence of a 1-analogue quantized radiation field in the q-CS classical limits where the absolute value of z is large. In this quantum-optics-like limit, the fractional uncertainties of most physical quantities (momentum, position, amplitude, phase) which characterize the quantum field are O(1). They only vanish as O(1/absolute value of z) when q = 1. However, for the number operator, N, and the N-Hamiltonian for a free q-boson gas, H(sub N) = h(omega)(N + 1/2), the fractional uncertainties do still approach zero. A signature for q-boson counting statistics is that (Delta N)(exp 2)/ (N) approaches 0 as the absolute value of z approaches infinity. Except for its O(1) fractional uncertainty, the q-generalization of the Hermitian phase operator of Pegg and Barnett, phi(sub q), still exhibits normal classical behavior. The standard number-phase uncertainty-relation, Delta(N) Delta phi(sub q) = 1/2, and the approximate commutation relation, (N, phi(sub q)) = i, still hold for the single-mode q-analogue quantized field. So, N and phi(sub q) are almost canonically conjugate operators in the q-CS classical limit. The q-analogue CS's minimize this uncertainty relation for moderate (absolute value of z)(exp 2).

  18. Sequencing of bimaxillary surgery in the correction of vertical maxillary excess: retrospective study.

    PubMed

    Salmen, F S; de Oliveira, T F M; Gabrielli, M A C; Pereira Filho, V A; Real Gabrielli, M F

    2018-06-01

    The aim of this study was to evaluate the precision of bimaxillary surgery performed to correct vertical maxillary excess, when the procedure is sequenced with mandibular surgery first or maxillary surgery first. Thirty-two patients, divided into two groups, were included in this retrospective study. Group 1 comprised patients who received bimaxillary surgery following the classical sequence with repositioning of the maxilla first. Patients in group 2 received bimaxillary surgery, but the mandible was operated on first. The precision of the maxillomandibular repositioning was determined by comparison of the digital prediction and postoperative tracings superimposed on the cranial base. The data were tabulated and analyzed statistically. In this sample, both surgical sequences provided adequate clinical accuracy. The classical sequence, repositioning the maxilla first, resulted in greater accuracy for A-point and the upper incisor edge vertical position. Repositioning the mandible first allowed greater precision in the vertical position of pogonion. In conclusion, although both surgical sequences may be used, repositioning the mandible first will result in greater imprecision in relation to the predictive tracing than repositioning the maxilla first. The classical sequence resulted in greater accuracy in the vertical position of the maxilla, which is key for aesthetics. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  19. The potential effects of pH and buffering capacity on dental erosion.

    PubMed

    Owens, Barry M

    2007-01-01

    Soft drink pH (initial pH) has been shown to be a causative factor--but not necessarily the primary initiating factor--of dental erosion. The titratable acidity or buffering capacity has been acknowledged as playing a significant role in the etiology of these lesions. This in vitro study sought to evaluate five different soft drinks (Coca-Cola Classic, Diet Coke, Gatorade sports drink, Red Bull high-energy drink, Starbucks Frappucino coffee drink) and tap water (control) in terms of initial pH and buffering capacity. Initial pH was measured in triplicate for the six beverages. The buffering capacity of each beverage was assessed by measuring the weight (in grams) of 0.10 M sodium hydroxide necessary for titration to pH levels of 5.0, 6.0, 7.0, and 8.3. Coca-Cola Classic produced the lowest mean pH, while Starbucks Frappucino produced the highest pH of any of the drinks except for tap water. Based on statistical analysis using ANOVA and Fisher's post hoc tests at a P < 0.05 level of significance, Red Bull had the highest mean buffering capacity (indicating the strongest potential for erosion of enamel), followed by Gatorade, Coca-Cola Classic, Diet Coke, and Starbucks Frappucino.

  20. Least-Squares Regression and Spectral Residual Augmented Classical Least-Squares Chemometric Models for Stability-Indicating Analysis of Agomelatine and Its Degradation Products: A Comparative Study.

    PubMed

    Naguib, Ibrahim A; Abdelrahman, Maha M; El Ghobashy, Mohamed R; Ali, Nesma A

    2016-01-01

    Two accurate, sensitive, and selective stability-indicating methods are developed and validated for simultaneous quantitative determination of agomelatine (AGM) and its forced degradation products (Deg I and Deg II), whether in pure forms or in pharmaceutical formulations. Partial least-squares regression (PLSR) and spectral residual augmented classical least-squares (SRACLS) are two chemometric models that are being subjected to a comparative study through handling UV spectral data in range (215-350 nm). For proper analysis, a three-factor, four-level experimental design was established, resulting in a training set consisting of 16 mixtures containing different ratios of interfering species. An independent test set consisting of eight mixtures was used to validate the prediction ability of the suggested models. The results presented indicate the ability of mentioned multivariate calibration models to analyze AGM, Deg I, and Deg II with high selectivity and accuracy. The analysis results of the pharmaceutical formulations were statistically compared to the reference HPLC method, with no significant differences observed regarding accuracy and precision. The SRACLS model gives comparable results to the PLSR model; however, it keeps the qualitative spectral information of the classical least-squares algorithm for analyzed components.

Top