Sample records for classical statistical theory

  1. The Development of Bayesian Theory and Its Applications in Business and Bioinformatics

    NASA Astrophysics Data System (ADS)

    Zhang, Yifei

    2018-03-01

    Bayesian Theory originated from an Essay of a British mathematician named Thomas Bayes in 1763, and after its development in 20th century, Bayesian Statistics has been taking a significant part in statistical study of all fields. Due to the recent breakthrough of high-dimensional integral, Bayesian Statistics has been improved and perfected, and now it can be used to solve problems that Classical Statistics failed to solve. This paper summarizes Bayesian Statistics’ history, concepts and applications, which are illustrated in five parts: the history of Bayesian Statistics, the weakness of Classical Statistics, Bayesian Theory and its development and applications. The first two parts make a comparison between Bayesian Statistics and Classical Statistics in a macroscopic aspect. And the last three parts focus on Bayesian Theory in specific -- from introducing some particular Bayesian Statistics’ concepts to listing their development and finally their applications.

  2. Statistical mechanics in the context of special relativity. II.

    PubMed

    Kaniadakis, G

    2005-09-01

    The special relativity laws emerge as one-parameter (light speed) generalizations of the corresponding laws of classical physics. These generalizations, imposed by the Lorentz transformations, affect both the definition of the various physical observables (e.g., momentum, energy, etc.), as well as the mathematical apparatus of the theory. Here, following the general lines of [Phys. Rev. E 66, 056125 (2002)], we show that the Lorentz transformations impose also a proper one-parameter generalization of the classical Boltzmann-Gibbs-Shannon entropy. The obtained relativistic entropy permits us to construct a coherent and self-consistent relativistic statistical theory, preserving the main features of the ordinary statistical theory, which is recovered in the classical limit. The predicted distribution function is a one-parameter continuous deformation of the classical Maxwell-Boltzmann distribution and has a simple analytic form, showing power law tails in accordance with the experimental evidence. Furthermore, this statistical mechanics can be obtained as the stationary case of a generalized kinetic theory governed by an evolution equation obeying the H theorem and reproducing the Boltzmann equation of the ordinary kinetics in the classical limit.

  3. Lenard-Balescu calculations and classical molecular dynamics simulations of electrical and thermal conductivities of hydrogen plasmas

    DOE PAGES

    Whitley, Heather D.; Scullard, Christian R.; Benedict, Lorin X.; ...

    2014-12-04

    Here, we present a discussion of kinetic theory treatments of linear electrical and thermal transport in hydrogen plasmas, for a regime of interest to inertial confinement fusion applications. In order to assess the accuracy of one of the more involved of these approaches, classical Lenard-Balescu theory, we perform classical molecular dynamics simulations of hydrogen plasmas using 2-body quantum statistical potentials and compute both electrical and thermal conductivity from out particle trajectories using the Kubo approach. Our classical Lenard-Balescu results employing the identical statistical potentials agree well with the simulations.

  4. Quantum mechanics as classical statistical mechanics with an ontic extension and an epistemic restriction.

    PubMed

    Budiyono, Agung; Rohrlich, Daniel

    2017-11-03

    Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.

  5. JOURNAL SCOPE GUIDELINES: Paper classification scheme

    NASA Astrophysics Data System (ADS)

    2005-06-01

    This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas

  6. Relationships among Classical Test Theory and Item Response Theory Frameworks via Factor Analytic Models

    ERIC Educational Resources Information Center

    Kohli, Nidhi; Koran, Jennifer; Henn, Lisa

    2015-01-01

    There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…

  7. Quantum correlations and dynamics from classical random fields valued in complex Hilbert spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khrennikov, Andrei

    2010-08-15

    One of the crucial differences between mathematical models of classical and quantum mechanics (QM) is the use of the tensor product of the state spaces of subsystems as the state space of the corresponding composite system. (To describe an ensemble of classical composite systems, one uses random variables taking values in the Cartesian product of the state spaces of subsystems.) We show that, nevertheless, it is possible to establish a natural correspondence between the classical and the quantum probabilistic descriptions of composite systems. Quantum averages for composite systems (including entangled) can be represented as averages with respect to classical randommore » fields. It is essentially what Albert Einstein dreamed of. QM is represented as classical statistical mechanics with infinite-dimensional phase space. While the mathematical construction is completely rigorous, its physical interpretation is a complicated problem. We present the basic physical interpretation of prequantum classical statistical field theory in Sec. II. However, this is only the first step toward real physical theory.« less

  8. Quantum and classical behavior in interacting bosonic systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertzberg, Mark P.

    It is understood that in free bosonic theories, the classical field theory accurately describes the full quantum theory when the occupancy numbers of systems are very large. However, the situation is less understood in interacting theories, especially on time scales longer than the dynamical relaxation time. Recently there have been claims that the quantum theory deviates spectacularly from the classical theory on this time scale, even if the occupancy numbers are extremely large. Furthermore, it is claimed that the quantum theory quickly thermalizes while the classical theory does not. The evidence for these claims comes from noticing a spectacular differencemore » in the time evolution of expectation values of quantum operators compared to the classical micro-state evolution. If true, this would have dramatic consequences for many important phenomena, including laboratory studies of interacting BECs, dark matter axions, preheating after inflation, etc. In this work we critically examine these claims. We show that in fact the classical theory can describe the quantum behavior in the high occupancy regime, even when interactions are large. The connection is that the expectation values of quantum operators in a single quantum micro-state are approximated by a corresponding classical ensemble average over many classical micro-states. Furthermore, by the ergodic theorem, a classical ensemble average of local fields with statistical translation invariance is the spatial average of a single micro-state. So the correlation functions of the quantum and classical field theories of a single micro-state approximately agree at high occupancy, even in interacting systems. Furthermore, both quantum and classical field theories can thermalize, when appropriate coarse graining is introduced, with the classical case requiring a cutoff on low occupancy UV modes. We discuss applications of our results.« less

  9. On Probability Domains IV

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  10. Tsallis non-extensive statistics and solar wind plasma complexity

    NASA Astrophysics Data System (ADS)

    Pavlos, G. P.; Iliopoulos, A. C.; Zastenker, G. N.; Zelenyi, L. M.; Karakatsanis, L. P.; Riazantseva, M. O.; Xenakis, M. N.; Pavlos, E. G.

    2015-03-01

    This article presents novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which took place on 26th September 2011. Solar wind plasma is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields (B → , E →) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992).

  11. ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prigogine, I.; Balescu, R.; Henin, F.

    1960-12-01

    Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)

  12. Phase-Sensitive Coherence and the Classical-Quantum Boundary in Ghost Imaging

    NASA Technical Reports Server (NTRS)

    Erkmen, Baris I.; Hardy, Nicholas D.; Venkatraman, Dheera; Wong, Franco N. C.; Shapiro, Jeffrey H.

    2011-01-01

    The theory of partial coherence has a long and storied history in classical statistical optics. the vast majority of this work addresses fields that are statistically stationary in time, hence their complex envelopes only have phase-insensitive correlations. The quantum optics of squeezed-state generation, however, depends on nonlinear interactions producing baseband field operators with phase-insensitive and phase-sensitive correlations. Utilizing quantum light to enhance imaging has been a topic of considerable current interest, much of it involving biphotons, i.e., streams of entangled-photon pairs. Biphotons have been employed for quantum versions of optical coherence tomography, ghost imaging, holography, and lithography. However, their seemingly quantum features have been mimicked with classical-sate light, questioning wherein lies the classical-quantum boundary. We have shown, for the case of Gaussian-state light, that this boundary is intimately connected to the theory of phase-sensitive partial coherence. Here we present that theory, contrasting it with the familiar case of phase-insensitive partial coherence, and use it to elucidate the classical-quantum boundary of ghost imaging. We show, both theoretically and experimentally, that classical phase-sensitive light produces ghost imaging most closely mimicking those obtained in biphotons, and we derived the spatial resolution, image contrast, and signal-to-noise ratio of a standoff-sensing ghost imager, taking into account target-induced speckle.

  13. [The new methods in gerontology for life expectancy prediction of the indigenous population of Yugra].

    PubMed

    Gavrilenko, T V; Es'kov, V M; Khadartsev, A A; Khimikova, O I; Sokolova, A A

    2014-01-01

    The behavior of the state vector of human cardio-vascular system in different age groups according to methods of theory of chaos-self-organization and methods of classical statistics was investigated. Observations were made on the indigenous people of North of the Russian Federation. Using methods of the theory of chaos-self-organization the differences in the parameters of quasi-attractors of the human state vector of cardio-vascular system of the people of Russian Federation North were shown. Comparison with the results obtained by classical statistics was made.

  14. Quantum theory of multiscale coarse-graining.

    PubMed

    Han, Yining; Jin, Jaehyeok; Wagner, Jacob W; Voth, Gregory A

    2018-03-14

    Coarse-grained (CG) models serve as a powerful tool to simulate molecular systems at much longer temporal and spatial scales. Previously, CG models and methods have been built upon classical statistical mechanics. The present paper develops a theory and numerical methodology for coarse-graining in quantum statistical mechanics, by generalizing the multiscale coarse-graining (MS-CG) method to quantum Boltzmann statistics. A rigorous derivation of the sufficient thermodynamic consistency condition is first presented via imaginary time Feynman path integrals. It identifies the optimal choice of CG action functional and effective quantum CG (qCG) force field to generate a quantum MS-CG (qMS-CG) description of the equilibrium system that is consistent with the quantum fine-grained model projected onto the CG variables. A variational principle then provides a class of algorithms for optimally approximating the qMS-CG force fields. Specifically, a variational method based on force matching, which was also adopted in the classical MS-CG theory, is generalized to quantum Boltzmann statistics. The qMS-CG numerical algorithms and practical issues in implementing this variational minimization procedure are also discussed. Then, two numerical examples are presented to demonstrate the method. Finally, as an alternative strategy, a quasi-classical approximation for the thermal density matrix expressed in the CG variables is derived. This approach provides an interesting physical picture for coarse-graining in quantum Boltzmann statistical mechanics in which the consistency with the quantum particle delocalization is obviously manifest, and it opens up an avenue for using path integral centroid-based effective classical force fields in a coarse-graining methodology.

  15. Much Polyphony but Little Harmony: Otto Sackur's Groping for a Quantum Theory of Gases

    NASA Astrophysics Data System (ADS)

    Badino, Massimiliano; Friedrich, Bretislav

    2013-09-01

    The endeavor of Otto Sackur (1880-1914) was driven, on the one hand, by his interest in Nernst's heat theorem, statistical mechanics, and the problem of chemical equilibrium and, on the other hand, by his goal to shed light on classical mechanics from the quantum vantage point. Inspired by the interplay between classical physics and quantum theory, Sackur chanced to expound his personal take on the role of the quantum in the changing landscape of physics in the turbulent 1910s. We tell the story of this enthusiastic practitioner of the old quantum theory and early contributor to quantum statistical mechanics, whose scientific ontogenesis provides a telling clue about the phylogeny of his contemporaries.

  16. Chance, determinism and the classical theory of probability.

    PubMed

    Vasudevan, Anubav

    2018-02-01

    This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Strong correlations between the exponent α and the particle number for a Renyi monoatomic gas in Gibbs' statistical mechanics.

    PubMed

    Plastino, A; Rocca, M C

    2017-06-01

    Appealing to the 1902 Gibbs formalism for classical statistical mechanics (SM)-the first SM axiomatic theory ever that successfully explained equilibrium thermodynamics-we show that already at the classical level there is a strong correlation between Renyi's exponent α and the number of particles for very simple systems. No reference to heat baths is needed for such a purpose.

  18. Comparing the Effectiveness of SPSS and EduG Using Different Designs for Generalizability Theory

    ERIC Educational Resources Information Center

    Teker, Gulsen Tasdelen; Guler, Nese; Uyanik, Gulden Kaya

    2015-01-01

    Generalizability theory (G theory) provides a broad conceptual framework for social sciences such as psychology and education, and a comprehensive construct for numerous measurement events by using analysis of variance, a strong statistical method. G theory, as an extension of both classical test theory and analysis of variance, is a model which…

  19. A classical density-functional theory for describing water interfaces.

    PubMed

    Hughes, Jessica; Krebs, Eric J; Roundy, David

    2013-01-14

    We develop a classical density functional for water which combines the White Bear fundamental-measure theory (FMT) functional for the hard sphere fluid with attractive interactions based on the statistical associating fluid theory variable range (SAFT-VR). This functional reproduces the properties of water at both long and short length scales over a wide range of temperatures and is computationally efficient, comparable to the cost of FMT itself. We demonstrate our functional by applying it to systems composed of two hard rods, four hard rods arranged in a square, and hard spheres in water.

  20. Properties of the Boltzmann equation in the classical approximation

    DOE PAGES

    Epelbaum, Thomas; Gelis, François; Tanji, Naoto; ...

    2014-12-30

    We examine the Boltzmann equation with elastic point-like scalar interactions in two different versions of the the classical approximation. Although solving numerically the Boltzmann equation with the unapproximated collision term poses no problem, this allows one to study the effect of the ultraviolet cutoff in these approximations. This cutoff dependence in the classical approximations of the Boltzmann equation is closely related to the non-renormalizability of the classical statistical approximation of the underlying quantum field theory. The kinetic theory setup that we consider here allows one to study in a much simpler way the dependence on the ultraviolet cutoff, since onemore » has also access to the non-approximated result for comparison.« less

  1. Robust Statistics: What They Are, and Why They Are So Important

    ERIC Educational Resources Information Center

    Corlu, Sencer M.

    2009-01-01

    The problem with "classical" statistics all invoking the mean is that these estimates are notoriously influenced by atypical scores (outliers), partly because the mean itself is differentially influenced by outliers. In theory, "modern" statistics may generate more replicable characterizations of data, because at least in some…

  2. Epistemic View of Quantum States and Communication Complexity of Quantum Channels

    NASA Astrophysics Data System (ADS)

    Montina, Alberto

    2012-09-01

    The communication complexity of a quantum channel is the minimal amount of classical communication required for classically simulating a process of state preparation, transmission through the channel and subsequent measurement. It establishes a limit on the power of quantum communication in terms of classical resources. We show that classical simulations employing a finite amount of communication can be derived from a special class of hidden variable theories where quantum states represent statistical knowledge about the classical state and not an element of reality. This special class has attracted strong interest very recently. The communication cost of each derived simulation is given by the mutual information between the quantum state and the classical state of the parent hidden variable theory. Finally, we find that the communication complexity for single qubits is smaller than 1.28 bits. The previous known upper bound was 1.85 bits.

  3. Microgravity experiments on vibrated granular gases in a dilute regime: non-classical statistics

    NASA Astrophysics Data System (ADS)

    Leconte, M.; Garrabos, Y.; Falcon, E.; Lecoutre-Chabot, C.; Palencia, F.; Évesque, P.; Beysens, D.

    2006-07-01

    We report on an experimental study of a dilute gas of steel spheres colliding inelastically and excited by a piston performing sinusoidal vibration, in low gravity. Using improved experimental apparatus, here we present some results concerning the collision statistics of particles on a wall of the container. We also propose a simple model where the non-classical statistics obtained from our data are attributed to the boundary condition playing the role of a 'velostat' instead of a thermostat. The significant differences from the kinetic theory of usual gas are related to the inelasticity of collisions.

  4. The development of ensemble theory. A new glimpse at the history of statistical mechanics

    NASA Astrophysics Data System (ADS)

    Inaba, Hajime

    2015-12-01

    This paper investigates the history of statistical mechanics from the viewpoint of the development of the ensemble theory from 1871 to 1902. In 1871, Ludwig Boltzmann introduced a prototype model of an ensemble that represents a polyatomic gas. In 1879, James Clerk Maxwell defined an ensemble as copies of systems of the same energy. Inspired by H.W. Watson, he called his approach "statistical". Boltzmann and Maxwell regarded the ensemble theory as a much more general approach than the kinetic theory. In the 1880s, influenced by Hermann von Helmholtz, Boltzmann made use of ensembles to establish thermodynamic relations. In Elementary Principles in Statistical Mechanics of 1902, Josiah Willard Gibbs tried to get his ensemble theory to mirror thermodynamics, including thermodynamic operations in its scope. Thermodynamics played the role of a "blind guide". His theory of ensembles can be characterized as more mathematically oriented than Einstein's theory proposed in the same year. Mechanical, empirical, and statistical approaches to foundations of statistical mechanics are presented. Although it was formulated in classical terms, the ensemble theory provided an infrastructure still valuable in quantum statistics because of its generality.

  5. The modification of generalized uncertainty principle applied in the detection technique of femtosecond laser

    NASA Astrophysics Data System (ADS)

    Li, Ziyi

    2017-12-01

    Generalized uncertainty principle (GUP), also known as the generalized uncertainty relationship, is the modified form of the classical Heisenberg’s Uncertainty Principle in special cases. When we apply quantum gravity theories such as the string theory, the theoretical results suggested that there should be a “minimum length of observation”, which is about the size of the Planck-scale (10-35m). Taking into account the basic scale of existence, we need to fix a new common form of Heisenberg’s uncertainty principle in the thermodynamic system and make effective corrections to statistical physical questions concerning about the quantum density of states. Especially for the condition at high temperature and high energy levels, generalized uncertainty calculations have a disruptive impact on classical statistical physical theories but the present theory of Femtosecond laser is still established on the classical Heisenberg’s Uncertainty Principle. In order to improve the detective accuracy and temporal resolution of the Femtosecond laser, we applied the modified form of generalized uncertainty principle to the wavelength, energy and pulse time of Femtosecond laser in our work. And we designed three typical systems from micro to macro size to estimate the feasibility of our theoretical model and method, respectively in the chemical solution condition, crystal lattice condition and nuclear fission reactor condition.

  6. A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sinitskiy, Anton V.; Voth, Gregory A., E-mail: gavoth@uchicago.edu

    2015-09-07

    Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman’s imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionistmore » perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.« less

  7. A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals.

    PubMed

    Sinitskiy, Anton V; Voth, Gregory A

    2015-09-07

    Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman's imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionist perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.

  8. Urns and Chameleons: two metaphors for two different types of measurements

    NASA Astrophysics Data System (ADS)

    Accardi, Luigi

    2013-09-01

    The awareness of the physical possibility of models of space, alternative with respect to the Euclidean one, begun to emerge towards the end of the 19-th century. At the end of the 20-th century a similar awareness emerged concerning the physical possibility of models of the laws of chance alternative with respect to the classical probabilistic models (Kolmogorov model). In geometry the mathematical construction of several non-Euclidean models of space preceded of about one century their applications in physics, which came with the theory of relativity. In physics the opposite situation took place. In fact, while the first example of non Kolmogorov probabilistic models emerged in quantum physics approximately one century ago, at the beginning of 1900, the awareness of the fact that this new mathematical formalism reflected a new mathematical model of the laws of chance had to wait until the early 1980's. In this long time interval the classical and the new probabilistic models were both used in the description and the interpretation of quantum phenomena and negatively interfered with each other because of the absence (for many decades) of a mathematical theory that clearly delimited the respective domains of application. The result of this interference was the emergence of the so-called the "paradoxes of quantum theory". For several decades there have been many different attempts to solve these paradoxes giving rise to what K. Popper baptized "the great quantum muddle": a debate which has been at the core of the philosophy of science for more than 50 years. However these attempts have led to contradictions between the two fundamental theories of the contemporary physical: the quantum theory and the theory of the relativity. Quantum probability identifies the reason of the emergence of non Kolmogorov models, and therefore of the so-called the paradoxes of quantum theory, in the difference between the notion of passive measurements like "reading pre-existent properties" (urn metaphor) and measurements consisting in reading "a response to an interaction" (chameleon metaphor). The non-trivial point is that one can prove that, while the urn scheme cannot lead to empirical data outside of classic probability, response based measurements can give rise to non classical statistics. The talk will include entirely classical examples of non classical statistics and potential applications to economic, sociological or biomedical phenomena.

  9. Entropy in sound and vibration: towards a new paradigm.

    PubMed

    Le Bot, A

    2017-01-01

    This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart.

  10. Turbulent Chemically Reacting Flows According to a Kinetic Theory. Ph.D. Thesis; [statistical analysis/gas flow

    NASA Technical Reports Server (NTRS)

    Hong, Z. C.

    1975-01-01

    A review of various methods of calculating turbulent chemically reacting flow such as the Green Function, Navier-Stokes equation, and others is presented. Nonequilibrium degrees of freedom were employed to study the mixing behavior of a multiscale turbulence field. Classical and modern theories are discussed.

  11. Assessing the Accuracy and Consistency of Language Proficiency Classification under Competing Measurement Models

    ERIC Educational Resources Information Center

    Zhang, Bo

    2010-01-01

    This article investigates how measurement models and statistical procedures can be applied to estimate the accuracy of proficiency classification in language testing. The paper starts with a concise introduction of four measurement models: the classical test theory (CTT) model, the dichotomous item response theory (IRT) model, the testlet response…

  12. Some Research Orientations for Research in Social Studies Education. [Draft].

    ERIC Educational Resources Information Center

    van Manen, M. J. Max

    The need for a different conception of research from the classical statistical approach to theory development in social studies teaching is addressed in this paper. In a schema of dominant orientations of social theory, the outstanding epistemological features of the three main schools of contemporary metascience are outlined. Three systems of…

  13. Modeling Conditional Probabilities in Complex Educational Assessments. CSE Technical Report.

    ERIC Educational Resources Information Center

    Mislevy, Robert J.; Almond, Russell; Dibello, Lou; Jenkins, Frank; Steinberg, Linda; Yan, Duanli; Senturk, Deniz

    An active area in psychometric research is coordinated task design and statistical analysis built around cognitive models. Compared with classical test theory and item response theory, there is often less information from observed data about the measurement-model parameters. On the other hand, there is more information from the grounding…

  14. Academician Nikolai Nikolaevich Bogolyubov (for the 100th anniversary of his birth)

    NASA Astrophysics Data System (ADS)

    Martynyuk, A. A.; Mishchenko, E. F.; Samoilenko, A. M.; Sukhanov, A. D.

    2009-07-01

    This paper is dedicated to the memory of N. N. Bogolyubov in recognition of his towering stature in nonlinear mechanics and theoretical physics, his remarkable many-sided genius, and the originality and depth of his contribution to the world's science. The paper briefly describes Bogolyubov's achievements in nonlinear mechanics, classical statistical physics, theory of superconductivity, quantum field theory, and strong interaction theory

  15. Statistical Optics

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  16. Statistical mechanical foundation of the peridynamic nonlocal continuum theory: energy and momentum conservation laws.

    PubMed

    Lehoucq, R B; Sears, Mark P

    2011-09-01

    The purpose of this paper is to derive the energy and momentum conservation laws of the peridynamic nonlocal continuum theory using the principles of classical statistical mechanics. The peridynamic laws allow the consideration of discontinuous motion, or deformation, by relying on integral operators. These operators sum forces and power expenditures separated by a finite distance and so represent nonlocal interaction. The integral operators replace the differential divergence operators conventionally used, thereby obviating special treatment at points of discontinuity. The derivation presented employs a general multibody interatomic potential, avoiding the standard assumption of a pairwise decomposition. The integral operators are also expressed in terms of a stress tensor and heat flux vector under the assumption that these fields are differentiable, demonstrating that the classical continuum energy and momentum conservation laws are consequences of the more general peridynamic laws. An important conclusion is that nonlocal interaction is intrinsic to continuum conservation laws when derived using the principles of statistical mechanics.

  17. Entropy in sound and vibration: towards a new paradigm

    PubMed Central

    2017-01-01

    This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart. PMID:28265190

  18. Bayesian theories of conditioning in a changing world.

    PubMed

    Courville, Aaron C; Daw, Nathaniel D; Touretzky, David S

    2006-07-01

    The recent flowering of Bayesian approaches invites the re-examination of classic issues in behavior, even in areas as venerable as Pavlovian conditioning. A statistical account can offer a new, principled interpretation of behavior, and previous experiments and theories can inform many unexplored aspects of the Bayesian enterprise. Here we consider one such issue: the finding that surprising events provoke animals to learn faster. We suggest that, in a statistical account of conditioning, surprise signals change and therefore uncertainty and the need for new learning. We discuss inference in a world that changes and show how experimental results involving surprise can be interpreted from this perspective, and also how, thus understood, these phenomena help constrain statistical theories of animal and human learning.

  19. Prequantum classical statistical field theory: background field as a source of everything?

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2011-07-01

    Prequantum classical statistical field theory (PCSFT) is a new attempt to consider quantum mechanics (QM) as an emergent phenomenon, cf. with De Broglie's "double solution" approach, Bohmian mechanics, stochastic electrodynamics (SED), Nelson's stochastic QM and its generalization by Davidson, 't Hooft's models and their development by Elze. PCSFT is a comeback to a purely wave viewpoint on QM, cf. with early Schrodinger. There is no quantum particles at all, only waves. In particular, photons are simply wave-pulses of the classical electromagnetic field, cf. SED. Moreover, even massive particles are special "prequantum fields": the electron field, the neutron field, and so on. PCSFT claims that (sooner or later) people will be able to measure components of these fields: components of the "photonic field" (the classical electromagnetic field of low intensity), electronic field, neutronic field, and so on. At the moment we are able to produce quantum correlations as correlations of classical Gaussian random fields. In this paper we are interested in mathematical and physical reasons of usage of Gaussian fields. We consider prequantum signals (corresponding to quantum systems) as composed of a huge number of wave-pulses (on very fine prequantum time scale). We speculate that the prequantum background field (the field of "vacuum fluctuations") might play the role of a source of such pulses, i.e., the source of everything.

  20. Developing a Test for Assessing Elementary Students' Comprehension of Science Texts

    ERIC Educational Resources Information Center

    Wang, Jing-Ru; Chen, Shin-Feng; Tsay, Reuy-Fen; Chou, Ching-Ting; Lin, Sheau-Wen; Kao, Huey-Lien

    2012-01-01

    This study reports on the process of developing a test to assess students' reading comprehension of scientific materials and on the statistical results of the verification study. A combination of classic test theory and item response theory approaches was used to analyze the assessment data from a verification study. Data analysis indicates the…

  1. Null but not void: considerations for hypothesis testing.

    PubMed

    Shaw, Pamela A; Proschan, Michael A

    2013-01-30

    Standard statistical theory teaches us that once the null and alternative hypotheses have been defined for a parameter, the choice of the statistical test is clear. Standard theory does not teach us how to choose the null or alternative hypothesis appropriate to the scientific question of interest. Neither does it tell us that in some cases, depending on which alternatives are realistic, we may want to define our null hypothesis differently. Problems in statistical practice are frequently not as pristinely summarized as the classic theory in our textbooks. In this article, we present examples in statistical hypothesis testing in which seemingly simple choices are in fact rich with nuance that, when given full consideration, make the choice of the right hypothesis test much less straightforward. Published 2012. This article is a US Government work and is in the public domain in the USA.

  2. Two-dimensional collective electron magnetotransport, oscillations, and chaos in a semiconductor superlattice

    NASA Astrophysics Data System (ADS)

    Bonilla, L. L.; Carretero, M.; Segura, A.

    2017-12-01

    When quantized, traces of classically chaotic single-particle systems include eigenvalue statistics and scars in eigenfuntions. Since 2001, many theoretical and experimental works have argued that classically chaotic single-electron dynamics influences and controls collective electron transport. For transport in semiconductor superlattices under tilted magnetic and electric fields, these theories rely on a reduction to a one-dimensional self-consistent drift model. A two-dimensional theory based on self-consistent Boltzmann transport does not support that single-electron chaos influences collective transport. This theory agrees with existing experimental evidence of current self-oscillations, predicts spontaneous collective chaos via a period doubling scenario, and could be tested unambiguously by measuring the electric potential inside the superlattice under a tilted magnetic field.

  3. Two-dimensional collective electron magnetotransport, oscillations, and chaos in a semiconductor superlattice.

    PubMed

    Bonilla, L L; Carretero, M; Segura, A

    2017-12-01

    When quantized, traces of classically chaotic single-particle systems include eigenvalue statistics and scars in eigenfuntions. Since 2001, many theoretical and experimental works have argued that classically chaotic single-electron dynamics influences and controls collective electron transport. For transport in semiconductor superlattices under tilted magnetic and electric fields, these theories rely on a reduction to a one-dimensional self-consistent drift model. A two-dimensional theory based on self-consistent Boltzmann transport does not support that single-electron chaos influences collective transport. This theory agrees with existing experimental evidence of current self-oscillations, predicts spontaneous collective chaos via a period doubling scenario, and could be tested unambiguously by measuring the electric potential inside the superlattice under a tilted magnetic field.

  4. Brane Physics in M-theory

    NASA Astrophysics Data System (ADS)

    Argurio, Riccardo

    1998-07-01

    The thesis begins with an introduction to M-theory (at a graduate student's level), starting from perturbative string theory and proceeding to dualities, D-branes and finally Matrix theory. The following chapter treats, in a self-contained way, of general classical p-brane solutions. Black and extremal branes are reviewed, along with their semi-classical thermodynamics. We then focus on intersecting extremal branes, the intersection rules being derived both with and without the explicit use of supersymmetry. The last three chapters comprise more advanced aspects of brane physics, such as the dynamics of open branes, the little theories on the world-volume of branes and how the four dimensional Schwarzschild black hole can be mapped to an extremal configuration of branes, thus allowing for a statistical interpretation of its entropy. The original results were already reported in hep-th/9701042, hep-th/9704190, hep-th/9710027 and hep-th/9801053.

  5. Reliability of a Measure of Institutional Discrimination against Minorities

    DTIC Science & Technology

    1979-12-01

    samples are presented. The first is based upon classical statistical theory and the second derives from a series of computer-generated Monte Carlo...Institutional racism and sexism . Englewood Cliffs, N. J.: Prentice-Hall, Inc., 1978. Hays, W. L. and Winkler, R. L. Statistics : probability, inference... statistical measure of the e of institutional discrimination are discussed. Two methods of dealing with the problem of reliability of the measure in small

  6. Statistical correlation analysis for comparing vibration data from test and analysis

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.; Purves, L. R.; Hershfeld, D. J.

    1986-01-01

    A theory was developed to compare vibration modes obtained by NASTRAN analysis with those obtained experimentally. Because many more analytical modes can be obtained than experimental modes, the analytical set was treated as expansion functions for putting both sources in comparative form. The dimensional symmetry was developed for three general cases: nonsymmetric whole model compared with a nonsymmetric whole structural test, symmetric analytical portion compared with a symmetric experimental portion, and analytical symmetric portion with a whole experimental test. The theory was coded and a statistical correlation program was installed as a utility. The theory is established with small classical structures.

  7. A Transferrable Belief Model Representation for Physical Security of Nuclear Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Gerts

    This work analyzed various probabilistic methods such as classic statistics, Bayesian inference, possibilistic theory, and Dempster-Shafer theory of belief functions for the potential insight offered into the physical security of nuclear materials as well as more broad application to nuclear non-proliferation automated decision making theory. A review of the fundamental heuristic and basic limitations of each of these methods suggested that the Dempster-Shafer theory of belief functions may offer significant capability. Further examination of the various interpretations of Dempster-Shafer theory, such as random set, generalized Bayesian, and upper/lower probability demonstrate some limitations. Compared to the other heuristics, the transferrable beliefmore » model (TBM), one of the leading interpretations of Dempster-Shafer theory, can improve the automated detection of the violation of physical security using sensors and human judgment. The improvement is shown to give a significant heuristic advantage over other probabilistic options by demonstrating significant successes for several classic gedanken experiments.« less

  8. Dynamics of Markets

    NASA Astrophysics Data System (ADS)

    McCauley, Joseph L.

    2009-09-01

    Preface; 1. Econophysics: why and what; 2. Neo-classical economic theory; 3. Probability and stochastic processes; 4. Introduction to financial economics; 5. Introduction to portfolio selection theory; 6. Scaling, pair correlations, and conditional densities; 7. Statistical ensembles: deducing dynamics from time series; 8. Martingale option pricing; 9. FX market globalization: evolution of the dollar to worldwide reserve currency; 10. Macroeconomics and econometrics: regression models vs. empirically based modeling; 11. Complexity; Index.

  9. A pedestrian approach to the measurement problem in quantum mechanics

    NASA Astrophysics Data System (ADS)

    Boughn, Stephen; Reginatto, Marcel

    2013-09-01

    The quantum theory of measurement has been a matter of debate for over eighty years. Most of the discussion has focused on theoretical issues with the consequence that other aspects (such as the operational prescriptions that are an integral part of experimental physics) have been largely ignored. This has undoubtedly exacerbated attempts to find a solution to the "measurement problem". How the measurement problem is defined depends to some extent on how the theoretical concepts introduced by the theory are interpreted. In this paper, we fully embrace the minimalist statistical (ensemble) interpretation of quantum mechanics espoused by Einstein, Ballentine, and others. According to this interpretation, the quantum state description applies only to a statistical ensemble of similarly prepared systems rather than representing an individual system. Thus, the statistical interpretation obviates the need to entertain reduction of the state vector, one of the primary dilemmas of the measurement problem. The other major aspect of the measurement problem, the necessity of describing measurements in terms of classical concepts that lay outside of quantum theory, remains. A consistent formalism for interacting quantum and classical systems, like the one based on ensembles on configuration space that we refer to in this paper, might seem to eliminate this facet of the measurement problem; however, we argue that the ultimate interface with experiments is described by operational prescriptions and not in terms of the concepts of classical theory. There is no doubt that attempts to address the measurement problem have yielded important advances in fundamental physics; however, it is also very clear that the measurement problem is still far from being resolved. The pedestrian approach presented here suggests that this state of affairs is in part the result of searching for a theoretical/mathematical solution to what is fundamentally an experimental/observational question. It suggests also that the measurement problem is, in some sense, ill-posed and might never be resolved. This point of view is tenable so long as one is willing to view physical theories as providing models of nature rather than complete descriptions of reality. Among other things, these considerations lead us to suggest that the Copenhagen interpretation's insistence on the classicality of the measurement apparatus should be replaced by the requirement that a measurement, which is specified operationally, should simply be of sufficient precision.

  10. Quantum probability, choice in large worlds, and the statistical structure of reality.

    PubMed

    Ross, Don; Ladyman, James

    2013-06-01

    Classical probability models of incentive response are inadequate in "large worlds," where the dimensions of relative risk and the dimensions of similarity in outcome comparisons typically differ. Quantum probability models for choice in large worlds may be motivated pragmatically - there is no third theory - or metaphysically: statistical processing in the brain adapts to the true scale-relative structure of the universe.

  11. Quantum Field Theory Approach to Condensed Matter Physics

    NASA Astrophysics Data System (ADS)

    Marino, Eduardo C.

    2017-09-01

    Preface; Part I. Condensed Matter Physics: 1. Independent electrons and static crystals; 2. Vibrating crystals; 3. Interacting electrons; 4. Interactions in action; Part II. Quantum Field Theory: 5. Functional formulation of quantum field theory; 6. Quantum fields in action; 7. Symmetries: explicit or secret; 8. Classical topological excitations; 9. Quantum topological excitations; 10. Duality, bosonization and generalized statistics; 11. Statistical transmutation; 12. Pseudo quantum electrodynamics; Part III. Quantum Field Theory Approach to Condensed Matter Systems: 13. Quantum field theory methods in condensed matter; 14. Metals, Fermi liquids, Mott and Anderson insulators; 15. The dynamics of polarons; 16. Polyacetylene; 17. The Kondo effect; 18. Quantum magnets in 1D: Fermionization, bosonization, Coulomb gases and 'all that'; 19. Quantum magnets in 2D: nonlinear sigma model, CP1 and 'all that'; 20. The spin-fermion system: a quantum field theory approach; 21. The spin glass; 22. Quantum field theory approach to superfluidity; 23. Quantum field theory approach to superconductivity; 24. The cuprate high-temperature superconductors; 25. The pnictides: iron based superconductors; 26. The quantum Hall effect; 27. Graphene; 28. Silicene and transition metal dichalcogenides; 29. Topological insulators; 30. Non-abelian statistics and quantum computation; References; Index.

  12. Space-time models based on random fields with local interactions

    NASA Astrophysics Data System (ADS)

    Hristopulos, Dionissios T.; Tsantili, Ivi C.

    2016-08-01

    The analysis of space-time data from complex, real-life phenomena requires the use of flexible and physically motivated covariance functions. In most cases, it is not possible to explicitly solve the equations of motion for the fields or the respective covariance functions. In the statistical literature, covariance functions are often based on mathematical constructions. In this paper, we propose deriving space-time covariance functions by solving “effective equations of motion”, which can be used as statistical representations of systems with diffusive behavior. In particular, we propose to formulate space-time covariance functions based on an equilibrium effective Hamiltonian using the linear response theory. The effective space-time dynamics is then generated by a stochastic perturbation around the equilibrium point of the classical field Hamiltonian leading to an associated Langevin equation. We employ a Hamiltonian which extends the classical Gaussian field theory by including a curvature term and leads to a diffusive Langevin equation. Finally, we derive new forms of space-time covariance functions.

  13. On information, negentropy and H-theorem

    NASA Astrophysics Data System (ADS)

    Chakrabarti, C. G.; Sarker, N. G.

    1983-09-01

    The paper deals with the imprtance of the Kullback descrimination information in the statistical characterization of negentropy of non-equilibrium state and the irreversibility of a classical dynamical system. The theory based on the Kullback discrimination information as the H-function gives new insight into the interrelation between the concepts of coarse-graining and the principle of sufficiency leading to important statistical characterization of thermal equilibrium of a closed system.

  14. Molecular dynamics studies of electron-ion temperature equilibration in hydrogen plasmas within the coupled-mode regime

    DOE PAGES

    Benedict, Lorin X.; Surh, Michael P.; Stanton, Liam G.; ...

    2017-04-10

    Here, we use classical molecular dynamics (MD) to study electron-ion temperature equilibration in two-component plasmas in regimes for which the presence of coupled collective modes has been predicted to substantively reduce the equilibration rate. Guided by previous kinetic theory work, we examine hydrogen plasmas at a density of n = 10 26cm –3, T i = 10 5K, and 10 7 K < Te < 10 9K. The nonequilibrium classical MD simulations are performed with interparticle interactions modeled by quantum statistical potentials (QSPs). Our MD results indicate (i) a large effect from time-varying potential energy, which we quantify by appealingmore » to an adiabatic two-temperature equation of state, and (ii) a notable deviation in the energy equilibration rate when compared to calculations from classical Lenard-Balescu theory including the QSPs. In particular, it is shown that the energy equilibration rates from MD are more similar to those of the theory when coupled modes are neglected. We suggest possible reasons for this surprising result and propose directions of further research along these lines.« less

  15. A Critical Review and Appropriation of Pierre Bourdieu's Analysis of Social and Cultural Reproduction

    ERIC Educational Resources Information Center

    Shirley, Dennis

    1986-01-01

    Makes accessible Bourdieu's comprehensive and systematic sociology of French education; which integrates classical sociological theory and statistical analysis. Isolates and explicates key terminology, links these concepts together, and critiques the work from the perspective of the philosophy of praxis. (LHW)

  16. Theory-Based Causal Induction

    ERIC Educational Resources Information Center

    Griffiths, Thomas L.; Tenenbaum, Joshua B.

    2009-01-01

    Inducing causal relationships from observations is a classic problem in scientific inference, statistics, and machine learning. It is also a central part of human learning, and a task that people perform remarkably well given its notorious difficulties. People can learn causal structure in various settings, from diverse forms of data: observations…

  17. {Phi}{sup 4} kinks: Statistical mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habib, S.

    1995-12-31

    Some recent investigations of the thermal equilibrium properties of kinks in a 1+1-dimensional, classical {phi}{sup 4} field theory are reviewed. The distribution function, kink density, correlation function, and certain thermodynamic quantities were studied both theoretically and via large scale simulations. A simple double Gaussian variational approach within the transfer operator formalism was shown to give good results in the intermediate temperature range where the dilute gas theory is known to fail.

  18. Modeling micelle formation and interfacial properties with iSAFT classical density functional theory

    NASA Astrophysics Data System (ADS)

    Wang, Le; Haghmoradi, Amin; Liu, Jinlu; Xi, Shun; Hirasaki, George J.; Miller, Clarence A.; Chapman, Walter G.

    2017-03-01

    Surfactants reduce the interfacial tension between phases, making them an important additive in a number of industrial and commercial applications from enhanced oil recovery to personal care products (e.g., shampoo and detergents). To help obtain a better understanding of the dependence of surfactant properties on molecular structure, a classical density functional theory, also known as interfacial statistical associating fluid theory, has been applied to study the effects of surfactant architecture on micelle formation and interfacial properties for model nonionic surfactant/water/oil systems. In this approach, hydrogen bonding is explicitly included. To minimize the free energy, the system minimizes interactions between hydrophobic components and hydrophilic components with water molecules hydrating the surfactant head group. The theory predicts micellar structure, effects of surfactant architecture on critical micelle concentration, aggregation number, and interfacial tension isotherm of surfactant/water systems in qualitative agreement with experimental data. Furthermore, this model is applied to study swollen micelles and reverse swollen micelles that are necessary to understand the formation of a middle-phase microemulsion.

  19. Statistical analysis of 4 types of neck whiplash injuries based on classical meridian theory.

    PubMed

    Chen, Yemeng; Zhao, Yan; Xue, Xiaolin; Li, Hui; Wu, Xiuyan; Zhang, Qunce; Zheng, Xin; Wang, Tianfang

    2015-01-01

    As one component of the Chinese medicine meridian system, the meridian sinew (Jingjin, (see text), tendino-musculo) is specially described as being for acupuncture treatment of the musculoskeletal system because of its dynamic attributes and tender point correlations. In recent decades, the therapeutic importance of the sinew meridian has become revalued in clinical application. Based on this theory, the authors have established therapeutic strategies of acupuncture treatment in Whiplash-Associated Disorders (WAD) by categorizing four types of neck symptom presentations. The advantage of this new system is to make it much easier for the clinician to find effective acupuncture points. This study attempts to prove the significance of the proposed therapeutic strategies by analyzing data collected from a clinical survey of various WAD using non-supervised statistical methods, such as correlation analysis, factor analysis, and cluster analysis. The clinical survey data have successfully verified discrete characteristics of four neck syndromes, based upon the range of motion (ROM) and tender point location findings. A summary of the relationships among the symptoms of the four neck syndromes has shown the correlation coefficient as having a statistical significance (P < 0.01 or P < 0.05), especially with regard to ROM. Furthermore, factor and cluster analyses resulted in a total of 11 categories of general symptoms, which implies syndrome factors are more related to the Liver, as originally described in classical theory. The hypothesis of meridian sinew syndromes in WAD is clearly supported by the statistical analysis of the clinical trials. This new discovery should be beneficial in improving therapeutic outcomes.

  20. On the emergence of classical gravity

    NASA Astrophysics Data System (ADS)

    Larjo, Klaus

    In this thesis I will discuss how certain black holes arise as an effective, thermodynamical description from non-singular microstates in string theory. This provides a possible solution to the information paradox, and strengthens the case for treating black holes as thermodynamical objects. I will characterize the data defining a microstate of a black hole in several settings, and demonstrate that most of the data is unmeasurable for a classical observer. I will further show that the data that is measurable is universal for nearly all microstates, making it impossible for a classical observer to distinguish between microstates, thus giving rise to an effective statistical description for the black hole. In the first half of the thesis I will work with two specific systems: the half-BPS sector of [Special characters omitted.] = 4 super Yang-Mills the and the conformal field theory corresponding to the D1/D5 system; in both cases the high degree of symmetry present provides great control over potentially intractable computations. For these systems, I will further specify the conditions a quantum mechanical microstate must satisfy in order to have a classical description in terms of a unique metric, and define a 'metric operator' whose eigenstates correspond to classical geometries. In the second half of the thesis I will consider a much broader setting, general [Special characters omitted.] = I superconformal quiver gauge the= ories and their dual gravity theories, and demonstrate that a similar effective description arises also in this setting.

  1. Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.

    2013-05-01

    Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.

  2. SPSS and SAS programs for generalizability theory analyses.

    PubMed

    Mushquash, Christopher; O'Connor, Brian P

    2006-08-01

    The identification and reduction of measurement errors is a major challenge in psychological testing. Most investigators rely solely on classical test theory for assessing reliability, whereas most experts have long recommended using generalizability theory instead. One reason for the common neglect of generalizability theory is the absence of analytic facilities for this purpose in popular statistical software packages. This article provides a brief introduction to generalizability theory, describes easy to use SPSS, SAS, and MATLAB programs for conducting the recommended analyses, and provides an illustrative example, using data (N = 329) for the Rosenberg Self-Esteem Scale. Program output includes variance components, relative and absolute errors and generalizability coefficients, coefficients for D studies, and graphs of D study results.

  3. Rage against the Machine: Evaluation Metrics in the 21st Century

    ERIC Educational Resources Information Center

    Yang, Charles

    2017-01-01

    I review the classic literature in generative grammar and Marr's three-level program for cognitive science to defend the Evaluation Metric as a psychological theory of language learning. Focusing on well-established facts of language variation, change, and use, I argue that optimal statistical principles embodied in Bayesian inference models are…

  4. The Information Function for the One-Parameter Logistic Model: Is it Reliability?

    ERIC Educational Resources Information Center

    Doran, Harold C.

    2005-01-01

    The information function is an important statistic in item response theory (IRT) applications. Although the information function is often described as the IRT version of reliability, it differs from the classical notion of reliability from a critical perspective: replication. This article first explores the information function for the…

  5. Fixing the Big Bang Theory's Lithium Problem

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2017-02-01

    How did our universe come into being? The Big Bang theory is a widely accepted and highly successful cosmological model of the universe, but it does introduce one puzzle: the cosmological lithium problem. Have scientists now found a solution?Too Much LithiumIn the Big Bang theory, the universe expanded rapidly from a very high-density and high-temperature state dominated by radiation. This theory has been validated again and again: the discovery of the cosmic microwave background radiation and observations of the large-scale structure of the universe both beautifully support the Big Bang theory, for instance. But one pesky trouble-spot remains: the abundance of lithium.The arrows show the primary reactions involved in Big Bang nucleosynthesis, and their flux ratios, as predicted by the authors model, are given on the right. Synthesizing primordial elements is complicated! [Hou et al. 2017]According to Big Bang nucleosynthesis theory, primordial nucleosynthesis ran wild during the first half hour of the universes existence. This produced most of the universes helium and small amounts of other light nuclides, including deuterium and lithium.But while predictions match the observed primordial deuterium and helium abundances, Big Bang nucleosynthesis theory overpredicts the abundance of primordial lithium by about a factor of three. This inconsistency is known as the cosmological lithium problem and attempts to resolve it using conventional astrophysics and nuclear physics over the past few decades have not been successful.In a recent publicationled by Suqing Hou (Institute of Modern Physics, Chinese Academy of Sciences) and advisorJianjun He (Institute of Modern Physics National Astronomical Observatories, Chinese Academy of Sciences), however, a team of scientists has proposed an elegant solution to this problem.Time and temperature evolution of the abundances of primordial light elements during the beginning of the universe. The authors model (dotted lines) successfully predicts a lower abundance of the beryllium isotope which eventually decays into lithium relative to the classical Maxwell-Boltzmann distribution (solid lines), without changing the predicted abundances of deuterium or helium. [Hou et al. 2017]Questioning StatisticsHou and collaborators questioned a key assumption in Big Bang nucleosynthesis theory: that the nuclei involved in the process are all in thermodynamic equilibrium, and their velocities which determine the thermonuclear reaction rates are described by the classical Maxwell-Boltzmann distribution.But do nuclei still obey this classical distribution in the extremely complex, fast-expanding Big Bang hot plasma? Hou and collaborators propose that the lithium nuclei dont, and that they must instead be described by a slightly modified version of the classical distribution, accounted for using whats known as non-extensive statistics.The authors show that using the modified velocity distributions described by these statistics, they can successfully predict the observed primordial abundances of deuterium, helium, and lithium simultaneously. If this solution to the cosmological lithium problem is correct, the Big Bang theory is now one step closer to fully describing the formation of our universe.CitationS. Q. Hou et al 2017 ApJ 834 165. doi:10.3847/1538-4357/834/2/165

  6. Non-statistical effects in bond fission reactions of 1,2-difluoroethane

    NASA Astrophysics Data System (ADS)

    Schranz, Harold W.; Raff, Lionel M.; Thompson, Donald L.

    1991-08-01

    A microcanonical, classical variational transition-state theory based on the use of the efficient microcanonical sampling (EMS) procedure is applied to simple bond fission in 1,2-difluoroethane. Comparison is made with results of trajectory calculations performed on the same global potential-energy surface. Agreement between the statistical theory and trajectory results for CC CF and CH bond fissions is poor with differences as large as a factor of 125. Most importantly, at the lower energy studied, 6.0 eV, the statistical calculations predict considerably slower rates than those computed from trajectories. We conclude from these results that the statistical assumptions inherent in the transition-state theory method are not valid for 1,2-difluoroethane in spite of the fact that the total intramolecular energy transfer rate out of CH and CC normal and local modes is large relative to the bond fission rates. The IVR rate is not globally rapid and the trajectories do not access all of the energetically available phase space uniformly on the timescale of the reactions.

  7. Nonadditive entropy Sq and nonextensive statistical mechanics: Applications in geophysics and elsewhere

    NASA Astrophysics Data System (ADS)

    Tsallis, Constantino

    2012-06-01

    The celebrated Boltzmann-Gibbs (BG) entropy, S BG = -kΣi p i ln p i, and associated statistical mechanics are essentially based on hypotheses such as ergodicity, i.e., when ensemble averages coincide with time averages. This dynamical simplification occurs in classical systems (and quantum counterparts) whose microscopic evolution is governed by a positive largest Lyapunov exponent (LLE). Under such circumstances, relevant microscopic variables behave, from the probabilistic viewpoint, as (nearly) independent. Many phenomena exist, however, in natural, artificial and social systems (geophysics, astrophysics, biophysics, economics, and others) that violate ergodicity. To cover a (possibly) wide class of such systems, a generalization (nonextensive statistical mechanics) of the BG theory was proposed in 1988. This theory is based on nonadditive entropies such as S_q = kfrac{{1 - sumnolimits_i {p_i^q } }} {{q - 1}}left( {S_1 = S_{BG} } right). Here we comment some central aspects of this theory, and briefly review typical predictions, verifications and applications in geophysics and elsewhere, as illustrated through theoretical, experimental, observational, and computational results.

  8. Quantum-mechanical analysis of low-gain free-electron laser oscillators

    NASA Astrophysics Data System (ADS)

    Fares, H.; Yamada, M.; Chiadroni, E.; Ferrario, M.

    2018-05-01

    In the previous classical theory of the low-gain free-electron laser (FEL) oscillators, the electron is described as a point-like particle, a delta function in the spatial space. On the other hand, in the previous quantum treatments, the electron is described as a plane wave with a single momentum state, a delta function in the momentum space. In reality, an electron must have statistical uncertainties in the position and momentum domains. Then, the electron is neither a point-like charge nor a plane wave of a single momentum. In this paper, we rephrase the theory of the low-gain FEL where the interacting electron is represented quantum mechanically by a plane wave with a finite spreading length (i.e., a wave packet). Using the concepts of the transformation of reference frames and the statistical quantum mechanics, an expression for the single-pass radiation gain is derived. The spectral broadening of the radiation is expressed in terms of the spreading length of an electron, the relaxation time characterizing the energy spread of electrons, and the interaction time. We introduce a comparison between our results and those obtained in the already known classical analyses where a good agreement between both results is shown. While the correspondence between our results and the classical results are shown, novel insights into the electron dynamics and the interaction mechanism are presented.

  9. The ambiguity of simplicity in quantum and classical simulation

    NASA Astrophysics Data System (ADS)

    Aghamohammadi, Cina; Mahoney, John R.; Crutchfield, James P.

    2017-04-01

    A system's perceived simplicity depends on whether it is represented classically or quantally. This is not so surprising, as classical and quantum physics are descriptive frameworks built on different assumptions that capture, emphasize, and express different properties and mechanisms. What is surprising is that, as we demonstrate, simplicity is ambiguous: the relative simplicity between two systems can change sign when moving between classical and quantum descriptions. Here, we associate simplicity with small model-memory. We see that the notions of absolute physical simplicity at best form a partial, not a total, order. This suggests that appeals to principles of physical simplicity, via Ockham's Razor or to the ;elegance; of competing theories, may be fundamentally subjective. Recent rapid progress in quantum computation and quantum simulation suggest that the ambiguity of simplicity will strongly impact statistical inference and, in particular, model selection.

  10. Fundamental physical theories: Mathematical structures grounded on a primitive ontology

    NASA Astrophysics Data System (ADS)

    Allori, Valia

    In my dissertation I analyze the structure of fundamental physical theories. I start with an analysis of what an adequate primitive ontology is, discussing the measurement problem in quantum mechanics and theirs solutions. It is commonly said that these theories have little in common. I argue instead that the moral of the measurement problem is that the wave function cannot represent physical objects and a common structure between these solutions can be recognized: each of them is about a clear three-dimensional primitive ontology that evolves according to a law determined by the wave function. The primitive ontology is what matter is made of while the wave function tells the matter how to move. One might think that what is important in the notion of primitive ontology is their three-dimensionality. If so, in a theory like classical electrodynamics electromagnetic fields would be part of the primitive ontology. I argue that, reflecting on what the purpose of a fundamental physical theory is, namely to explain the behavior of objects in three-dimensional space, one can recognize that a fundamental physical theory has a particular architecture. If so, electromagnetic fields play a different role in the theory than the particles and therefore should be considered, like the wave function, as part of the law. Therefore, we can characterize the general structure of a fundamental physical theory as a mathematical structure grounded on a primitive ontology. I explore this idea to better understand theories like classical mechanics and relativity, emphasizing that primitive ontology is crucial in the process of building new theories, being fundamental in identifying the symmetries. Finally, I analyze what it means to explain the word around us in terms of the notion of primitive ontology in the case of regularities of statistical character. Here is where the notion of typicality comes into play: we have explained a phenomenon if the typical histories of the primitive ontology give rise to the statistical regularities we observe.

  11. Nanophotonic light-trapping theory for solar cells

    NASA Astrophysics Data System (ADS)

    Yu, Zongfu; Raman, Aaswath; Fan, Shanhui

    2011-11-01

    Conventional light-trapping theory, based on a ray-optics approach, was developed for standard thick photovoltaic cells. The classical theory established an upper limit for possible absorption enhancement in this context and provided a design strategy for reaching this limit. This theory has become the foundation for light management in bulk silicon PV cells, and has had enormous influence on the optical design of solar cells in general. This theory, however, is not applicable in the nanophotonic regime. Here we develop a statistical temporal coupled-mode theory of light trapping based on a rigorous electromagnetic approach. Our theory reveals that the standard limit can be substantially surpassed when optical modes in the active layer are confined to deep-subwavelength scale, opening new avenues for highly efficient next-generation solar cells.

  12. The Examination of Reliability According to Classical Test and Generalizability on a Job Performance Scale

    ERIC Educational Resources Information Center

    Yelboga, Atilla; Tavsancil, Ezel

    2010-01-01

    In this research, the classical test theory and generalizability theory analyses were carried out with the data obtained by a job performance scale for the years 2005 and 2006. The reliability coefficients obtained (estimated) from the classical test theory and generalizability theory analyses were compared. In classical test theory, test retest…

  13. Rydberg Atoms in Strong Fields: a Testing Ground for Quantum Chaos.

    NASA Astrophysics Data System (ADS)

    Courtney, Michael

    1995-01-01

    Rydberg atoms in strong static electric and magnetic fields provide experimentally accessible systems for studying the connections between classical chaos and quantum mechanics in the semiclassical limit. This experimental accessibility has motivated the development of reliable quantum mechanical solutions. This thesis uses both experimental and computed quantum spectra to test the central approaches to quantum chaos. These central approaches consist mainly of developing methods to compute the spectra of quantum systems in non -perturbative regimes, correlating statistical descriptions of eigenvalues with the classical behavior of the same Hamiltonian, and the development of semiclassical methods such as periodic-orbit theory. Particular emphasis is given to identifying the spectral signature of recurrences --quantum wave packets which follow classical orbits. The new findings include: the breakdown of the connection between energy-level statistics and classical chaos in odd-parity diamagnetic lithium, the discovery of the signature of very long period orbits in atomic spectra, quantitative evidence for the scattering of recurrences by the alkali -metal core, quantitative description of the behavior of recurrences near bifurcations, and a semiclassical interpretation of the evolution of continuum Stark spectra. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.).

  14. Universal self-similar dynamics of relativistic and nonrelativistic field theories near nonthermal fixed points

    NASA Astrophysics Data System (ADS)

    Piñeiro Orioli, Asier; Boguslavski, Kirill; Berges, Jürgen

    2015-07-01

    We investigate universal behavior of isolated many-body systems far from equilibrium, which is relevant for a wide range of applications from ultracold quantum gases to high-energy particle physics. The universality is based on the existence of nonthermal fixed points, which represent nonequilibrium attractor solutions with self-similar scaling behavior. The corresponding dynamic universality classes turn out to be remarkably large, encompassing both relativistic as well as nonrelativistic quantum and classical systems. For the examples of nonrelativistic (Gross-Pitaevskii) and relativistic scalar field theory with quartic self-interactions, we demonstrate that infrared scaling exponents as well as scaling functions agree. We perform two independent nonperturbative calculations, first by using classical-statistical lattice simulation techniques and second by applying a vertex-resummed kinetic theory. The latter extends kinetic descriptions to the nonperturbative regime of overoccupied modes. Our results open new perspectives to learn from experiments with cold atoms aspects about the dynamics during the early stages of our universe.

  15. Ghirardi-Rimini-Weber model with massive flashes

    NASA Astrophysics Data System (ADS)

    Tilloy, Antoine

    2018-01-01

    I introduce a modification of the Ghirardi-Rimini-Weber (GRW) model in which the flashes (or space-time collapse events) source a classical gravitational field. The resulting semiclassical theory of Newtonian gravity preserves the statistical interpretation of quantum states of matter in contrast with mean field approaches. It can be seen as a discrete version of recent proposals of consistent hybrid quantum classical theories. The model is in agreement with known experimental data and introduces new falsifiable predictions: (1) single particles do not self-interact, (2) the 1 /r gravitational potential of Newtonian gravity is cut off at short (≲10-7 m ) distances, and (3) gravity makes spatial superpositions decohere at a rate inversely proportional to that coming from the vanilla GRW model. Together, the last two predictions make the model experimentally falsifiable for all values of its parameters.

  16. Introduction

    NASA Astrophysics Data System (ADS)

    Cohen, E. G. D.

    Lecture notes are organized around the key word dissipation, while focusing on a presentation of modern theoretical developments in the study of irreversible phenomena. A broad cross-disciplinary perspective towards non-equilibrium statistical mechanics is backed by the general theory of nonlinear and complex dynamical systems. The classical-quantum intertwine and semiclassical dissipative borderline issue (decoherence, "classical out of quantum") are here included . Special emphasis is put on links between the theory of classical and quantum dynamical systems (temporal disorder, dynamical chaos and transport processes) with central problems of non-equilibrium statistical mechanics like e.g. the connection between dynamics and thermodynamics, relaxation towards equilibrium states and mechanisms capable to drive and next maintain the physical system far from equilibrium, in a non-equilibrium steady (stationary) state. The notion of an equilibrium state - towards which a system naturally evolves if left undisturbed - is a fundamental concept of equilibrium statistical mechanics. Taken as a primitive point of reference that allows to give an unambiguous status to near equilibrium and far from equilibrium systems, together with the dynamical notion of a relaxation (decay) towards a prescribed asymptotic invariant measure or probability distribution (properties of ergodicity and mixing are implicit). A related issue is to keep under control the process of driving a physical system away from an initial state of equilibrium and either keeping it in another (non-equilibrium) steady state or allowing to restore the initial data (return back, relax). To this end various models of environment (heat bath, reservoir, thermostat, measuring instrument etc.), and the environment - system coupling are analyzed. The central theme of the book is the dynamics of dissipation and various mechanisms responsible for the irreversible behaviour (transport properties) of open systems on classical and quantum levels of description. A distinguishing feature of these lecture notes is that microscopic foundations of irreversibility are investigated basically in terms of "small" systems, when the "system" and/or "environment" may have a finite (and small) number of degrees of freedom and may be bounded. This is to be contrasted with the casual understanding of statistical mechanics which is regarded to refer to systems with a very large number of degrees of freedom. In fact, it is commonly accepted that the accumulation of effects due to many (range of the Avogadro number) particles is required for statistical mechanics reasoning. Albeit those large numbers are not at all sufficient for transport properties. A helpful hint towards this conceptual turnover comes from the observation that for chaotic dynamical systems the random time evolution proves to be compatible with the underlying purely deterministic laws of motion. Chaotic features of the classical dynamics already appear in systems with two degrees of freedom and such systems need to be described in statistical terms, if we wish to quantify the dynamics of relaxation towards an invariant ergodic measure. The relaxation towards equilibrium finds a statistical description through an analysis of statistical ensembles. This entails an extension of the range of validity of statistical mechanics to small classical systems. On the other hand, the dynamics of fluctuations in macroscopic dissipative systems (due to their molecular composition and thermal mobility) may render a characterization of such systems as being chaotic. That motivates attempts of understanding the role of microscopic chaos and various "chaotic hypotheses" - dynamical systems approach is being pushed down to the level of atoms, molecules and complex matter constituents, whose natural substitute are low-dimensional model subsystems (encompassing as well the mesoscopic "quantum chaos") - in non-equilibrium transport phenomena. On the way a number of questions is addressed like e.g.: is there, or what is the nature of a connection between chaos (modern theory of dynamical systems) and irreversible thermodynamics; can really quantum chaos explain some peculiar features of quantum transport? The answer in both cases is positive, modulo a careful discrimination between viewing the dynamical chaos as a necessary or sufficient basis for irreversibility. In those dynamical contexts, another key term dynamical semigroups refers to major technical tools appropriate for the "dissipative mathematics", modelling irreversible behaviour on the classical and quantum levels of description. Dynamical systems theory and "quantum chaos" research involve both a high level of mathematical sophistication and heavy computer "experimentation". One of the present volume specific flavors is a tutorial access to quite advanced mathematical tools. They gradually penetrate the classical and quantum dynamical semigroup description, while culminating in the noncommutative Brillouin zone construction as a prerequisite to understand transport in aperiodic solids. Lecture notes are structured into chapters to give a better insight into major conceptual streamlines. Chapter I is devoted to a discussion of non-equilibrium steady states and, through so-called chaotic hypothesis combined with suitable fluctuation theorems, elucidates the role of Sinai-Ruelle-Bowen distribution in both equilibrium and non-equilibrium statistical physics frameworks (E. G. D. Cohen). Links between dynamics and statistics (Boltzmann versus Tsallis) are also discussed. Fluctuation relations and a survey of deterministic thermostats are given in the context of non-equilibrium steady states of fluids (L. Rondoni). Response of systems driven far from equilibrium is analyzed on the basis of a central assertion about the existence of the statistical representation in terms of an ensemble of dynamical realizations of the driving process. Non-equilibrium work relation is deduced for irreversible processes (C. Jarzynski). The survey of non-equilibrium steady states in statistical mechanics of classical and quantum systems employs heat bath models and the random matrix theory input. The quantum heat bath analysis and derivation of fluctuation-dissipation theorems is performed by means of the influence functional technique adopted to solve quantum master equations (D. Kusnezov). Chapter II deals with an issue of relaxation and its dynamical theory in both classical and quantum contexts. Pollicott-Ruelle resonance background for the exponential decay scenario is discussed for irreversible processes of diffusion in the Lorentz gas and multibaker models (P. Gaspard). The Pollicott-Ruelle theory reappears as a major inspiration in the survey of the behaviour of ensembles of chaotic systems, with a focus on model systems for which no rigorous results concerning the exponential decay of correlations in time is available (S. Fishman). The observation, that non-equilibrium transport processes in simple classical chaotic systems can be described in terms of fractal structures developing in the system phase space, links their formation and properties with the entropy production in the course of diffusion processes displaying a low dimensional deterministic (chaotic) origin (J. R. Dorfman). Chapter III offers an introduction to the theory of dynamical semigroups. Asymptotic properties of Markov operators and Markov semigroups acting in the set of probability densities (statistical ensemble notion is implicit) are analyzed. Ergodicity, mixing, strong (complete) mixing and sweeping are discussed in the familiar setting of "noise, chaos and fractals" (R. Rudnicki). The next step comprises a passage to quantum dynamical semigroups and completely positive dynamical maps, with an ultimate goal to introduce a consistent framework for the analysis of irreversible phenomena in open quantum systems, where dissipation and decoherence are crucial concepts (R. Alicki). Friction and damping in classical and quantum mechanics of finite dissipative systems is analyzed by means of Markovian quantum semigroups with special emphasis on the issue of complete positivity (M. Fannes). Specific two-level model systems of elementary particle physics (kaons) and rudiments of neutron interferometry are employed to elucidate a distinction between positivity and complete positivity (F. Benatti). Quantization of dynamics of stochastic models related to equilibrium Gibbs states results in dynamical maps which form quantum stochastic dynamical semigroups (W. A. Majewski). Chapter IV addresses diverse but deeply interrelated features of driven chaotic (mesoscopic) classical and quantum systems, their dissipative properties, notions of quantum irreversibility, entanglement, dephasing and decoherence. A survey of non-perturbative quantum effects for open quantum systems is concluded by outlining the discrepancies between random matrix theory and non-perturbative semiclassical predictions (D. Cohen). As a useful supplement to the subject of bounded open systems, methods of quantum state control in a cavity (coherent versus incoherent dynamics and dissipation) are described for low dimensional quantum systems (A. Buchleitner). The dynamics of open quantum systems can be alternatively described by means of non-Markovian stochastic Schrödinger equation, jointly for an open system and its environment, which moves us beyond the Linblad evolution scenario of Markovian dynamical semigroups. The quantum Brownian motion is considered (W. Strunz) . Chapter V enforces a conceptual transition 'from "small" to "large" systems with emphasis on irreversible thermodynamics of quantum transport. Typical features of the statistical mechanics of infinitely extended systems and the dynamical (small) systems approach are described by means of representative examples of relaxation towards asymptotic steady states: quantum one-dimensional lattice conductor and an open multibaker map (S. Tasaki). Dissipative transport in aperiodic solids is reviewed by invoking methods on noncommutative geometry. The anomalous Drude formula is derived. The occurence of quantum chaos is discussed together with its main consequences (J. Bellissard). The chapter is concluded by a survey of scaling limits of the N-body Schrödinger quantum dynamics, where classical evolution equations of irreversible statistical mechanics (linear Boltzmann, Hartree, Vlasov) emerge "out of quantum". In particular, a scaling limit of one body quantum dynamics with impurities (static random potential) and that of quantum dynamics with weakly coupled phonons are shown to yield the linear Boltzmann equation (L. Erdös). Various interrelations between chapters and individual lectures, plus a detailed fine-tuned information about the subject matter coverage of the volume, can be recovered by examining an extensive index.

  17. A statistical theory for sound radiation and reflection from a duct

    NASA Technical Reports Server (NTRS)

    Cho, Y. C.

    1979-01-01

    A new analytical method is introduced for the study of the sound radiation and reflection from the open end of a duct. The sound is thought of as an aggregation of the quasiparticles-phonons. The motion of the latter is described in terms of the statistical distribution, which is derived from the classical wave theory. The results are in good agreement with the solutions obtained using the Wiener-Hopf technique when the latter is applicable, but the new method is simple and provides straightforward physical interpretation of the problem. Furthermore, it is applicable to a problem involving a duct in which modes are difficult to determine or cannot be defined at all, whereas the Wiener-Hopf technique is not.

  18. Spectra of turbulently advected scalars that have small Schmidt number

    NASA Astrophysics Data System (ADS)

    Hill, Reginald J.

    2017-09-01

    Exact statistical equations are derived for turbulent advection of a passive scalar having diffusivity much larger than the kinematic viscosity, i.e., small Schmidt number. The equations contain all terms needed for precise direct numerical simulation (DNS) quantification. In the appropriate limit, the equations reduce to the classical theory for which the scalar spectrum is proportional to the energy spectrum multiplied by k-4, which, in turn, results in the inertial-diffusive range power law, k-17 /3. The classical theory was derived for the case of isotropic velocity and scalar fields. The exact equations are simplified for less restrictive cases: (1) locally isotropic scalar fluctuations at dissipation scales with no restriction on symmetry of the velocity field, (2) isotropic velocity field with averaging over all wave-vector directions with no restriction on the symmetry of the scalar, motivated by that average being used for DNS, and (3) isotropic velocity field with axisymmetric scalar fluctuations, motivated by the mean-scalar-gradient-source case. The equations are applied to recently published DNSs of passive scalars for the cases of a freely decaying scalar and a mean-scalar-gradient source. New terms in the exact equations are estimated for those cases and are found to be significant; those terms cause the deviations from the classical theory found by the DNS studies. A new formula for the mean-scalar-gradient case explains the variation of the scalar spectra for the DNS of the smallest Schmidt-number cases. Expansion in Legendre polynomials reveals the effect of axisymmetry. Inertial-diffusive-range formulas for both the zero- and second-order Legendre contributions are given. Exact statistical equations reveal what must be quantified using DNS to determine what causes deviations from asymptotic relationships.

  19. Comment on Pisarenko et al., "Characterization of the Tail of the Distribution of Earthquake Magnitudes by Combining the GEV and GPD Descriptions of Extreme Value Theory"

    NASA Astrophysics Data System (ADS)

    Raschke, Mathias

    2016-02-01

    In this short note, I comment on the research of Pisarenko et al. (Pure Appl. Geophys 171:1599-1624, 2014) regarding the extreme value theory and statistics in the case of earthquake magnitudes. The link between the generalized extreme value distribution (GEVD) as an asymptotic model for the block maxima of a random variable and the generalized Pareto distribution (GPD) as a model for the peaks over threshold (POT) of the same random variable is presented more clearly. Inappropriately, Pisarenko et al. (Pure Appl. Geophys 171:1599-1624, 2014) have neglected to note that the approximations by GEVD and GPD work only asymptotically in most cases. This is particularly the case with truncated exponential distribution (TED), a popular distribution model for earthquake magnitudes. I explain why the classical models and methods of the extreme value theory and statistics do not work well for truncated exponential distributions. Consequently, these classical methods should be used for the estimation of the upper bound magnitude and corresponding parameters. Furthermore, I comment on various issues of statistical inference in Pisarenko et al. and propose alternatives. I argue why GPD and GEVD would work for various types of stochastic earthquake processes in time, and not only for the homogeneous (stationary) Poisson process as assumed by Pisarenko et al. (Pure Appl. Geophys 171:1599-1624, 2014). The crucial point of earthquake magnitudes is the poor convergence of their tail distribution to the GPD, and not the earthquake process over time.

  20. RANDOMNESS of Numbers DEFINITION(QUERY:WHAT? V HOW?) ONLY Via MAXWELL-BOLTZMANN CLASSICAL-Statistics(MBCS) Hot-Plasma VS. Digits-Clumping Log-Law NON-Randomness Inversion ONLY BOSE-EINSTEIN QUANTUM-Statistics(BEQS) .

    NASA Astrophysics Data System (ADS)

    Siegel, Z.; Siegel, Edward Carl-Ludwig

    2011-03-01

    RANDOMNESS of Numbers cognitive-semantics DEFINITION VIA Cognition QUERY: WHAT???, NOT HOW?) VS. computer-``science" mindLESS number-crunching (Harrel-Sipser-...) algorithmics Goldreich "PSEUDO-randomness"[Not.AMS(02)] mea-culpa is ONLY via MAXWELL-BOLTZMANN CLASSICAL-STATISTICS(NOT FDQS!!!) "hot-plasma" REPULSION VERSUS Newcomb(1881)-Weyl(1914;1916)-Benford(1938) "NeWBe" logarithmic-law digit-CLUMPING/ CLUSTERING NON-Randomness simple Siegel[AMS Joint.Mtg.(02)-Abs. # 973-60-124] algebraic-inversion to THE QUANTUM and ONLY BEQS preferentially SEQUENTIALLY lower-DIGITS CLUMPING/CLUSTERING with d = 0 BEC, is ONLY VIA Siegel-Baez FUZZYICS=CATEGORYICS (SON OF TRIZ)/"Category-Semantics"(C-S), latter intersection/union of Lawvere(1964)-Siegel(1964)] category-theory (matrix: MORPHISMS V FUNCTORS) "+" cognitive-semantics'' (matrix: ANTONYMS V SYNONYMS) yields Siegel-Baez FUZZYICS=CATEGORYICS/C-S tabular list-format matrix truth-table analytics: MBCS RANDOMNESS TRUTH/EMET!!!

  1. Diagrammar in classical scalar field theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cattaruzza, E., E-mail: Enrico.Cattaruzza@gmail.com; Gozzi, E., E-mail: gozzi@ts.infn.it; INFN, Sezione di Trieste

    2011-09-15

    In this paper we analyze perturbatively a g{phi}{sup 4}classical field theory with and without temperature. In order to do that, we make use of a path-integral approach developed some time ago for classical theories. It turns out that the diagrams appearing at the classical level are many more than at the quantum level due to the presence of extra auxiliary fields in the classical formalism. We shall show that a universal supersymmetry present in the classical path-integral mentioned above is responsible for the cancelation of various diagrams. The same supersymmetry allows the introduction of super-fields and super-diagrams which considerably simplifymore » the calculations and make the classical perturbative calculations almost 'identical' formally to the quantum ones. Using the super-diagrams technique, we develop the classical perturbation theory up to third order. We conclude the paper with a perturbative check of the fluctuation-dissipation theorem. - Highlights: > We provide the Feynman diagrams of perturbation theory for a classical field theory. > We give a super-formalism which links the quantum diagrams to the classical ones. > We check perturbatively the fluctuation-dissipation theorem.« less

  2. PDF-based heterogeneous multiscale filtration model.

    PubMed

    Gong, Jian; Rutland, Christopher J

    2015-04-21

    Motivated by modeling of gasoline particulate filters (GPFs), a probability density function (PDF) based heterogeneous multiscale filtration (HMF) model is developed to calculate filtration efficiency of clean particulate filters. A new methodology based on statistical theory and classic filtration theory is developed in the HMF model. Based on the analysis of experimental porosimetry data, a pore size probability density function is introduced to represent heterogeneity and multiscale characteristics of the porous wall. The filtration efficiency of a filter can be calculated as the sum of the contributions of individual collectors. The resulting HMF model overcomes the limitations of classic mean filtration models which rely on tuning of the mean collector size. Sensitivity analysis shows that the HMF model recovers the classical mean model when the pore size variance is very small. The HMF model is validated by fundamental filtration experimental data from different scales of filter samples. The model shows a good agreement with experimental data at various operating conditions. The effects of the microstructure of filters on filtration efficiency as well as the most penetrating particle size are correctly predicted by the model.

  3. Universality in chaos: Lyapunov spectrum and random matrix theory.

    PubMed

    Hanada, Masanori; Shimada, Hidehiko; Tezuka, Masaki

    2018-02-01

    We propose the existence of a new universality in classical chaotic systems when the number of degrees of freedom is large: the statistical property of the Lyapunov spectrum is described by random matrix theory. We demonstrate it by studying the finite-time Lyapunov exponents of the matrix model of a stringy black hole and the mass-deformed models. The massless limit, which has a dual string theory interpretation, is special in that the universal behavior can be seen already at t=0, while in other cases it sets in at late time. The same pattern is demonstrated also in the product of random matrices.

  4. Universality in chaos: Lyapunov spectrum and random matrix theory

    NASA Astrophysics Data System (ADS)

    Hanada, Masanori; Shimada, Hidehiko; Tezuka, Masaki

    2018-02-01

    We propose the existence of a new universality in classical chaotic systems when the number of degrees of freedom is large: the statistical property of the Lyapunov spectrum is described by random matrix theory. We demonstrate it by studying the finite-time Lyapunov exponents of the matrix model of a stringy black hole and the mass-deformed models. The massless limit, which has a dual string theory interpretation, is special in that the universal behavior can be seen already at t =0 , while in other cases it sets in at late time. The same pattern is demonstrated also in the product of random matrices.

  5. On the relativistic micro-canonical ensemble and relativistic kinetic theory for N relativistic particles in inertial and non-inertial rest frames

    NASA Astrophysics Data System (ADS)

    Alba, David; Crater, Horace W.; Lusanna, Luca

    2015-03-01

    A new formulation of relativistic classical mechanics allows a reconsideration of old unsolved problems in relativistic kinetic theory and in relativistic statistical mechanics. In particular a definition of the relativistic micro-canonical partition function is given strictly in terms of the Poincaré generators of an interacting N-particle system both in the inertial and non-inertial rest frames. The non-relativistic limit allows a definition of both the inertial and non-inertial micro-canonical ensemble in terms of the Galilei generators.

  6. Concepts and their dynamics: a quantum-theoretic modeling of human thought.

    PubMed

    Aerts, Diederik; Gabora, Liane; Sozzo, Sandro

    2013-10-01

    We analyze different aspects of our quantum modeling approach of human concepts and, more specifically, focus on the quantum effects of contextuality, interference, entanglement, and emergence, illustrating how each of them makes its appearance in specific situations of the dynamics of human concepts and their combinations. We point out the relation of our approach, which is based on an ontology of a concept as an entity in a state changing under influence of a context, with the main traditional concept theories, that is, prototype theory, exemplar theory, and theory theory. We ponder about the question why quantum theory performs so well in its modeling of human concepts, and we shed light on this question by analyzing the role of complex amplitudes, showing how they allow to describe interference in the statistics of measurement outcomes, while in the traditional theories statistics of outcomes originates in classical probability weights, without the possibility of interference. The relevance of complex numbers, the appearance of entanglement, and the role of Fock space in explaining contextual emergence, all as unique features of the quantum modeling, are explicitly revealed in this article by analyzing human concepts and their dynamics. © 2013 Cognitive Science Society, Inc.

  7. Quantum-Like Representation of Non-Bayesian Inference

    NASA Astrophysics Data System (ADS)

    Asano, M.; Basieva, I.; Khrennikov, A.; Ohya, M.; Tanaka, Y.

    2013-01-01

    This research is related to the problem of "irrational decision making or inference" that have been discussed in cognitive psychology. There are some experimental studies, and these statistical data cannot be described by classical probability theory. The process of decision making generating these data cannot be reduced to the classical Bayesian inference. For this problem, a number of quantum-like coginitive models of decision making was proposed. Our previous work represented in a natural way the classical Bayesian inference in the frame work of quantum mechanics. By using this representation, in this paper, we try to discuss the non-Bayesian (irrational) inference that is biased by effects like the quantum interference. Further, we describe "psychological factor" disturbing "rationality" as an "environment" correlating with the "main system" of usual Bayesian inference.

  8. Nonclassical acoustics

    NASA Technical Reports Server (NTRS)

    Kentzer, C. P.

    1976-01-01

    A statistical approach to sound propagation is considered in situations where, due to the presence of large gradients of properties of the medium, the classical (deterministic) treatment of wave motion is inadequate. Mathematical methods for wave motions not restricted to small wavelengths (analogous to known methods of quantum mechanics) are used to formulate a wave theory of sound in nonuniform flows. Nonlinear transport equations for field probabilities are derived for the limiting case of noninteracting sound waves and it is postulated that such transport equations, appropriately generalized, may be used to predict the statistical behavior of sound in arbitrary flows.

  9. Interference in the classical probabilistic model and its representation in complex Hilbert space

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei Yu.

    2005-10-01

    The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.

  10. Thermal quantum time-correlation functions from classical-like dynamics

    NASA Astrophysics Data System (ADS)

    Hele, Timothy J. H.

    2017-07-01

    Thermal quantum time-correlation functions are of fundamental importance in quantum dynamics, allowing experimentally measurable properties such as reaction rates, diffusion constants and vibrational spectra to be computed from first principles. Since the exact quantum solution scales exponentially with system size, there has been considerable effort in formulating reliable linear-scaling methods involving exact quantum statistics and approximate quantum dynamics modelled with classical-like trajectories. Here, we review recent progress in the field with the development of methods including centroid molecular dynamics , ring polymer molecular dynamics (RPMD) and thermostatted RPMD (TRPMD). We show how these methods have recently been obtained from 'Matsubara dynamics', a form of semiclassical dynamics which conserves the quantum Boltzmann distribution. We also apply the Matsubara formalism to reaction rate theory, rederiving t → 0+ quantum transition-state theory (QTST) and showing that Matsubara-TST, like RPMD-TST, is equivalent to QTST. We end by surveying areas for future progress.

  11. On the Relationship Between Classical Test Theory and Item Response Theory: From One to the Other and Back.

    PubMed

    Raykov, Tenko; Marcoulides, George A

    2016-04-01

    The frequently neglected and often misunderstood relationship between classical test theory and item response theory is discussed for the unidimensional case with binary measures and no guessing. It is pointed out that popular item response models can be directly obtained from classical test theory-based models by accounting for the discrete nature of the observed items. Two distinct observational equivalence approaches are outlined that render the item response models from corresponding classical test theory-based models, and can each be used to obtain the former from the latter models. Similarly, classical test theory models can be furnished using the reverse application of either of those approaches from corresponding item response models.

  12. Development and validation of the irritable bowel syndrome scale under the system of quality of life instruments for chronic diseases QLICD-IBS: combinations of classical test theory and generalizability theory.

    PubMed

    Lei, Pingguang; Lei, Guanghe; Tian, Jianjun; Zhou, Zengfen; Zhao, Miao; Wan, Chonghua

    2014-10-01

    This paper is aimed to develop the irritable bowel syndrome (IBS) scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-IBS) by the modular approach and validate it by both classical test theory and generalizability theory. The QLICD-IBS was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, and quantitative statistical procedures. One hundred twelve inpatients with IBS were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability, and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t tests and also G studies and D studies of generalizability theory analysis. Multi-trait scaling analysis, correlation, and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. Test-retest reliability coefficients (Pearson r and intra-class correlation (ICC)) for the overall score and all domains were higher than 0.80; the internal consistency α for all domains at two measurements were higher than 0.70 except for the social domain (0.55 and 0.67, respectively). The overall score and scores for all domains/facets had statistically significant changes after treatments with moderate or higher effect size standardized response mean (SRM) ranging from 0.72 to 1.02 at domain levels. G coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-IBS has good validity, reliability, responsiveness, and some highlights and can be used as the quality of life instrument for patients with IBS.

  13. Measuring uncertainty by extracting fuzzy rules using rough sets and extracting fuzzy rules under uncertainty and measuring definability using rough sets

    NASA Technical Reports Server (NTRS)

    Worm, Jeffrey A.; Culas, Donald E.

    1991-01-01

    Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. This paper examines the concepts of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to provide the possible optimal solution. By incorporating principles from these theories, a decision-making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much we believe these rules is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of its fuzzy attributes is studied.

  14. [Speculations regarding electric conductivity, the development of an electron theory of metals and the beginning of solid body physics].

    PubMed

    Wiederkehr, Karl Heinrich

    2010-01-01

    The development of an electron-theory of metals is closely connected with early speculation in the period before Maxwell (W Weber and others) regarding electrical conductivity in metals. These Speculations were in contrast with Faraday's view of an all-embracing molecular dielectric polarisation, and a subsequent passage of charges in metallic conductors. In terms of the empirical law of Wiedemann-Franz-Lorenz, the conductivity of electricity and heat had to be treated commonly. The classical electron-theory of metals (Riecke, Drude, H.A. Lorentz) reached a dead end on account of problems concerned with specific heat capacity. Sommerfeld, by means of the Quantum theory and the Fermi-Statistic, could find the solution.

  15. Dynamically biased statistical model for the ortho/para conversion in the H2 + H3+ → H3+ + H2 reaction.

    PubMed

    Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio

    2012-09-07

    In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007)]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H(5)(+) complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H(5)(+) complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011)] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.

  16. Dynamically biased statistical model for the ortho/para conversion in the H2+H3+ --> H3++ H2 reaction

    NASA Astrophysics Data System (ADS)

    Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio

    2012-09-01

    In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007), 10.1063/1.2430711]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H_5^+ complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H_5^+ complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011), 10.1063/1.3587246] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.

  17. The Effect of Substituting p for alpha on the Unconditional and Conditional Powers of a Null Hypothesis Test.

    ERIC Educational Resources Information Center

    Martuza, Victor R.; Engel, John D.

    Results from classical power analysis (Brewer, 1972) suggest that a researcher should not set a=p (when p is less than a) in a posteriori fashion when a study yields statistically significant results because of a resulting decrease in power. The purpose of the present report is to use Bayesian theory in examining the validity of this…

  18. Transport processes in magnetically confined plasmas in the nonlinear regime.

    PubMed

    Sonnino, Giorgio

    2006-06-01

    A field theory approach to transport phenomena in magnetically confined plasmas is presented. The thermodynamic field theory (TFT), previously developed for treating the generic thermodynamic system out of equilibrium, is applied to plasmas physics. Transport phenomena are treated here as the effect of the field linking the thermodynamic forces with their conjugate flows combined with statistical mechanics. In particular, the Classical and the Pfirsch-Schluter regimes are analyzed by solving the thermodynamic field equations of the TFT in the weak-field approximation. We found that, the TFT does not correct the expressions of the ionic heat fluxes evaluated by the neoclassical theory in these two regimes. On the other hand, the fluxes of matter and electronic energy (heat flow) is further enhanced in the nonlinear Classical and Pfirsch-Schluter regimes. These results seem to be in line with the experimental observations. The complete set of the electronic and ionic transport equations in the nonlinear Banana regime, is also reported. A paper showing the comparison between our theoretic results and the experimental observations in the JET machine is currently in preparation.

  19. Energy flow in non-equilibrium conformal field theory

    NASA Astrophysics Data System (ADS)

    Bernard, Denis; Doyon, Benjamin

    2012-09-01

    We study the energy current and its fluctuations in quantum gapless 1d systems far from equilibrium modeled by conformal field theory, where two separated halves are prepared at distinct temperatures and glued together at a point contact. We prove that these systems converge towards steady states, and give a general description of such non-equilibrium steady states in terms of quantum field theory data. We compute the large deviation function, also called the full counting statistics, of energy transfer through the contact. These are universal and satisfy fluctuation relations. We provide a simple representation of these quantum fluctuations in terms of classical Poisson processes whose intensities are proportional to Boltzmann weights.

  20. Scaling properties of the two-dimensional randomly stirred Navier-Stokes equation.

    PubMed

    Mazzino, Andrea; Muratore-Ginanneschi, Paolo; Musacchio, Stefano

    2007-10-05

    We inquire into the scaling properties of the 2D Navier-Stokes equation sustained by a force field with Gaussian statistics, white noise in time, and with a power-law correlation in momentum space of degree 2 - 2 epsilon. This is at variance with the setting usually assumed to derive Kraichnan's classical theory. We contrast accurate numerical experiments with the different predictions provided for the small epsilon regime by Kraichnan's double cascade theory and by renormalization group analysis. We give clear evidence that for all epsilon, Kraichnan's theory is consistent with the observed phenomenology. Our results call for a revision in the renormalization group analysis of (2D) fully developed turbulence.

  1. Calculations of the surface tensions of liquid metals

    NASA Technical Reports Server (NTRS)

    Stroud, D. G.

    1981-01-01

    The understanding of the surface tension of liquid metals and alloys from as close to first principles as possible is discussed. The two ingredients which are combined in these calculations are: the electron theory of metals, and the classical theory of liquids, as worked out within the framework of statistical mechanics. The results are a new theory of surface tensions and surface density profiles from knowledge purely of the bulk properties of the coexisting liquid and vapor phases. It is found that the method works well for the pure liquid metals on which it was tested; work is extended to mixtures of liquid metals, interfaces between immiscible liquid metals, and to the temperature derivative of the surface tension.

  2. On the Relationship between Classical Test Theory and Item Response Theory: From One to the Other and Back

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2016-01-01

    The frequently neglected and often misunderstood relationship between classical test theory and item response theory is discussed for the unidimensional case with binary measures and no guessing. It is pointed out that popular item response models can be directly obtained from classical test theory-based models by accounting for the discrete…

  3. Fundamental theories of waves and particles formulated without classical mass

    NASA Astrophysics Data System (ADS)

    Fry, J. L.; Musielak, Z. E.

    2010-12-01

    Quantum and classical mechanics are two conceptually and mathematically different theories of physics, and yet they do use the same concept of classical mass that was originally introduced by Newton in his formulation of the laws of dynamics. In this paper, physical consequences of using the classical mass by both theories are explored, and a novel approach that allows formulating fundamental (Galilean invariant) theories of waves and particles without formally introducing the classical mass is presented. In this new formulation, the theories depend only on one common parameter called 'wave mass', which is deduced from experiments for selected elementary particles and for the classical mass of one kilogram. It is shown that quantum theory with the wave mass is independent of the Planck constant and that higher accuracy of performing calculations can be attained by such theory. Natural units in connection with the presented approach are also discussed and justification beyond dimensional analysis is given for the particular choice of such units.

  4. The contrasting roles of Planck's constant in classical and quantum theories

    NASA Astrophysics Data System (ADS)

    Boyer, Timothy H.

    2018-04-01

    We trace the historical appearance of Planck's constant in physics, and we note that initially the constant did not appear in connection with quanta. Furthermore, we emphasize that Planck's constant can appear in both classical and quantum theories. In both theories, Planck's constant sets the scale of atomic phenomena. However, the roles played in the foundations of the theories are sharply different. In quantum theory, Planck's constant is crucial to the structure of the theory. On the other hand, in classical electrodynamics, Planck's constant is optional, since it appears only as the scale factor for the (homogeneous) source-free contribution to the general solution of Maxwell's equations. Since classical electrodynamics can be solved while taking the homogenous source-free contribution in the solution as zero or non-zero, there are naturally two different theories of classical electrodynamics, one in which Planck's constant is taken as zero and one where it is taken as non-zero. The textbooks of classical electromagnetism present only the version in which Planck's constant is taken to vanish.

  5. Taking-On: A Grounded Theory of Addressing Barriers in Task Completion

    ERIC Educational Resources Information Center

    Austinson, Julie Ann

    2011-01-01

    This study of taking-on was conducted using classical grounded theory methodology (Glaser, 1978, 1992, 1998, 2001, 2005; Glaser & Strauss, 1967). Classical grounded theory is inductive, empirical, and naturalistic; it does not utilize manipulation or constrained time frames. Classical grounded theory is a systemic research method used to generate…

  6. Measuring uncertainty by extracting fuzzy rules using rough sets

    NASA Technical Reports Server (NTRS)

    Worm, Jeffrey A.

    1991-01-01

    Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.

  7. On the classic and modern theories of matching.

    PubMed

    McDowell, J J

    2005-07-01

    Classic matching theory, which is based on Herrnstein's (1961) original matching equation and includes the well-known quantitative law of effect, is almost certainly false. The theory is logically inconsistent with known experimental findings, and experiments have shown that its central constant-k assumption is not tenable. Modern matching theory, which is based on the power function version of the original matching equation, remains tenable, although it has not been discussed or studied extensively. The modern theory is logically consistent with known experimental findings, it predicts the fact and details of the violation of the classic theory's constant-k assumption, and it accurately describes at least some data that are inconsistent with the classic theory.

  8. PADÉ APPROXIMANTS FOR THE EQUATION OF STATE FOR RELATIVISTIC HYDRODYNAMICS BY KINETIC THEORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsai, Shang-Hsi; Yang, Jaw-Yen, E-mail: shanghsi@gmail.com

    2015-07-20

    A two-point Padé approximant (TPPA) algorithm is developed for the equation of state (EOS) for relativistic hydrodynamic systems, which are described by the classical Maxwell–Boltzmann statistics and the semiclassical Fermi–Dirac statistics with complete degeneracy. The underlying rational function is determined by the ratios of the macroscopic state variables with various orders of accuracy taken at the extreme relativistic limits. The nonunique TPPAs are validated by Taub's inequality for the consistency of the kinetic theory and the special theory of relativity. The proposed TPPA is utilized in deriving the EOS of the dilute gas and in calculating the specific heat capacity,more » the adiabatic index function, and the isentropic sound speed of the ideal gas. Some general guidelines are provided for the application of an arbitrary accuracy requirement. The superiority of the proposed TPPA is manifested in manipulating the constituent polynomials of the approximants, which avoids the arithmetic complexity of struggling with the modified Bessel functions and the hyperbolic trigonometric functions arising from the relativistic kinetic theory.« less

  9. Entropy, a Unifying Concept: from Physics to Cognitive Psychology

    NASA Astrophysics Data System (ADS)

    Tsallis, Constantino; Tsallis, Alexandra C.

    Together with classical, relativistic and quantum mechanics, as well as Maxwell electromagnetism, Boltzmann-Gibbs (BG) statistical mechanics constitutes one of the main theories of contemporary physics. This theory primarily concerns inanimate matter, and at its generic foundation we find nonlinear dynamical systems satisfying the ergodic hypothesis. This hypothesis is typically guaranteed for systems whose maximal Lyapunov exponent is positive. What happens when this crucial quantity is zero instead? We suggest here that, in what concerns thermostatistical properties, we typically enter what in some sense may be considered as a new world — the world of living systems — . The need emerges, at least for many systems, for generalizing the basis of BG statistical mechanics, namely the Boltzmann-Gibbs-von Neumann-Shannon en-tropic functional form, which connects the oscopic, thermodynamic quantity, with the occurrence probabilities of microscopic configurations. This unifying approach is briefly reviewed here, and its widespread applications — from physics to cognitive psychology — are overviewed. Special attention is dedicated to the learning/memorizing process in humans and computers. The present observations might be related to the gestalt theory of visual perceptions and the actor-network theory.

  10. Analyzing force concept inventory with item response theory

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Bao, Lei

    2010-10-01

    Item response theory is a popular assessment method used in education. It rests on the assumption of a probability framework that relates students' innate ability and their performance on test questions. Item response theory transforms students' raw test scores into a scaled proficiency score, which can be used to compare results obtained with different test questions. The scaled score also addresses the issues of ceiling effects and guessing, which commonly exist in quantitative assessment. We used item response theory to analyze the force concept inventory (FCI). Our results show that item response theory can be useful for analyzing physics concept surveys such as the FCI and produces results about the individual questions and student performance that are beyond the capability of classical statistics. The theory yields detailed measurement parameters regarding the difficulty, discrimination features, and probability of correct guess for each of the FCI questions.

  11. Classical Field Theory and the Stress-Energy Tensor

    NASA Astrophysics Data System (ADS)

    Swanson, Mark S.

    2015-09-01

    This book is a concise introduction to the key concepts of classical field theory for beginning graduate students and advanced undergraduate students who wish to study the unifying structures and physical insights provided by classical field theory without dealing with the additional complication of quantization. In that regard, there are many important aspects of field theory that can be understood without quantizing the fields. These include the action formulation, Galilean and relativistic invariance, traveling and standing waves, spin angular momentum, gauge invariance, subsidiary conditions, fluctuations, spinor and vector fields, conservation laws and symmetries, and the Higgs mechanism, all of which are often treated briefly in a course on quantum field theory. The variational form of classical mechanics and continuum field theory are both developed in the time-honored graduate level text by Goldstein et al (2001). An introduction to classical field theory from a somewhat different perspective is available in Soper (2008). Basic classical field theory is often treated in books on quantum field theory. Two excellent texts where this is done are Greiner and Reinhardt (1996) and Peskin and Schroeder (1995). Green's function techniques are presented in Arfken et al (2013).

  12. Wetting of heterogeneous substrates. A classical density-functional-theory approach

    NASA Astrophysics Data System (ADS)

    Yatsyshin, Peter; Parry, Andrew O.; Rascón, Carlos; Duran-Olivencia, Miguel A.; Kalliadasis, Serafim

    2017-11-01

    Wetting is a nucleation of a third phase (liquid) on the interface between two different phases (solid and gas). In many experimentally accessible cases of wetting, the interplay between the substrate structure, and the fluid-fluid and fluid-substrate intermolecular interactions leads to the appearance of a whole ``zoo'' of exciting interface phase transitions, associated with the formation of nano-droplets/bubbles, and thin films. Practical applications of wetting at small scales are numerous and include the design of lab-on-a-chip devices and superhydrophobic surfaces. In this talk, we will use a fully microscopic approach to explore the phase space of a planar wall, decorated with patches of different hydrophobicity, and demonstrate the highly non-trivial behaviour of the liquid-gas interface near the substrate. We will present fluid density profiles, adsorption isotherms and wetting phase diagrams. Our analysis is based on a formulation of statistical mechanics, commonly known as classical density-functional theory. It provides a computationally-friendly and rigorous framework, suitable for probing small-scale physics of classical fluids and other soft-matter systems. EPSRC Grants No. EP/L027186,EP/K503733;ERC Advanced Grant No. 247031.

  13. Is quantum theory a form of statistical mechanics?

    NASA Astrophysics Data System (ADS)

    Adler, S. L.

    2007-05-01

    We give a review of the basic themes of my recent book: Adler S L 2004 Quantum Theory as an Emergent Phenomenon (Cambridge: Cambridge University Press). We first give motivations for considering the possibility that quantum mechanics is not exact, but is instead an accurate asymptotic approximation to a deeper level theory. For this deeper level, we propose a non-commutative generalization of classical mechanics, that we call "trace dynamics", and we give a brief survey of how it works, considering for simplicity only the bosonic case. We then discuss the statistical mechanics of trace dynamics and give our argument that with suitable approximations, the Ward identities for trace dynamics imply that ensemble averages in the canonical ensemble correspond to Wightman functions in quantum field theory. Thus, quantum theory emerges as the statistical thermodynamics of trace dynamics. Finally, we argue that Brownian motion corrections to this thermodynamics lead to stochastic corrections to the Schrödinger equation, of the type that have been much studied in the "continuous spontaneous localization" model of objective state vector reduction. In appendices to the talk, we give details of the existence of a conserved operator in trace dynamics that encodes the structure of the canonical algebra, of the derivation of the Ward identities, and of the proof that the stochastically-modified Schrödinger equation leads to state vector reduction with Born rule probabilities.

  14. Single-snapshot DOA estimation by using Compressed Sensing

    NASA Astrophysics Data System (ADS)

    Fortunati, Stefano; Grasso, Raffaele; Gini, Fulvio; Greco, Maria S.; LePage, Kevin

    2014-12-01

    This paper deals with the problem of estimating the directions of arrival (DOA) of multiple source signals from a single observation vector of an array data. In particular, four estimation algorithms based on the theory of compressed sensing (CS), i.e., the classical ℓ 1 minimization (or Least Absolute Shrinkage and Selection Operator, LASSO), the fast smooth ℓ 0 minimization, and the Sparse Iterative Covariance-Based Estimator, SPICE and the Iterative Adaptive Approach for Amplitude and Phase Estimation, IAA-APES algorithms, are analyzed, and their statistical properties are investigated and compared with the classical Fourier beamformer (FB) in different simulated scenarios. We show that unlike the classical FB, a CS-based beamformer (CSB) has some desirable properties typical of the adaptive algorithms (e.g., Capon and MUSIC) even in the single snapshot case. Particular attention is devoted to the super-resolution property. Theoretical arguments and simulation analysis provide evidence that a CS-based beamformer can achieve resolution beyond the classical Rayleigh limit. Finally, the theoretical findings are validated by processing a real sonar dataset.

  15. Canonical partition functions: ideal quantum gases, interacting classical gases, and interacting quantum gases

    NASA Astrophysics Data System (ADS)

    Zhou, Chi-Chun; Dai, Wu-Sheng

    2018-02-01

    In statistical mechanics, for a system with a fixed number of particles, e.g. a finite-size system, strictly speaking, the thermodynamic quantity needs to be calculated in the canonical ensemble. Nevertheless, the calculation of the canonical partition function is difficult. In this paper, based on the mathematical theory of the symmetric function, we suggest a method for the calculation of the canonical partition function of ideal quantum gases, including ideal Bose, Fermi, and Gentile gases. Moreover, we express the canonical partition functions of interacting classical and quantum gases given by the classical and quantum cluster expansion methods in terms of the Bell polynomial in mathematics. The virial coefficients of ideal Bose, Fermi, and Gentile gases are calculated from the exact canonical partition function. The virial coefficients of interacting classical and quantum gases are calculated from the canonical partition function by using the expansion of the Bell polynomial, rather than calculated from the grand canonical potential.

  16. A quantum-classical theory with nonlinear and stochastic dynamics

    NASA Astrophysics Data System (ADS)

    Burić, N.; Popović, D. B.; Radonjić, M.; Prvanović, S.

    2014-12-01

    The method of constrained dynamical systems on the quantum-classical phase space is utilized to develop a theory of quantum-classical hybrid systems. Effects of the classical degrees of freedom on the quantum part are modeled using an appropriate constraint, and the interaction also includes the effects of neglected degrees of freedom. Dynamical law of the theory is given in terms of nonlinear stochastic differential equations with Hamiltonian and gradient terms. The theory provides a successful dynamical description of the collapse during quantum measurement.

  17. Quantum Social Science

    NASA Astrophysics Data System (ADS)

    Haven, Emmanuel; Khrennikov, Andrei

    2013-01-01

    Preface; Part I. Physics Concepts in Social Science? A Discussion: 1. Classical, statistical and quantum mechanics: all in one; 2. Econophysics: statistical physics and social science; 3. Quantum social science: a non-mathematical motivation; Part II. Mathematics and Physics Preliminaries: 4. Vector calculus and other mathematical preliminaries; 5. Basic elements of quantum mechanics; 6. Basic elements of Bohmian mechanics; Part III. Quantum Probabilistic Effects in Psychology: Basic Questions and Answers: 7. A brief overview; 8. Interference effects in psychology - an introduction; 9. A quantum-like model of decision making; Part IV. Other Quantum Probabilistic Effects in Economics, Finance and Brain Sciences: 10. Financial/economic theory in crisis; 11. Bohmian mechanics in finance and economics; 12. The Bohm-Vigier Model and path simulation; 13. Other applications to economic/financial theory; 14. The neurophysiological sources of quantum-like processing in the brain; Conclusion; Glossary; Index.

  18. Statistical Mechanics of Coherent Ising Machine — The Case of Ferromagnetic and Finite-Loading Hopfield Models —

    NASA Astrophysics Data System (ADS)

    Aonishi, Toru; Mimura, Kazushi; Utsunomiya, Shoko; Okada, Masato; Yamamoto, Yoshihisa

    2017-10-01

    The coherent Ising machine (CIM) has attracted attention as one of the most effective Ising computing architectures for solving large scale optimization problems because of its scalability and high-speed computational ability. However, it is difficult to implement the Ising computation in the CIM because the theories and techniques of classical thermodynamic equilibrium Ising spin systems cannot be directly applied to the CIM. This means we have to adapt these theories and techniques to the CIM. Here we focus on a ferromagnetic model and a finite loading Hopfield model, which are canonical models sharing a common mathematical structure with almost all other Ising models. We derive macroscopic equations to capture nonequilibrium phase transitions in these models. The statistical mechanical methods developed here constitute a basis for constructing evaluation methods for other Ising computation models.

  19. Statistical Analyses of Hydrophobic Interactions: A Mini-Review

    DOE PAGES

    Pratt, Lawrence R.; Chaudhari, Mangesh I.; Rempe, Susan B.

    2016-07-14

    Here this review focuses on the striking recent progress in solving for hydrophobic interactions between small inert molecules. We discuss several new understandings. First, the inverse temperature phenomenology of hydrophobic interactions, i.e., strengthening of hydrophobic bonds with increasing temperature, is decisively exhibited by hydrophobic interactions between atomic-scale hard sphere solutes in water. Second, inclusion of attractive interactions associated with atomic-size hydrophobic reference cases leads to substantial, nontrivial corrections to reference results for purely repulsive solutes. Hydrophobic bonds are weakened by adding solute dispersion forces to treatment of reference cases. The classic statistical mechanical theory for those corrections is not accuratemore » in this application, but molecular quasi-chemical theory shows promise. Lastly, because of the masking roles of excluded volume and attractive interactions, comparisons that do not discriminate the different possibilities face an interpretive danger.« less

  20. Randomized central limit theorems: A unified theory.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  1. Randomized central limit theorems: A unified theory

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles’ aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles’ extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic—scaling all ensemble components by a common deterministic scale. However, there are “random environment” settings in which the underlying scaling schemes are stochastic—scaling the ensemble components by different random scales. Examples of such settings include Holtsmark’s law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)—in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes—and present “randomized counterparts” to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  2. From Wald to Savage: homo economicus becomes a Bayesian statistician.

    PubMed

    Giocoli, Nicola

    2013-01-01

    Bayesian rationality is the paradigm of rational behavior in neoclassical economics. An economic agent is deemed rational when she maximizes her subjective expected utility and consistently revises her beliefs according to Bayes's rule. The paper raises the question of how, when and why this characterization of rationality came to be endorsed by mainstream economists. Though no definitive answer is provided, it is argued that the question is of great historiographic importance. The story begins with Abraham Wald's behaviorist approach to statistics and culminates with Leonard J. Savage's elaboration of subjective expected utility theory in his 1954 classic The Foundations of Statistics. The latter's acknowledged fiasco to achieve a reinterpretation of traditional inference techniques along subjectivist and behaviorist lines raises the puzzle of how a failed project in statistics could turn into such a big success in economics. Possible answers call into play the emphasis on consistency requirements in neoclassical theory and the impact of the postwar transformation of U.S. business schools. © 2012 Wiley Periodicals, Inc.

  3. From classical to quantum mechanics: ``How to translate physical ideas into mathematical language''

    NASA Astrophysics Data System (ADS)

    Bergeron, H.

    2001-09-01

    Following previous works by E. Prugovečki [Physica A 91A, 202 (1978) and Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)] on common features of classical and quantum mechanics, we develop a unified mathematical framework for classical and quantum mechanics (based on L2-spaces over classical phase space), in order to investigate to what extent quantum mechanics can be obtained as a simple modification of classical mechanics (on both logical and analytical levels). To obtain this unified framework, we split quantum theory in two parts: (i) general quantum axiomatics (a system is described by a state in a Hilbert space, observables are self-adjoints operators, and so on) and (ii) quantum mechanics proper that specifies the Hilbert space as L2(Rn); the Heisenberg rule [pi,qj]=-iℏδij with p=-iℏ∇, the free Hamiltonian H=-ℏ2Δ/2m and so on. We show that general quantum axiomatics (up to a supplementary "axiom of classicity") can be used as a nonstandard mathematical ground to formulate physical ideas and equations of ordinary classical statistical mechanics. So, the question of a "true quantization" with "ℏ" must be seen as an independent physical problem not directly related with quantum formalism. At this stage, we show that this nonstandard formulation of classical mechanics exhibits a new kind of operation that has no classical counterpart: this operation is related to the "quantization process," and we show why quantization physically depends on group theory (the Galilei group). This analytical procedure of quantization replaces the "correspondence principle" (or canonical quantization) and allows us to map classical mechanics into quantum mechanics, giving all operators of quantum dynamics and the Schrödinger equation. The great advantage of this point of view is that quantization is based on concrete physical arguments and not derived from some "pure algebraic rule" (we exhibit also some limit of the correspondence principle). Moreover spins for particles are naturally generated, including an approximation of their interaction with magnetic fields. We also recover by this approach the semi-classical formalism developed by E. Prugovečki [Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)].

  4. The image recognition based on neural network and Bayesian decision

    NASA Astrophysics Data System (ADS)

    Wang, Chugege

    2018-04-01

    The artificial neural network began in 1940, which is an important part of artificial intelligence. At present, it has become a hot topic in the fields of neuroscience, computer science, brain science, mathematics, and psychology. Thomas Bayes firstly reported the Bayesian theory in 1763. After the development in the twentieth century, it has been widespread in all areas of statistics. In recent years, due to the solution of the problem of high-dimensional integral calculation, Bayesian Statistics has been improved theoretically, which solved many problems that cannot be solved by classical statistics and is also applied to the interdisciplinary fields. In this paper, the related concepts and principles of the artificial neural network are introduced. It also summarizes the basic content and principle of Bayesian Statistics, and combines the artificial neural network technology and Bayesian decision theory and implement them in all aspects of image recognition, such as enhanced face detection method based on neural network and Bayesian decision, as well as the image classification based on the Bayesian decision. It can be seen that the combination of artificial intelligence and statistical algorithms has always been the hot research topic.

  5. Principle of minimal work fluctuations.

    PubMed

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)].

  6. The Gibbs paradox and the physical criteria for indistinguishability of identical particles

    NASA Astrophysics Data System (ADS)

    Unnikrishnan, C. S.

    2016-08-01

    Gibbs paradox in the context of statistical mechanics addresses the issue of additivity of entropy of mixing gases. The usual discussion attributes the paradoxical situation to classical distinguishability of identical particles and credits quantum theory for enabling indistinguishability of identical particles to solve the problem. We argue that indistinguishability of identical particles is already a feature in classical mechanics and this is clearly brought out when the problem is treated in the language of information and associated entropy. We pinpoint the physical criteria for indistinguishability that is crucial for the treatment of the Gibbs’ problem and the consistency of its solution with conventional thermodynamics. Quantum mechanics provides a quantitative criterion, not possible in the classical picture, for the degree of indistinguishability in terms of visibility of quantum interference, or overlap of the states as pointed out by von Neumann, thereby endowing the entropy expression with mathematical continuity and physical reasonableness.

  7. Generalized classical and quantum signal theories

    NASA Astrophysics Data System (ADS)

    Rundblad, E.; Labunets, V.; Novak, P.

    2005-05-01

    In this paper we develop two topics and show their inter- and cross-relation. The first centers on general notions of the generalized classical signal theory on finite Abelian hypergroups. The second concerns the generalized quantum hyperharmonic analysis of quantum signals (Hermitean operators associated with classical signals). We study classical and quantum generalized convolution hypergroup algebras of classical and quantum signals.

  8. Philosophical perspectives on quantum chaos: Models and interpretations

    NASA Astrophysics Data System (ADS)

    Bokulich, Alisa Nicole

    2001-09-01

    The problem of quantum chaos is a special case of the larger problem of understanding how the classical world emerges from quantum mechanics. While we have learned that chaos is pervasive in classical systems, it appears to be almost entirely absent in quantum systems. The aim of this dissertation is to determine what implications the interpretation of quantum mechanics has for attempts to explain the emergence of classical chaos. There are three interpretations of quantum mechanics that have set out programs for solving the problem of quantum chaos: the standard interpretation, the statistical interpretation, and the deBroglie-Bohm causal interpretation. One of the main conclusions of this dissertation is that an interpretation alone is insufficient for solving the problem of quantum chaos and that the phenomenon of decoherence must be taken into account. Although a completely satisfactory solution of the problem of quantum chaos is still outstanding, I argue that the deBroglie-Bohm interpretation with the help of decoherence outlines the most promising research program to pursue. In addition to making a contribution to the debate in the philosophy of physics concerning the interpretation of quantum mechanics, this dissertation reveals two important methodological lessons for the philosophy of science. First, issues of reductionism and intertheoretic relations cannot be divorced from questions concerning the interpretation of the theories involved. Not only is the exploration of intertheoretic relations a central part of the articulation and interpretation of an individual theory, but the very terms used to discuss intertheoretic relations, such as `state' and `classical limit', are themselves defined by particular interpretations of the theory. The second lesson that emerges is that, when it comes to characterizing the relationship between classical chaos and quantum mechanics, the traditional approaches to intertheoretic relations, namely reductionism and theoretical pluralism, are inadequate. The fruitful ways in which models have been used in quantum chaos research point to the need for a new framework for addressing intertheoretic relations that focuses on models rather than laws.

  9. Matrix Concentration Inequalities via the Method of Exchangeable Pairs

    DTIC Science & Technology

    2012-01-27

    viewed as an exchangeable pairs version of the Burkholder –Davis–Gundy (BDG) inequality from classical martingale theory [Bur73]. Matrix extensions of...non-commutative probability. Math. Ann., 319:1–16, 2001. [Bur73] D. L. Burkholder . Distribution function inequalities for martingales. Ann. Probab., 1...Statist. Assoc., 58(301):13–30, 1963. [JX03] M. Junge and Q. Xu. Noncommutative Burkholder /Rosenthal inequalities. Ann. Probab., 31(2):948–995, 2003

  10. Decoherence and thermalization of a pure quantum state in quantum field theory.

    PubMed

    Giraud, Alexandre; Serreau, Julien

    2010-06-11

    We study the real-time evolution of a self-interacting O(N) scalar field initially prepared in a pure, coherent quantum state. We present a complete solution of the nonequilibrium quantum dynamics from a 1/N expansion of the two-particle-irreducible effective action at next-to-leading order, which includes scattering and memory effects. We demonstrate that, restricting one's attention (or ability to measure) to a subset of the infinite hierarchy of correlation functions, one observes an effective loss of purity or coherence and, on longer time scales, thermalization. We point out that the physics of decoherence is well described by classical statistical field theory.

  11. Nodal portraits of quantum billiards: Domains, lines, and statistics

    NASA Astrophysics Data System (ADS)

    Jain, Sudhir Ranjan; Samajdar, Rhine

    2017-10-01

    This is a comprehensive review of the nodal domains and lines of quantum billiards, emphasizing a quantitative comparison of theoretical findings to experiments. The nodal statistics are shown to distinguish not only between regular and chaotic classical dynamics but also between different geometric shapes of the billiard system itself. How a random superposition of plane waves can model chaotic eigenfunctions is discussed and the connections of the complex morphology of the nodal lines thereof to percolation theory and Schramm-Loewner evolution are highlighted. Various approaches to counting the nodal domains—using trace formulas, graph theory, and difference equations—are also illustrated with examples. The nodal patterns addressed pertain to waves on vibrating plates and membranes, acoustic and electromagnetic modes, wave functions of a "particle in a box" as well as to percolating clusters, and domains in ferromagnets, thus underlining the diversity and far-reaching implications of the problem.

  12. Asteroid orbital error analysis: Theory and application

    NASA Technical Reports Server (NTRS)

    Muinonen, K.; Bowell, Edward

    1992-01-01

    We present a rigorous Bayesian theory for asteroid orbital error estimation in which the probability density of the orbital elements is derived from the noise statistics of the observations. For Gaussian noise in a linearized approximation the probability density is also Gaussian, and the errors of the orbital elements at a given epoch are fully described by the covariance matrix. The law of error propagation can then be applied to calculate past and future positional uncertainty ellipsoids (Cappellari et al. 1976, Yeomans et al. 1987, Whipple et al. 1991). To our knowledge, this is the first time a Bayesian approach has been formulated for orbital element estimation. In contrast to the classical Fisherian school of statistics, the Bayesian school allows a priori information to be formally present in the final estimation. However, Bayesian estimation does give the same results as Fisherian estimation when no priori information is assumed (Lehtinen 1988, and reference therein).

  13. Geometric entropy and edge modes of the electromagnetic field

    NASA Astrophysics Data System (ADS)

    Donnelly, William; Wall, Aron C.

    2016-11-01

    We calculate the vacuum entanglement entropy of Maxwell theory in a class of curved spacetimes by Kaluza-Klein reduction of the theory onto a two-dimensional base manifold. Using two-dimensional duality, we express the geometric entropy of the electromagnetic field as the entropy of a tower of scalar fields, constant electric and magnetic fluxes, and a contact term, whose leading-order divergence was discovered by Kabat. The complete contact term takes the form of one negative scalar degree of freedom confined to the entangling surface. We show that the geometric entropy agrees with a statistical definition of entanglement entropy that includes edge modes: classical solutions determined by their boundary values on the entangling surface. This resolves a long-standing puzzle about the statistical interpretation of the contact term in the entanglement entropy. We discuss the implications of this negative term for black hole thermodynamics and the renormalization of Newton's constant.

  14. Development and validation of the coronary heart disease scale under the system of quality of life instruments for chronic diseases QLICD-CHD: combinations of classical test theory and Generalizability Theory.

    PubMed

    Wan, Chonghua; Li, Hezhan; Fan, Xuejin; Yang, Ruixue; Pan, Jiahua; Chen, Wenru; Zhao, Rong

    2014-06-04

    Quality of life (QOL) for patients with coronary heart disease (CHD) is now concerned worldwide with the specific instruments being seldom and no one developed by the modular approach. This paper is aimed to develop the CHD scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-CHD) by the modular approach and validate it by both classical test theory and Generalizability Theory. The QLICD-CHD was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, pre-testing and quantitative statistical procedures. 146 inpatients with CHD were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t-tests and also G studies and D studies of Genralizability Theory analysis. Multi-trait scaling analysis, correlation and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. The internal consistency α and test-retest reliability coefficients (Pearson r and Intra-class correlations ICC) for the overall instrument and all domains were higher than 0.70 and 0.80 respectively; The overall and all domains except for social domain had statistically significant changes after treatments with moderate effect size SRM (standardized response mea) ranging from 0.32 to 0.67. G-coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-CHD has good validity, reliability, and moderate responsiveness and some highlights, and can be used as the quality of life instrument for patients with CHD. However, in order to obtain better reliability, the numbers of items for social domain should be increased or the items' quality, not quantity, should be improved.

  15. Deconfinement in Yang-Mills Theory through Toroidal Compactification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simic, Dusan; Unsal, Mithat; /Stanford U., Phys. Dept. /SLAC

    2011-08-12

    We introduce field theory techniques through which the deconfinement transition of four-dimensional Yang-Mills theory can be moved to a semi-classical domain where it becomes calculable using two-dimensional field theory. We achieve this through a double-trace deformation of toroidally compactified Yang-Mills theory on R{sup 2} x S{sub L}{sup 1} x S{sub {beta}}{sup 1}. At large N, fixed-L, and arbitrary {beta}, the thermodynamics of the deformed theory is equivalent to that of ordinary Yang-Mills theory at leading order in the large N expansion. At fixed-N, small L and a range of {beta}, the deformed theory maps to a two-dimensional theory with electricmore » and magnetic (order and disorder) perturbations, analogs of which appear in planar spin-systems and statistical physics. We show that in this regime the deconfinement transition is driven by the competition between electric and magnetic perturbations in this two-dimensional theory. This appears to support the scenario proposed by Liao and Shuryak regarding the magnetic component of the quark-gluon plasma at RHIC.« less

  16. Introduction to Classical Density Functional Theory by a Computational Experiment

    ERIC Educational Resources Information Center

    Jeanmairet, Guillaume; Levy, Nicolas; Levesque, Maximilien; Borgis, Daniel

    2014-01-01

    We propose an in silico experiment to introduce the classical density functional theory (cDFT). Density functional theories, whether quantum or classical, rely on abstract concepts that are nonintuitive; however, they are at the heart of powerful tools and active fields of research in both physics and chemistry. They led to the 1998 Nobel Prize in…

  17. Design Equations and Criteria of Orthotropic Composite Panels

    DTIC Science & Technology

    2013-05-01

    33  Appendix A Classical Laminate Theory ( CLT ): ....................................................................... A–1  Appendix...Science London , 1990. NSWCCD-65-TR–2004/16A A–1 Appendix A Classical Laminate Theory ( CLT ): In Section 6 of this report, preliminary design...determined using:  Classical Laminate Theory, CLT , to Predict Equivalent Stiffness Characteristics, First- Ply Strength Note: CLT is valid for

  18. Single, Complete, Probability Spaces Consistent With EPR-Bohm-Bell Experimental Data

    NASA Astrophysics Data System (ADS)

    Avis, David; Fischer, Paul; Hilbert, Astrid; Khrennikov, Andrei

    2009-03-01

    We show that paradoxical consequences of violations of Bell's inequality are induced by the use of an unsuitable probabilistic description for the EPR-Bohm-Bell experiment. The conventional description (due to Bell) is based on a combination of statistical data collected for different settings of polarization beam splitters (PBSs). In fact, such data consists of some conditional probabilities which only partially define a probability space. Ignoring this conditioning leads to apparent contradictions in the classical probabilistic model (due to Kolmogorov). We show how to make a completely consistent probabilistic model by taking into account the probabilities of selecting the settings of the PBSs. Our model matches both the experimental data and is consistent with classical probability theory.

  19. Using Bayes' theorem for free energy calculations

    NASA Astrophysics Data System (ADS)

    Rogers, David M.

    Statistical mechanics is fundamentally based on calculating the probabilities of molecular-scale events. Although Bayes' theorem has generally been recognized as providing key guiding principals for setup and analysis of statistical experiments [83], classical frequentist models still predominate in the world of computational experimentation. As a starting point for widespread application of Bayesian methods in statistical mechanics, we investigate the central quantity of free energies from this perspective. This dissertation thus reviews the basics of Bayes' view of probability theory, and the maximum entropy formulation of statistical mechanics before providing examples of its application to several advanced research areas. We first apply Bayes' theorem to a multinomial counting problem in order to determine inner shell and hard sphere solvation free energy components of Quasi-Chemical Theory [140]. We proceed to consider the general problem of free energy calculations from samples of interaction energy distributions. From there, we turn to spline-based estimation of the potential of mean force [142], and empirical modeling of observed dynamics using integrator matching. The results of this research are expected to advance the state of the art in coarse-graining methods, as they allow a systematic connection from high-resolution (atomic) to low-resolution (coarse) structure and dynamics. In total, our work on these problems constitutes a critical starting point for further application of Bayes' theorem in all areas of statistical mechanics. It is hoped that the understanding so gained will allow for improvements in comparisons between theory and experiment.

  20. k-Cosymplectic Classical Field Theories: Tulczyjew and Skinner-Rusk Formulations

    NASA Astrophysics Data System (ADS)

    Rey, Angel M.; Román-Roy, Narciso; Salgado, Modesto; Vilariño, Silvia

    2012-06-01

    The k-cosymplectic Lagrangian and Hamiltonian formalisms of first-order classical field theories are reviewed and completed. In particular, they are stated for singular and almost-regular systems. Subsequently, several alternative formulations for k-cosymplectic first-order field theories are developed: First, generalizing the construction of Tulczyjew for mechanics, we give a new interpretation of the classical field equations. Second, the Lagrangian and Hamiltonian formalisms are unified by giving an extension of the Skinner-Rusk formulation on classical mechanics.

  1. Methodological issues regarding power of classical test theory (CTT) and item response theory (IRT)-based approaches for the comparison of patient-reported outcomes in two groups of patients - a simulation study

    PubMed Central

    2010-01-01

    Background Patients-Reported Outcomes (PRO) are increasingly used in clinical and epidemiological research. Two main types of analytical strategies can be found for these data: classical test theory (CTT) based on the observed scores and models coming from Item Response Theory (IRT). However, whether IRT or CTT would be the most appropriate method to analyse PRO data remains unknown. The statistical properties of CTT and IRT, regarding power and corresponding effect sizes, were compared. Methods Two-group cross-sectional studies were simulated for the comparison of PRO data using IRT or CTT-based analysis. For IRT, different scenarios were investigated according to whether items or person parameters were assumed to be known, to a certain extent for item parameters, from good to poor precision, or unknown and therefore had to be estimated. The powers obtained with IRT or CTT were compared and parameters having the strongest impact on them were identified. Results When person parameters were assumed to be unknown and items parameters to be either known or not, the power achieved using IRT or CTT were similar and always lower than the expected power using the well-known sample size formula for normally distributed endpoints. The number of items had a substantial impact on power for both methods. Conclusion Without any missing data, IRT and CTT seem to provide comparable power. The classical sample size formula for CTT seems to be adequate under some conditions but is not appropriate for IRT. In IRT, it seems important to take account of the number of items to obtain an accurate formula. PMID:20338031

  2. [Scale Relativity Theory in living beings morphogenesis: fratal, determinism and chance].

    PubMed

    Chaline, J

    2012-10-01

    The Scale Relativity Theory has many biological applications from linear to non-linear and, from classical mechanics to quantum mechanics. Self-similar laws have been used as model for the description of a huge number of biological systems. Theses laws may explain the origin of basal life structures. Log-periodic behaviors of acceleration or deceleration can be applied to branching macroevolution, to the time sequences of major evolutionary leaps. The existence of such a law does not mean that the role of chance in evolution is reduced, but instead that randomness and contingency may occur within a framework which may itself be structured in a partly statistical way. The scale relativity theory can open new perspectives in evolution. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  3. How do Durkheimian variables impact variation in national suicide rates when proxies for depression and alcoholism are controlled?

    PubMed

    Fernquist, Robert M

    2007-01-01

    Sociological research on Durkheim's theories of egoistic and anomic suicide has given Durkheim continued support more than a century after Durkheim published his work. Recent criticism by Breault (1994), though, argues that Durkheim's theories of suicide actually have not been empirically supported given the lack of psychological variables included in sociological research on suicide rates. Using proxy measures of depression and alcoholism, two known psychological variables to impact suicide, as well as classic Durkheimian variables, suicide rates in eight European countries from 1973-1997 were examined. Results indicate that Durkheim's theories of egoism and anomie, while not completely supported in statistical analysis of suicide rates, received moderate support. Results suggest the continued usefulness of Durkheim's work in aggregate analyses of suicide.

  4. Poincaré resonances and the limits of trajectory dynamics.

    PubMed Central

    Petrosky, T; Prigogine, I

    1993-01-01

    In previous papers we have shown that the elimination of the resonance divergences in large Poincare systems leads to complex irreducible spectral representations for the Liouville-von Neumann operator. Complex means that time symmetry is broken and irreducibility means that this representation is implementable only by statistical ensembles and not by trajectories. We consider in this paper classical potential scattering. Our theory applies to persistent scattering. Numerical simulations show quantitative agreement with our predictions. PMID:11607428

  5. Non-equilibrium statistical mechanics theory for the large scales of geophysical flows

    NASA Astrophysics Data System (ADS)

    Eric, S.; Bouchet, F.

    2010-12-01

    The aim of any theory of turbulence is to understand the statistical properties of the velocity field. As a huge number of degrees of freedom is involved, statistical mechanics is a natural approach. The self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. We discuss classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations, mean field approach) and thermodynamic concepts (ensemble inequivalence, negative heat capacity) are briefly explained and used to predict statistical equilibria for turbulent flows. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations will be discussed. We also present recent results for non-equilibrium situations, for which forces and dissipation are in a statistical balance. As an example, the concept of phase transition allows us to describe drastic changes of the whole system when a few external parameters are changed. F. Bouchet and E. Simonnet, Random Changes of Flow Topology in Two-Dimensional and Geophysical Turbulence, Physical Review Letters 102 (2009), no. 9, 094504-+. F. Bouchet and J. Sommeria, Emergence of intense jets and Jupiter's Great Red Spot as maximum-entropy structures, Journal of Fluid Mechanics 464 (2002), 165-207. A. Venaille and F. Bouchet, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. Bouchet and A. Venaille, Statistical mechanics of two-dimensional and geophysical flows, submitted to Physics Reports Non-equilibrium phase transitions for the 2D Navier-Stokes equations with stochastic forces (time series and probability density functions (PDFs) of the modulus of the largest scale Fourrier component, showing bistability between dipole and unidirectional flows). This bistability is predicted by statistical mechanics.

  6. Collective stimulated Brillouin backscatter

    NASA Astrophysics Data System (ADS)

    Lushnikov, Pavel; Rose, Harvey

    2007-11-01

    We develop the statistical theory of linear collective stimulated Brillouin backscatter (CBSBS) in spatially and temporally incoherent laser beam. Instability is collective because it does not depend on the dynamics of isolated hot spots (speckles) of laser intensity, but rather depends on averaged laser beam intensity, optic f/#, and laser coherence time, Tc. CBSBS has a much larger threshold than a classical coherent beam's in long-scale-length high temperature plasma. It is a novel regime in which Tc is too large for applicability of well-known statistical theories (RPA) but Tc must be small enough to suppress single speckle processes such as self-focusing. Even if laser Tc is too large for a priori applicability of our theory, collective forward SBS^1, perhaps enhanced by high Z dopant, and its resultant self-induced Tc reduction, may regain the CBSBS regime. We identified convective and absolute CBSBS regimes. The threshold of convective instability is inside the typical parameter region of NIF designs. Well above incoherent threshold, the coherent instability growth rate is recovered. ^1 P.M. Lushnikov and H.A. Rose, Plasma Physics and Controlled Fusion, 48, 1501 (2006).

  7. Automatic data-processing equipment of moon mark of nail for verifying some experiential theory of Traditional Chinese Medicine.

    PubMed

    Niu, Renjie; Fu, Chenyu; Xu, Zhiyong; Huang, Jianyuan

    2016-04-29

    Doctors who practice Traditional Chinese Medicine (TCM) diagnose using four methods - inspection, auscultation and olfaction, interrogation, and pulse feeling/palpation. The shape and shape changes of the moon marks on the nails are an important indication when judging the patient's health. There are a series of classical and experimental theories about moon marks in TCM, which does not have support from statistical data. To verify some experiential theories on moon mark in TCM by automatic data-processing equipment. This paper proposes the equipment that utilizes image processing technology to collect moon mark data of different target groups conveniently and quickly, building a database that combines this information with that gathered from the health and mental status questionnaire in each test. This equipment has a simple design, a low cost, and an optimized algorithm. The practice has been proven to quickly complete automatic acquisition and preservation of key data about moon marks. In the future, some conclusions will likely be obtained from these data; some changes of moon marks related to a special pathological change will be established with statistical methods.

  8. Nonparametric functional data estimation applied to ozone data: prediction and extreme value analysis.

    PubMed

    Quintela-del-Río, Alejandro; Francisco-Fernández, Mario

    2011-02-01

    The study of extreme values and prediction of ozone data is an important topic of research when dealing with environmental problems. Classical extreme value theory is usually used in air-pollution studies. It consists in fitting a parametric generalised extreme value (GEV) distribution to a data set of extreme values, and using the estimated distribution to compute return levels and other quantities of interest. Here, we propose to estimate these values using nonparametric functional data methods. Functional data analysis is a relatively new statistical methodology that generally deals with data consisting of curves or multi-dimensional variables. In this paper, we use this technique, jointly with nonparametric curve estimation, to provide alternatives to the usual parametric statistical tools. The nonparametric estimators are applied to real samples of maximum ozone values obtained from several monitoring stations belonging to the Automatic Urban and Rural Network (AURN) in the UK. The results show that nonparametric estimators work satisfactorily, outperforming the behaviour of classical parametric estimators. Functional data analysis is also used to predict stratospheric ozone concentrations. We show an application, using the data set of mean monthly ozone concentrations in Arosa, Switzerland, and the results are compared with those obtained by classical time series (ARIMA) analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Generalized probability theories: what determines the structure of quantum theory?

    NASA Astrophysics Data System (ADS)

    Janotta, Peter; Hinrichsen, Haye

    2014-08-01

    The framework of generalized probabilistic theories is a powerful tool for studying the foundations of quantum physics. It provides the basis for a variety of recent findings that significantly improve our understanding of the rich physical structure of quantum theory. This review paper tries to present the framework and recent results to a broader readership in an accessible manner. To achieve this, we follow a constructive approach. Starting from a few basic physically motivated assumptions we show how a given set of observations can be manifested in an operational theory. Furthermore, we characterize consistency conditions limiting the range of possible extensions. In this framework classical and quantum theory appear as special cases, and the aim is to understand what distinguishes quantum mechanics as the fundamental theory realized in nature. It turns out that non-classical features of single systems can equivalently result from higher-dimensional classical theories that have been restricted. Entanglement and non-locality, however, are shown to be genuine non-classical features.

  10. Effective field theory of dissipative fluids

    DOE PAGES

    Crossley, Michael; Glorioso, Paolo; Liu, Hong

    2017-09-20

    We develop an effctive fi eld theory for dissipative fluids which governs the dynamics of long-lived gapless modes associated with conserved quantities. The resulting theory gives a path integral formulation of fluctuating hydrodynamics which systematically incorporates nonlinear interactions of noises. The dynamical variables are mappings between a "fluid spacetime" and the physical spacetime and an essential aspect of our formulation is to identify the appropriate symmetries in the fluid spacetime. The theory applies to nonlinear disturbances around a general density matrix. For a thermal density matrix, we require an additional Z2 symmetry, to which we refer as the local KMSmore » condition. This leads to the standard constraints of hydrodynamics, as well as a nonlinear generalization of the Onsager relations. It also leads to an emergent supersymmetry in the classical statistical regime, and a higher derivative deformation of supersymmetry in the full quantum regime.« less

  11. Effective field theory of dissipative fluids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crossley, Michael; Glorioso, Paolo; Liu, Hong

    We develop an effctive fi eld theory for dissipative fluids which governs the dynamics of long-lived gapless modes associated with conserved quantities. The resulting theory gives a path integral formulation of fluctuating hydrodynamics which systematically incorporates nonlinear interactions of noises. The dynamical variables are mappings between a "fluid spacetime" and the physical spacetime and an essential aspect of our formulation is to identify the appropriate symmetries in the fluid spacetime. The theory applies to nonlinear disturbances around a general density matrix. For a thermal density matrix, we require an additional Z2 symmetry, to which we refer as the local KMSmore » condition. This leads to the standard constraints of hydrodynamics, as well as a nonlinear generalization of the Onsager relations. It also leads to an emergent supersymmetry in the classical statistical regime, and a higher derivative deformation of supersymmetry in the full quantum regime.« less

  12. The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes

    ERIC Educational Resources Information Center

    Cartier, Stephen F.

    2011-01-01

    A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…

  13. Interactions Dominate the Dynamics of Visual Cognition

    PubMed Central

    Stephen, Damian G.; Mirman, Daniel

    2010-01-01

    Many cognitive theories have described behavior as the summation of independent contributions from separate components. Contrasting views have emphasized the importance of multiplicative interactions and emergent structure. We describe a statistical approach to distinguishing additive and multiplicative processes and apply it to the dynamics of eye movements during classic visual cognitive tasks. The results reveal interaction-dominant dynamics in eye movements in each of the three tasks, and that fine-grained eye movements are modulated by task constraints. These findings reveal the interactive nature of cognitive processing and are consistent with theories that view cognition as an emergent property of processes that are broadly distributed over many scales of space and time rather than a componential assembly line. PMID:20070957

  14. A model of gene expression based on random dynamical systems reveals modularity properties of gene regulatory networks.

    PubMed

    Antoneli, Fernando; Ferreira, Renata C; Briones, Marcelo R S

    2016-06-01

    Here we propose a new approach to modeling gene expression based on the theory of random dynamical systems (RDS) that provides a general coupling prescription between the nodes of any given regulatory network given the dynamics of each node is modeled by a RDS. The main virtues of this approach are the following: (i) it provides a natural way to obtain arbitrarily large networks by coupling together simple basic pieces, thus revealing the modularity of regulatory networks; (ii) the assumptions about the stochastic processes used in the modeling are fairly general, in the sense that the only requirement is stationarity; (iii) there is a well developed mathematical theory, which is a blend of smooth dynamical systems theory, ergodic theory and stochastic analysis that allows one to extract relevant dynamical and statistical information without solving the system; (iv) one may obtain the classical rate equations form the corresponding stochastic version by averaging the dynamic random variables (small noise limit). It is important to emphasize that unlike the deterministic case, where coupling two equations is a trivial matter, coupling two RDS is non-trivial, specially in our case, where the coupling is performed between a state variable of one gene and the switching stochastic process of another gene and, hence, it is not a priori true that the resulting coupled system will satisfy the definition of a random dynamical system. We shall provide the necessary arguments that ensure that our coupling prescription does indeed furnish a coupled regulatory network of random dynamical systems. Finally, the fact that classical rate equations are the small noise limit of our stochastic model ensures that any validation or prediction made on the basis of the classical theory is also a validation or prediction of our model. We illustrate our framework with some simple examples of single-gene system and network motifs. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Rasch-family models are more valuable than score-based approaches for analysing longitudinal patient-reported outcomes with missing data.

    PubMed

    de Bock, Élodie; Hardouin, Jean-Benoit; Blanchin, Myriam; Le Neel, Tanguy; Kubis, Gildas; Bonnaud-Antignac, Angélique; Dantan, Étienne; Sébille, Véronique

    2016-10-01

    The objective was to compare classical test theory and Rasch-family models derived from item response theory for the analysis of longitudinal patient-reported outcomes data with possibly informative intermittent missing items. A simulation study was performed in order to assess and compare the performance of classical test theory and Rasch model in terms of bias, control of the type I error and power of the test of time effect. The type I error was controlled for classical test theory and Rasch model whether data were complete or some items were missing. Both methods were unbiased and displayed similar power with complete data. When items were missing, Rasch model remained unbiased and displayed higher power than classical test theory. Rasch model performed better than the classical test theory approach regarding the analysis of longitudinal patient-reported outcomes with possibly informative intermittent missing items mainly for power. This study highlights the interest of Rasch-based models in clinical research and epidemiology for the analysis of incomplete patient-reported outcomes data. © The Author(s) 2013.

  16. Telling and Not-Telling: A Classic Grounded Theory of Sharing Life-Stories

    ERIC Educational Resources Information Center

    Powers, Trudy Lee

    2013-01-01

    This study of "Telling and Not-Telling" was conducted using the classic grounded theory methodology (Glaser 1978, 1992, 1998; Glaser & Strauss, 1967). This unique methodology systematically and inductively generates conceptual theories from data. The goal is to discover theory that explains, predicts, and provides practical…

  17. Buoyant Turbulence Kinetic Energy (TKE) Production in Katabatic Flow Despite Stable Thermal Stratification

    NASA Astrophysics Data System (ADS)

    Oldroyd, H. J.; Pardyjak, E.; Higgins, C. W.; Parlange, M. B.

    2015-12-01

    As micrometeorological research shifts to increasingly non-idealized environments, the lens through which we view classical atmospheric boundary layer theory must also shift to accommodate unfamiliar behavior. We present observations of katabatic flow over a steep (35.5 degree), alpine slope and draw comparisons with classical theory for nocturnal boundary layers (NBL) over flat terrain to delineate key physical differences and similarities. In both cases, the NBL is characterized by a strong, terrain-aligned thermal stratification. Over flat terrain, this temperature inversion tends to stabilize perturbations and suppresses vertical motions. Hence, the buoyancy term in the TKE budget equation acts as a sink. In contrast, the steep-slope katabatic flow regime is characterized by buoyant TKE production despite NBL thermal stratification. This buoyant TKE production occurs because streamwise (upslope) heat fluxes, which are typically treated as unimportant over flat terrain, contribute to the total vertical buoyancy flux since the gravity vector is not terrain-normal. Due to a relatively small number of observations over steep terrain, the turbulence structure of such flows and the implications of buoyant TKE production in the NBL have gone largely unexplored. As an important consequence of this characteristic, we show that conventional stability characterizations require careful coordinate system alignment and interpretation for katabatic flows. The streamwise heat fluxes play an integral role in characterizing stability and turbulent transport, more broadly, in katabatic flows. Therefore, multi-scale statistics and budget analyses describing physical interactions between turbulent fluxes at various scales are presented to interpret similarities and differences between the observations and classical theories regarding streamwise heat fluxes.

  18. Integral equations in the study of polar and ionic interaction site fluids

    PubMed Central

    Howard, Jesse J.

    2011-01-01

    In this review article we consider some of the current integral equation approaches and application to model polar liquid mixtures. We consider the use of multidimensional integral equations and in particular progress on the theory and applications of three dimensional integral equations. The IEs we consider may be derived from equilibrium statistical mechanical expressions incorporating a classical Hamiltonian description of the system. We give example including salt solutions, inhomogeneous solutions and systems including proteins and nucleic acids. PMID:22383857

  19. Competitive-Cooperative Automated Reasoning from Distributed and Multiple Source of Data

    NASA Astrophysics Data System (ADS)

    Fard, Amin Milani

    Knowledge extraction from distributed database systems, have been investigated during past decade in order to analyze billions of information records. In this work a competitive deduction approach in a heterogeneous data grid environment is proposed using classic data mining and statistical methods. By applying a game theory concept in a multi-agent model, we tried to design a policy for hierarchical knowledge discovery and inference fusion. To show the system run, a sample multi-expert system has also been developed.

  20. Accessible Information Without Disturbing Partially Known Quantum States on a von Neumann Algebra

    NASA Astrophysics Data System (ADS)

    Kuramochi, Yui

    2018-04-01

    This paper addresses the problem of how much information we can extract without disturbing a statistical experiment, which is a family of partially known normal states on a von Neumann algebra. We define the classical part of a statistical experiment as the restriction of the equivalent minimal sufficient statistical experiment to the center of the outcome space, which, in the case of density operators on a Hilbert space, corresponds to the classical probability distributions appearing in the maximal decomposition by Koashi and Imoto (Phys. Rev. A 66, 022,318 2002). We show that we can access by a Schwarz or completely positive channel at most the classical part of a statistical experiment if we do not disturb the states. We apply this result to the broadcasting problem of a statistical experiment. We also show that the classical part of the direct product of statistical experiments is the direct product of the classical parts of the statistical experiments. The proof of the latter result is based on the theorem that the direct product of minimal sufficient statistical experiments is also minimal sufficient.

  1. Quantum statistics and squeezing for a microwave-driven interacting magnon system.

    PubMed

    Haghshenasfard, Zahra; Cottam, Michael G

    2017-02-01

    Theoretical studies are reported for the statistical properties of a microwave-driven interacting magnon system. Both the magnetic dipole-dipole and the exchange interactions are included and the theory is developed for the case of parallel pumping allowing for the inclusion of the nonlinear processes due to the four-magnon interactions. The method of second quantization is used to transform the total Hamiltonian from spin operators to boson creation and annihilation operators. By using the coherent magnon state representation we have studied the magnon occupation number and the statistical behavior of the system. In particular, it is shown that the nonlinearities introduced by the parallel pumping field and the four-magnon interactions lead to non-classical quantum statistical properties of the system, such as magnon squeezing. Also control of the collapse-and-revival phenomena for the time evolution of the average magnon number is demonstrated by varying the parallel pumping amplitude and the four-magnon coupling.

  2. Parametric resonance in tunable superconducting cavities

    NASA Astrophysics Data System (ADS)

    Wustmann, Waltraut; Shumeiko, Vitaly

    2013-05-01

    We develop a theory of parametric resonance in tunable superconducting cavities. The nonlinearity introduced by the superconducting quantum interference device (SQUID) attached to the cavity and damping due to connection of the cavity to a transmission line are taken into consideration. We study in detail the nonlinear classical dynamics of the cavity field below and above the parametric threshold for the degenerate parametric resonance, featuring regimes of multistability and parametric radiation. We investigate the phase-sensitive amplification of external signals on resonance, as well as amplification of detuned signals, and relate the amplifier performance to that of linear parametric amplifiers. We also discuss applications of the device for dispersive qubit readout. Beyond the classical response of the cavity, we investigate small quantum fluctuations around the amplified classical signals. We evaluate the noise power spectrum both for the internal field in the cavity and the output field. Other quantum-statistical properties of the noise are addressed such as squeezing spectra, second-order coherence, and two-mode entanglement.

  3. Statistical mechanics based on fractional classical and quantum mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korichi, Z.; Meftah, M. T., E-mail: mewalid@yahoo.com

    2014-03-15

    The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.

  4. Finite-block-length analysis in classical and quantum information theory.

    PubMed

    Hayashi, Masahito

    2017-01-01

    Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects.

  5. Finite-block-length analysis in classical and quantum information theory

    PubMed Central

    HAYASHI, Masahito

    2017-01-01

    Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects. PMID:28302962

  6. Power spectra as a diagnostic tool in probing statistical/nonstatistical behavior in unimolecular reactions

    NASA Astrophysics Data System (ADS)

    Chang, Xiaoyen Y.; Sewell, Thomas D.; Raff, Lionel M.; Thompson, Donald L.

    1992-11-01

    The possibility of utilizing different types of power spectra obtained from classical trajectories as a diagnostic tool to identify the presence of nonstatistical dynamics is explored by using the unimolecular bond-fission reactions of 1,2-difluoroethane and the 2-chloroethyl radical as test cases. In previous studies, the reaction rates for these systems were calculated by using a variational transition-state theory and classical trajectory methods. A comparison of the results showed that 1,2-difluoroethane is a nonstatistical system, while the 2-chloroethyl radical behaves statistically. Power spectra for these two systems have been generated under various conditions. The characteristics of these spectra are as follows: (1) The spectra for the 2-chloroethyl radical are always broader and more coupled to other modes than is the case for 1,2-difluoroethane. This is true even at very low levels of excitation. (2) When an internal energy near or above the dissociation threshold is initially partitioned into a local C-H stretching mode, the power spectra for 1,2-difluoroethane broaden somewhat, but discrete and somewhat isolated bands are still clearly evident. In contrast, the analogous power spectra for the 2-chloroethyl radical exhibit a near complete absence of isolated bands. The general appearance of the spectrum suggests a very high level of mode-to-mode coupling, large intramolecular vibrational energy redistribution (IVR) rates, and global statistical behavior. (3) The appearance of the power spectrum for the 2-chloroethyl radical is unaltered regardless of whether the initial C-H excitation is in the CH2 or the CH2Cl group. This result also suggests statistical behavior. These results are interpreted to mean that power spectra may be used as a diagnostic tool to assess the statistical character of a system. The presence of a diffuse spectrum exhibiting a nearly complete loss of isolated structures indicates that the dissociation dynamics of the molecule will be well described by statistical theories. If, however, the power spectrum maintains its discrete, isolated character, as is the case for 1,2-difluoroethane, the opposite conclusion is suggested. Since power spectra are very easily computed, this diagnostic method may prove to be useful.

  7. Koopman-von Neumann formulation of classical Yang-Mills theories: I

    NASA Astrophysics Data System (ADS)

    Carta, P.; Gozzi, E.; Mauro, D.

    2006-03-01

    In this paper we present the Koopman-von Neumann (KvN) formulation of classical non-Abelian gauge field theories. In particular we shall explore the functional (or classical path integral) counterpart of the KvN method. In the quantum path integral quantization of Yang-Mills theories concepts like gauge-fixing and Faddeev-Popov determinant appear in a quite natural way. We will prove that these same objects are needed also in this classical path integral formulation for Yang-Mills theories. We shall also explore the classical path integral counterpart of the BFV formalism and build all the associated universal and gauge charges. These last are quite different from the analog quantum ones and we shall show the relation between the two. This paper lays the foundation of this formalism which, due to the many auxiliary fields present, is rather heavy. Applications to specific topics outlined in the paper will appear in later publications.

  8. a Classical Isodual Theory of Antimatter and its Prediction of Antigravity

    NASA Astrophysics Data System (ADS)

    Santilli, Ruggero Maria

    An inspection of the contemporary physics literature reveals that, while matter is treated at all levels of study, from Newtonian mechanics to quantum field theory, antimatter is solely treated at the level of second quantization. For the purpose of initiating the restoration of full equivalence in the treatment of matter and antimatter in due time, and as the classical foundations of an axiomatically consistent inclusion of gravitation in unified gauge theories recently appeared elsewhere, in this paper we present a classical representation of antimatter which begins at the primitive Newtonian level with corresponding formulations at all subsequent levels. By recalling that charge conjugation of particles into antiparticles is antiautomorphic, the proposed theory of antimatter is based on a new map, called isoduality, which is also antiautomorphic (and more generally, antiisomorphic), yet it is applicable beginning at the classical level and then persists at the quantum level where it becomes equivalent to charge conjugation. We therefore present, apparently for the first time, the classical isodual theory of antimatter, we identify the physical foundations of the theory as being the novel isodual Galilean, special and general relativities, and we show the compatibility of the theory with all available classical experimental data on antimatter. We identify the classical foundations of the prediction of antigravity for antimatter in the field of matter (or vice-versa) without any claim on its validity, and defer its resolution to specifically identified experiments. We identify the novel, classical, isodual electromagnetic waves which are predicted to be emitted by antimatter, the so-called space-time machine based on a novel non-Newtonian geometric propulsion, and other implications of the theory. We also introduce, apparently for the first time, the isodual space and time inversions and show that they are nontrivially different than the conventional ones, thus offering a possibility for the future resolution whether far away galaxies and quasars are made up of matter or of antimatter. The paper ends with the indication that the studies are at their first infancy, and indicates some of the open problems. To avoid a prohibitive length, the paper is restricted to the classical treatment, while studies on operator profiles are treated elsewhere.

  9. A psychometric evaluation of the digital logic concept inventory

    NASA Astrophysics Data System (ADS)

    Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.

    2014-10-01

    Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric evaluation). Classical Test Theory and Item Response Theory provide two psychometric frameworks for evaluating the quality of assessment tools. We discuss how these theories can be applied to assessment tools generally and then apply them to the Digital Logic Concept Inventory (DLCI). We demonstrate that the DLCI is sufficiently reliable for research purposes when used in its entirety and as a post-course assessment of students' conceptual understanding of digital logic. The DLCI can also discriminate between students across a wide range of ability levels, providing the most information about weaker students' ability levels.

  10. SurfKin: an ab initio kinetic code for modeling surface reactions.

    PubMed

    Le, Thong Nguyen-Minh; Liu, Bin; Huynh, Lam K

    2014-10-05

    In this article, we describe a C/C++ program called SurfKin (Surface Kinetics) to construct microkinetic mechanisms for modeling gas-surface reactions. Thermodynamic properties of reaction species are estimated based on density functional theory calculations and statistical mechanics. Rate constants for elementary steps (including adsorption, desorption, and chemical reactions on surfaces) are calculated using the classical collision theory and transition state theory. Methane decomposition and water-gas shift reaction on Ni(111) surface were chosen as test cases to validate the code implementations. The good agreement with literature data suggests this is a powerful tool to facilitate the analysis of complex reactions on surfaces, and thus it helps to effectively construct detailed microkinetic mechanisms for such surface reactions. SurfKin also opens a possibility for designing nanoscale model catalysts. Copyright © 2014 Wiley Periodicals, Inc.

  11. Classical Statistics and Statistical Learning in Imaging Neuroscience

    PubMed Central

    Bzdok, Danilo

    2017-01-01

    Brain-imaging research has predominantly generated insight by means of classical statistics, including regression-type analyses and null-hypothesis testing using t-test and ANOVA. Throughout recent years, statistical learning methods enjoy increasing popularity especially for applications in rich and complex data, including cross-validated out-of-sample prediction using pattern classification and sparsity-inducing regression. This concept paper discusses the implications of inferential justifications and algorithmic methodologies in common data analysis scenarios in neuroimaging. It is retraced how classical statistics and statistical learning originated from different historical contexts, build on different theoretical foundations, make different assumptions, and evaluate different outcome metrics to permit differently nuanced conclusions. The present considerations should help reduce current confusion between model-driven classical hypothesis testing and data-driven learning algorithms for investigating the brain with imaging techniques. PMID:29056896

  12. Statistical Learning Theory for High Dimensional Prediction: Application to Criterion-Keyed Scale Development

    PubMed Central

    Chapman, Benjamin P.; Weiss, Alexander; Duberstein, Paul

    2016-01-01

    Statistical learning theory (SLT) is the statistical formulation of machine learning theory, a body of analytic methods common in “big data” problems. Regression-based SLT algorithms seek to maximize predictive accuracy for some outcome, given a large pool of potential predictors, without overfitting the sample. Research goals in psychology may sometimes call for high dimensional regression. One example is criterion-keyed scale construction, where a scale with maximal predictive validity must be built from a large item pool. Using this as a working example, we first introduce a core principle of SLT methods: minimization of expected prediction error (EPE). Minimizing EPE is fundamentally different than maximizing the within-sample likelihood, and hinges on building a predictive model of sufficient complexity to predict the outcome well, without undue complexity leading to overfitting. We describe how such models are built and refined via cross-validation. We then illustrate how three common SLT algorithms–Supervised Principal Components, Regularization, and Boosting—can be used to construct a criterion-keyed scale predicting all-cause mortality, using a large personality item pool within a population cohort. Each algorithm illustrates a different approach to minimizing EPE. Finally, we consider broader applications of SLT predictive algorithms, both as supportive analytic tools for conventional methods, and as primary analytic tools in discovery phase research. We conclude that despite their differences from the classic null-hypothesis testing approach—or perhaps because of them–SLT methods may hold value as a statistically rigorous approach to exploratory regression. PMID:27454257

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blume-Kohout, Robin J; Scholten, Travis L.

    Quantum state tomography on a d-dimensional system demands resources that grow rapidly with d. They may be reduced by using model selection to tailor the number of parameters in the model (i.e., the size of the density matrix). Most model selection methods typically rely on a test statistic and a null theory that describes its behavior when two models are equally good. Here, we consider the loglikelihood ratio. Because of the positivity constraint ρ ≥ 0, quantum state space does not generally satisfy local asymptotic normality (LAN), meaning the classical null theory for the loglikelihood ratio (the Wilks theorem) shouldmore » not be used. Thus, understanding and quantifying how positivity affects the null behavior of this test statistic is necessary for its use in model selection for state tomography. We define a new generalization of LAN, metric-projected LAN, show that quantum state space satisfies it, and derive a replacement for the Wilks theorem. In addition to enabling reliable model selection, our results shed more light on the qualitative effects of the positivity constraint on state tomography.« less

  14. The Prediction of Item Parameters Based on Classical Test Theory and Latent Trait Theory

    ERIC Educational Resources Information Center

    Anil, Duygu

    2008-01-01

    In this study, the prediction power of the item characteristics based on the experts' predictions on conditions try-out practices cannot be applied was examined for item characteristics computed depending on classical test theory and two-parameters logistic model of latent trait theory. The study was carried out on 9914 randomly selected students…

  15. Any Ontological Model of the Single Qubit Stabilizer Formalism must be Contextual

    NASA Astrophysics Data System (ADS)

    Lillystone, Piers; Wallman, Joel J.

    Quantum computers allow us to easily solve some problems classical computers find hard. Non-classical improvements in computational power should be due to some non-classical property of quantum theory. Contextuality, a more general notion of non-locality, is a necessary, but not sufficient, resource for quantum speed-up. Proofs of contextuality can be constructed for the classically simulable stabilizer formalism. Previous proofs of stabilizer contextuality are known for 2 or more qubits, for example the Mermin-Peres magic square. In the work presented we extend these results and prove that any ontological model of the single qubit stabilizer theory must be contextual, as defined by R. Spekkens, and give a relation between our result and the Mermin-Peres square. By demonstrating that contextuality is present in the qubit stabilizer formalism we provide further insight into the contextuality present in quantum theory. Understanding the contextuality of classical sub-theories will allow us to better identify the physical properties of quantum theory required for computational speed up. This research was supported by CIFAR, the Government of Ontario, and the Government of Canada through NSERC and Industry Canada.

  16. Evaluation of the mathematical and economic basis for conversion processes in the LEAP energy-economy model

    NASA Astrophysics Data System (ADS)

    Oblow, E. M.

    1982-10-01

    An evaluation was made of the mathematical and economic basis for conversion processes in the Long-term Energy Analysis Program (LEAP) energy economy model. Conversion processes are the main modeling subunit in LEAP used to represent energy conversion industries and are supposedly based on the classical economic theory of the firm. Questions about uniqueness and existence of LEAP solutions and their relation to classical equilibrium economic theory prompted the study. An analysis of classical theory and LEAP model equations was made to determine their exact relationship. The conclusions drawn from this analysis were that LEAP theory is not consistent with the classical theory of the firm. Specifically, the capacity factor formalism used by LEAP does not support a classical interpretation in terms of a technological production function for energy conversion processes. The economic implications of this inconsistency are suboptimal process operation and short term negative profits in years where plant operation should be terminated. A new capacity factor formalism, which retains the behavioral features of the original model, is proposed to resolve these discrepancies.

  17. (Re)igniting a Sociological Imagination in Adult Education: The Continuing Relevance of Classical Theory

    ERIC Educational Resources Information Center

    Lange, Elizabeth

    2015-01-01

    This article argues that sociology has been a foundational discipline for the field of adult education, but it has been largely implicit, until recently. This article contextualizes classical theories of sociology within contemporary critiques, reviews the historical roots of sociology and then briefly introduces the classical theories…

  18. Contemporary understanding of riots: Classical crowd psychology, ideology and the social identity approach.

    PubMed

    Stott, Clifford; Drury, John

    2016-04-01

    This article explores the origins and ideology of classical crowd psychology, a body of theory reflected in contemporary popularised understandings such as of the 2011 English 'riots'. This article argues that during the nineteenth century, the crowd came to symbolise a fear of 'mass society' and that 'classical' crowd psychology was a product of these fears. Classical crowd psychology pathologised, reified and decontextualised the crowd, offering the ruling elites a perceived opportunity to control it. We contend that classical theory misrepresents crowd psychology and survives in contemporary understanding because it is ideological. We conclude by discussing how classical theory has been supplanted in academic contexts by an identity-based crowd psychology that restores the meaning to crowd action, replaces it in its social context and in so doing transforms theoretical understanding of 'riots' and the nature of the self. © The Author(s) 2016.

  19. On the transition from the quantum to the classical regime for massive scalar particles: A spatiotemporal approach

    NASA Astrophysics Data System (ADS)

    Lusanna, Luca; Pauri, Massimo

    2014-08-01

    If the classical structure of space-time is assumed to define an a priori scenario for the formulation of quantum theory (QT), the coordinate representation of the solutions of the Schroedinger equation of a quantum system containing one ( N) massive scalar particle has a preferred status. Let us consider all of the solutions admitting a multipolar expansion of the probability density function (and more generally of the Wigner function) around a space-time trajectory to be properly selected. For every normalized solution there is a privileged trajectory implying the vanishing of the dipole moment of the multipolar expansion: it is given by the expectation value of the position operator . Then, the special subset of solutions which satisfy Ehrenfest's Theorem (named thereby Ehrenfest monopole wave functions (EMWF)), have the important property that this privileged classical trajectory is determined by a closed Newtonian equation of motion where the effective force is the Newtonian force plus non-Newtonian terms (of order ħ 2 or higher) depending on the higher multipoles of the probability distribution ρ. Note that the superposition of two EMWFs is not an EMWF, a result to be strongly hoped for, given the possible unwanted implications concerning classical spatial perception. These results can be extended to N-particle systems in such a way that, when N classical trajectories with all the dipole moments vanishing and satisfying Ehrenfest theorem are associated with the normalized wave functions of the N-body system, we get a natural transition from the 3 N-dimensional configuration space to the space-time. Moreover, these results can be extended to relativistic quantum mechanics. Consequently, in suitable states of N quantum particle which are EMWF, we get the "emergence" of corresponding "classical particles" following Newton-like trajectories in space-time. Note that all this holds true in the standard framework of quantum mechanics, i.e. assuming, in particular, the validity of Born's rule and the individual system interpretation of the wave function (no ensemble interpretation). These results are valid without any approximation (like ħ → 0, big quantum numbers, etc.). Moreover, we do not commit ourselves to any specific ontological interpretation of quantum theory (such as, e.g., the Bohmian one). We will argue that, in substantial agreement with Bohr's viewpoint, the macroscopic description of the preparation, certain intermediate steps and the detection of the final outcome of experiments involving massive particles are dominated by these classical "effective" trajectories. This approach can be applied to the point of view of de-coherence in the case of a diagonal reduced density matrix ρ red (an improper mixture) depending on the position variables of a massive particle and of a pointer. When both the particle and the pointer wave functions appearing in ρ red are EMWF, the expectation value of the particle and pointer position variables becomes a statistical average on a classical ensemble. In these cases an improper quantum mixture becomes a classical statistical one, thus providing a particular answer to an open problem of de-coherence about the emergence of classicality.

  20. Influence of an asymmetric ring on the modeling of an orthogonally stiffened cylindrical shell

    NASA Technical Reports Server (NTRS)

    Rastogi, Naveen; Johnson, Eric R.

    1994-01-01

    Structural models are examined for the influence of a ring with an asymmetrical cross section on the linear elastic response of an orthogonally stiffened cylindrical shell subjected to internal pressure. The first structural model employs classical theory for the shell and stiffeners. The second model employs transverse shear deformation theories for the shell and stringer and classical theory for the ring. Closed-end pressure vessel effects are included. Interacting line load intensities are computed in the stiffener-to-skin joints for an example problem having the dimensions of the fuselage of a large transport aircraft. Classical structural theory is found to exaggerate the asymmetric response compared to the transverse shear deformation theory.

  1. Design and control strategies for CELSS - Integrating mechanistic paradigms and biological complexities

    NASA Technical Reports Server (NTRS)

    Moore, B., III; Kaufmann, R.; Reinhold, C.

    1981-01-01

    Systems analysis and control theory consideration are given to simulations of both individual components and total systems, in order to develop a reliable control strategy for a Controlled Ecological Life Support System (CELSS) which includes complex biological components. Because of the numerous nonlinearities and tight coupling within the biological component, classical control theory may be inadequate and the statistical analysis of factorial experiments more useful. The range in control characteristics of particular species may simplify the overall task by providing an appropriate balance of stability and controllability to match species function in the overall design. The ultimate goal of this research is the coordination of biological and mechanical subsystems in order to achieve a self-supporting environment.

  2. Space-Group Symmetries Generate Chaotic Fluid Advection in Crystalline Granular Media

    NASA Astrophysics Data System (ADS)

    Turuban, R.; Lester, D. R.; Le Borgne, T.; Méheust, Y.

    2018-01-01

    The classical connection between symmetry breaking and the onset of chaos in dynamical systems harks back to the seminal theory of Noether [Transp. Theory Statist. Phys. 1, 186 (1918), 10.1080/00411457108231446]. We study the Lagrangian kinematics of steady 3D Stokes flow through simple cubic and body-centered cubic (bcc) crystalline lattices of close-packed spheres, and uncover an important exception. While breaking of point-group symmetries is a necessary condition for chaotic mixing in both lattices, a further space-group (glide) symmetry of the bcc lattice generates a transition from globally regular to globally chaotic dynamics. This finding provides new insights into chaotic mixing in porous media and has significant implications for understanding the impact of symmetries upon generic dynamical systems.

  3. Interactions dominate the dynamics of visual cognition.

    PubMed

    Stephen, Damian G; Mirman, Daniel

    2010-04-01

    Many cognitive theories have described behavior as the summation of independent contributions from separate components. Contrasting views have emphasized the importance of multiplicative interactions and emergent structure. We describe a statistical approach to distinguishing additive and multiplicative processes and apply it to the dynamics of eye movements during classic visual cognitive tasks. The results reveal interaction-dominant dynamics in eye movements in each of the three tasks, and that fine-grained eye movements are modulated by task constraints. These findings reveal the interactive nature of cognitive processing and are consistent with theories that view cognition as an emergent property of processes that are broadly distributed over many scales of space and time rather than a componential assembly line. Copyright 2009 Elsevier B.V. All rights reserved.

  4. Quantum formalism for classical statistics

    NASA Astrophysics Data System (ADS)

    Wetterich, C.

    2018-06-01

    In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.

  5. String Theory Methods for Condensed Matter Physics

    NASA Astrophysics Data System (ADS)

    Nastase, Horatiu

    2017-09-01

    Preface; Acknowledgments; Introduction; Part I. Condensed Matter Models and Problems: 1. Lightning review of statistical mechanics, thermodynamics, phases and phase transitions; 2. Magnetism in solids; 3. Electrons in solids: Fermi gas vs. Fermi liquid; 4. Bosonic quasi-particles: phonons and plasmons; 5. Spin-charge separation in 1+1 dimensional solids: spinons and holons; 6. The Ising model and the Heisenberg spin chain; 7. Spin chains and integrable systems; 8. The thermodynamic Bethe ansatz; 9. Conformal field theories and quantum phase transitions; 10. Classical vs. quantum Hall effect; 11. Superconductivity: Landau-Ginzburg, London and BCS; 12. Topology and statistics: Berry and Chern-Simons, anyons and nonabelions; 13. Insulators; 14. The Kondo effect and the Kondo problem; 15. Hydrodynamics and transport properties: from Boltzmann to Navier-Stokes; Part II. Elements of General Relativity and String Theory: 16. The Einstein equation and the Schwarzschild solution; 17. The Reissner-Nordstrom and Kerr-Newman solutions and thermodynamic properties of black holes; 18. Extra dimensions and Kaluza-Klein; 19. Electromagnetism and gravity in various dimensions. Consistent truncations; 20. Gravity plus matter: black holes and p-branes in various dimensions; 21. Weak/strong coupling dualities in 1+1, 2+1, 3+1 and d+1 dimensions; 22. The relativistic point particle and the relativistic string; 23. Lightcone strings and quantization; 24. D-branes and gauge fields; 25. Electromagnetic fields on D-branes. Supersymmetry and N = 4 SYM. T-duality of closed strings; 26. Dualities and M theory; 27. The AdS/CFT correspondence: definition and motivation; Part III. Applying String Theory to Condensed Matter Problems: 28. The pp wave correspondence: string Hamiltonian from N = 4 SYM; 29. Spin chains from N = 4 SYM; 30. The Bethe ansatz: Bethe strings from classical strings in AdS; 31. Integrability and AdS/CFT; 32. AdS/CFT phenomenology: Lifshitz, Galilean and Schrodinger symmetries and their gravity duals; 33. Finite temperature and black holes; 34. Hot plasma equilibrium thermodynamics: entropy, charge density and chemical potential of strongly coupled theories; 35. Spectral functions and transport properties; 36. Dynamic and nonequilibrium properties of plasmas: electric transport, Langevin diffusion and thermalization via black hole quasi-normal modes; 37. The holographic superconductor; 38. The fluid-gravity correspondence: conformal relativistic fluids from black hole horizons; 39. Nonrelativistic fluids: from Einstein to Navier-Stokes and back; Part IV. Advanced Applications: 40. Fermi gas and liquid in AdS/CFT; 41. Quantum Hall effect from string theory; 42. Quantum critical systems and AdS/CFT; 43. Particle-vortex duality and ABJM vs. AdS4 X CP3 duality; 44. Topology and non-standard statistics from AdS/CFT; 45. DBI scalar model for QGP/black hole hydro- and thermo-dynamics; 46. Holographic entanglement entropy in condensed matter; 47. Holographic insulators; 48. Holographic strange metals and the Kondo problem; References; Index.

  6. Demonstrating the Difference between Classical Test Theory and Item Response Theory Using Derived Test Data

    ERIC Educational Resources Information Center

    Magno, Carlo

    2009-01-01

    The present report demonstrates the difference between classical test theory (CTT) and item response theory (IRT) approach using an actual test data for chemistry junior high school students. The CTT and IRT were compared across two samples and two forms of test on their item difficulty, internal consistency, and measurement errors. The specific…

  7. Studying Reliability of Open Ended Mathematics Items According to the Classical Test Theory and Generalizability Theory

    ERIC Educational Resources Information Center

    Guler, Nese; Gelbal, Selahattin

    2010-01-01

    In this study, the Classical test theory and generalizability theory were used for determination to reliability of scores obtained from measurement tool of mathematics success. 24 open-ended mathematics question of the TIMSS-1999 was applied to 203 students in 2007-spring semester. Internal consistency of scores was found as 0.92. For…

  8. Overview of classical test theory and item response theory for the quantitative assessment of items in developing patient-reported outcomes measures.

    PubMed

    Cappelleri, Joseph C; Jason Lundy, J; Hays, Ron D

    2014-05-01

    The US Food and Drug Administration's guidance for industry document on patient-reported outcomes (PRO) defines content validity as "the extent to which the instrument measures the concept of interest" (FDA, 2009, p. 12). According to Strauss and Smith (2009), construct validity "is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity" (p. 7). Hence, both qualitative and quantitative information are essential in evaluating the validity of measures. We review classical test theory and item response theory (IRT) approaches to evaluating PRO measures, including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized "difficulty" (severity) order of items is represented by observed responses. If a researcher has few qualitative data and wants to get preliminary information about the content validity of the instrument, then descriptive assessments using classical test theory should be the first step. As the sample size grows during subsequent stages of instrument development, confidence in the numerical estimates from Rasch and other IRT models (as well as those of classical test theory) would also grow. Classical test theory and IRT can be useful in providing a quantitative assessment of items and scales during the content-validity phase of PRO-measure development. Depending on the particular type of measure and the specific circumstances, the classical test theory and/or the IRT should be considered to help maximize the content validity of PRO measures. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.

  9. Constrained variational calculus for higher order classical field theories

    NASA Astrophysics Data System (ADS)

    Campos, Cédric M.; de León, Manuel; Martín de Diego, David

    2010-11-01

    We develop an intrinsic geometrical setting for higher order constrained field theories. As a main tool we use an appropriate generalization of the classical Skinner-Rusk formalism. Some examples of applications are studied, in particular to the geometrical description of optimal control theory for partial differential equations.

  10. Quantum optical signatures in strong-field laser physics: Infrared photon counting in high-order-harmonic generation.

    PubMed

    Gonoskov, I A; Tsatrafyllis, N; Kominis, I K; Tzallas, P

    2016-09-07

    We analytically describe the strong-field light-electron interaction using a quantized coherent laser state with arbitrary photon number. We obtain a light-electron wave function which is a closed-form solution of the time-dependent Schrödinger equation (TDSE). This wave function provides information about the quantum optical features of the interaction not accessible by semi-classical theories. With this approach we can reveal the quantum optical properties of high harmonic generation (HHG) process in gases by measuring the photon statistics of the transmitted infrared (IR) laser radiation. This work can lead to novel experiments in high-resolution spectroscopy in extreme-ultraviolet (XUV) and attosecond science without the need to measure the XUV light, while it can pave the way for the development of intense non-classical light sources.

  11. Counting statistics of chaotic resonances at optical frequencies: Theory and experiments

    NASA Astrophysics Data System (ADS)

    Lippolis, Domenico; Wang, Li; Xiao, Yun-Feng

    2017-07-01

    A deformed dielectric microcavity is used as an experimental platform for the analysis of the statistics of chaotic resonances, in the perspective of testing fractal Weyl laws at optical frequencies. In order to surmount the difficulties that arise from reading strongly overlapping spectra, we exploit the mixed nature of the phase space at hand, and only count the high-Q whispering-gallery modes (WGMs) directly. That enables us to draw statistical information on the more lossy chaotic resonances, coupled to the high-Q regular modes via dynamical tunneling. Three different models [classical, Random-Matrix-Theory (RMT) based, semiclassical] to interpret the experimental data are discussed. On the basis of least-squares analysis, theoretical estimates of Ehrenfest time, and independent measurements, we find that a semiclassically modified RMT-based expression best describes the experiment in all its realizations, particularly when the resonator is coupled to visible light, while RMT alone still works quite well in the infrared. In this work we reexamine and substantially extend the results of a short paper published earlier [L. Wang et al., Phys. Rev. E 93, 040201(R) (2016), 10.1103/PhysRevE.93.040201].

  12. OPEN PROBLEM: Orbits' statistics in chaotic dynamical systems

    NASA Astrophysics Data System (ADS)

    Arnold, V.

    2008-07-01

    This paper shows how the measurement of the stochasticity degree of a finite sequence of real numbers, published by Kolmogorov in Italian in a journal of insurances' statistics, can be usefully applied to measure the objective stochasticity degree of sequences, originating from dynamical systems theory and from number theory. Namely, whenever the value of Kolmogorov's stochasticity parameter of a given sequence of numbers is too small (or too big), one may conclude that the conjecture describing this sequence as a sample of independent values of a random variables is highly improbable. Kolmogorov used this strategy fighting (in a paper in 'Doklady', 1940) against Lysenko, who had tried to disprove the classical genetics' law of Mendel experimentally. Calculating his stochasticity parameter value for the numbers from Lysenko's experiment reports, Kolmogorov deduced, that, while these numbers were different from the exact fulfilment of Mendel's 3 : 1 law, any smaller deviation would be a manifestation of the report's number falsification. The calculation of the values of the stochasticity parameter would be useful for many other generators of pseudorandom numbers and for many other chaotically looking statistics, including even the prime numbers distribution (discussed in this paper as an example).

  13. Application of quantum master equation for long-term prognosis of asset-prices

    NASA Astrophysics Data System (ADS)

    Khrennikova, Polina

    2016-05-01

    This study combines the disciplines of behavioral finance and an extension of econophysics, namely the concepts and mathematical structure of quantum physics. We apply the formalism of quantum theory to model the dynamics of some correlated financial assets, where the proposed model can be potentially applied for developing a long-term prognosis of asset price formation. At the informational level, the asset price states interact with each other by the means of a ;financial bath;. The latter is composed of agents' expectations about the future developments of asset prices on the finance market, as well as financially important information from mass-media, society, and politicians. One of the essential behavioral factors leading to the quantum-like dynamics of asset prices is the irrationality of agents' expectations operating on the finance market. These expectations lead to a deeper type of uncertainty concerning the future price dynamics of the assets, than given by a classical probability theory, e.g., in the framework of the classical financial mathematics, which is based on the theory of stochastic processes. The quantum dimension of the uncertainty in price dynamics is expressed in the form of the price-states superposition and entanglement between the prices of the different financial assets. In our model, the resolution of this deep quantum uncertainty is mathematically captured with the aid of the quantum master equation (its quantum Markov approximation). We illustrate our model of preparation of a future asset price prognosis by a numerical simulation, involving two correlated assets. Their returns interact more intensively, than understood by a classical statistical correlation. The model predictions can be extended to more complex models to obtain price configuration for multiple assets and portfolios.

  14. Dressing the post-Newtonian two-body problem and classical effective field theory

    NASA Astrophysics Data System (ADS)

    Kol, Barak; Smolkin, Michael

    2009-12-01

    We apply a dressed perturbation theory to better organize and economize the computation of high orders of the 2-body effective action of an inspiralling post-Newtonian (PN) gravitating binary. We use the effective field theory approach with the nonrelativistic field decomposition (NRG fields). For that purpose we develop quite generally the dressing theory of a nonlinear classical field theory coupled to pointlike sources. We introduce dressed charges and propagators, but unlike the quantum theory there are no dressed bulk vertices. The dressed quantities are found to obey recursive integral equations which succinctly encode parts of the diagrammatic expansion, and are the classical version of the Schwinger-Dyson equations. Actually, the classical equations are somewhat stronger since they involve only finitely many quantities, unlike the quantum theory. Classical diagrams are shown to factorize exactly when they contain nonlinear worldline vertices, and we classify all the possible topologies of irreducible diagrams for low loop numbers. We apply the dressing program to our post-Newtonian case of interest. The dressed charges consist of the dressed energy-momentum tensor after a nonrelativistic decomposition, and we compute all dressed charges (in the harmonic gauge) appearing up to 2PN in the 2-body effective action (and more). We determine the irreducible skeleton diagrams up to 3PN and we employ the dressed charges to compute several terms beyond 2PN.

  15. Quantum-Like Model for Decision Making Process in Two Players Game. A Non-Kolmogorovian Model

    NASA Astrophysics Data System (ADS)

    Asano, Masanari; Ohya, Masanori; Khrennikov, Andrei

    2011-03-01

    In experiments of games, players frequently make choices which are regarded as irrational in game theory. In papers of Khrennikov (Information Dynamics in Cognitive, Psychological and Anomalous Phenomena. Fundamental Theories of Physics, Kluwer Academic, Norwell, 2004; Fuzzy Sets Syst. 155:4-17, 2005; Biosystems 84:225-241, 2006; Found. Phys. 35(10):1655-1693, 2005; in QP-PQ Quantum Probability and White Noise Analysis, vol. XXIV, pp. 105-117, 2009), it was pointed out that statistics collected in such the experiments have "quantum-like" properties, which can not be explained in classical probability theory. In this paper, we design a simple quantum-like model describing a decision-making process in a two-players game and try to explain a mechanism of the irrational behavior of players. Finally we discuss a mathematical frame of non-Kolmogorovian system in terms of liftings (Accardi and Ohya, in Appl. Math. Optim. 39:33-59, 1999).

  16. Density-functional theory for fluid-solid and solid-solid phase transitions.

    PubMed

    Bharadwaj, Atul S; Singh, Yashwant

    2017-03-01

    We develop a theory to describe solid-solid phase transitions. The density functional formalism of classical statistical mechanics is used to find an exact expression for the difference in the grand thermodynamic potentials of the two coexisting phases. The expression involves both the symmetry conserving and the symmetry broken parts of the direct pair correlation function. The theory is used to calculate phase diagram of systems of soft spheres interacting via inverse power potentials u(r)=ε(σ/r)^{n}, where parameter n measures softness of the potential. We find that for 1/n<0.154 systems freeze into the face centered cubic (fcc) structure while for 1/n≥0.154 the body-centred-cubic (bcc) structure is preferred. The bcc structure transforms into the fcc structure upon increasing the density. The calculated phase diagram is in good agreement with the one found from molecular simulations.

  17. Kinetic field theory: exact free evolution of Gaussian phase-space correlations

    NASA Astrophysics Data System (ADS)

    Fabis, Felix; Kozlikin, Elena; Lilow, Robert; Bartelmann, Matthias

    2018-04-01

    In recent work we developed a description of cosmic large-scale structure formation in terms of non-equilibrium ensembles of classical particles, with time evolution obtained in the framework of a statistical field theory. In these works, the initial correlations between particles sampled from random Gaussian density and velocity fields have so far been treated perturbatively or restricted to pure momentum correlations. Here we treat the correlations between all phase-space coordinates exactly by adopting a diagrammatic language for the different forms of correlations, directly inspired by the Mayer cluster expansion. We will demonstrate that explicit expressions for phase-space density cumulants of arbitrary n-point order, which fully capture the non-linear coupling of free streaming kinematics due to initial correlations, can be obtained from a simple set of Feynman rules. These cumulants will be the foundation for future investigations of perturbation theory in particle interactions.

  18. Bosonic Loop Diagrams as Perturbative Solutions of the Classical Field Equations in ϕ4-Theory

    NASA Astrophysics Data System (ADS)

    Finster, Felix; Tolksdorf, Jürgen

    2012-05-01

    Solutions of the classical ϕ4-theory in Minkowski space-time are analyzed in a perturbation expansion in the nonlinearity. Using the language of Feynman diagrams, the solution of the Cauchy problem is expressed in terms of tree diagrams which involve the retarded Green's function and have one outgoing leg. In order to obtain general tree diagrams, we set up a "classical measurement process" in which a virtual observer of a scattering experiment modifies the field and detects suitable energy differences. By adding a classical stochastic background field, we even obtain all loop diagrams. The expansions are compared with the standard Feynman diagrams of the corresponding quantum field theory.

  19. Quantum mean-field approximation for lattice quantum models: Truncating quantum correlations and retaining classical ones

    NASA Astrophysics Data System (ADS)

    Malpetti, Daniele; Roscilde, Tommaso

    2017-02-01

    The mean-field approximation is at the heart of our understanding of complex systems, despite its fundamental limitation of completely neglecting correlations between the elementary constituents. In a recent work [Phys. Rev. Lett. 117, 130401 (2016), 10.1103/PhysRevLett.117.130401], we have shown that in quantum many-body systems at finite temperature, two-point correlations can be formally separated into a thermal part and a quantum part and that quantum correlations are generically found to decay exponentially at finite temperature, with a characteristic, temperature-dependent quantum coherence length. The existence of these two different forms of correlation in quantum many-body systems suggests the possibility of formulating an approximation, which affects quantum correlations only, without preventing the correct description of classical fluctuations at all length scales. Focusing on lattice boson and quantum Ising models, we make use of the path-integral formulation of quantum statistical mechanics to introduce such an approximation, which we dub quantum mean-field (QMF) approach, and which can be readily generalized to a cluster form (cluster QMF or cQMF). The cQMF approximation reduces to cluster mean-field theory at T =0 , while at any finite temperature it produces a family of systematically improved, semi-classical approximations to the quantum statistical mechanics of the lattice theory at hand. Contrary to standard MF approximations, the correct nature of thermal critical phenomena is captured by any cluster size. In the two exemplary cases of the two-dimensional quantum Ising model and of two-dimensional quantum rotors, we study systematically the convergence of the cQMF approximation towards the exact result, and show that the convergence is typically linear or sublinear in the boundary-to-bulk ratio of the clusters as T →0 , while it becomes faster than linear as T grows. These results pave the way towards the development of semiclassical numerical approaches based on an approximate, yet systematically improved account of quantum correlations.

  20. Entanglement entropy of electromagnetic edge modes.

    PubMed

    Donnelly, William; Wall, Aron C

    2015-03-20

    The vacuum entanglement entropy of Maxwell theory, when evaluated by standard methods, contains an unexpected term with no known statistical interpretation. We resolve this two-decades old puzzle by showing that this term is the entanglement entropy of edge modes: classical solutions determined by the electric field normal to the entangling surface. We explain how the heat kernel regularization applied to this term leads to the negative divergent expression found by Kabat. This calculation also resolves a recent puzzle concerning the logarithmic divergences of gauge fields in 3+1 dimensions.

  1. Some loopholes to save quantum nonlocality

    NASA Astrophysics Data System (ADS)

    Accardi, Luigi

    2005-02-01

    The EPR-chameleon experiment has closed a long standing debate between the supporters of quantum nonlocality and the thesis of quantum probability according to which the essence of the quantum pecularity is non Kolmogorovianity rather than non locality. The theory of adaptive systems (symbolized by the chameleon effect) provides a natural intuition for the emergence of non-Kolmogorovian statistics from classical deterministic dynamical systems. These developments are quickly reviewed and in conclusion some comments are introduced on recent attempts to "reconstruct history" on the lines described by Orwell in "1984".

  2. How Settings Change People: Applying Behavior Setting Theory to Consumer-Run Organizations

    ERIC Educational Resources Information Center

    Brown, Louis D.; Shepherd, Matthew D.; Wituk, Scott A.; Meissen, Greg

    2007-01-01

    Self-help initiatives stand as a classic context for organizational studies in community psychology. Behavior setting theory stands as a classic conception of organizations and the environment. This study explores both, applying behavior setting theory to consumer-run organizations (CROs). Analysis of multiple data sets from all CROs in Kansas…

  3. Assessment of Work Climates: The Appropriateness of Classical-Management Theory and Human-Relations Theory under Various Contingencies. Final Report.

    ERIC Educational Resources Information Center

    Langdale, John A.

    The construct of "organizational climate" was explicated and various ways of operationalizing it were reviewed. A survey was made of the literature pertinent to the classical-human relations dimension of environmental quality. As a result, it was hypothesized that the appropriateness of the classical and human-relations master plans is moderated…

  4. Quarks, Symmetries and Strings - a Symposium in Honor of Bunji Sakita's 60th Birthday

    NASA Astrophysics Data System (ADS)

    Kaku, M.; Jevicki, A.; Kikkawa, K.

    1991-04-01

    The Table of Contents for the full book PDF is as follows: * Preface * Evening Banquet Speech * I. Quarks and Phenomenology * From the SU(6) Model to Uniqueness in the Standard Model * A Model for Higgs Mechanism in the Standard Model * Quark Mass Generation in QCD * Neutrino Masses in the Standard Model * Solar Neutrino Puzzle, Horizontal Symmetry of Electroweak Interactions and Fermion Mass Hierarchies * State of Chiral Symmetry Breaking at High Temperatures * Approximate |ΔI| = 1/2 Rule from a Perspective of Light-Cone Frame Physics * Positronium (and Some Other Systems) in a Strong Magnetic Field * Bosonic Technicolor and the Flavor Problem * II. Strings * Supersymmetry in String Theory * Collective Field Theory and Schwinger-Dyson Equations in Matrix Models * Non-Perturbative String Theory * The Structure of Non-Perturbative Quantum Gravity in One and Two Dimensions * Noncritical Virasoro Algebra of d < 1 Matrix Model and Quantized String Field * Chaos in Matrix Models ? * On the Non-Commutative Symmetry of Quantum Gravity in Two Dimensions * Matrix Model Formulation of String Field Theory in One Dimension * Geometry of the N = 2 String Theory * Modular Invariance form Gauge Invariance in the Non-Polynomial String Field Theory * Stringy Symmetry and Off-Shell Ward Identities * q-Virasoro Algebra and q-Strings * Self-Tuning Fields and Resonant Correlations in 2d-Gravity * III. Field Theory Methods * Linear Momentum and Angular Momentum in Quaternionic Quantum Mechanics * Some Comments on Real Clifford Algebras * On the Quantum Group p-adics Connection * Gravitational Instantons Revisited * A Generalized BBGKY Hierarchy from the Classical Path-Integral * A Quantum Generated Symmetry: Group-Level Duality in Conformal and Topological Field Theory * Gauge Symmetries in Extended Objects * Hidden BRST Symmetry and Collective Coordinates * Towards Stochastically Quantizing Topological Actions * IV. Statistical Methods * A Brief Summary of the s-Channel Theory of Superconductivity * Neural Networks and Models for the Brain * Relativistic One-Body Equations for Planar Particles with Arbitrary Spin * Chiral Property of Quarks and Hadron Spectrum in Lattice QCD * Scalar Lattice QCD * Semi-Superconductivity of a Charged Anyon Gas * Two-Fermion Theory of Strongly Correlated Electrons and Charge-Spin Separation * Statistical Mechanics and Error-Correcting Codes * Quantum Statistics

  5. Maximum entropy models of ecosystem functioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertram, Jason, E-mail: jason.bertram@anu.edu.au

    2014-12-05

    Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on themore » information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.« less

  6. Walking through the statistical black boxes of plant breeding.

    PubMed

    Xavier, Alencar; Muir, William M; Craig, Bruce; Rainey, Katy Martin

    2016-10-01

    The main statistical procedures in plant breeding are based on Gaussian process and can be computed through mixed linear models. Intelligent decision making relies on our ability to extract useful information from data to help us achieve our goals more efficiently. Many plant breeders and geneticists perform statistical analyses without understanding the underlying assumptions of the methods or their strengths and pitfalls. In other words, they treat these statistical methods (software and programs) like black boxes. Black boxes represent complex pieces of machinery with contents that are not fully understood by the user. The user sees the inputs and outputs without knowing how the outputs are generated. By providing a general background on statistical methodologies, this review aims (1) to introduce basic concepts of machine learning and its applications to plant breeding; (2) to link classical selection theory to current statistical approaches; (3) to show how to solve mixed models and extend their application to pedigree-based and genomic-based prediction; and (4) to clarify how the algorithms of genome-wide association studies work, including their assumptions and limitations.

  7. The evolving Planck mass in classically scale-invariant theories

    NASA Astrophysics Data System (ADS)

    Kannike, K.; Raidal, M.; Spethmann, C.; Veermäe, H.

    2017-04-01

    We consider classically scale-invariant theories with non-minimally coupled scalar fields, where the Planck mass and the hierarchy of physical scales are dynamically generated. The classical theories possess a fixed point, where scale invariance is spontaneously broken. In these theories, however, the Planck mass becomes unstable in the presence of explicit sources of scale invariance breaking, such as non-relativistic matter and cosmological constant terms. We quantify the constraints on such classical models from Big Bang Nucleosynthesis that lead to an upper bound on the non-minimal coupling and require trans-Planckian field values. We show that quantum corrections to the scalar potential can stabilise the fixed point close to the minimum of the Coleman-Weinberg potential. The time-averaged motion of the evolving fixed point is strongly suppressed, thus the limits on the evolving gravitational constant from Big Bang Nucleosynthesis and other measurements do not presently constrain this class of theories. Field oscillations around the fixed point, if not damped, contribute to the dark matter density of the Universe.

  8. Representing the thermal state in time-dependent density functional theory

    DOE PAGES

    Modine, N. A.; Hatcher, R. M.

    2015-05-28

    Classical molecular dynamics (MD) provides a powerful and widely used approach to determining thermodynamic properties by integrating the classical equations of motion of a system of atoms. Time-Dependent Density Functional Theory (TDDFT) provides a powerful and increasingly useful approach to integrating the quantum equations of motion for a system of electrons. TDDFT efficiently captures the unitary evolution of a many-electron state by mapping the system into a fictitious non-interacting system. In analogy to MD, one could imagine obtaining the thermodynamic properties of an electronic system from a TDDFT simulation in which the electrons are excited from their ground state bymore » a time-dependent potential and then allowed to evolve freely in time while statistical data are captured from periodic snapshots of the system. For a variety of systems (e.g., many metals), the electrons reach an effective state of internal equilibrium due to electron-electron interactions on a time scale that is short compared to electron-phonon equilibration. During the initial time-evolution of such systems following electronic excitation, electron-phonon interactions should be negligible, and therefore, TDDFT should successfully capture the internal thermalization of the electrons. However, it is unclear how TDDFT represents the resulting thermal state. In particular, the thermal state is usually represented in quantum statistical mechanics as a mixed state, while the occupations of the TDDFT wave functions are fixed by the initial state in TDDFT. Two key questions involve (1) reformulating quantum statistical mechanics so that thermodynamic expectations can be obtained as an unweighted average over a set of many-body pure states and (2) constructing a family of non-interacting (single determinant) TDDFT states that approximate the required many-body states for the canonical ensemble. In Section II, we will address these questions by first demonstrating that thermodynamic expectations can be evaluated by averaging over certain many-body pure states, which we will call thermal states, and then constructing TDDFT states that approximate these thermal states. In Section III, we will present some numerical tests of the resulting theory, and in Section IV, we will summarize our main results and discuss some possible future directions for this work.« less

  9. Statistical learning theory for high dimensional prediction: Application to criterion-keyed scale development.

    PubMed

    Chapman, Benjamin P; Weiss, Alexander; Duberstein, Paul R

    2016-12-01

    Statistical learning theory (SLT) is the statistical formulation of machine learning theory, a body of analytic methods common in "big data" problems. Regression-based SLT algorithms seek to maximize predictive accuracy for some outcome, given a large pool of potential predictors, without overfitting the sample. Research goals in psychology may sometimes call for high dimensional regression. One example is criterion-keyed scale construction, where a scale with maximal predictive validity must be built from a large item pool. Using this as a working example, we first introduce a core principle of SLT methods: minimization of expected prediction error (EPE). Minimizing EPE is fundamentally different than maximizing the within-sample likelihood, and hinges on building a predictive model of sufficient complexity to predict the outcome well, without undue complexity leading to overfitting. We describe how such models are built and refined via cross-validation. We then illustrate how 3 common SLT algorithms-supervised principal components, regularization, and boosting-can be used to construct a criterion-keyed scale predicting all-cause mortality, using a large personality item pool within a population cohort. Each algorithm illustrates a different approach to minimizing EPE. Finally, we consider broader applications of SLT predictive algorithms, both as supportive analytic tools for conventional methods, and as primary analytic tools in discovery phase research. We conclude that despite their differences from the classic null-hypothesis testing approach-or perhaps because of them-SLT methods may hold value as a statistically rigorous approach to exploratory regression. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. Theory of Neutron Chain Reactions: Extracts from Volume I, Diffusion and Slowing Down of Neutrons: Chapter I. Elementary Theory of Neutron Diffusion. Chapter II. Second Order Diffusion Theory. Chapter III. Slowing Down of Neutrons

    DOE R&D Accomplishments Database

    Weinberg, Alvin M.; Noderer, L. C.

    1951-05-15

    The large scale release of nuclear energy in a uranium fission chain reaction involves two essentially distinct physical phenomena. On the one hand there are the individual nuclear processes such as fission, neutron capture, and neutron scattering. These are essentially quantum mechanical in character, and their theory is non-classical. On the other hand, there is the process of diffusion -- in particular, diffusion of neutrons, which is of fundamental importance in a nuclear chain reaction. This process is classical; insofar as the theory of the nuclear chain reaction depends on the theory of neutron diffusion, the mathematical study of chain reactions is an application of classical, not quantum mechanical, techniques.

  11. ERRATUM: Papers published in incorrect sections

    NASA Astrophysics Data System (ADS)

    2004-04-01

    A number of J. Phys. A: Math. Gen. articles have mistakenly been placed in the wrong subject section in recent issues of the journal. We would like to apologize to the authors of these articles for publishing their papers in the Fluid and Plasma Theory section. The correct section for each article is given below. Statistical Physics Issue 4: Microcanonical entropy for small magnetizations Behringer H 2004 J. Phys. A: Math. Gen. 37 1443 Mathematical Physics Issue 9: On the solution of fractional evolution equations Kilbas A A, Pierantozzi T, Trujillo J J and Vázquez L 2004 J. Phys. A: Math. Gen. 37 3271 Quantum Mechanics and Quantum Information Theory Issue 6: New exactly solvable isospectral partners for PT-symmetric potentials Sinha A and Roy P 2004 J. Phys. A: Math. Gen. 37 2509 Issue 9: Symplectically entangled states and their applications to coding Vourdas A 2004 J. Phys. A: Math. Gen. 37 3305 Classical and Quantum Field Theory Issue 6: Pairing of parafermions of order 2: seniority model Nelson C A 2004 J. Phys. A: Math. Gen. 37 2497 Issue 7: Jordan-Schwinger map, 3D harmonic oscillator constants of motion, and classical and quantum parameters characterizing electromagnetic wave polarization Mota R D, Xicoténcatl M A and Granados V D 2004 J. Phys. A: Math. Gen. 37 2835 Issue 9: Could only fermions be elementary? Lev F M 2004 J. Phys. A: Math. Gen. 37 3285

  12. Classical conformality in the Standard Model from Coleman’s theory

    NASA Astrophysics Data System (ADS)

    Kawana, Kiyoharu

    2016-09-01

    The classical conformality (CC) is one of the possible candidates for explaining the gauge hierarchy of the Standard Model (SM). We show that it is naturally obtained from the Coleman’s theory on baby universe.

  13. Maximal incompatibility of locally classical behavior and global causal order in multiparty scenarios

    NASA Astrophysics Data System (ADS)

    Baumeler, ńmin; Feix, Adrien; Wolf, Stefan

    2014-10-01

    Quantum theory in a global spacetime gives rise to nonlocal correlations, which cannot be explained causally in a satisfactory way; this motivates the study of theories with reduced global assumptions. Oreshkov, Costa, and Brukner [Nat. Commun. 3, 1092 (2012), 10.1038/ncomms2076] proposed a framework in which quantum theory is valid locally but where, at the same time, no global spacetime, i.e., predefined causal order, is assumed beyond the absence of logical paradoxes. It was shown for the two-party case, however, that a global causal order always emerges in the classical limit. Quite naturally, it has been conjectured that the same also holds in the multiparty setting. We show that, counter to this belief, classical correlations locally compatible with classical probability theory exist that allow for deterministic signaling between three or more parties incompatible with any predefined causal order.

  14. A post-classical theory of enamel biomineralization… and why we need one.

    PubMed

    Simmer, James P; Richardson, Amelia S; Hu, Yuan-Yuan; Smith, Charles E; Ching-Chun Hu, Jan

    2012-09-01

    Enamel crystals are unique in shape, orientation and organization. They are hundreds of thousands times longer than they are wide, run parallel to each other, are oriented with respect to the ameloblast membrane at the mineralization front and are organized into rod or interrod enamel. The classical theory of amelogenesis postulates that extracellular matrix proteins shape crystallites by specifically inhibiting ion deposition on the crystal sides, orient them by binding multiple crystallites and establish higher levels of crystal organization. Elements of the classical theory are supported in principle by in vitro studies; however, the classical theory does not explain how enamel forms in vivo. In this review, we describe how amelogenesis is highly integrated with ameloblast cell activities and how the shape, orientation and organization of enamel mineral ribbons are established by a mineralization front apparatus along the secretory surface of the ameloblast cell membrane.

  15. Quantum-like model of unconscious–conscious dynamics

    PubMed Central

    Khrennikov, Andrei

    2015-01-01

    We present a quantum-like model of sensation–perception dynamics (originated in Helmholtz theory of unconscious inference) based on the theory of quantum apparatuses and instruments. We illustrate our approach with the model of bistable perception of a particular ambiguous figure, the Schröder stair. This is a concrete model for unconscious and conscious processing of information and their interaction. The starting point of our quantum-like journey was the observation that perception dynamics is essentially contextual which implies impossibility of (straightforward) embedding of experimental statistical data in the classical (Kolmogorov, 1933) framework of probability theory. This motivates application of nonclassical probabilistic schemes. And the quantum formalism provides a variety of the well-approved and mathematically elegant probabilistic schemes to handle results of measurements. The theory of quantum apparatuses and instruments is the most general quantum scheme describing measurements and it is natural to explore it to model the sensation–perception dynamics. In particular, this theory provides the scheme of indirect quantum measurements which we apply to model unconscious inference leading to transition from sensations to perceptions. PMID:26283979

  16. On Ruch's Principle of Decreasing Mixing Distance in classical statistical physics

    NASA Astrophysics Data System (ADS)

    Busch, Paul; Quadt, Ralf

    1990-10-01

    Ruch's Principle of Decreasing Mixing Distance is reviewed as a statistical physical principle and its basic suport and geometric interpretation, the Ruch-Schranner-Seligman theorem, is generalized to be applicable to a large representative class of classical statistical systems.

  17. Statistical mechanics explanation for the structure of ocean eddies and currents

    NASA Astrophysics Data System (ADS)

    Venaille, A.; Bouchet, F.

    2010-12-01

    The equilibrium statistical mechanics of two dimensional and geostrophic flows predicts the outcome for the large scales of the flow, resulting from the turbulent mixing. This theory has been successfully applied to describe detailed properties of Jupiter's Great Red Spot. We discuss the range of applicability of this theory to ocean dynamics. It is able to reproduce mesoscale structures like ocean rings. It explains, from statistical mechanics, the westward drift of rings at the speed of non dispersive baroclinic waves, and the recently observed (Chelton and col.) slower northward drift of cyclonic eddies and southward drift of anticyclonic eddies. We also uncover relations between strong eastward mid-basin inertial jets, like the Kuroshio extension and the Gulf Stream, and statistical equilibria. We explain under which conditions such strong mid-basin jets can be understood as statistical equilibria. We claim that these results are complementary to the classical Sverdrup-Munk theory: they explain the inertial part basin dynamics, the jets structure and location, using very simple theoretical arguments. References: A. VENAILLE and F. BOUCHET, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. BOUCHET and A. VENAILLE, Statistical mechanics of two-dimensional and geophysical flows, arxiv ...., submitted to Physics Reports P. BERLOFF, A. M. HOGG, W. DEWAR, The Turbulent Oscillator: A Mechanism of Low- Frequency Variability of the Wind-Driven Ocean Gyres, Journal of Physical Oceanography 37 (2007) 2363-+. D. B. CHELTON, M. G. SCHLAX, R. M. SAMELSON, R. A. de SZOEKE, Global observations of large oceanic eddies, Geo. Res. Lett.34 (2007) 15606-+ b) and c) are snapshots of streamfunction and potential vorticity (red: positive values; blue: negative values) in the upper layer of a three layer quasi-geostrophic model of a mid-latitude ocean basin (from Berloff and co.). a) Streamfunction predicted by statistical mechanics. Even in an out-equilibrium situation like this one, equilibrium statistical mechanics predicts remarkably the overall qualitative flow structure. Observation of westward drift of ocean eddies and of slower northward drift of cyclones and southward drift of anticyclones by Chelton and co. We explain these observations from statistical mechanics.

  18. A scrutiny of the premise of the Rice-Ramsperger-Kassel-Marcus theory in isomerization reaction of an Ar7-type molecule

    NASA Astrophysics Data System (ADS)

    Takatsuka, Kazuo; Seko, Chihiro

    1996-12-01

    The validity of the physical premise of the Rice-Ramsperger-Kassel-Marcus (RRKM) theory is investigated in terms of the classical dynamics of isomerization reaction in Ar7-like molecules (clusters). The passage times of classical trajectories through the potential basins of isomers in the structural transitions are examined. In the high energy region corresponding to the so-called liquidlike phase, remarkable uniformity of the average passage times has been found. That is, the average passage time is characterized only by a basin through which a trajectory is currently passing and, hence, does not depend on the next visiting basins. This behavior is out of accord with the ordinary chemical law in that the ``reaction rates'' do not seem to depend on the height of the individual potential barriers. We ascribe this seemingly strange uniformity to the strong mixing (chaos) lying behind the rate process. That is, as soon as a classical path enters a basin, it gets involved into a chaotic zone in which many paths having different channels are entangled among each other, and effectively (in the statistical sense) loses its memory about which basin it came from and where it should visit next time. This model is verified by confirming that the populations of the lifetime of transition from one basin to others are expressed in exponential functions, which should have very similar exponents to each other in each passing-through basin. The inverse of the exponent is essentially proportional to the average passage time, and consequently brings about the uniformity. These populations set a foundation for the multichannel generalization of the RRKM theory. Two cases of the non-RRKM behaviors have been studied. One is a nonstatistical behavior in the low energy region such as the so-called coexistence phase. The other is the short-time behavior. It is well established [M. Berblinger and C. Schlier, J. Chem. Phys. 101, 4750 (1994)] that in a relatively simple and small system such as H+3, the so-called direct paths, which lead to dissociation before the phase-space mixing is completed, increase the probability of short-time passage. In contrast, we have found in our Ar7-like molecules that trajectories of short passage time are fewer than expected by the statistical theory. It is conceived that somewhat a long time in the initial stage of the isomerization is spent by a trajectory to find its ways out to the next basins.

  19. Pathways to dewetting in hydrophobic confinement.

    PubMed

    Remsing, Richard C; Xi, Erte; Vembanur, Srivathsan; Sharma, Sumit; Debenedetti, Pablo G; Garde, Shekhar; Patel, Amish J

    2015-07-07

    Liquid water can become metastable with respect to its vapor in hydrophobic confinement. The resulting dewetting transitions are often impeded by large kinetic barriers. According to macroscopic theory, such barriers arise from the free energy required to nucleate a critical vapor tube that spans the region between two hydrophobic surfaces--tubes with smaller radii collapse, whereas larger ones grow to dry the entire confined region. Using extensive molecular simulations of water between two nanoscopic hydrophobic surfaces, in conjunction with advanced sampling techniques, here we show that for intersurface separations that thermodynamically favor dewetting, the barrier to dewetting does not correspond to the formation of a (classical) critical vapor tube. Instead, it corresponds to an abrupt transition from an isolated cavity adjacent to one of the confining surfaces to a gap-spanning vapor tube that is already larger than the critical vapor tube anticipated by macroscopic theory. Correspondingly, the barrier to dewetting is also smaller than the classical expectation. We show that the peculiar nature of water density fluctuations adjacent to extended hydrophobic surfaces--namely, the enhanced likelihood of observing low-density fluctuations relative to Gaussian statistics--facilitates this nonclassical behavior. By stabilizing isolated cavities relative to vapor tubes, enhanced water density fluctuations thus stabilize novel pathways, which circumvent the classical barriers and offer diminished resistance to dewetting. Our results thus suggest a key role for fluctuations in speeding up the kinetics of numerous phenomena ranging from Cassie-Wenzel transitions on superhydrophobic surfaces, to hydrophobically driven biomolecular folding and assembly.

  20. Microscopic molecular dynamics characterization of the second-order non-Navier-Fourier constitutive laws in the Poiseuille gas flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rana, A.; Ravichandran, R.; Park, J. H.

    The second-order non-Navier-Fourier constitutive laws, expressed in a compact algebraic mathematical form, were validated for the force-driven Poiseuille gas flow by the deterministic atomic-level microscopic molecular dynamics (MD). Emphasis is placed on how completely different methods (a second-order continuum macroscopic theory based on the kinetic Boltzmann equation, the probabilistic mesoscopic direct simulation Monte Carlo, and, in particular, the deterministic microscopic MD) describe the non-classical physics, and whether the second-order non-Navier-Fourier constitutive laws derived from the continuum theory can be validated using MD solutions for the viscous stress and heat flux calculated directly from the molecular data using the statistical method.more » Peculiar behaviors (non-uniform tangent pressure profile and exotic instantaneous heat conduction from cold to hot [R. S. Myong, “A full analytical solution for the force-driven compressible Poiseuille gas flow based on a nonlinear coupled constitutive relation,” Phys. Fluids 23(1), 012002 (2011)]) were re-examined using atomic-level MD results. It was shown that all three results were in strong qualitative agreement with each other, implying that the second-order non-Navier-Fourier laws are indeed physically legitimate in the transition regime. Furthermore, it was shown that the non-Navier-Fourier constitutive laws are essential for describing non-zero normal stress and tangential heat flux, while the classical and non-classical laws remain similar for shear stress and normal heat flux.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pratt, Lawrence R.; Chaudhari, Mangesh I.; Rempe, Susan B.

    Here this review focuses on the striking recent progress in solving for hydrophobic interactions between small inert molecules. We discuss several new understandings. First, the inverse temperature phenomenology of hydrophobic interactions, i.e., strengthening of hydrophobic bonds with increasing temperature, is decisively exhibited by hydrophobic interactions between atomic-scale hard sphere solutes in water. Second, inclusion of attractive interactions associated with atomic-size hydrophobic reference cases leads to substantial, nontrivial corrections to reference results for purely repulsive solutes. Hydrophobic bonds are weakened by adding solute dispersion forces to treatment of reference cases. The classic statistical mechanical theory for those corrections is not accuratemore » in this application, but molecular quasi-chemical theory shows promise. Lastly, because of the masking roles of excluded volume and attractive interactions, comparisons that do not discriminate the different possibilities face an interpretive danger.« less

  2. Higher Spin Fields in Three-Dimensional Gravity

    NASA Astrophysics Data System (ADS)

    Lepage-Jutier, Arnaud

    In this thesis, we study the effects of massless higher spin fields in three-dimensional gravity with a negative cosmological constant. First, we introduce gravity in Anti-de Sitter (AdS) space without the higher spin gauge symmetry. We recapitulate the semi-classical analysis that outlines the duality between quantum gravity in three dimensions with a negative cosmological constant and a conformal field theory on the asymptotic boundary of AdS 3. We review the statistical interpretation of the black hole entropy via the AdS/CFT correspondence and the modular invariance of the partition function of a CFT on a torus. For the case of higher spin theories in AdS 3 we use those modular properties to bound the amount of gauge symmetry present. We then discuss briefly cases that can evade this bound.

  3. Hidden Markov models incorporating fuzzy measures and integrals for protein sequence identification and alignment.

    PubMed

    Bidargaddi, Niranjan P; Chetty, Madhu; Kamruzzaman, Joarder

    2008-06-01

    Profile hidden Markov models (HMMs) based on classical HMMs have been widely applied for protein sequence identification. The formulation of the forward and backward variables in profile HMMs is made under statistical independence assumption of the probability theory. We propose a fuzzy profile HMM to overcome the limitations of that assumption and to achieve an improved alignment for protein sequences belonging to a given family. The proposed model fuzzifies the forward and backward variables by incorporating Sugeno fuzzy measures and Choquet integrals, thus further extends the generalized HMM. Based on the fuzzified forward and backward variables, we propose a fuzzy Baum-Welch parameter estimation algorithm for profiles. The strong correlations and the sequence preference involved in the protein structures make this fuzzy architecture based model as a suitable candidate for building profiles of a given family, since the fuzzy set can handle uncertainties better than classical methods.

  4. Experimentally modeling stochastic processes with less memory by the use of a quantum processor

    PubMed Central

    Palsson, Matthew S.; Gu, Mile; Ho, Joseph; Wiseman, Howard M.; Pryde, Geoff J.

    2017-01-01

    Computer simulation of observable phenomena is an indispensable tool for engineering new technology, understanding the natural world, and studying human society. However, the most interesting systems are often so complex that simulating their future behavior demands storing immense amounts of information regarding how they have behaved in the past. For increasingly complex systems, simulation becomes increasingly difficult and is ultimately constrained by resources such as computer memory. Recent theoretical work shows that quantum theory can reduce this memory requirement beyond ultimate classical limits, as measured by a process’ statistical complexity, C. We experimentally demonstrate this quantum advantage in simulating stochastic processes. Our quantum implementation observes a memory requirement of Cq = 0.05 ± 0.01, far below the ultimate classical limit of C = 1. Scaling up this technique would substantially reduce the memory required in simulations of more complex systems. PMID:28168218

  5. Leading-order classical Lagrangians for the nonminimal standard-model extension

    NASA Astrophysics Data System (ADS)

    Reis, J. A. A. S.; Schreck, M.

    2018-03-01

    In this paper, we derive the general leading-order classical Lagrangian covering all fermion operators of the nonminimal standard-model extension (SME). Such a Lagrangian is considered to be the point-particle analog of the effective field theory description of Lorentz violation that is provided by the SME. At leading order in Lorentz violation, the Lagrangian obtained satisfies the set of five nonlinear equations that govern the map from the field theory to the classical description. This result can be of use for phenomenological studies of classical bodies in gravitational fields.

  6. Understanding quantum measurement from the solution of dynamical models

    NASA Astrophysics Data System (ADS)

    Allahverdyan, Armen E.; Balian, Roger; Nieuwenhuizen, Theo M.

    2013-04-01

    The quantum measurement problem, to wit, understanding why a unique outcome is obtained in each individual experiment, is currently tackled by solving models. After an introduction we review the many dynamical models proposed over the years for elucidating quantum measurements. The approaches range from standard quantum theory, relying for instance on quantum statistical mechanics or on decoherence, to quantum-classical methods, to consistent histories and to modifications of the theory. Next, a flexible and rather realistic quantum model is introduced, describing the measurement of the z-component of a spin through interaction with a magnetic memory simulated by a Curie-Weiss magnet, including N≫1 spins weakly coupled to a phonon bath. Initially prepared in a metastable paramagnetic state, it may transit to its up or down ferromagnetic state, triggered by its coupling with the tested spin, so that its magnetization acts as a pointer. A detailed solution of the dynamical equations is worked out, exhibiting several time scales. Conditions on the parameters of the model are found, which ensure that the process satisfies all the features of ideal measurements. Various imperfections of the measurement are discussed, as well as attempts of incompatible measurements. The first steps consist in the solution of the Hamiltonian dynamics for the spin-apparatus density matrix Dˆ(t). Its off-diagonal blocks in a basis selected by the spin-pointer coupling, rapidly decay owing to the many degrees of freedom of the pointer. Recurrences are ruled out either by some randomness of that coupling, or by the interaction with the bath. On a longer time scale, the trend towards equilibrium of the magnet produces a final state Dˆ(t) that involves correlations between the system and the indications of the pointer, thus ensuring registration. Although Dˆ(t) has the form expected for ideal measurements, it only describes a large set of runs. Individual runs are approached by analyzing the final states associated with all possible subensembles of runs, within a specified version of the statistical interpretation. There the difficulty lies in a quantum ambiguity: There exist many incompatible decompositions of the density matrix Dˆ(t) into a sum of sub-matrices, so that one cannot infer from its sole determination the states that would describe small subsets of runs. This difficulty is overcome by dynamics due to suitable interactions within the apparatus, which produce a special combination of relaxation and decoherence associated with the broken invariance of the pointer. Any subset of runs thus reaches over a brief delay a stable state which satisfies the same hierarchic property as in classical probability theory; the reduction of the state for each individual run follows. Standard quantum statistical mechanics alone appears sufficient to explain the occurrence of a unique answer in each run and the emergence of classicality in a measurement process. Finally, pedagogical exercises are proposed and lessons for future works on models are suggested, while the statistical interpretation is promoted for teaching.

  7. Quasi-Static Analysis of Round LaRC THUNDER Actuators

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.

    2007-01-01

    An analytic approach is developed to predict the shape and displacement with voltage in the quasi-static limit of round LaRC Thunder Actuators. The problem is treated with classical lamination theory and Von Karman non-linear analysis. In the case of classical lamination theory exact analytic solutions are found. It is shown that classical lamination theory is insufficient to describe the physical situation for large actuators but is sufficient for very small actuators. Numerical results are presented for the non-linear analysis and compared with experimental measurements. Snap-through behavior, bifurcation, and stability are presented and discussed.

  8. Quasi-Static Analysis of LaRC THUNDER Actuators

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.

    2007-01-01

    An analytic approach is developed to predict the shape and displacement with voltage in the quasi-static limit of LaRC Thunder Actuators. The problem is treated with classical lamination theory and Von Karman non-linear analysis. In the case of classical lamination theory exact analytic solutions are found. It is shown that classical lamination theory is insufficient to describe the physical situation for large actuators but is sufficient for very small actuators. Numerical results are presented for the non-linear analysis and compared with experimental measurements. Snap-through behavior, bifurcation, and stability are presented and discussed.

  9. Navigating the grounded theory terrain. Part 2.

    PubMed

    Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John

    2011-01-01

    In this paper, the choice of classic grounded theory will be discussed and justified in the context of the first author's PhD research. The methodological discussion takes place within the context of PhD research entitled: Development of a stakeholder-led framework for a structured education programme that will prepare nurses and healthcare assistants to deliver a psychosocial intervention for people with dementia. There is a lack of research and limited understanding of the effect of psychosocial interventions on people with dementia. The first author thought classic grounded theory a suitable research methodology to investigate as it is held to be ideal for areas of research where there is little understanding of the social processes at work. The literature relating to the practical application of classic grounded theory is illustrated using examples relating to four key grounded theory components: Theory development: using constant comparison and memoing, Methodological rigour, Emergence of a core category, Inclusion of self and engagement with participants. Following discussion of the choice and application of classic grounded theory, this paper explores the need for researchers to visit and understand the various grounded theory options. This paper argues that researchers new to grounded theory must be familiar with and understand the various options. The researchers will then be able to apply the methodologies they choose consistently and critically. Doing so will allow them to develop theory rigorously and they will ultimately be able to better defend their final methodological destinations.

  10. Mathematical model of the SH-3G helicopter

    NASA Technical Reports Server (NTRS)

    Phillips, J. D.

    1982-01-01

    A mathematical model of the Sikorsky SH-3G helicopter based on classical nonlinear, quasi-steady rotor theory was developed. The model was validated statically and dynamically by comparison with Navy flight-test data. The model incorporates ad hoc revisions which address the ideal assumptions of classical rotor theory and improve the static trim characteristics to provide a more realistic simulation, while retaining the simplicity of the classical model.

  11. Geometric Algebra for Physicists

    NASA Astrophysics Data System (ADS)

    Doran, Chris; Lasenby, Anthony

    2007-11-01

    Preface; Notation; 1. Introduction; 2. Geometric algebra in two and three dimensions; 3. Classical mechanics; 4. Foundations of geometric algebra; 5. Relativity and spacetime; 6. Geometric calculus; 7. Classical electrodynamics; 8. Quantum theory and spinors; 9. Multiparticle states and quantum entanglement; 10. Geometry; 11. Further topics in calculus and group theory; 12. Lagrangian and Hamiltonian techniques; 13. Symmetry and gauge theory; 14. Gravitation; Bibliography; Index.

  12. Noninvasive fetal QRS detection using an echo state network and dynamic programming.

    PubMed

    Lukoševičius, Mantas; Marozas, Vaidotas

    2014-08-01

    We address a classical fetal QRS detection problem from abdominal ECG recordings with a data-driven statistical machine learning approach. Our goal is to have a powerful, yet conceptually clean, solution. There are two novel key components at the heart of our approach: an echo state recurrent neural network that is trained to indicate fetal QRS complexes, and several increasingly sophisticated versions of statistics-based dynamic programming algorithms, which are derived from and rooted in probability theory. We also employ a standard technique for preprocessing and removing maternal ECG complexes from the signals, but do not take this as the main focus of this work. The proposed approach is quite generic and can be extended to other types of signals and annotations. Open-source code is provided.

  13. An Investigation of the Impact of Guessing on Coefficient α and Reliability

    PubMed Central

    2014-01-01

    Guessing is known to influence the test reliability of multiple-choice tests. Although there are many studies that have examined the impact of guessing, they used rather restrictive assumptions (e.g., parallel test assumptions, homogeneous inter-item correlations, homogeneous item difficulty, and homogeneous guessing levels across items) to evaluate the relation between guessing and test reliability. Based on the item response theory (IRT) framework, this study investigated the extent of the impact of guessing on reliability under more realistic conditions where item difficulty, item discrimination, and guessing levels actually vary across items with three different test lengths (TL). By accommodating multiple item characteristics simultaneously, this study also focused on examining interaction effects between guessing and other variables entered in the simulation to be more realistic. The simulation of the more realistic conditions and calculations of reliability and classical test theory (CTT) item statistics were facilitated by expressing CTT item statistics, coefficient α, and reliability in terms of IRT model parameters. In addition to the general negative impact of guessing on reliability, results showed interaction effects between TL and guessing and between guessing and test difficulty.

  14. Behavior of the maximum likelihood in quantum state tomography

    NASA Astrophysics Data System (ADS)

    Scholten, Travis L.; Blume-Kohout, Robin

    2018-02-01

    Quantum state tomography on a d-dimensional system demands resources that grow rapidly with d. They may be reduced by using model selection to tailor the number of parameters in the model (i.e., the size of the density matrix). Most model selection methods typically rely on a test statistic and a null theory that describes its behavior when two models are equally good. Here, we consider the loglikelihood ratio. Because of the positivity constraint ρ ≥ 0, quantum state space does not generally satisfy local asymptotic normality (LAN), meaning the classical null theory for the loglikelihood ratio (the Wilks theorem) should not be used. Thus, understanding and quantifying how positivity affects the null behavior of this test statistic is necessary for its use in model selection for state tomography. We define a new generalization of LAN, metric-projected LAN, show that quantum state space satisfies it, and derive a replacement for the Wilks theorem. In addition to enabling reliable model selection, our results shed more light on the qualitative effects of the positivity constraint on state tomography.

  15. Behavior of the maximum likelihood in quantum state tomography

    DOE PAGES

    Blume-Kohout, Robin J; Scholten, Travis L.

    2018-02-22

    Quantum state tomography on a d-dimensional system demands resources that grow rapidly with d. They may be reduced by using model selection to tailor the number of parameters in the model (i.e., the size of the density matrix). Most model selection methods typically rely on a test statistic and a null theory that describes its behavior when two models are equally good. Here, we consider the loglikelihood ratio. Because of the positivity constraint ρ ≥ 0, quantum state space does not generally satisfy local asymptotic normality (LAN), meaning the classical null theory for the loglikelihood ratio (the Wilks theorem) shouldmore » not be used. Thus, understanding and quantifying how positivity affects the null behavior of this test statistic is necessary for its use in model selection for state tomography. We define a new generalization of LAN, metric-projected LAN, show that quantum state space satisfies it, and derive a replacement for the Wilks theorem. In addition to enabling reliable model selection, our results shed more light on the qualitative effects of the positivity constraint on state tomography.« less

  16. Behavior of the maximum likelihood in quantum state tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blume-Kohout, Robin J; Scholten, Travis L.

    Quantum state tomography on a d-dimensional system demands resources that grow rapidly with d. They may be reduced by using model selection to tailor the number of parameters in the model (i.e., the size of the density matrix). Most model selection methods typically rely on a test statistic and a null theory that describes its behavior when two models are equally good. Here, we consider the loglikelihood ratio. Because of the positivity constraint ρ ≥ 0, quantum state space does not generally satisfy local asymptotic normality (LAN), meaning the classical null theory for the loglikelihood ratio (the Wilks theorem) shouldmore » not be used. Thus, understanding and quantifying how positivity affects the null behavior of this test statistic is necessary for its use in model selection for state tomography. We define a new generalization of LAN, metric-projected LAN, show that quantum state space satisfies it, and derive a replacement for the Wilks theorem. In addition to enabling reliable model selection, our results shed more light on the qualitative effects of the positivity constraint on state tomography.« less

  17. Cognitive biases, linguistic universals, and constraint-based grammar learning.

    PubMed

    Culbertson, Jennifer; Smolensky, Paul; Wilson, Colin

    2013-07-01

    According to classical arguments, language learning is both facilitated and constrained by cognitive biases. These biases are reflected in linguistic typology-the distribution of linguistic patterns across the world's languages-and can be probed with artificial grammar experiments on child and adult learners. Beginning with a widely successful approach to typology (Optimality Theory), and adapting techniques from computational approaches to statistical learning, we develop a Bayesian model of cognitive biases and show that it accounts for the detailed pattern of results of artificial grammar experiments on noun-phrase word order (Culbertson, Smolensky, & Legendre, 2012). Our proposal has several novel properties that distinguish it from prior work in the domains of linguistic theory, computational cognitive science, and machine learning. This study illustrates how ideas from these domains can be synthesized into a model of language learning in which biases range in strength from hard (absolute) to soft (statistical), and in which language-specific and domain-general biases combine to account for data from the macro-level scale of typological distribution to the micro-level scale of learning by individuals. Copyright © 2013 Cognitive Science Society, Inc.

  18. The Nature of Quantum Truth: Logic, Set Theory, & Mathematics in the Context of Quantum Theory

    NASA Astrophysics Data System (ADS)

    Frey, Kimberly

    The purpose of this dissertation is to construct a radically new type of mathematics whose underlying logic differs from the ordinary classical logic used in standard mathematics, and which we feel may be more natural for applications in quantum mechanics. Specifically, we begin by constructing a first order quantum logic, the development of which closely parallels that of ordinary (classical) first order logic --- the essential differences are in the nature of the logical axioms, which, in our construction, are motivated by quantum theory. After showing that the axiomatic first order logic we develop is sound and complete (with respect to a particular class of models), this logic is then used as a foundation on which to build (axiomatic) mathematical systems --- and we refer to the resulting new mathematics as "quantum mathematics." As noted above, the hope is that this form of mathematics is more natural than classical mathematics for the description of quantum systems, and will enable us to address some foundational aspects of quantum theory which are still troublesome --- e.g. the measurement problem --- as well as possibly even inform our thinking about quantum gravity. After constructing the underlying logic, we investigate properties of several mathematical systems --- e.g. axiom systems for abstract algebras, group theory, linear algebra, etc. --- in the presence of this quantum logic. In the process, we demonstrate that the resulting quantum mathematical systems have some strange, but very interesting features, which indicates a richness in the structure of mathematics that is classically inaccessible. Moreover, some of these features do indeed suggest possible applications to foundational questions in quantum theory. We continue our investigation of quantum mathematics by constructing an axiomatic quantum set theory, which we show satisfies certain desirable criteria. Ultimately, we hope that such a set theory will lead to a foundation for quantum mathematics in a sense which parallels the foundational role of classical set theory in classical mathematics. One immediate application of the quantum set theory we develop is to provide a foundation on which to construct quantum natural numbers, which are the quantum analog of the classical counting numbers. It turns out that in a special class of models, there exists a 1-1 correspondence between the quantum natural numbers and bounded observables in quantum theory whose eigenvalues are (ordinary) natural numbers. This 1-1 correspondence is remarkably satisfying, and not only gives us great confidence in our quantum set theory, but indicates the naturalness of such models for quantum theory itself. We go on to develop a Peano-like arithmetic for these new "numbers," as well as consider some of its consequences. Finally, we conclude by summarizing our results, and discussing directions for future work.

  19. Nucleation theory - Is replacement free energy needed?. [error analysis of capillary approximation

    NASA Technical Reports Server (NTRS)

    Doremus, R. H.

    1982-01-01

    It has been suggested that the classical theory of nucleation of liquid from its vapor as developed by Volmer and Weber (1926) needs modification with a factor referred to as the replacement free energy and that the capillary approximation underlying the classical theory is in error. Here, the classical nucleation equation is derived from fluctuation theory, Gibb's result for the reversible work to form a critical nucleus, and the rate of collision of gas molecules with a surface. The capillary approximation is not used in the derivation. The chemical potential of small drops is then considered, and it is shown that the capillary approximation can be derived from thermodynamic equations. The results show that no corrections to Volmer's equation are needed.

  20. Effective model hierarchies for dynamic and static classical density functional theories

    NASA Astrophysics Data System (ADS)

    Majaniemi, S.; Provatas, N.; Nonomura, M.

    2010-09-01

    The origin and methodology of deriving effective model hierarchies are presented with applications to solidification of crystalline solids. In particular, it is discussed how the form of the equations of motion and the effective parameters on larger scales can be obtained from the more microscopic models. It will be shown that tying together the dynamic structure of the projection operator formalism with static classical density functional theories can lead to incomplete (mass) transport properties even though the linearized hydrodynamics on large scales is correctly reproduced. To facilitate a more natural way of binding together the dynamics of the macrovariables and classical density functional theory, a dynamic generalization of density functional theory based on the nonequilibrium generating functional is suggested.

  1. Teaching Statistics Using Classic Psychology Research: An Activities-Based Approach

    ERIC Educational Resources Information Center

    Holmes, Karen Y.; Dodd, Brett A.

    2012-01-01

    In this article, we discuss a collection of active learning activities derived from classic psychology studies that illustrate the appropriate use of descriptive and inferential statistics. (Contains 2 tables.)

  2. Using extant literature in a grounded theory study: a personal account.

    PubMed

    Yarwood-Ross, Lee; Jack, Kirsten

    2015-03-01

    To provide a personal account of the factors in a doctoral study that led to the adoption of classic grounded theory principles relating to the use of literature. Novice researchers considering grounded theory methodology will become aware of the contentious issue of how and when extant literature should be incorporated into a study. The three main grounded theory approaches are classic, Straussian and constructivist, and the seminal texts provide conflicting beliefs surrounding the use of literature. A classic approach avoids a pre-study literature review to minimise preconceptions and emphasises the constant comparison method, while the Straussian and constructivist approaches focus more on the beneficial aspects of an initial literature review and researcher reflexivity. The debate also extends into the wider academic community, where no consensus exists. This is a methodological paper detailing the authors' engagement in the debate surrounding the role of the literature in a grounded theory study. In the authors' experience, researchers can best understand the use of literature in grounded theory through immersion in the seminal texts, engaging with wider academic literature, and examining their preconceptions of the substantive area. The authors concluded that classic grounded theory principles were appropriate in the context of their doctoral study. Novice researchers will have their own sets of circumstances when preparing their studies and should become aware of the different perspectives to make decisions that they can ultimately justify. This paper can be used by other novice researchers as an example of the decision-making process that led to delaying a pre-study literature review and identifies the resources used to write a research proposal when using a classic grounded theory approach.

  3. Multipactor threshold calculation of coaxial transmission lines in microwave applications with nonstationary statistical theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, S.; Li, Y.; Liu, C.

    2015-08-15

    This paper presents a statistical theory for the initial onset of multipactor breakdown in coaxial transmission lines, taking both the nonuniform electric field and random electron emission velocity into account. A general numerical method is first developed to construct the joint probability density function based on the approximate equation of the electron trajectory. The nonstationary dynamics of the multipactor process on both surfaces of coaxial lines are modelled based on the probability of various impacts and their corresponding secondary emission. The resonant assumption of the classical theory on the independent double-sided and single-sided impacts is replaced by the consideration ofmore » their interaction. As a result, the time evolutions of the electron population for exponential growth and absorption on both inner and outer conductor, in response to the applied voltage above and below the multipactor breakdown level, are obtained to investigate the exact mechanism of multipactor discharge in coaxial lines. Furthermore, the multipactor threshold predictions of the presented model are compared with experimental results using measured secondary emission yield of the tested samples which shows reasonable agreement. Finally, the detailed impact scenario reveals that single-surface multipactor is more likely to occur with a higher outer to inner conductor radius ratio.« less

  4. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling.

    PubMed

    Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall

    2016-01-01

    Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.

  5. Classical BV Theories on Manifolds with Boundary

    NASA Astrophysics Data System (ADS)

    Cattaneo, Alberto S.; Mnev, Pavel; Reshetikhin, Nicolai

    2014-12-01

    In this paper we extend the classical BV framework to gauge theories on spacetime manifolds with boundary. In particular, we connect the BV construction in the bulk with the BFV construction on the boundary and we develop its extension to strata of higher codimension in the case of manifolds with corners. We present several examples including electrodynamics, Yang-Mills theory and topological field theories coming from the AKSZ construction, in particular, the Chern-Simons theory, the BF theory, and the Poisson sigma model. This paper is the first step towards developing the perturbative quantization of such theories on manifolds with boundary in a way consistent with gluing.

  6. An Examination of the Flynn Effect in the National Intelligence Test in Estonia

    ERIC Educational Resources Information Center

    Shiu, William

    2012-01-01

    This study examined the Flynn Effect (FE; i.e., the rise in IQ scores over time) in Estonia from Scale B of the National Intelligence Test using both classical test theory (CTT) and item response theory (IRT) methods. Secondary data from two cohorts (1934, n = 890 and 2006, n = 913) of students were analyzed, using both classical test theory (CTT)…

  7. On the streaming model for redshift-space distortions

    NASA Astrophysics Data System (ADS)

    Kuruvilla, Joseph; Porciani, Cristiano

    2018-06-01

    The streaming model describes the mapping between real and redshift space for 2-point clustering statistics. Its key element is the probability density function (PDF) of line-of-sight pairwise peculiar velocities. Following a kinetic-theory approach, we derive the fundamental equations of the streaming model for ordered and unordered pairs. In the first case, we recover the classic equation while we demonstrate that modifications are necessary for unordered pairs. We then discuss several statistical properties of the pairwise velocities for DM particles and haloes by using a suite of high-resolution N-body simulations. We test the often used Gaussian ansatz for the PDF of pairwise velocities and discuss its limitations. Finally, we introduce a mixture of Gaussians which is known in statistics as the generalised hyperbolic distribution and show that it provides an accurate fit to the PDF. Once inserted in the streaming equation, the fit yields an excellent description of redshift-space correlations at all scales that vastly outperforms the Gaussian and exponential approximations. Using a principal-component analysis, we reduce the complexity of our model for large redshift-space separations. Our results increase the robustness of studies of anisotropic galaxy clustering and are useful for extending them towards smaller scales in order to test theories of gravity and interacting dark-energy models.

  8. A graph theory approach to identify resonant and non-resonant transmission paths in statistical modal energy distribution analysis

    NASA Astrophysics Data System (ADS)

    Aragonès, Àngels; Maxit, Laurent; Guasch, Oriol

    2015-08-01

    Statistical modal energy distribution analysis (SmEdA) extends classical statistical energy analysis (SEA) to the mid frequency range by establishing power balance equations between modes in different subsystems. This circumvents the SEA requirement of modal energy equipartition and enables applying SmEdA to the cases of low modal overlap, locally excited subsystems and to deal with complex heterogeneous subsystems as well. Yet, widening the range of application of SEA is done at a price with large models because the number of modes per subsystem can become considerable when the frequency increases. Therefore, it would be worthwhile to have at one's disposal tools for a quick identification and ranking of the resonant and non-resonant paths involved in modal energy transmission between subsystems. It will be shown that previously developed graph theory algorithms for transmission path analysis (TPA) in SEA can be adapted to SmEdA and prove useful for that purpose. The case of airborne transmission between two cavities separated apart by homogeneous and ribbed plates will be first addressed to illustrate the potential of the graph approach. A more complex case representing transmission between non-contiguous cavities in a shipbuilding structure will be also presented.

  9. Independence polynomial and matching polynomial of the Koch network

    NASA Astrophysics Data System (ADS)

    Liao, Yunhua; Xie, Xiaoliang

    2015-11-01

    The lattice gas model and the monomer-dimer model are two classical models in statistical mechanics. It is well known that the partition functions of these two models are associated with the independence polynomial and the matching polynomial in graph theory, respectively. Both polynomials have been shown to belong to the “#P-complete” class, which indicate the problems are computationally “intractable”. We consider these two polynomials of the Koch networks which are scale-free with small-world effects. Explicit recurrences are derived, and explicit formulae are presented for the number of independent sets of a certain type.

  10. Nucleon matter equation of state, particle number fluctuations, and shear viscosity within UrQMD box calculations

    NASA Astrophysics Data System (ADS)

    Motornenko, A.; Bravina, L.; Gorenstein, M. I.; Magner, A. G.; Zabrodin, E.

    2018-03-01

    Properties of equilibrated nucleon system are studied within the ultra-relativistic quantum molecular dynamics (UrQMD) transport model. The UrQMD calculations are done within a finite box with periodic boundary conditions. The system achieves thermal equilibrium due to nucleon-nucleon elastic scattering. For the UrQMD-equilibrium state, nucleon energy spectra, equation of state, particle number fluctuations, and shear viscosity η are calculated. The UrQMD results are compared with both, statistical mechanics and Chapman-Enskog kinetic theory, for a classical system of nucleons with hard-core repulsion.

  11. Bukhvostov-Lipatov model and quantum-classical duality

    NASA Astrophysics Data System (ADS)

    Bazhanov, Vladimir V.; Lukyanov, Sergei L.; Runov, Boris A.

    2018-02-01

    The Bukhvostov-Lipatov model is an exactly soluble model of two interacting Dirac fermions in 1 + 1 dimensions. The model describes weakly interacting instantons and anti-instantons in the O (3) non-linear sigma model. In our previous work [arxiv:arXiv:1607.04839] we have proposed an exact formula for the vacuum energy of the Bukhvostov-Lipatov model in terms of special solutions of the classical sinh-Gordon equation, which can be viewed as an example of a remarkable duality between integrable quantum field theories and integrable classical field theories in two dimensions. Here we present a complete derivation of this duality based on the classical inverse scattering transform method, traditional Bethe ansatz techniques and analytic theory of ordinary differential equations. In particular, we show that the Bethe ansatz equations defining the vacuum state of the quantum theory also define connection coefficients of an auxiliary linear problem for the classical sinh-Gordon equation. Moreover, we also present details of the derivation of the non-linear integral equations determining the vacuum energy and other spectral characteristics of the model in the case when the vacuum state is filled by 2-string solutions of the Bethe ansatz equations.

  12. Further Development of an Optimal Design Approach Applied to Axial Magnetic Bearings

    NASA Technical Reports Server (NTRS)

    Bloodgood, V. Dale, Jr.; Groom, Nelson J.; Britcher, Colin P.

    2000-01-01

    Classical design methods involved in magnetic bearings and magnetic suspension systems have always had their limitations. Because of this, the overall effectiveness of a design has always relied heavily on the skill and experience of the individual designer. This paper combines two approaches that have been developed to aid the accuracy and efficiency of magnetostatic design. The first approach integrates classical magnetic circuit theory with modern optimization theory to increase design efficiency. The second approach uses loss factors to increase the accuracy of classical magnetic circuit theory. As an example, an axial magnetic thrust bearing is designed for minimum power.

  13. Ethical and Stylistic Implications in Delivering Conference Papers.

    ERIC Educational Resources Information Center

    Enos, Theresa

    1986-01-01

    Analyzes shortcomings of conference papers intended for the eye rather than the ear. Referring to classical oratory, speech act theory, and cognitive theory, recommends revising papers for oral presentation by using classical disposition; deductive rather than inductive argument; formulaic repetition of words and phrases; non-inverted clause…

  14. Quantum theory for 1D X-ray free electron laser

    NASA Astrophysics Data System (ADS)

    Anisimov, Petr M.

    2018-06-01

    Classical 1D X-ray Free Electron Laser (X-ray FEL) theory has stood the test of time by guiding FEL design and development prior to any full-scale analysis. Future X-ray FELs and inverse-Compton sources, where photon recoil approaches an electron energy spread value, push the classical theory to its limits of applicability. After substantial efforts by the community to find what those limits are, there is no universally agreed upon quantum approach to design and development of future X-ray sources. We offer a new approach to formulate the quantum theory for 1D X-ray FELs that has an obvious connection to the classical theory, which allows for immediate transfer of knowledge between the two regimes. We exploit this connection in order to draw quantum mechanical conclusions about the quantum nature of electrons and generated radiation in terms of FEL variables.

  15. Plasmon mass scale and quantum fluctuations of classical fields on a real time lattice

    NASA Astrophysics Data System (ADS)

    Kurkela, Aleksi; Lappi, Tuomas; Peuron, Jarkko

    2018-03-01

    Classical real-time lattice simulations play an important role in understanding non-equilibrium phenomena in gauge theories and are used in particular to model the prethermal evolution of heavy-ion collisions. Above the Debye scale the classical Yang-Mills (CYM) theory can be matched smoothly to kinetic theory. First we study the limits of the quasiparticle picture of the CYM fields by determining the plasmon mass of the system using 3 different methods. Then we argue that one needs a numerical calculation of a system of classical gauge fields and small linearized fluctuations, which correspond to quantum fluctuations, in a way that keeps the separation between the two manifest. We demonstrate and test an implementation of an algorithm with the linearized fluctuation showing that the linearization indeed works and that the Gauss's law is conserved.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nomura, Yasunori; Salzetta, Nico; Sanches, Fabio

    We study the Hilbert space structure of classical spacetimes under the assumption that entanglement in holographic theories determines semiclassical geometry. We show that this simple assumption has profound implications; for example, a superposition of classical spacetimes may lead to another classical spacetime. Despite its unconventional nature, this picture admits the standard interpretation of superpositions of well-defined semiclassical spacetimes in the limit that the number of holographic degrees of freedom becomes large. We illustrate these ideas using a model for the holographic theory of cosmological spacetimes.

  17. Classical theory of radiating strings

    NASA Technical Reports Server (NTRS)

    Copeland, Edmund J.; Haws, D.; Hindmarsh, M.

    1990-01-01

    The divergent part of the self force of a radiating string coupled to gravity, an antisymmetric tensor and a dilaton in four dimensions are calculated to first order in classical perturbation theory. While this divergence can be absorbed into a renormalization of the string tension, demanding that both it and the divergence in the energy momentum tensor vanish forces the string to have the couplings of compactified N = 1 D = 10 supergravity. In effect, supersymmetry cures the classical infinities.

  18. Emergence of a classical Universe from quantum gravity and cosmology.

    PubMed

    Kiefer, Claus

    2012-09-28

    I describe how we can understand the classical appearance of our world from a universal quantum theory. The essential ingredient is the process of decoherence. I start with a general discussion in ordinary quantum theory and then turn to quantum gravity and quantum cosmology. There is a whole hierarchy of classicality from the global gravitational field to the fluctuations in the cosmic microwave background, which serve as the seeds for the structure in the Universe.

  19. Chiral anomaly, Berry phase, and chiral kinetic theory from worldlines in quantum field theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Niklas; Venugopalan, Raju

    Here, we outline a novel chiral kinetic theory framework for systematic computations of the Chiral Magnetic Effect (CME) in ultrarelativistic heavy-ion collisions. The real part of the fermion determinant in the QCD effective action is expressed as a supersymmetric world-line action of spinning, colored, Grassmanian point particles in background gauge fields, with equations of motion that are covariant generalizations of the Bargmann-Michel-Telegdi and Wong equations. Berry’s phase is obtained in a consistent non-relativistic adiabatic limit. The chiral anomaly, in contrast, arises from the phase of the fermion determinant; its topological properties are therefore distinct from those of the Berry phase.more » We show that the imaginary contribution to the fermion determinant too can be expressed as a point particle world-line path integral and derive the corresponding anomalous axial vector current. Our results can be used to derive a covariant relativistic chiral kinetic theory including the effects of topological fluctuations that has overlap with classical-statistical simulations of the CME at early times and anomalous hydrodynamics at late times.« less

  20. Chiral anomaly, Berry phase, and chiral kinetic theory from worldlines in quantum field theory

    DOE PAGES

    Mueller, Niklas; Venugopalan, Raju

    2018-03-21

    Here, we outline a novel chiral kinetic theory framework for systematic computations of the Chiral Magnetic Effect (CME) in ultrarelativistic heavy-ion collisions. The real part of the fermion determinant in the QCD effective action is expressed as a supersymmetric world-line action of spinning, colored, Grassmanian point particles in background gauge fields, with equations of motion that are covariant generalizations of the Bargmann-Michel-Telegdi and Wong equations. Berry’s phase is obtained in a consistent non-relativistic adiabatic limit. The chiral anomaly, in contrast, arises from the phase of the fermion determinant; its topological properties are therefore distinct from those of the Berry phase.more » We show that the imaginary contribution to the fermion determinant too can be expressed as a point particle world-line path integral and derive the corresponding anomalous axial vector current. Our results can be used to derive a covariant relativistic chiral kinetic theory including the effects of topological fluctuations that has overlap with classical-statistical simulations of the CME at early times and anomalous hydrodynamics at late times.« less

  1. Classical gluon and graviton radiation from the bi-adjoint scalar double copy

    NASA Astrophysics Data System (ADS)

    Goldberger, Walter D.; Prabhu, Siddharth G.; Thompson, Jedidiah O.

    2017-09-01

    We find double-copy relations between classical radiating solutions in Yang-Mills theory coupled to dynamical color charges and their counterparts in a cubic bi-adjoint scalar field theory which interacts linearly with particles carrying bi-adjoint charge. The particular color-to-kinematics replacements we employ are motivated by the Bern-Carrasco-Johansson double-copy correspondence for on-shell amplitudes in gauge and gravity theories. They are identical to those recently used to establish relations between classical radiating solutions in gauge theory and in dilaton gravity. Our explicit bi-adjoint solutions are constructed to second order in a perturbative expansion, and map under the double copy onto gauge theory solutions which involve at most cubic gluon self-interactions. If the correspondence is found to persist to higher orders in perturbation theory, our results suggest the possibility of calculating gravitational radiation from colliding compact objects, directly from a scalar field with vastly simpler (purely cubic) Feynman vertices.

  2. Combinatorial Market Processing for Multilateral Coordination

    DTIC Science & Technology

    2005-09-01

    8 In the classical auction theory literature, most of the attention is focused on one-sided, single-item auctions [86]. There is now a growing body of...Programming in Infinite-dimensional Spaces: Theory and Applications, Wiley, 1987. [3] K. J. Arrow, “An extension of the basic theorems of classical ...Commodities, Princeton University Press, 1969. [43] D. Friedman and J. Rust, The Double Auction Market: Institutions, Theories, and Evidence, Addison

  3. Group entropies, correlation laws, and zeta functions.

    PubMed

    Tempesta, Piergiulio

    2011-08-01

    The notion of group entropy is proposed. It enables the unification and generaliztion of many different definitions of entropy known in the literature, such as those of Boltzmann-Gibbs, Tsallis, Abe, and Kaniadakis. Other entropic functionals are introduced, related to nontrivial correlation laws characterizing universality classes of systems out of equilibrium when the dynamics is weakly chaotic. The associated thermostatistics are discussed. The mathematical structure underlying our construction is that of formal group theory, which provides the general structure of the correlations among particles and dictates the associated entropic functionals. As an example of application, the role of group entropies in information theory is illustrated and generalizations of the Kullback-Leibler divergence are proposed. A new connection between statistical mechanics and zeta functions is established. In particular, Tsallis entropy is related to the classical Riemann zeta function.

  4. Magnetic disorder in superconductors: Enhancement by mesoscopic fluctuations

    NASA Astrophysics Data System (ADS)

    Burmistrov, I. S.; Skvortsov, M. A.

    2018-01-01

    We study the density of states (DOS) and the transition temperature Tc in a dirty superconducting film with rare classical magnetic impurities of an arbitrary strength described by the Poissonian statistics. We take into account that the potential disorder is a source of mesoscopic fluctuations of the local DOS, and, consequently, of the effective strength of magnetic impurities. We find that these mesoscopic fluctuations result in a nonzero DOS for all energies in the region of the phase diagram where without this effect the DOS is zero within the standard mean-field theory. This mechanism can be more efficient in filling the mean-field superconducting gap than rare fluctuations of the potential disorder (instantons). Depending on the magnetic impurity strength, the suppression of Tc by spin-flip scattering can be faster or slower than in the standard mean-field theory.

  5. Large-scale fluctuations in the diffusive decomposition of solid solutions

    NASA Astrophysics Data System (ADS)

    Karpov, V. G.; Grimsditch, M.

    1995-04-01

    The concept of an instability in the classic Ostwald ripening theory with respect to compositional fluctuations is suggested. We show that small statistical fluctuations in the precipitate phase lead to gigantic Coulomb-like fluctuations in the solute concentration which in turn affect the ripening. As a result large-scale fluctuations in both the precipitate and solute concentrations appear. These fluctuations are characterized by amplitudes of the order of the average values of the corresponding quantities and by a space scale L~(na)-1/2 which is considerably greater than both the average nuclear radius and internuclear distance. The Lifshitz-Slyozov theory of ripening is shown to remain locally applicable, over length scales much less than L. The implications of these findings for elastic light scattering in solid solutions that have undergone Ostwald ripening are considered.

  6. The Classical Vacuum.

    ERIC Educational Resources Information Center

    Boyer, Timothy H.

    1985-01-01

    The classical vacuum of physics is not empty, but contains a distinctive pattern of electromagnetic fields. Discovery of the vacuum, thermal spectrum, classical electron theory, zero-point spectrum, and effects of acceleration are discussed. Connection between thermal radiation and the classical vacuum reveals unexpected unity in the laws of…

  7. An improved exceedance theory for combined random stresses

    NASA Technical Reports Server (NTRS)

    Lester, H. C.

    1974-01-01

    An extension is presented of Rice's classic solution for the exceedances of a constant level by a single random process to its counterpart for an n-dimensional vector process. An interaction boundary, analogous to the constant level considered by Rice for the one-dimensional case, is assumed in the form of a hypersurface. The theory for the numbers of boundary exceedances is developed by using a joint statistical approach which fully accounts for all cross-correlation effects. An exact expression is derived for the n-dimensional exceedance density function, which is valid for an arbitrary interaction boundary. For application to biaxial states of combined random stress, the general theory is reduced to the two-dimensional case. An elliptical stress interaction boundary is assumed and the exact expression for the density function is presented. The equations are expressed in a format which facilitates calculating the exceedances by numerically evaluating a line integral. The behavior of the density function for the two-dimensional case is briefly discussed.

  8. The Six Core Theories of Modern Physics

    NASA Astrophysics Data System (ADS)

    Stevens, Charles F.

    1996-09-01

    Charles Stevens, a prominent neurobiologist who originally trained as a biophysicist (with George Uhlenbeck and Mark Kac), wrote this book almost by accident. Each summer he found himself reviewing key areas of physics that he had once known and understood well, for use in his present biological research. Since there was no book, he created his own set of notes, which formed the basis for this brief, clear, and self-contained summary of the basic theoretical structures of classical mechanics, electricity and magnetism, quantum mechanics, statistical physics, special relativity, and quantum field theory. The Six Core Theories of Modern Physics can be used by advanced undergraduates or beginning graduate students as a supplement to the standard texts or for an uncluttered, succinct review of the key areas. Professionals in such quantitative sciences as chemistry, engineering, computer science, applied mathematics, and biophysics who need to brush up on the essentials of a particular area will find most of the required background material, including the mathematics.

  9. The Effect of Mental Rotation on Surgical Pathological Diagnosis.

    PubMed

    Park, Heejung; Kim, Hyun Soo; Cha, Yoon Jin; Choi, Junjeong; Minn, Yangki; Kim, Kyung Sik; Kim, Se Hoon

    2018-05-01

    Pathological diagnosis involves very delicate and complex consequent processing that is conducted by a pathologist. The recognition of false patterns might be an important cause of misdiagnosis in the field of surgical pathology. In this study, we evaluated the influence of visual and cognitive bias in surgical pathologic diagnosis, focusing on the influence of "mental rotation." We designed three sets of the same images of uterine cervix biopsied specimens (original, left to right mirror images, and 180-degree rotated images), and recruited 32 pathologists to diagnose the 3 set items individually. First, the items found to be adequate for analysis by classical test theory, Generalizability theory, and item response theory. The results showed statistically no differences in difficulty, discrimination indices, and response duration time between the image sets. Mental rotation did not influence the pathologists' diagnosis in practice. Interestingly, outliers were more frequent in rotated image sets, suggesting that the mental rotation process may influence the pathological diagnoses of a few individual pathologists. © Copyright: Yonsei University College of Medicine 2018.

  10. Random walk in generalized quantum theory

    NASA Astrophysics Data System (ADS)

    Martin, Xavier; O'Connor, Denjoe; Sorkin, Rafael D.

    2005-01-01

    One can view quantum mechanics as a generalization of classical probability theory that provides for pairwise interference among alternatives. Adopting this perspective, we “quantize” the classical random walk by finding, subject to a certain condition of “strong positivity”, the most general Markovian, translationally invariant “decoherence functional” with nearest neighbor transitions.

  11. Free Fermions and the Classical Compact Groups

    NASA Astrophysics Data System (ADS)

    Cunden, Fabio Deelan; Mezzadri, Francesco; O'Connell, Neil

    2018-06-01

    There is a close connection between the ground state of non-interacting fermions in a box with classical (absorbing, reflecting, and periodic) boundary conditions and the eigenvalue statistics of the classical compact groups. The associated determinantal point processes can be extended in two natural directions: (i) we consider the full family of admissible quantum boundary conditions (i.e., self-adjoint extensions) for the Laplacian on a bounded interval, and the corresponding projection correlation kernels; (ii) we construct the grand canonical extensions at finite temperature of the projection kernels, interpolating from Poisson to random matrix eigenvalue statistics. The scaling limits in the bulk and at the edges are studied in a unified framework, and the question of universality is addressed. Whether the finite temperature determinantal processes correspond to the eigenvalue statistics of some matrix models is, a priori, not obvious. We complete the picture by constructing a finite temperature extension of the Haar measure on the classical compact groups. The eigenvalue statistics of the resulting grand canonical matrix models (of random size) corresponds exactly to the grand canonical measure of free fermions with classical boundary conditions.

  12. Neo-classical theory of competition or Adam Smith's hand as mathematized ideology

    NASA Astrophysics Data System (ADS)

    McCauley, Joseph L.

    2001-10-01

    Orthodox economic theory (utility maximization, rational agents, efficient markets in equilibrium) is based on arbitrarily postulated, nonempiric notions. The disagreement between economic reality and a key feature of neo-classical economic theory was criticized empirically by Osborne. I show that the orthodox theory is internally self-inconsistent for the very reason suggested by Osborne: lack of invertibility of demand and supply as functions of price to obtain price as functions of supply and demand. The reason for the noninvertibililty arises from nonintegrable excess demand dynamics, a feature of their theory completely ignored by economists.

  13. Measurement incompatibility and Schrödinger-Einstein-Podolsky-Rosen steering in a class of probabilistic theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banik, Manik, E-mail: manik11ju@gmail.com

    Steering is one of the most counter intuitive non-classical features of bipartite quantum system, first noticed by Schrödinger at the early days of quantum theory. On the other hand, measurement incompatibility is another non-classical feature of quantum theory, initially pointed out by Bohr. Recently, Quintino et al. [Phys. Rev. Lett. 113, 160402 (2014)] and Uola et al. [Phys. Rev. Lett. 113, 160403 (2014)] have investigated the relation between these two distinct non-classical features. They have shown that a set of measurements is not jointly measurable (i.e., incompatible) if and only if they can be used for demonstrating Schrödinger-Einstein-Podolsky-Rosen steering. Themore » concept of steering has been generalized for more general abstract tensor product theories rather than just Hilbert space quantum mechanics. In this article, we discuss that the notion of measurement incompatibility can be extended for general probability theories. Further, we show that the connection between steering and measurement incompatibility holds in a border class of tensor product theories rather than just quantum theory.« less

  14. What is Quantum Mechanics? A Minimal Formulation

    NASA Astrophysics Data System (ADS)

    Friedberg, R.; Hohenberg, P. C.

    2018-03-01

    This paper presents a minimal formulation of nonrelativistic quantum mechanics, by which is meant a formulation which describes the theory in a succinct, self-contained, clear, unambiguous and of course correct manner. The bulk of the presentation is the so-called "microscopic theory", applicable to any closed system S of arbitrary size N, using concepts referring to S alone, without resort to external apparatus or external agents. An example of a similar minimal microscopic theory is the standard formulation of classical mechanics, which serves as the template for a minimal quantum theory. The only substantive assumption required is the replacement of the classical Euclidean phase space by Hilbert space in the quantum case, with the attendant all-important phenomenon of quantum incompatibility. Two fundamental theorems of Hilbert space, the Kochen-Specker-Bell theorem and Gleason's theorem, then lead inevitably to the well-known Born probability rule. For both classical and quantum mechanics, questions of physical implementation and experimental verification of the predictions of the theories are the domain of the macroscopic theory, which is argued to be a special case or application of the more general microscopic theory.

  15. Theory of mind deficit in adult patients with congenital heart disease.

    PubMed

    Chiavarino, Claudia; Bianchino, Claudia; Brach-Prever, Silvia; Riggi, Chiara; Palumbo, Luigi; Bara, Bruno G; Bosco, Francesca M

    2015-10-01

    This article provides the first assessment of theory of mind, that is, the ability to reason about mental states, in adult patients with congenital heart disease. Patients with congenital heart disease and matched healthy controls were administered classical theory of mind tasks and a semi-structured interview which provides a multidimensional evaluation of theory of mind (Theory of Mind Assessment Scale). The patients with congenital heart disease performed worse than the controls on the Theory of Mind Assessment Scale, whereas they did as well as the control group on the classical theory-of-mind tasks. These findings provide the first evidence that adults with congenital heart disease may display specific impairments in theory of mind. © The Author(s) 2013.

  16. Nonequilibrium fixed points in longitudinally expanding scalar theories: Infrared cascade, Bose condensation and a challenge for kinetic theory

    DOE PAGES

    Berges, J.; Schlichting, S.; Boguslavski, K.; ...

    2015-11-05

    In [Phys. Rev. Lett. 114, 061601 (2015)], we reported on a new universality class for longitudinally expanding systems, encompassing strongly correlated non-Abelian plasmas and N-component self-interacting scalar field theories. Using classical-statistical methods, we showed that these systems share the same self-similar scaling properties for a wide range of momenta in a limit where particles are weakly coupled but their occupancy is high. Here we significantly expand on our previous work and delineate two further self-similar regimes. One of these occurs in the deep infrared (IR) regime of very high occupancies, where the nonequilibrium dynamics leads to the formation of amore » Bose-Einstein condensate. The universal IR scaling exponents and the spectral index characterizing the isotropic IR distributions are described by an effective theory derived from a systematic large-N expansion at next-to-leading order. Remarkably, this effective theory can be cast as a vertex-resummed kinetic theory. The other novel self-similar regime occurs close to the hard physical scale of the theory, and sets in only at later times. In this study, we argue that the important role of the infrared dynamics ensures that key features of our results for scalar and gauge theories cannot be reproduced consistently in conventional kinetic theory frameworks.« less

  17. Nonequilibrium fixed points in longitudinally expanding scalar theories: Infrared cascade, Bose condensation and a challenge for kinetic theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berges, J.; Schlichting, S.; Boguslavski, K.

    In [Phys. Rev. Lett. 114, 061601 (2015)], we reported on a new universality class for longitudinally expanding systems, encompassing strongly correlated non-Abelian plasmas and N-component self-interacting scalar field theories. Using classical-statistical methods, we showed that these systems share the same self-similar scaling properties for a wide range of momenta in a limit where particles are weakly coupled but their occupancy is high. Here we significantly expand on our previous work and delineate two further self-similar regimes. One of these occurs in the deep infrared (IR) regime of very high occupancies, where the nonequilibrium dynamics leads to the formation of amore » Bose-Einstein condensate. The universal IR scaling exponents and the spectral index characterizing the isotropic IR distributions are described by an effective theory derived from a systematic large-N expansion at next-to-leading order. Remarkably, this effective theory can be cast as a vertex-resummed kinetic theory. The other novel self-similar regime occurs close to the hard physical scale of the theory, and sets in only at later times. In this study, we argue that the important role of the infrared dynamics ensures that key features of our results for scalar and gauge theories cannot be reproduced consistently in conventional kinetic theory frameworks.« less

  18. Rigorous theory of molecular orientational nonlinear optics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kwak, Chong Hoon, E-mail: chkwak@ynu.ac.kr; Kim, Gun Yeup

    2015-01-15

    Classical statistical mechanics of the molecular optics theory proposed by Buckingham [A. D. Buckingham and J. A. Pople, Proc. Phys. Soc. A 68, 905 (1955)] has been extended to describe the field induced molecular orientational polarization effects on nonlinear optics. In this paper, we present the generalized molecular orientational nonlinear optical processes (MONLO) through the calculation of the classical orientational averaging using the Boltzmann type time-averaged orientational interaction energy in the randomly oriented molecular system under the influence of applied electric fields. The focal points of the calculation are (1) the derivation of rigorous tensorial components of the effective molecularmore » hyperpolarizabilities, (2) the molecular orientational polarizations and the electronic polarizations including the well-known third-order dc polarization, dc electric field induced Kerr effect (dc Kerr effect), optical Kerr effect (OKE), dc electric field induced second harmonic generation (EFISH), degenerate four wave mixing (DFWM) and third harmonic generation (THG). We also present some of the new predictive MONLO processes. For second-order MONLO, second-order optical rectification (SOR), Pockels effect and difference frequency generation (DFG) are described in terms of the anisotropic coefficients of first hyperpolarizability. And, for third-order MONLO, third-order optical rectification (TOR), dc electric field induced difference frequency generation (EFIDFG) and pump-probe transmission are presented.« less

  19. Quantum Foundations of Quantum Information

    NASA Astrophysics Data System (ADS)

    Griffiths, Robert

    2009-03-01

    The main foundational issue for quantum information is: What is quantum information about? What does it refer to? Classical information typically refers to physical properties, and since classical is a subset of quantum information (assuming the world is quantum mechanical), quantum information should--and, it will be argued, does--refer to quantum physical properties represented by projectors on appropriate subspaces of a quantum Hilbert space. All sorts of microscopic and macroscopic properties, not just measurement outcomes, can be represented in this way, and are thus a proper subject of quantum information. The Stern-Gerlach experiment illustrates this. When properties are compatible, which is to say their projectors commute, Shannon's classical information theory based on statistical correlations extends without difficulty or change to the quantum case. When projectors do not commute, giving rise to characteristic quantum effects, a foundation for the subject can still be constructed by replacing the ``measurement and wave-function collapse'' found in textbooks--an efficient calculational tool, but one giving rise to numerous conceptual difficulties--with a fully consistent and paradox free stochastic formulation of standard quantum mechanics. This formulation is particularly helpful in that it contains no nonlocal superluminal influences; the reason the latter carry no information is that they do not exist.

  20. Periodic orbit spectrum in terms of Ruelle-Pollicott resonances

    NASA Astrophysics Data System (ADS)

    Leboeuf, P.

    2004-02-01

    Fully chaotic Hamiltonian systems possess an infinite number of classical solutions which are periodic, e.g., a trajectory “p” returns to its initial conditions after some fixed time τp. Our aim is to investigate the spectrum {τ1,τ2,…} of periods of the periodic orbits. An explicit formula for the density ρ(τ)=∑pδ(τ-τp) is derived in terms of the eigenvalues of the classical evolution operator. The density is naturally decomposed into a smooth part plus an interferent sum over oscillatory terms. The frequencies of the oscillatory terms are given by the imaginary part of the complex eigenvalues (Ruelle-Pollicott resonances). For large periods, corrections to the well-known exponential growth of the smooth part of the density are obtained. An alternative formula for ρ(τ) in terms of the zeros and poles of the Ruelle ζ function is also discussed. The results are illustrated with the geodesic motion in billiards of constant negative curvature. Connections with the statistical properties of the corresponding quantum eigenvalues, random-matrix theory, and discrete maps are also considered. In particular, a random-matrix conjecture is proposed for the eigenvalues of the classical evolution operator of chaotic billiards.

  1. Competing quantum effects in the free energy profiles and diffusion rates of hydrogen and deuterium molecules through clathrate hydrates.

    PubMed

    Cendagorta, Joseph R; Powers, Anna; Hele, Timothy J H; Marsalek, Ondrej; Bačić, Zlatko; Tuckerman, Mark E

    2016-11-30

    Clathrate hydrates hold considerable promise as safe and economical materials for hydrogen storage. Here we present a quantum mechanical study of H 2 and D 2 diffusion through a hexagonal face shared by two large cages of clathrate hydrates over a wide range of temperatures. Path integral molecular dynamics simulations are used to compute the free-energy profiles for the diffusion of H 2 and D 2 as a function of temperature. Ring polymer molecular dynamics rate theory, incorporating both exact quantum statistics and approximate quantum dynamical effects, is utilized in the calculations of the H 2 and D 2 diffusion rates in a broad temperature interval. We find that the shape of the quantum free-energy profiles and their height relative to the classical free energy barriers at a given temperature, as well as the rate of diffusion, are strongly affected by competing quantum effects: above 25 K, zero-point energy (ZPE) perpendicular to the reaction path for diffusion between cavities decreases the quantum rate compared to the classical rate, whereas at lower temperatures tunneling outcompetes the ZPE and as a result the quantum rate is greater than the classical rate.

  2. Classical Physics and the Bounds of Quantum Correlations.

    PubMed

    Frustaglia, Diego; Baltanás, José P; Velázquez-Ahumada, María C; Fernández-Prieto, Armando; Lujambio, Aintzane; Losada, Vicente; Freire, Manuel J; Cabello, Adán

    2016-06-24

    A unifying principle explaining the numerical bounds of quantum correlations remains elusive, despite the efforts devoted to identifying it. Here, we show that these bounds are indeed not exclusive to quantum theory: for any abstract correlation scenario with compatible measurements, models based on classical waves produce probability distributions indistinguishable from those of quantum theory and, therefore, share the same bounds. We demonstrate this finding by implementing classical microwaves that propagate along meter-size transmission-line circuits and reproduce the probabilities of three emblematic quantum experiments. Our results show that the "quantum" bounds would also occur in a classical universe without quanta. The implications of this observation are discussed.

  3. Open or closed? Dirac, Heisenberg, and the relation between classical and quantum mechanics

    NASA Astrophysics Data System (ADS)

    Bokulich, Alisa

    2004-09-01

    This paper describes a long-standing, though little known, debate between Dirac and Heisenberg over the nature of scientific methodology, theory change, and intertheoretic relations. Following Heisenberg's terminology, their disagreements can be summarized as a debate over whether the classical and quantum theories are "open" or "closed." A close examination of this debate sheds new light on the philosophical views of two of the great founders of quantum theory.

  4. The role of a posteriori mathematics in physics

    NASA Astrophysics Data System (ADS)

    MacKinnon, Edward

    2018-05-01

    The calculus that co-evolved with classical mechanics relied on definitions of functions and differentials that accommodated physical intuitions. In the early nineteenth century mathematicians began the rigorous reformulation of calculus and eventually succeeded in putting almost all of mathematics on a set-theoretic foundation. Physicists traditionally ignore this rigorous mathematics. Physicists often rely on a posteriori math, a practice of using physical considerations to determine mathematical formulations. This is illustrated by examples from classical and quantum physics. A justification of such practice stems from a consideration of the role of phenomenological theories in classical physics and effective theories in contemporary physics. This relates to the larger question of how physical theories should be interpreted.

  5. The Lack of Chemical Equilibrium does not Preclude the Use of the Classical Nucleation Theory in Circumstellar Outflows

    NASA Technical Reports Server (NTRS)

    Paquette, John A.; Nuth, Joseph A., III

    2011-01-01

    Classical nucleation theory has been used in models of dust nucleation in circumstellar outflows around oxygen-rich asymptotic giant branch stars. One objection to the application of classical nucleation theory (CNT) to astrophysical systems of this sort is that an equilibrium distribution of clusters (assumed by CNT) is unlikely to exist in such conditions due to a low collision rate of condensable species. A model of silicate grain nucleation and growth was modified to evaluate the effect of a nucleation flux orders of magnitUde below the equilibrium value. The results show that a lack of chemical equilibrium has only a small effect on the ultimate grain distribution.

  6. S-Duality, Deconstruction and Confinement for a Marginal Deformation of N=4 SUSY Yang-Mills

    NASA Astrophysics Data System (ADS)

    Dorey, Nick

    2004-08-01

    We study an exactly marginal deformation of Script N = 4 SUSY Yang-Mills with gauge group U(N) using field theory and string theory methods. The classical theory has a Higgs branch for rational values of the deformation parameter. We argue that the quantum theory also has an S-dual confining branch which cannot be seen classically. The low-energy effective theory on these branches is a six-dimensional non-commutative gauge theory with sixteen supercharges. Confinement of magnetic and electric charges, on the Higgs and confining branches respectively, occurs due to the formation of BPS-saturated strings in the low energy theory. The results also suggest a new way of deconstructing Little String Theory as a large-N limit of a confining gauge theory in four dimensions.

  7. High-pressure phase transitions - Examples of classical predictability

    NASA Astrophysics Data System (ADS)

    Celebonovic, Vladan

    1992-09-01

    The applicability of the Savic and Kasanin (1962-1967) classical theory of dense matter to laboratory experiments requiring estimates of high-pressure phase transitions was examined by determining phase transition pressures for a set of 19 chemical substances (including elements, hydrocarbons, metal oxides, and salts) for which experimental data were available. A comparison between experimental and transition points and those predicted by the Savic-Kasanin theory showed that the theory can be used for estimating values of transition pressures. The results also support conclusions obtained in previous astronomical applications of the Savic-Kasanin theory.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khrennikov, Andrei

    We present fundamentals of a prequantum model with hidden variables of the classical field type. In some sense this is the comeback of classical wave mechanics. Our approach also can be considered as incorporation of quantum mechanics into classical signal theory. All quantum averages (including correlations of entangled systems) can be represented as classical signal averages and correlations.

  9. Classicality condition on a system observable in a quantum measurement and a relative-entropy conservation law

    NASA Astrophysics Data System (ADS)

    Kuramochi, Yui; Ueda, Masahito

    2015-03-01

    We consider the information flow on a system observable X corresponding to a positive-operator-valued measure under a quantum measurement process Y described by a completely positive instrument from the viewpoint of the relative entropy. We establish a sufficient condition for the relative-entropy conservation law which states that the average decrease in the relative entropy of the system observable X equals the relative entropy of the measurement outcome of Y , i.e., the information gain due to measurement. This sufficient condition is interpreted as an assumption of classicality in the sense that there exists a sufficient statistic in a joint successive measurement of Y followed by X such that the probability distribution of the statistic coincides with that of a single measurement of X for the premeasurement state. We show that in the case when X is a discrete projection-valued measure and Y is discrete, the classicality condition is equivalent to the relative-entropy conservation for arbitrary states. The general theory on the relative-entropy conservation is applied to typical quantum measurement models, namely, quantum nondemolition measurement, destructive sharp measurements on two-level systems, a photon counting, a quantum counting, homodyne and heterodyne measurements. These examples except for the nondemolition and photon-counting measurements do not satisfy the known Shannon-entropy conservation law proposed by Ban [M. Ban, J. Phys. A: Math. Gen. 32, 1643 (1999), 10.1088/0305-4470/32/9/012], implying that our approach based on the relative entropy is applicable to a wider class of quantum measurements.

  10. Predictions of homogeneous nucleation rates for n-alkanes accounting for the diffuse phase interface and capillary waves.

    PubMed

    Planková, Barbora; Vinš, Václav; Hrubý, Jan

    2017-10-28

    Homogeneous droplet nucleation has been studied for almost a century but has not yet been fully understood. In this work, we used the density gradient theory (DGT) and considered the influence of capillary waves (CWs) on the predicted size-dependent surface tensions and nucleation rates for selected n-alkanes. The DGT model was completed by an equation of state (EoS) based on the perturbed-chain statistical associating fluid theory and compared to the classical nucleation theory and the Peng-Robinson EoS. It was found that the critical clusters are practically free of CWs because they are so small that even the smallest wavelengths of CWs do not fit into their finite dimensions. The CWs contribute to the entropy of the system and thus decrease the surface tension. A correction for the effect of CWs on the surface tension is presented. The effect of the different EoSs is relatively small because by a fortuitous coincidence their predictions are similar in the relevant range of critical cluster sizes. The difference of the DGT predictions to the classical nucleation theory computations is important but not decisive. Of the effects investigated, the most pronounced is the suppression of CWs which causes a sizable decrease of the predicted nucleation rates. The major difference between experimental nucleation rate data and theoretical predictions remains in the temperature dependence. For normal alkanes, this discrepancy is much stronger than observed, e.g., for water. Theoretical corrections developed here have a minor influence on the temperature dependency. We provide empirical equations correcting the predicted nucleation rates to values comparable with experiments.

  11. Predictions of homogeneous nucleation rates for n-alkanes accounting for the diffuse phase interface and capillary waves

    NASA Astrophysics Data System (ADS)

    Planková, Barbora; Vinš, Václav; Hrubý, Jan

    2017-10-01

    Homogeneous droplet nucleation has been studied for almost a century but has not yet been fully understood. In this work, we used the density gradient theory (DGT) and considered the influence of capillary waves (CWs) on the predicted size-dependent surface tensions and nucleation rates for selected n-alkanes. The DGT model was completed by an equation of state (EoS) based on the perturbed-chain statistical associating fluid theory and compared to the classical nucleation theory and the Peng-Robinson EoS. It was found that the critical clusters are practically free of CWs because they are so small that even the smallest wavelengths of CWs do not fit into their finite dimensions. The CWs contribute to the entropy of the system and thus decrease the surface tension. A correction for the effect of CWs on the surface tension is presented. The effect of the different EoSs is relatively small because by a fortuitous coincidence their predictions are similar in the relevant range of critical cluster sizes. The difference of the DGT predictions to the classical nucleation theory computations is important but not decisive. Of the effects investigated, the most pronounced is the suppression of CWs which causes a sizable decrease of the predicted nucleation rates. The major difference between experimental nucleation rate data and theoretical predictions remains in the temperature dependence. For normal alkanes, this discrepancy is much stronger than observed, e.g., for water. Theoretical corrections developed here have a minor influence on the temperature dependency. We provide empirical equations correcting the predicted nucleation rates to values comparable with experiments.

  12. Statistics of resonances for a class of billiards on the Poincaré half-plane

    NASA Astrophysics Data System (ADS)

    Howard, P. J.; Mota-Furtado, F.; O'Mahony, P. F.; Uski, V.

    2005-12-01

    The lower boundary of Artin's billiard on the Poincaré half-plane is continuously deformed to generate a class of billiards with classical dynamics varying from fully integrable to completely chaotic. The quantum scattering problem in these open billiards is described and the statistics of both real and imaginary parts of the resonant momenta are investigated. The evolution of the resonance positions is followed as the boundary is varied which leads to large changes in their distribution. The transition to arithmetic chaos in Artin's billiard, which is responsible for the Poissonian level-spacing statistics of the bound states in the continuum (cusp forms) at the same time as the formation of a set of resonances all with width \\frac{1}{4} and real parts determined by the zeros of Riemann's zeta function, is closely examined. Regimes are found which obey the universal predictions of random matrix theory (RMT) as well as exhibiting non-universal long-range correlations. The Brody parameter is used to describe the transitions between different regimes.

  13. Experience and Explanation: Using Videogames to Prepare Students for Formal Instruction in Statistics

    NASA Astrophysics Data System (ADS)

    Arena, Dylan A.; Schwartz, Daniel L.

    2014-08-01

    Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics, where people's everyday experiences often conflict with normative statistical theories and a videogame might provide an alternate set of experiences for students to draw upon. The research used a game called Stats Invaders!, a variant of the classic videogame Space Invaders. In Stats Invaders!, the locations of descending alien invaders follow probability distributions, and players need to infer the shape of the distributions to play well. The experiment tested whether the game developed participants' intuitions about the structure of random events and thereby prepared them for future learning from a subsequent written passage on probability distributions. Community-college students who played the game and then read the passage learned more than participants who only read the passage.

  14. On Fluctuations of Eigenvalues of Random Band Matrices

    NASA Astrophysics Data System (ADS)

    Shcherbina, M.

    2015-10-01

    We consider the fluctuations of linear eigenvalue statistics of random band matrices whose entries have the form with i.i.d. possessing the th moment, where the function u has a finite support , so that M has only nonzero diagonals. The parameter b (called the bandwidth) is assumed to grow with n in a way such that . Without any additional assumptions on the growth of b we prove CLT for linear eigenvalue statistics for a rather wide class of test functions. Thus we improve and generalize the results of the previous papers (Jana et al., arXiv:1412.2445; Li et al. Random Matrices 2:04, 2013), where CLT was proven under the assumption . Moreover, we develop a method which allows to prove automatically the CLT for linear eigenvalue statistics of the smooth test functions for almost all classical models of random matrix theory: deformed Wigner and sample covariance matrices, sparse matrices, diluted random matrices, matrices with heavy tales etc.

  15. 3D polarisation speckle as a demonstration of tensor version of the van Cittert-Zernike theorem for stochastic electromagnetic beams

    NASA Astrophysics Data System (ADS)

    Ma, Ning; Zhao, Juan; Hanson, Steen G.; Takeda, Mitsuo; Wang, Wei

    2016-10-01

    Laser speckle has received extensive studies of its basic properties and associated applications. In the majority of research on speckle phenomena, the random optical field has been treated as a scalar optical field, and the main interest has been concentrated on their statistical properties and applications of its intensity distribution. Recently, statistical properties of random electric vector fields referred to as Polarization Speckle have come to attract new interest because of their importance in a variety of areas with practical applications such as biomedical optics and optical metrology. Statistical phenomena of random electric vector fields have close relevance to the theories of speckles, polarization and coherence theory. In this paper, we investigate the correlation tensor for stochastic electromagnetic fields modulated by a depolarizer consisting of a rough-surfaced retardation plate. Under the assumption that the microstructure of the scattering surface on the depolarizer is as fine as to be unresolvable in our observation region, we have derived a relationship between the polarization matrix/coherency matrix for the modulated electric fields behind the rough-surfaced retardation plate and the coherence matrix under the free space geometry. This relation is regarded as entirely analogous to the van Cittert-Zernike theorem of classical coherence theory. Within the paraxial approximation as represented by the ABCD-matrix formalism, the three-dimensional structure of the generated polarization speckle is investigated based on the correlation tensor, indicating a typical carrot structure with a much longer axial dimension than the extent in its transverse dimension.

  16. Quantum-Like Bayesian Networks for Modeling Decision Making

    PubMed Central

    Moreira, Catarina; Wichert, Andreas

    2016-01-01

    In this work, we explore an alternative quantum structure to perform quantum probabilistic inferences to accommodate the paradoxical findings of the Sure Thing Principle. We propose a Quantum-Like Bayesian Network, which consists in replacing classical probabilities by quantum probability amplitudes. However, since this approach suffers from the problem of exponential growth of quantum parameters, we also propose a similarity heuristic that automatically fits quantum parameters through vector similarities. This makes the proposed model general and predictive in contrast to the current state of the art models, which cannot be generalized for more complex decision scenarios and that only provide an explanatory nature for the observed paradoxes. In the end, the model that we propose consists in a nonparametric method for estimating inference effects from a statistical point of view. It is a statistical model that is simpler than the previous quantum dynamic and quantum-like models proposed in the literature. We tested the proposed network with several empirical data from the literature, mainly from the Prisoner's Dilemma game and the Two Stage Gambling game. The results obtained show that the proposed quantum Bayesian Network is a general method that can accommodate violations of the laws of classical probability theory and make accurate predictions regarding human decision-making in these scenarios. PMID:26858669

  17. Uniting the Spheres: Modern Feminist Theory and Classic Texts in AP English

    ERIC Educational Resources Information Center

    Drew, Simao J. A.; Bosnic, Brenda G.

    2008-01-01

    High school teachers Simao J. A. Drew and Brenda G. Bosnic help familiarize students with gender role analysis and feminist theory. Students examine classic literature and contemporary texts, considering characters' historical, literary, and social contexts while expanding their understanding of how patterns of identity and gender norms exist and…

  18. Aesthetic Creativity: Insights from Classical Literary Theory on Creative Learning

    ERIC Educational Resources Information Center

    Hellstrom, Tomas Georg

    2011-01-01

    This paper addresses the subject of textual creativity by drawing on work done in classical literary theory and criticism, specifically new criticism, structuralism and early poststructuralism. The question of how readers and writers engage creatively with the text is closely related to educational concerns, though they are often thought of as…

  19. Assessing the Performance of Classical Test Theory Item Discrimination Estimators in Monte Carlo Simulations

    ERIC Educational Resources Information Center

    Bazaldua, Diego A. Luna; Lee, Young-Sun; Keller, Bryan; Fellers, Lauren

    2017-01-01

    The performance of various classical test theory (CTT) item discrimination estimators has been compared in the literature using both empirical and simulated data, resulting in mixed results regarding the preference of some discrimination estimators over others. This study analyzes the performance of various item discrimination estimators in CTT:…

  20. Louis Guttman's Contributions to Classical Test Theory

    ERIC Educational Resources Information Center

    Zimmerman, Donald W.; Williams, Richard H.; Zumbo, Bruno D.; Ross, Donald

    2005-01-01

    This article focuses on Louis Guttman's contributions to the classical theory of educational and psychological tests, one of the lesser known of his many contributions to quantitative methods in the social sciences. Guttman's work in this field provided a rigorous mathematical basis for ideas that, for many decades after Spearman's initial work,…

  1. Generalization of the Activated Complex Theory of Reaction Rates. II. Classical Mechanical Treatment

    DOE R&D Accomplishments Database

    Marcus, R. A.

    1964-01-01

    In its usual classical form activated complex theory assumes a particular expression for the kinetic energy of the reacting system -- one associated with a rectilinear motion along the reaction coordinate. The derivation of the rate expression given in the present paper is based on the general kinetic energy expression.

  2. Analogy between electromagnetic potentials and wave-like dynamic variables with connections to quantum theory

    NASA Astrophysics Data System (ADS)

    Yang, Chen

    2018-05-01

    The transitions from classical theories to quantum theories have attracted many interests. This paper demonstrates the analogy between the electromagnetic potentials and wave-like dynamic variables with their connections to quantum theory for audiences at advanced undergraduate level and above. In the first part, the counterpart relations in the classical electrodynamics (e.g. gauge transform and Lorenz condition) and classical mechanics (e.g. Legendre transform and free particle condition) are presented. These relations lead to similar governing equations of the field variables and dynamic variables. The Lorenz gauge, scalar potential and vector potential manifest a one-to-one similarity to the action, Hamiltonian and momentum, respectively. In the second part, the connections between the classical pictures of electromagnetic field and particle to quantum picture are presented. By characterising the states of electromagnetic field and particle via their (corresponding) variables, their evolution pictures manifest the same algebraic structure (isomorphic). Subsequently, pictures of the electromagnetic field and particle are compared to the quantum picture and their interconnections are given. A brief summary of the obtained results are presented at the end of the paper.

  3. Quantum-correlation breaking channels, quantum conditional probability and Perron-Frobenius theory

    NASA Astrophysics Data System (ADS)

    Chruściński, Dariusz

    2013-03-01

    Using the quantum analog of conditional probability and classical Bayes theorem we discuss some aspects of particular entanglement breaking channels: quantum-classical and classical-classical channels. Applying the quantum analog of Perron-Frobenius theorem we generalize the recent result of Korbicz et al. (2012) [8] on full and spectrum broadcasting from quantum-classical channels to arbitrary quantum channels.

  4. Operator Formulation of Classical Mechanics.

    ERIC Educational Resources Information Center

    Cohn, Jack

    1980-01-01

    Discusses the construction of an operator formulation of classical mechanics which is directly concerned with wave packets in configuration space and is more similar to that of convential quantum theory than other extant operator formulations of classical mechanics. (Author/HM)

  5. Active control on high-order coherence and statistic characterization on random phase fluctuation of two classical point sources.

    PubMed

    Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan

    2016-03-29

    Young's double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources.

  6. Extended theory of harmonic maps connects general relativity to chaos and quantum mechanism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Gang; Duan, Yi-Shi

    General relativity and quantum mechanism are two separate rules of modern physics explaining how nature works. Both theories are accurate, but the direct connection between two theories was not yet clarified. Recently, researchers blur the line between classical and quantum physics by connecting chaos and entanglement equation. Here in this paper, we showed the Duan's extended HM theory, which has the solution of the general relativity, can also have the solutions of the classic chaos equations and even the solution of Schrödinger equation in quantum physics, suggesting the extended theory of harmonic maps may act as a universal theory ofmore » physics.« less

  7. Extended theory of harmonic maps connects general relativity to chaos and quantum mechanism

    DOE PAGES

    Ren, Gang; Duan, Yi-Shi

    2017-07-20

    General relativity and quantum mechanism are two separate rules of modern physics explaining how nature works. Both theories are accurate, but the direct connection between two theories was not yet clarified. Recently, researchers blur the line between classical and quantum physics by connecting chaos and entanglement equation. Here in this paper, we showed the Duan's extended HM theory, which has the solution of the general relativity, can also have the solutions of the classic chaos equations and even the solution of Schrödinger equation in quantum physics, suggesting the extended theory of harmonic maps may act as a universal theory ofmore » physics.« less

  8. Computation in generalised probabilisitic theories

    NASA Astrophysics Data System (ADS)

    Lee, Ciarán M.; Barrett, Jonathan

    2015-08-01

    From the general difficulty of simulating quantum systems using classical systems, and in particular the existence of an efficient quantum algorithm for factoring, it is likely that quantum computation is intrinsically more powerful than classical computation. At present, the best upper bound known for the power of quantum computation is that {{BQP}}\\subseteq {{AWPP}}, where {{AWPP}} is a classical complexity class (known to be included in {{PP}}, hence {{PSPACE}}). This work investigates limits on computational power that are imposed by simple physical, or information theoretic, principles. To this end, we define a circuit-based model of computation in a class of operationally-defined theories more general than quantum theory, and ask: what is the minimal set of physical assumptions under which the above inclusions still hold? We show that given only an assumption of tomographic locality (roughly, that multipartite states and transformations can be characterized by local measurements), efficient computations are contained in {{AWPP}}. This inclusion still holds even without assuming a basic notion of causality (where the notion is, roughly, that probabilities for outcomes cannot depend on future measurement choices). Following Aaronson, we extend the computational model by allowing post-selection on measurement outcomes. Aaronson showed that the corresponding quantum complexity class, {{PostBQP}}, is equal to {{PP}}. Given only the assumption of tomographic locality, the inclusion in {{PP}} still holds for post-selected computation in general theories. Hence in a world with post-selection, quantum theory is optimal for computation in the space of all operational theories. We then consider whether one can obtain relativized complexity results for general theories. It is not obvious how to define a sensible notion of a computational oracle in the general framework that reduces to the standard notion in the quantum case. Nevertheless, it is possible to define computation relative to a ‘classical oracle’. Then, we show there exists a classical oracle relative to which efficient computation in any theory satisfying the causality assumption does not include {{NP}}.

  9. Statistical inference and Aristotle's Rhetoric.

    PubMed

    Macdonald, Ranald R

    2004-11-01

    Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.

  10. Opening Switch Research on a Plasma Focus VI.

    DTIC Science & Technology

    1988-02-26

    Sausage Instability in the Plasma Focus In this section the classical Kruskal- Schwarzschild 3 theory for the sausage mode is applied to the pinch phase...on 1) the shape of the pinch, 2) axial flow of plasma, and 3) self-generated magnetic fields are also presented. The Kruskal- Schwarzschild Theory The...classical mhd theory for the m=O mode in a plasma supported by a magnetic field against gravity; this is the well-known Kruskal- Schwarzschild

  11. Nanoscale Capillary Flows in Alumina: Testing the Limits of Classical Theory.

    PubMed

    Lei, Wenwen; McKenzie, David R

    2016-07-21

    Anodic aluminum oxide (AAO) membranes have well-formed cylindrical channels, as small as 10 nm in diameter, in a close packed hexagonal array. The channels in AAO membranes simulate very small leaks that may be present for example in an aluminum oxide device encapsulation. The 10 nm alumina channel is the smallest that has been studied to date for its moisture flow properties and provides a stringent test of classical capillary theory. We measure the rate at which moisture penetrates channels with diameters in the range of 10 to 120 nm with moist air present at 1 atm on one side and dry air at the same total pressure on the other. We extend classical theory for water leak rates at high humidities by allowing for variable meniscus curvature at the entrance and show that the extended theory explains why the flow increases greatly when capillary filling occurs and enables the contact angle to be determined. At low humidities our measurements for air-filled channels agree well with theory for the interdiffusive flow of water vapor in air. The flow rate of water-filled channels is one order of magnitude less than expected from classical capillary filling theory and is coincidentally equal to the helium flow rate, validating the use of helium leak testing for evaluating moisture flows in aluminum oxide leaks.

  12. Statistical Mechanics of Disordered Systems - Series: Cambridge Series in Statistical and Probabilistic Mathematics (No. 18)

    NASA Astrophysics Data System (ADS)

    Bovier, Anton

    2006-06-01

    Our mathematical understanding of the statistical mechanics of disordered systems is going through a period of stunning progress. This self-contained book is a graduate-level introduction for mathematicians and for physicists interested in the mathematical foundations of the field, and can be used as a textbook for a two-semester course on mathematical statistical mechanics. It assumes only basic knowledge of classical physics and, on the mathematics side, a good working knowledge of graduate-level probability theory. The book starts with a concise introduction to statistical mechanics, proceeds to disordered lattice spin systems, and concludes with a presentation of the latest developments in the mathematical understanding of mean-field spin glass models. In particular, recent progress towards a rigorous understanding of the replica symmetry-breaking solutions of the Sherrington-Kirkpatrick spin glass models, due to Guerra, Aizenman-Sims-Starr and Talagrand, is reviewed in some detail. Comprehensive introduction to an active and fascinating area of research Clear exposition that builds to the state of the art in the mathematics of spin glasses Written by a well-known and active researcher in the field

  13. Reversed inverse regression for the univariate linear calibration and its statistical properties derived using a new methodology

    NASA Astrophysics Data System (ADS)

    Kang, Pilsang; Koo, Changhoi; Roh, Hokyu

    2017-11-01

    Since simple linear regression theory was established at the beginning of the 1900s, it has been used in a variety of fields. Unfortunately, it cannot be used directly for calibration. In practical calibrations, the observed measurements (the inputs) are subject to errors, and hence they vary, thus violating the assumption that the inputs are fixed. Therefore, in the case of calibration, the regression line fitted using the method of least squares is not consistent with the statistical properties of simple linear regression as already established based on this assumption. To resolve this problem, "classical regression" and "inverse regression" have been proposed. However, they do not completely resolve the problem. As a fundamental solution, we introduce "reversed inverse regression" along with a new methodology for deriving its statistical properties. In this study, the statistical properties of this regression are derived using the "error propagation rule" and the "method of simultaneous error equations" and are compared with those of the existing regression approaches. The accuracy of the statistical properties thus derived is investigated in a simulation study. We conclude that the newly proposed regression and methodology constitute the complete regression approach for univariate linear calibrations.

  14. Scaling and stochastic cascade properties of NEMO oceanic simulations and their potential value for GCM evaluation and downscaling

    NASA Astrophysics Data System (ADS)

    Verrier, Sébastien; Crépon, Michel; Thiria, Sylvie

    2014-09-01

    Spectral scaling properties have already been evidenced on oceanic numerical simulations and have been subject to several interpretations. They can be used to evaluate classical turbulence theories that predict scaling with specific exponents and to evaluate the quality of GCM outputs from a statistical and multiscale point of view. However, a more complete framework based on multifractal cascades is able to generalize the classical but restrictive second-order spectral framework to other moment orders, providing an accurate description of probability distributions of the fields at multiple scales. The predictions of this formalism still needed systematic verification in oceanic GCM while they have been confirmed recently for their atmospheric counterparts by several papers. The present paper is devoted to a systematic analysis of several oceanic fields produced by the NEMO oceanic GCM. Attention is focused to regional, idealized configurations that permit to evaluate the NEMO engine core from a scaling point of view regardless of limitations involved by land masks. Based on classical multifractal analysis tools, multifractal properties were evidenced for several oceanic state variables (sea surface temperature and salinity, velocity components, etc.). While first-order structure functions estimated a different nonconservativity parameter H in two scaling ranges, the multiorder statistics of turbulent fluxes were scaling over almost the whole available scaling range. This multifractal scaling was then parameterized with the help of the universal multifractal framework, providing parameters that are coherent with existing empirical literature. Finally, we argue that the knowledge of these properties may be useful for oceanographers. The framework seems very well suited for the statistical evaluation of OGCM outputs. Moreover, it also provides practical solutions to simulate subpixel variability stochastically for GCM downscaling purposes. As an independent perspective, the existence of multifractal properties in oceanic flows seems also interesting for investigating scale dependencies in remote sensing inversion algorithms.

  15. Hamilton-Jacobi theory in multisymplectic classical field theories

    NASA Astrophysics Data System (ADS)

    de León, Manuel; Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso; Vilariño, Silvia

    2017-09-01

    The geometric framework for the Hamilton-Jacobi theory developed in the studies of Cariñena et al. [Int. J. Geom. Methods Mod. Phys. 3(7), 1417-1458 (2006)], Cariñena et al. [Int. J. Geom. Methods Mod. Phys. 13(2), 1650017 (2015)], and de León et al. [Variations, Geometry and Physics (Nova Science Publishers, New York, 2009)] is extended for multisymplectic first-order classical field theories. The Hamilton-Jacobi problem is stated for the Lagrangian and the Hamiltonian formalisms of these theories as a particular case of a more general problem, and the classical Hamilton-Jacobi equation for field theories is recovered from this geometrical setting. Particular and complete solutions to these problems are defined and characterized in several equivalent ways in both formalisms, and the equivalence between them is proved. The use of distributions in jet bundles that represent the solutions to the field equations is the fundamental tool in this formulation. Some examples are analyzed and, in particular, the Hamilton-Jacobi equation for non-autonomous mechanical systems is obtained as a special case of our results.

  16. Quantum theory of the classical: quantum jumps, Born's Rule and objective classical reality via quantum Darwinism.

    PubMed

    Zurek, Wojciech Hubert

    2018-07-13

    The emergence of the classical world from the quantum substrate of our Universe is a long-standing conundrum. In this paper, I describe three insights into the transition from quantum to classical that are based on the recognition of the role of the environment. I begin with the derivation of preferred sets of states that help to define what exists-our everyday classical reality. They emerge as a result of the breaking of the unitary symmetry of the Hilbert space which happens when the unitarity of quantum evolutions encounters nonlinearities inherent in the process of amplification-of replicating information. This derivation is accomplished without the usual tools of decoherence, and accounts for the appearance of quantum jumps and the emergence of preferred pointer states consistent with those obtained via environment-induced superselection, or einselection The pointer states obtained in this way determine what can happen-define events-without appealing to Born's Rule for probabilities. Therefore, p k =| ψ k | 2 can now be deduced from the entanglement-assisted invariance, or envariance -a symmetry of entangled quantum states. With probabilities at hand, one also gains new insights into the foundations of quantum statistical physics. Moreover, one can now analyse the information flows responsible for decoherence. These information flows explain how the perception of objective classical reality arises from the quantum substrate: the effective amplification that they represent accounts for the objective existence of the einselected states of macroscopic quantum systems through the redundancy of pointer state records in their environment-through quantum Darwinism This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).

  17. Erzwingt die Quantenmechanik eine drastische Änderung unseres Weltbilds? Gedanken und Experimente nach Einstein, Podolsky und Rosen

    NASA Astrophysics Data System (ADS)

    Frodl, Peter

    Von den Anfängen der Quantenmechanik bis heute gibt es Versuche, sie als statistische Theorie über Ensembles individueller klassischer Systeme zu interpretieren. Die Bedingungen, unter denen Theorien verborgener Parameter zu deterministischen Beschreibungen dieser individuellen Systeme als klassisch angesehen werden können, wurden von Einstein, Podolsky und Rosen 1935 formuliert: 1. Physikalische Systeme sind im Prinzip separierbar. 2. Zu jeder physikalischen Größe, deren Wert man ohne Störung des betrachteten Systems mit Sicherheit voraussagen kann, existiert ein ihr entsprechendes Element der physikalischen Realität.Zusammen sind sie, wie Bell 1964 gezeigt hat, prinzipiell unverträglich mit der Quantenmechanik und unhaltbar angesichts neuerer Experimente. Diese erweisen einmal mehr die Quantenmechanik als richtige Theorie. Um ihre Ergebnisse zu verstehen, müssen wir entweder die in der klassischen Physik als selbstverständlich angesehene Annahme der Separierbarkeit physikalischer Systeme aufgeben oder unseren Begriff der physikalischen Realität revidieren. Eine Untersuchung des Begriffs der Separabilität und einige Überlegungen zum Problem der Messung von Observablen zeigen, daß eine Änderung des Begriffs der physikalischen Realität unumgänglich ist. Der revidierte Realitätsbegriff sollte mit klassischer Physik und Quantenmechanik verträglich sein, um ein einheitliches Weltbild zu ermöglichen.Translated AbstractDo Quantum Mechanics Force us to Drastically Change our View of the World? Thoughts and Experiments after Einstein, Podolsky and RosenSince the advent of quantum mechanics there have been attempts of its interpretation in terms of statistical theory concerning individual classical systems. The very conditions necessary to consider hidden variable theories describing these individual systems as classical had been pointed out by Einstein, Podolsky and Rosen in 1935: 1. Physical systems are in principle separable. 2. If it is possible to predict with certainty the value of a physical quantity without disturbing the system under consideration, then there exists an element of physical reality corresponding to this physical quantity.Together they are, as was shown by Bell in 1964, incompatible in principle with quantum mechanics and no more tenable in view of recent experiments. These experiments once more corroborate quantum theory. In order to understand their results we are forced either to drop the assumption of separability of physical systems (taken for self-evident in classical physics) or to change our concept of physical reality. After investigating the notion of separability and connecting the EPR-correlations to the measurement problem we, conclude that a change of the concept of physical reality is indispensable. The revised concept should be compatible with both classical and quantum physics in order to allow a uniform view of the physical world.

  18. Psychometric considerations in the measurement of event-related brain potentials: Guidelines for measurement and reporting.

    PubMed

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Failing to consider psychometric issues related to reliability and validity, differential deficits, and statistical power potentially undermines the conclusions of a study. In research using event-related brain potentials (ERPs), numerous contextual factors (population sampled, task, data recording, analysis pipeline, etc.) can impact the reliability of ERP scores. The present review considers the contextual factors that influence ERP score reliability and the downstream effects that reliability has on statistical analyses. Given the context-dependent nature of ERPs, it is recommended that ERP score reliability be formally assessed on a study-by-study basis. Recommended guidelines for ERP studies include 1) reporting the threshold of acceptable reliability and reliability estimates for observed scores, 2) specifying the approach used to estimate reliability, and 3) justifying how trial-count minima were chosen. A reliability threshold for internal consistency of at least 0.70 is recommended, and a threshold of 0.80 is preferred. The review also advocates the use of generalizability theory for estimating score dependability (the generalizability theory analog to reliability) as an improvement on classical test theory reliability estimates, suggesting that the latter is less well suited to ERP research. To facilitate the calculation and reporting of dependability estimates, an open-source Matlab program, the ERP Reliability Analysis Toolbox, is presented. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Thermostatistical description of gas mixtures from space partitions

    NASA Astrophysics Data System (ADS)

    Rohrmann, R. D.; Zorec, J.

    2006-10-01

    The new mathematical framework based on the free energy of pure classical fluids presented by Rohrmann [Physica A 347, 221 (2005)] is extended to multicomponent systems to determine thermodynamic and structural properties of chemically complex fluids. Presently, the theory focuses on D -dimensional mixtures in the low-density limit (packing factor η<0.01 ). The formalism combines the free-energy minimization technique with space partitions that assign an available volume v to each particle. v is related to the closeness of the nearest neighbor and provides a useful tool to evaluate the perturbations experimented by particles in a fluid. The theory shows a close relationship between statistical geometry and statistical mechanics. New, unconventional thermodynamic variables and mathematical identities are derived as a result of the space division. Thermodynamic potentials μil , conjugate variable of the populations Nil of particles class i with the nearest neighbors of class l are defined and their relationships with the usual chemical potentials μi are established. Systems of hard spheres are treated as illustrative examples and their thermodynamics functions are derived analytically. The low-density expressions obtained agree nicely with those of scaled-particle theory and Percus-Yevick approximation. Several pair distribution functions are introduced and evaluated. Analytical expressions are also presented for hard spheres with attractive forces due to Kac-tails and square-well potentials. Finally, we derive general chemical equilibrium conditions.

  20. Beyond Classical Information Theory: Advancing the Fundamentals for Improved Geophysical Prediction

    NASA Astrophysics Data System (ADS)

    Perdigão, R. A. P.; Pires, C. L.; Hall, J.; Bloeschl, G.

    2016-12-01

    Information Theory, in its original and quantum forms, has gradually made its way into various fields of science and engineering. From the very basic concepts of Information Entropy and Mutual Information to Transit Information, Interaction Information and respective partitioning into statistical synergy, redundancy and exclusivity, the overall theoretical foundations have matured as early as the mid XX century. In the Earth Sciences various interesting applications have been devised over the last few decades, such as the design of complex process networks of descriptive and/or inferential nature, wherein earth system processes are "nodes" and statistical relationships between them designed as information-theoretical "interactions". However, most applications still take the very early concepts along with their many caveats, especially in heavily non-Normal, non-linear and structurally changing scenarios. In order to overcome the traditional limitations of information theory and tackle elusive Earth System phenomena, we introduce a new suite of information dynamic methodologies towards a more physically consistent and information comprehensive framework. The methodological developments are then illustrated on a set of practical examples from geophysical fluid dynamics, where high-order nonlinear relationships elusive to the current non-linear information measures are aptly captured. In doing so, these advances increase the predictability of critical events such as the emergence of hyper-chaotic regimes in ocean-atmospheric dynamics and the occurrence of hydro-meteorological extremes.

  1. High-order harmonics measured by the photon statistics of the infrared driving-field exiting the atomic medium.

    PubMed

    Tsatrafyllis, N; Kominis, I K; Gonoskov, I A; Tzallas, P

    2017-04-27

    High-order harmonics in the extreme-ultraviolet spectral range, resulting from the strong-field laser-atom interaction, have been used in a broad range of fascinating applications in all states of matter. In the majority of these studies the harmonic generation process is described using semi-classical theories which treat the electromagnetic field of the driving laser pulse classically without taking into account its quantum nature. In addition, for the measurement of the generated harmonics, all the experiments require diagnostics in the extreme-ultraviolet spectral region. Here by treating the driving laser field quantum mechanically we reveal the quantum-optical nature of the high-order harmonic generation process by measuring the photon number distribution of the infrared light exiting the harmonic generation medium. It is found that the high-order harmonics are imprinted in the photon number distribution of the infrared light and can be recorded without the need of a spectrometer in the extreme-ultraviolet.

  2. Fokker-Planck equation of the reduced Wigner function associated to an Ohmic quantum Langevin dynamics

    NASA Astrophysics Data System (ADS)

    Colmenares, Pedro J.

    2018-05-01

    This article has to do with the derivation and solution of the Fokker-Planck equation associated to the momentum-integrated Wigner function of a particle subjected to a harmonic external field in contact with an ohmic thermal bath of quantum harmonic oscillators. The strategy employed is a simplified version of the phenomenological approach of Schramm, Jung, and Grabert of interpreting the operators as c numbers to derive the quantum master equation arising from a twofold transformation of the Wigner function of the entire phase space. The statistical properties of the random noise comes from the integral functional theory of Grabert, Schramm, and Ingold. By means of a single Wigner transformation, a simpler equation than that mentioned before is found. The Wigner function reproduces the known results of the classical limit. This allowed us to rewrite the underdamped classical Langevin equation as a first-order stochastic differential equation with time-dependent drift and diffusion terms.

  3. Statistical nature of infrared dynamics on de Sitter background

    NASA Astrophysics Data System (ADS)

    Tokuda, Junsei; Tanaka, Takahiro

    2018-02-01

    In this study, we formulate a systematic way of deriving an effective equation of motion(EoM) for long wavelength modes of a massless scalar field with a general potential V(phi) on de Sitter background, and investigate whether or not the effective EoM can be described as a classical stochastic process. Our formulation gives an extension of the usual stochastic formalism to including sub-leading secular growth coming from the nonlinearity of short wavelength modes. Applying our formalism to λ phi4 theory, we explicitly derive an effective EoM which correctly recovers the next-to-leading secularly growing part at a late time, and show that this effective EoM can be seen as a classical stochastic process. Our extended stochastic formalism can describe all secularly growing terms which appear in all correlation functions with a specific operator ordering. The restriction of the operator ordering will not be a big drawback because the commutator of a light scalar field becomes negligible at large scales owing to the squeezing.

  4. High-order harmonics measured by the photon statistics of the infrared driving-field exiting the atomic medium

    PubMed Central

    Tsatrafyllis, N.; Kominis, I. K.; Gonoskov, I. A.; Tzallas, P.

    2017-01-01

    High-order harmonics in the extreme-ultraviolet spectral range, resulting from the strong-field laser-atom interaction, have been used in a broad range of fascinating applications in all states of matter. In the majority of these studies the harmonic generation process is described using semi-classical theories which treat the electromagnetic field of the driving laser pulse classically without taking into account its quantum nature. In addition, for the measurement of the generated harmonics, all the experiments require diagnostics in the extreme-ultraviolet spectral region. Here by treating the driving laser field quantum mechanically we reveal the quantum-optical nature of the high-order harmonic generation process by measuring the photon number distribution of the infrared light exiting the harmonic generation medium. It is found that the high-order harmonics are imprinted in the photon number distribution of the infrared light and can be recorded without the need of a spectrometer in the extreme-ultraviolet. PMID:28447616

  5. Chandrasekhar's dynamical friction and non-extensive statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva, J.M.; Lima, J.A.S.; De Souza, R.E.

    2016-05-01

    The motion of a point like object of mass M passing through the background potential of massive collisionless particles ( m || M ) suffers a steady deceleration named dynamical friction. In his classical work, Chandrasekhar assumed a Maxwellian velocity distribution in the halo and neglected the self gravity of the wake induced by the gravitational focusing of the mass M . In this paper, by relaxing the validity of the Maxwellian distribution due to the presence of long range forces, we derive an analytical formula for the dynamical friction in the context of the q -nonextensive kinetic theory. Inmore » the extensive limiting case ( q = 1), the classical Gaussian Chandrasekhar result is recovered. As an application, the dynamical friction timescale for Globular Clusters spiraling to the galactic center is explicitly obtained. Our results suggest that the problem concerning the large timescale as derived by numerical N -body simulations or semi-analytical models can be understood as a departure from the standard extensive Maxwellian regime as measured by the Tsallis nonextensive q -parameter.« less

  6. Experimental Observation of Two Features Unexpected from the Classical Theories of Rubber Elasticity

    NASA Astrophysics Data System (ADS)

    Nishi, Kengo; Fujii, Kenta; Chung, Ung-il; Shibayama, Mitsuhiro; Sakai, Takamasa

    2017-12-01

    Although the elastic modulus of a Gaussian chain network is thought to be successfully described by classical theories of rubber elasticity, such as the affine and phantom models, verification experiments are largely lacking owing to difficulties in precisely controlling of the network structure. We prepared well-defined model polymer networks experimentally, and measured the elastic modulus G for a broad range of polymer concentrations and connectivity probabilities, p . In our experiment, we observed two features that were distinct from those predicted by classical theories. First, we observed the critical behavior G ˜|p -pc|1.95 near the sol-gel transition. This scaling law is different from the prediction of classical theories, but can be explained by analogy between the electric conductivity of resistor networks and the elasticity of polymer networks. Here, pc is the sol-gel transition point. Furthermore, we found that the experimental G -p relations in the region above C* did not follow the affine or phantom theories. Instead, all the G /G0-p curves fell onto a single master curve when G was normalized by the elastic modulus at p =1 , G0. We show that the effective medium approximation for Gaussian chain networks explains this master curve.

  7. The use of decision analysis to evaluate the economic effects of heat mount detectors in two dairy herds.

    PubMed

    Williamson, N B

    1975-03-01

    This paper reports a decrease in the interval from calving to conception in two commercial dairy herds, associated with the use of KaMaR Heat Mount Detectors. An economic analysis of the results uses a neoclassical decision theory approach to demonstrate that the use of heat mount detectors is likely to be profitable, with an expected net return of $154.18 per 100 calvings. The analysis demonstrates the suitability of a decision-theoretic approach to the analysis of applied research, and illustrates some of the weaknesses of "Classical" statistical analysis in such circumstances.

  8. Predicting the stability of nanodevices

    NASA Astrophysics Data System (ADS)

    Lin, Z. Z.; Yu, W. F.; Wang, Y.; Ning, X. J.

    2011-05-01

    A simple model based on the statistics of single atoms is developed to predict the stability or lifetime of nanodevices without empirical parameters. Under certain conditions, the model produces the Arrhenius law and the Meyer-Neldel compensation rule. Compared with the classical molecular-dynamics simulations for predicting the stability of monatomic carbon chain at high temperature, the model is proved to be much more accurate than the transition state theory. Based on the ab initio calculation of the static potential, the model can give out a corrected lifetime of monatomic carbon and gold chains at higher temperature, and predict that the monatomic chains are very stable at room temperature.

  9. Renormalization group theory outperforms other approaches in statistical comparison between upscaling techniques for porous media

    NASA Astrophysics Data System (ADS)

    Hanasoge, Shravan; Agarwal, Umang; Tandon, Kunj; Koelman, J. M. Vianney A.

    2017-09-01

    Determining the pressure differential required to achieve a desired flow rate in a porous medium requires solving Darcy's law, a Laplace-like equation, with a spatially varying tensor permeability. In various scenarios, the permeability coefficient is sampled at high spatial resolution, which makes solving Darcy's equation numerically prohibitively expensive. As a consequence, much effort has gone into creating upscaled or low-resolution effective models of the coefficient while ensuring that the estimated flow rate is well reproduced, bringing to the fore the classic tradeoff between computational cost and numerical accuracy. Here we perform a statistical study to characterize the relative success of upscaling methods on a large sample of permeability coefficients that are above the percolation threshold. We introduce a technique based on mode-elimination renormalization group theory (MG) to build coarse-scale permeability coefficients. Comparing the results with coefficients upscaled using other methods, we find that MG is consistently more accurate, particularly due to its ability to address the tensorial nature of the coefficients. MG places a low computational demand, in the manner in which we have implemented it, and accurate flow-rate estimates are obtained when using MG-upscaled permeabilities that approach or are beyond the percolation threshold.

  10. Classical, Generalizability, and Multifaceted Rasch Detection of Interrater Variability in Large, Sparse Data Sets.

    ERIC Educational Resources Information Center

    MacMillan, Peter D.

    2000-01-01

    Compared classical test theory (CTT), generalizability theory (GT), and multifaceted Rasch model (MFRM) approaches to detecting and correcting for rater variability using responses of 4,930 high school students graded by 3 raters on 9 scales. The MFRM approach identified far more raters as different than did the CTT analysis. GT and Rasch…

  11. Marshaling Resources: A Classic Grounded Theory Study of Online Learners

    ERIC Educational Resources Information Center

    Yalof, Barbara

    2012-01-01

    Students who enroll in online courses comprise one quarter of an increasingly diverse student body in higher education today. Yet, it is not uncommon for an online program to lose over 50% of its enrolled students prior to graduation. This study used a classic grounded theory qualitative methodology to investigate the persistent problem of…

  12. The Integration of Gender into the Teaching of Classical Social Theory: Help from "The Handmaid's Tale."

    ERIC Educational Resources Information Center

    Gotsch-Thomson, Susan

    1990-01-01

    Describes how gender is integrated into a classical social theory course by including a female theorist in the reading assignments and using "The Handmaid's Tale" by Margaret Atwood as the basis for class discussion. Reviews the course objectives and readings; describes the process of the class discussions; and provides student…

  13. Self-Consistent Field Lattice Model for Polymer Networks.

    PubMed

    Tito, Nicholas B; Storm, Cornelis; Ellenbroek, Wouter G

    2017-12-26

    A lattice model based on polymer self-consistent field theory is developed to predict the equilibrium statistics of arbitrary polymer networks. For a given network topology, our approach uses moment propagators on a lattice to self-consistently construct the ensemble of polymer conformations and cross-link spatial probability distributions. Remarkably, the calculation can be performed "in the dark", without any prior knowledge on preferred chain conformations or cross-link positions. Numerical results from the model for a test network exhibit close agreement with molecular dynamics simulations, including when the network is strongly sheared. Our model captures nonaffine deformation, mean-field monomer interactions, cross-link fluctuations, and finite extensibility of chains, yielding predictions that differ markedly from classical rubber elasticity theory for polymer networks. By examining polymer networks with different degrees of interconnectivity, we gain insight into cross-link entropy, an important quantity in the macroscopic behavior of gels and self-healing materials as they are deformed.

  14. Testing for detailed balance in a financial market

    NASA Astrophysics Data System (ADS)

    Fiebig, H. R.; Musgrove, D. P.

    2015-06-01

    We test a historical price-time series in a financial market (the NASDAQ 100 index) for a statistical property known as detailed balance. The presence of detailed balance would imply that the market can be modeled by a stochastic process based on a Markov chain, thus leading to equilibrium. In economic terms, a positive outcome of the test would support the efficient market hypothesis, a cornerstone of neo-classical economic theory. In contrast to the usage in prevalent economic theory the term equilibrium here is tied to the returns, rather than the price-time series. The test is based on an action functional S constructed from the elements of the detailed balance condition and the historical data set, and then analyzing S by means of simulated annealing. Checks are performed to verify the validity of the analysis method. We discuss the outcome of this analysis.

  15. Real time forecasting of near-future evolution.

    PubMed

    Gerrish, Philip J; Sniegowski, Paul D

    2012-09-07

    A metaphor for adaptation that informs much evolutionary thinking today is that of mountain climbing, where horizontal displacement represents change in genotype, and vertical displacement represents change in fitness. If it were known a priori what the 'fitness landscape' looked like, that is, how the myriad possible genotypes mapped onto fitness, then the possible paths up the fitness mountain could each be assigned a probability, thus providing a dynamical theory with long-term predictive power. Such detailed genotype-fitness data, however, are rarely available and are subject to change with each change in the organism or in the environment. Here, we take a very different approach that depends only on fitness or phenotype-fitness data obtained in real time and requires no a priori information about the fitness landscape. Our general statistical model of adaptive evolution builds on classical theory and gives reasonable predictions of fitness and phenotype evolution many generations into the future.

  16. Advanced quantitative measurement methodology in physics education research

    NASA Astrophysics Data System (ADS)

    Wang, Jing

    The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The reasoning ability structures for U.S. and Chinese students at different educational levels are given by the analysis. A final discussion on the advanced quantitative assessment methodology and the pure mathematical methodology is presented at the end.

  17. Thermal Stress Analysis of a Continuous and Pulsed End-Pumped Nd:YAG Rod Crystal Using Non-Classic Conduction Heat Transfer Theory

    NASA Astrophysics Data System (ADS)

    Mojahedi, Mahdi; Shekoohinejad, Hamidreza

    2018-02-01

    In this paper, temperature distribution in the continuous and pulsed end-pumped Nd:YAG rod crystal is determined using nonclassical and classical heat conduction theories. In order to find the temperature distribution in crystal, heat transfer differential equations of crystal with consideration of boundary conditions are derived based on non-Fourier's model and temperature distribution of the crystal is achieved by an analytical method. Then, by transferring non-Fourier differential equations to matrix equations, using finite element method, temperature and stress of every point of crystal are calculated in the time domain. According to the results, a comparison between classical and nonclassical theories is represented to investigate rupture power values. In continuous end pumping with equal input powers, non-Fourier theory predicts greater temperature and stress compared to Fourier theory. It also shows that with an increase in relaxation time, crystal rupture power decreases. Despite of these results, in single rectangular pulsed end-pumping condition, with an equal input power, Fourier theory indicates higher temperature and stress rather than non-Fourier theory. It is also observed that, when the relaxation time increases, maximum amounts of temperature and stress decrease.

  18. Western classical music development: a statistical analysis of composers similarity, differentiation and evolution.

    PubMed

    Georges, Patrick

    2017-01-01

    This paper proposes a statistical analysis that captures similarities and differences between classical music composers with the eventual aim to understand why particular composers 'sound' different even if their 'lineages' (influences network) are similar or why they 'sound' alike if their 'lineages' are different. In order to do this we use statistical methods and measures of association or similarity (based on presence/absence of traits such as specific 'ecological' characteristics and personal musical influences) that have been developed in biosystematics, scientometrics, and bibliographic coupling. This paper also represents a first step towards a more ambitious goal of developing an evolutionary model of Western classical music.

  19. Making classical and quantum canonical general relativity computable through a power series expansion in the inverse cosmological constant.

    PubMed

    Gambini, R; Pullin, J

    2000-12-18

    We consider general relativity with a cosmological constant as a perturbative expansion around a completely solvable diffeomorphism invariant field theory. This theory is the lambda --> infinity limit of general relativity. This allows an explicit perturbative computational setup in which the quantum states of the theory and the classical observables can be explicitly computed. An unexpected relationship arises at a quantum level between the discrete spectrum of the volume operator and the allowed values of the cosmological constant.

  20. On the Anticipatory Aspects of the Four Interactions: what the Known Classical and Semi-Classical Solutions Teach us

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lusanna, Luca

    2004-08-19

    The four (electro-magnetic, weak, strong and gravitational) interactions are described by singular Lagrangians and by Dirac-Bergmann theory of Hamiltonian constraints. As a consequence a subset of the original configuration variables are gauge variables, not determined by the equations of motion. Only at the Hamiltonian level it is possible to separate the gauge variables from the deterministic physical degrees of freedom, the Dirac observables, and to formulate a well posed Cauchy problem for them both in special and general relativity. Then the requirement of causality dictates the choice of retarded solutions at the classical level. However both the problems of themore » classical theory of the electron, leading to the choice of (1/2) (retarded + advanced) solutions, and the regularization of quantum field theory, leading to the Feynman propagator, introduce anticipatory aspects. The determination of the relativistic Darwin potential as a semi-classical approximation to the Lienard-Wiechert solution for particles with Grassmann-valued electric charges, regularizing the Coulomb self-energies, shows that these anticipatory effects live beyond the semi-classical approximation (tree level) under the form of radiative corrections, at least for the electro-magnetic interaction.Talk and 'best contribution' at The Sixth International Conference on Computing Anticipatory Systems CASYS'03, Liege August 11-16, 2003.« less

  1. The dynamical mass of a classical Cepheid variable star in an eclipsing binary system.

    PubMed

    Pietrzyński, G; Thompson, I B; Gieren, W; Graczyk, D; Bono, G; Udalski, A; Soszyński, I; Minniti, D; Pilecki, B

    2010-11-25

    Stellar pulsation theory provides a means of determining the masses of pulsating classical Cepheid supergiants-it is the pulsation that causes their luminosity to vary. Such pulsational masses are found to be smaller than the masses derived from stellar evolution theory: this is the Cepheid mass discrepancy problem, for which a solution is missing. An independent, accurate dynamical mass determination for a classical Cepheid variable star (as opposed to type-II Cepheids, low-mass stars with a very different evolutionary history) in a binary system is needed in order to determine which is correct. The accuracy of previous efforts to establish a dynamical Cepheid mass from Galactic single-lined non-eclipsing binaries was typically about 15-30% (refs 6, 7), which is not good enough to resolve the mass discrepancy problem. In spite of many observational efforts, no firm detection of a classical Cepheid in an eclipsing double-lined binary has hitherto been reported. Here we report the discovery of a classical Cepheid in a well detached, double-lined eclipsing binary in the Large Magellanic Cloud. We determine the mass to a precision of 1% and show that it agrees with its pulsation mass, providing strong evidence that pulsation theory correctly and precisely predicts the masses of classical Cepheids.

  2. Bertrand's theorem and virial theorem in fractional classical mechanics

    NASA Astrophysics Data System (ADS)

    Yu, Rui-Yan; Wang, Towe

    2017-09-01

    Fractional classical mechanics is the classical counterpart of fractional quantum mechanics. The central force problem in this theory is investigated. Bertrand's theorem is generalized, and virial theorem is revisited, both in three spatial dimensions. In order to produce stable, closed, non-circular orbits, the inverse-square law and the Hooke's law should be modified in fractional classical mechanics.

  3. Dissipative Effects on Inertial-Range Statistics at High Reynolds Numbers.

    PubMed

    Sinhuber, Michael; Bewley, Gregory P; Bodenschatz, Eberhard

    2017-09-29

    Using the unique capabilities of the Variable Density Turbulence Tunnel at the Max Planck Institute for Dynamics and Self-Organization, Göttingen, we report experimental measurements in classical grid turbulence that uncover oscillations of the velocity structure functions in the inertial range. This was made possible by measuring extremely long time series of up to 10^{10} samples of the turbulent fluctuating velocity, which corresponds to O(10^{7}) integral length scales. The measurements were conducted in a well-controlled environment at a wide range of high Reynolds numbers from R_{λ}=110 up to R_{λ}=1600, using both traditional hot-wire probes as well as the nanoscale thermal anemometry probe developed at Princeton University. An implication of the observed oscillations is that dissipation influences the inertial-range statistics of turbulent flows at scales significantly larger than predicted by current models and theories.

  4. Physics of Electronic Materials

    NASA Astrophysics Data System (ADS)

    Rammer, Jørgen

    2017-03-01

    1. Quantum mechanics; 2. Quantum tunneling; 3. Standard metal model; 4. Standard conductor model; 5. Electric circuit theory; 6. Quantum wells; 7. Particle in a periodic potential; 8. Bloch currents; 9. Crystalline solids; 10. Semiconductor doping; 11. Transistors; 12. Heterostructures; 13. Mesoscopic physics; 14. Arithmetic, logic and machines; Appendix A. Principles of quantum mechanics; Appendix B. Dirac's delta function; Appendix C. Fourier analysis; Appendix D. Classical mechanics; Appendix E. Wave function properties; Appendix F. Transfer matrix properties; Appendix G. Momentum; Appendix H. Confined particles; Appendix I. Spin and quantum statistics; Appendix J. Statistical mechanics; Appendix K. The Fermi-Dirac distribution; Appendix L. Thermal current fluctuations; Appendix M. Gaussian wave packets; Appendix N. Wave packet dynamics; Appendix O. Screening by symmetry method; Appendix P. Commutation and common eigenfunctions; Appendix Q. Interband coupling; Appendix R. Common crystal structures; Appendix S. Effective mass approximation; Appendix T. Integral doubling formula; Bibliography; Index.

  5. Universality classes of fluctuation dynamics in hierarchical complex systems

    NASA Astrophysics Data System (ADS)

    Macêdo, A. M. S.; González, Iván R. Roa; Salazar, D. S. P.; Vasconcelos, G. L.

    2017-03-01

    A unified approach is proposed to describe the statistics of the short-time dynamics of multiscale complex systems. The probability density function of the relevant time series (signal) is represented as a statistical superposition of a large time-scale distribution weighted by the distribution of certain internal variables that characterize the slowly changing background. The dynamics of the background is formulated as a hierarchical stochastic model whose form is derived from simple physical constraints, which in turn restrict the dynamics to only two possible classes. The probability distributions of both the signal and the background have simple representations in terms of Meijer G functions. The two universality classes for the background dynamics manifest themselves in the signal distribution as two types of tails: power law and stretched exponential, respectively. A detailed analysis of empirical data from classical turbulence and financial markets shows excellent agreement with the theory.

  6. The Tensile Strength of Liquid Nitrogen

    NASA Astrophysics Data System (ADS)

    Huang, Jian

    1992-01-01

    The tensile strength of liquids has been a puzzling subject. On the one hand, the classical nucleation theory has met great success in predicting the nucleation rates of superheated liquids. On the other hand, most of reported experimental values of the tensile strength for different liquids are far below the prediction from the classical nucleation theory. In this study, homogeneous nucleation in liquid nitrogen and its tensile strength have been investigated. Different approaches for determining the pressure amplitude were studied carefully. It is shown that Raman-Nath theory, as modified by the introduction of an effective interaction length, can be used to determine the pressure amplitude in the focal plane of a focusing ultrasonic transducer. The results obtained from different diffraction orders are consistent and in good agreement with other approaches including Debye's theory and solving the KZK equation. The measurement of the tensile strength was carried out in a high pressure stainless steel dewar. A High intensity ultrasonic wave was focused into a small volume of liquid nitrogen in a short time period. A probe laser beam passes through the focal region of a concave spherical transducer with small aperture angle and the transmitted light is detected with a photodiode. The pressure amplitude at the focus is calculated based on the acoustic power radiated into the liquid. In the experiment, the electrical signal on the transducer is gated at its resonance frequency with gate widths of 20 mus to 0.2 ms and temperature range from 77 K to near 100 K. The calculated pressure amplitude is in agreement with the prediction of classical nucleation theory for the nucleation rates from 10^6 to 10^ {11} (bubbles/cm^3 sec). This work provides the experimental evidence that the validity of the classical nucleation theory can be extended to the region of the negative pressure up to -90 atm. This is only the second cryogenic liquid to reach the tensile strength predicted from the classical nucleation theory.

  7. Nonequilibrium dynamics of the O( N ) model on dS3 and AdS crunches

    NASA Astrophysics Data System (ADS)

    Kumar, S. Prem; Vaganov, Vladislav

    2018-03-01

    We study the nonperturbative quantum evolution of the interacting O( N ) vector model at large- N , formulated on a spatial two-sphere, with time dependent couplings which diverge at finite time. This model - the so-called "E-frame" theory, is related via a conformal transformation to the interacting O( N ) model in three dimensional global de Sitter spacetime with time independent couplings. We show that with a purely quartic, relevant deformation the quantum evolution of the E-frame model is regular even when the classical theory is rendered singular at the end of time by the diverging coupling. Time evolution drives the E-frame theory to the large- N Wilson-Fisher fixed point when the classical coupling diverges. We study the quantum evolution numerically for a variety of initial conditions and demonstrate the finiteness of the energy at the classical "end of time". With an additional (time dependent) mass deformation, quantum backreaction lowers the mass, with a putative smooth time evolution only possible in the limit of infinite quartic coupling. We discuss the relevance of these results for the resolution of crunch singularities in AdS geometries dual to E-frame theories with a classical gravity dual.

  8. Mixed Quantum/Classical Theory for Molecule-Molecule Inelastic Scattering: Derivations of Equations and Application to N2 + H2 System.

    PubMed

    Semenov, Alexander; Babikov, Dmitri

    2015-12-17

    The mixed quantum classical theory, MQCT, for inelastic scattering of two molecules is developed, in which the internal (rotational, vibrational) motion of both collision partners is treated with quantum mechanics, and the molecule-molecule scattering (translational motion) is described by classical trajectories. The resultant MQCT formalism includes a system of coupled differential equations for quantum probability amplitudes, and the classical equations of motion in the mean-field potential. Numerical tests of this theory are carried out for several most important rotational state-to-state transitions in the N2 + H2 system, in a broad range of collision energies. Besides scattering resonances (at low collision energies) excellent agreement with full-quantum results is obtained, including the excitation thresholds, the maxima of cross sections, and even some smaller features, such as slight oscillations of energy dependencies. Most importantly, at higher energies the results of MQCT are nearly identical to the full quantum results, which makes this approach a good alternative to the full-quantum calculations that become computationally expensive at higher collision energies and for heavier collision partners. Extensions of this theory to include vibrational transitions or general asymmetric-top rotor (polyatomic) molecules are relatively straightforward.

  9. Classical theory of atom-surface scattering: The rainbow effect

    NASA Astrophysics Data System (ADS)

    Miret-Artés, Salvador; Pollak, Eli

    2012-07-01

    The scattering of heavy atoms and molecules from surfaces is oftentimes dominated by classical mechanics. A large body of experiments have gathered data on the angular distributions of the scattered species, their energy loss distribution, sticking probability, dependence on surface temperature and more. For many years these phenomena have been considered theoretically in the framework of the “washboard model” in which the interaction of the incident particle with the surface is described in terms of hard wall potentials. Although this class of models has helped in elucidating some of the features it left open many questions such as: true potentials are clearly not hard wall potentials, it does not provide a realistic framework for phonon scattering, and it cannot explain the incident angle and incident energy dependence of rainbow scattering, nor can it provide a consistent theory for sticking. In recent years we have been developing a classical perturbation theory approach which has provided new insight into the dynamics of atom-surface scattering. The theory includes both surface corrugation as well as interaction with surface phonons in terms of harmonic baths which are linearly coupled to the system coordinates. This model has been successful in elucidating many new features of rainbow scattering in terms of frictions and bath fluctuations or noise. It has also given new insight into the origins of asymmetry in atomic scattering from surfaces. New phenomena deduced from the theory include friction induced rainbows, energy loss rainbows, a theory of super-rainbows, and more. In this review we present the classical theory of atom-surface scattering as well as extensions and implications for semiclassical scattering and the further development of a quantum theory of surface scattering. Special emphasis is given to the inversion of scattering data into information on the particle-surface interactions.

  10. Classical theory of atom-surface scattering: The rainbow effect

    NASA Astrophysics Data System (ADS)

    Miret-Artés, Salvador; Pollak, Eli

    The scattering of heavy atoms and molecules from surfaces is oftentimes dominated by classical mechanics. A large body of experiments have gathered data on the angular distributions of the scattered species, their energy loss distribution, sticking probability, dependence on surface temperature and more. For many years these phenomena have been considered theoretically in the framework of the "washboard model" in which the interaction of the incident particle with the surface is described in terms of hard wall potentials. Although this class of models has helped in elucidating some of the features it left open many questions such as: true potentials are clearly not hard wall potentials, it does not provide a realistic framework for phonon scattering, and it cannot explain the incident angle and incident energy dependence of rainbow scattering, nor can it provide a consistent theory for sticking. In recent years we have been developing a classical perturbation theory approach which has provided new insight into the dynamics of atom-surface scattering. The theory includes both surface corrugation as well as interaction with surface phonons in terms of harmonic baths which are linearly coupled to the system coordinates. This model has been successful in elucidating many new features of rainbow scattering in terms of frictions and bath fluctuations or noise. It has also given new insight into the origins of asymmetry in atomic scattering from surfaces. New phenomena deduced from the theory include friction induced rainbows, energy loss rainbows, a theory of super-rainbows, and more. In this review we present the classical theory of atom-surface scattering as well as extensions and implications for semiclassical scattering and the further development of a quantum theory of surface scattering. Special emphasis is given to the inversion of scattering data into information on the particle-surface interactions.

  11. Gaussian orthogonal ensemble statistics in graphene billiards with the shape of classically integrable billiards.

    PubMed

    Yu, Pei; Li, Zi-Yuan; Xu, Hong-Ya; Huang, Liang; Dietz, Barbara; Grebogi, Celso; Lai, Ying-Cheng

    2016-12-01

    A crucial result in quantum chaos, which has been established for a long time, is that the spectral properties of classically integrable systems generically are described by Poisson statistics, whereas those of time-reversal symmetric, classically chaotic systems coincide with those of random matrices from the Gaussian orthogonal ensemble (GOE). Does this result hold for two-dimensional Dirac material systems? To address this fundamental question, we investigate the spectral properties in a representative class of graphene billiards with shapes of classically integrable circular-sector billiards. Naively one may expect to observe Poisson statistics, which is indeed true for energies close to the band edges where the quasiparticle obeys the Schrödinger equation. However, for energies near the Dirac point, where the quasiparticles behave like massless Dirac fermions, Poisson statistics is extremely rare in the sense that it emerges only under quite strict symmetry constraints on the straight boundary parts of the sector. An arbitrarily small amount of imperfection of the boundary results in GOE statistics. This implies that, for circular-sector confinements with arbitrary angle, the spectral properties will generically be GOE. These results are corroborated by extensive numerical computation. Furthermore, we provide a physical understanding for our results.

  12. Gaussian orthogonal ensemble statistics in graphene billiards with the shape of classically integrable billiards

    NASA Astrophysics Data System (ADS)

    Yu, Pei; Li, Zi-Yuan; Xu, Hong-Ya; Huang, Liang; Dietz, Barbara; Grebogi, Celso; Lai, Ying-Cheng

    2016-12-01

    A crucial result in quantum chaos, which has been established for a long time, is that the spectral properties of classically integrable systems generically are described by Poisson statistics, whereas those of time-reversal symmetric, classically chaotic systems coincide with those of random matrices from the Gaussian orthogonal ensemble (GOE). Does this result hold for two-dimensional Dirac material systems? To address this fundamental question, we investigate the spectral properties in a representative class of graphene billiards with shapes of classically integrable circular-sector billiards. Naively one may expect to observe Poisson statistics, which is indeed true for energies close to the band edges where the quasiparticle obeys the Schrödinger equation. However, for energies near the Dirac point, where the quasiparticles behave like massless Dirac fermions, Poisson statistics is extremely rare in the sense that it emerges only under quite strict symmetry constraints on the straight boundary parts of the sector. An arbitrarily small amount of imperfection of the boundary results in GOE statistics. This implies that, for circular-sector confinements with arbitrary angle, the spectral properties will generically be GOE. These results are corroborated by extensive numerical computation. Furthermore, we provide a physical understanding for our results.

  13. Loop Quantum Cosmology.

    PubMed

    Bojowald, Martin

    2008-01-01

    Quantum gravity is expected to be necessary in order to understand situations in which classical general relativity breaks down. In particular in cosmology one has to deal with initial singularities, i.e., the fact that the backward evolution of a classical spacetime inevitably comes to an end after a finite amount of proper time. This presents a breakdown of the classical picture and requires an extended theory for a meaningful description. Since small length scales and high curvatures are involved, quantum effects must play a role. Not only the singularity itself but also the surrounding spacetime is then modified. One particular theory is loop quantum cosmology, an application of loop quantum gravity to homogeneous systems, which removes classical singularities. Its implications can be studied at different levels. The main effects are introduced into effective classical equations, which allow one to avoid the interpretational problems of quantum theory. They give rise to new kinds of early-universe phenomenology with applications to inflation and cyclic models. To resolve classical singularities and to understand the structure of geometry around them, the quantum description is necessary. Classical evolution is then replaced by a difference equation for a wave function, which allows an extension of quantum spacetime beyond classical singularities. One main question is how these homogeneous scenarios are related to full loop quantum gravity, which can be dealt with at the level of distributional symmetric states. Finally, the new structure of spacetime arising in loop quantum gravity and its application to cosmology sheds light on more general issues, such as the nature of time. Supplementary material is available for this article at 10.12942/lrr-2008-4.

  14. Data Analysis Techniques for Physical Scientists

    NASA Astrophysics Data System (ADS)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  15. Information theory lateral density distribution for Earth inferred from global gravity field

    NASA Technical Reports Server (NTRS)

    Rubincam, D. P.

    1981-01-01

    Information Theory Inference, better known as the Maximum Entropy Method, was used to infer the lateral density distribution inside the Earth. The approach assumed that the Earth consists of indistinguishable Maxwell-Boltzmann particles populating infinitesimal volume elements, and followed the standard methods of statistical mechanics (maximizing the entropy function). The GEM 10B spherical harmonic gravity field coefficients, complete to degree and order 36, were used as constraints on the lateral density distribution. The spherically symmetric part of the density distribution was assumed to be known. The lateral density variation was assumed to be small compared to the spherically symmetric part. The resulting information theory density distribution for the cases of no crust removed, 30 km of compensated crust removed, and 30 km of uncompensated crust removed all gave broad density anomalies extending deep into the mantle, but with the density contrasts being the greatest towards the surface (typically + or 0.004 g cm 3 in the first two cases and + or - 0.04 g cm 3 in the third). None of the density distributions resemble classical organized convection cells. The information theory approach may have use in choosing Standard Earth Models, but, the inclusion of seismic data into the approach appears difficult.

  16. A Comparison between Discrimination Indices and Item-Response Theory Using the Rasch Model in a Clinical Course Written Examination of a Medical School.

    PubMed

    Park, Jong Cook; Kim, Kwang Sig

    2012-03-01

    The reliability of test is determined by each items' characteristics. Item analysis is achieved by classical test theory and item response theory. The purpose of the study was to compare the discrimination indices with item response theory using the Rasch model. Thirty-one 4th-year medical school students participated in the clinical course written examination, which included 22 A-type items and 3 R-type items. Point biserial correlation coefficient (C(pbs)) was compared to method of extreme group (D), biserial correlation coefficient (C(bs)), item-total correlation coefficient (C(it)), and corrected item-total correlation coeffcient (C(cit)). Rasch model was applied to estimate item difficulty and examinee's ability and to calculate item fit statistics using joint maximum likelihood. Explanatory power (r2) of Cpbs is decreased in the following order: C(cit) (1.00), C(it) (0.99), C(bs) (0.94), and D (0.45). The ranges of difficulty logit and standard error and ability logit and standard error were -0.82 to 0.80 and 0.37 to 0.76, -3.69 to 3.19 and 0.45 to 1.03, respectively. Item 9 and 23 have outfit > or =1.3. Student 1, 5, 7, 18, 26, 30, and 32 have fit > or =1.3. C(pbs), C(cit), and C(it) are good discrimination parameters. Rasch model can estimate item difficulty parameter and examinee's ability parameter with standard error. The fit statistics can identify bad items and unpredictable examinee's responses.

  17. Aging Theories for Establishing Safe Life Spans of Airborne Critical Structural Components

    NASA Technical Reports Server (NTRS)

    Ko, William L.

    2003-01-01

    New aging theories have been developed to establish the safe life span of airborne critical structural components such as B-52B aircraft pylon hooks for carrying air-launch drop-test vehicles. The new aging theories use the equivalent-constant-amplitude loading spectrum to represent the actual random loading spectrum with the same damaging effect. The crack growth due to random loading cycling of the first flight is calculated using the half-cycle theory, and then extrapolated to all the crack growths of the subsequent flights. The predictions of the new aging theories (finite difference aging theory and closed-form aging theory) are compared with the classical flight-test life theory and the previously developed Ko first- and Ko second-order aging theories. The new aging theories predict the number of safe flights as considerably lower than that predicted by the classical aging theory, and slightly lower than those predicted by the Ko first- and Ko second-order aging theories due to the inclusion of all the higher order terms.

  18. On the co-creation of classical and modern physics.

    PubMed

    Staley, Richard

    2005-12-01

    While the concept of "classical physics" has long framed our understanding of the environment from which modern physics emerged, it has consistently been read back into a period in which the physicists concerned initially considered their work in quite other terms. This essay explores the shifting currency of the rich cultural image of the classical/ modern divide by tracing empirically different uses of "classical" within the physics community from the 1890s to 1911. A study of fin-de-siècle addresses shows that the earliest general uses of the concept proved controversial. Our present understanding of the term was in large part shaped by its incorporation (in different ways) within the emerging theories of relativity and quantum theory--where the content of "classical" physics was defined by proponents of the new. Studying the diverse ways in which Boltzmann, Larmor, Poincaré, Einstein, Minkowski, and Planck invoked the term "classical" will help clarify the critical relations between physicists' research programs and their use of worldview arguments in fashioning modern physics.

  19. Contact stresses in gear teeth: A new method of analysis

    NASA Technical Reports Server (NTRS)

    Somprakit, Paisan; Huston, Ronald L.; Oswald, Fred B.

    1991-01-01

    A new, innovative procedure called point load superposition for determining the contact stresses in mating gear teeth. It is believed that this procedure will greatly extend both the range of applicability and the accuracy of gear contact stress analysis. Point load superposition is based upon fundamental solutions from the theory of elasticity. It is an iterative numerical procedure which has distinct advantages over the classical Hertz method, the finite element method, and over existing applications with the boundary element method. Specifically, friction and sliding effects, which are either excluded from or difficult to study with the classical methods, are routinely handled with the new procedure. Presented here are the basic theory and the algorithms. Several examples are given. Results are consistent with those of the classical theories. Applications to spur gears are discussed.

  20. Finite element modelling versus classic beam theory: comparing methods for stress estimation in a morphologically diverse sample of vertebrate long bones

    PubMed Central

    Brassey, Charlotte A.; Margetts, Lee; Kitchener, Andrew C.; Withers, Philip J.; Manning, Phillip L.; Sellers, William I.

    2013-01-01

    Classic beam theory is frequently used in biomechanics to model the stress behaviour of vertebrate long bones, particularly when creating intraspecific scaling models. Although methodologically straightforward, classic beam theory requires complex irregular bones to be approximated as slender beams, and the errors associated with simplifying complex organic structures to such an extent are unknown. Alternative approaches, such as finite element analysis (FEA), while much more time-consuming to perform, require no such assumptions. This study compares the results obtained using classic beam theory with those from FEA to quantify the beam theory errors and to provide recommendations about when a full FEA is essential for reasonable biomechanical predictions. High-resolution computed tomographic scans of eight vertebrate long bones were used to calculate diaphyseal stress owing to various loading regimes. Under compression, FEA values of minimum principal stress (σmin) were on average 142 per cent (±28% s.e.) larger than those predicted by beam theory, with deviation between the two models correlated to shaft curvature (two-tailed p = 0.03, r2 = 0.56). Under bending, FEA values of maximum principal stress (σmax) and beam theory values differed on average by 12 per cent (±4% s.e.), with deviation between the models significantly correlated to cross-sectional asymmetry at midshaft (two-tailed p = 0.02, r2 = 0.62). In torsion, assuming maximum stress values occurred at the location of minimum cortical thickness brought beam theory and FEA values closest in line, and in this case FEA values of τtorsion were on average 14 per cent (±5% s.e.) higher than beam theory. Therefore, FEA is the preferred modelling solution when estimates of absolute diaphyseal stress are required, although values calculated by beam theory for bending may be acceptable in some situations. PMID:23173199

  1. Generalized Quantum Theory of Bianchi IX Cosmologies

    NASA Astrophysics Data System (ADS)

    Craig, David; Hartle, James

    2003-04-01

    We apply sum-over-histories generalized quantum theory to the closed homogeneous minisuperspace Bianchi IX cosmological model. We sketch how the probabilities in decoherent sets of alternative, coarse-grained histories of this model universe are calculated. We consider in particular, the probabilities for classical evolution in a suitable coarse-graining. For a restricted class of initial conditions and coarse grainings we exhibit the approximate decoherence of alternative histories in which the universe behaves classically and those in which it does not, illustrating the prediction that these universes will evolve in an approximately classical manner with a probability near unity.

  2. Generalized mutual information and Tsirelson's bound

    NASA Astrophysics Data System (ADS)

    Wakakuwa, Eyuri; Murao, Mio

    2014-12-01

    We introduce a generalization of the quantum mutual information between a classical system and a quantum system into the mutual information between a classical system and a system described by general probabilistic theories. We apply this generalized mutual information (GMI) to a derivation of Tsirelson's bound from information causality, and prove that Tsirelson's bound can be derived from the chain rule of the GMI. By using the GMI, we formulate the "no-supersignalling condition" (NSS), that the assistance of correlations does not enhance the capability of classical communication. We prove that NSS is never violated in any no-signalling theory.

  3. Generalized mutual information and Tsirelson's bound

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wakakuwa, Eyuri; Murao, Mio

    2014-12-04

    We introduce a generalization of the quantum mutual information between a classical system and a quantum system into the mutual information between a classical system and a system described by general probabilistic theories. We apply this generalized mutual information (GMI) to a derivation of Tsirelson's bound from information causality, and prove that Tsirelson's bound can be derived from the chain rule of the GMI. By using the GMI, we formulate the 'no-supersignalling condition' (NSS), that the assistance of correlations does not enhance the capability of classical communication. We prove that NSS is never violated in any no-signalling theory.

  4. Electrode redox reactions with polarizable molecules.

    PubMed

    Matyushov, Dmitry V

    2018-04-21

    A theory of redox reactions involving electron transfer between a metal electrode and a polarizable molecule in solution is formulated. Both the existence of molecular polarizability and its ability to change due to electron transfer distinguish this problem from classical theories of interfacial electrochemistry. When the polarizability is different between the oxidized and reduced states, the statistics of thermal fluctuations driving the reactant over the activation barrier becomes non-Gaussian. The problem of electron transfer is formulated as crossing of two non-parabolic free energy surfaces. An analytical solution for these free energy surfaces is provided and the activation barrier of electrode electron transfer is given in terms of two reorganization energies corresponding to the oxidized and reduced states of the molecule in solution. The new non-Gaussian theory is, therefore, based on two theory parameters in contrast to one-parameter Marcus formulation for electrode reactions. The theory, which is consistent with the Nernst equation, predicts asymmetry between the cathodic and anodic branches of the electrode current. They show different slopes at small electrode overpotentials and become curved at larger overpotentials. However, the curvature of the Tafel plot is reduced compared to the Marcus-Hush model and approaches the empirical Butler-Volmer form with different transfer coefficients for the anodic and cathodic currents.

  5. Electrode redox reactions with polarizable molecules

    NASA Astrophysics Data System (ADS)

    Matyushov, Dmitry V.

    2018-04-01

    A theory of redox reactions involving electron transfer between a metal electrode and a polarizable molecule in solution is formulated. Both the existence of molecular polarizability and its ability to change due to electron transfer distinguish this problem from classical theories of interfacial electrochemistry. When the polarizability is different between the oxidized and reduced states, the statistics of thermal fluctuations driving the reactant over the activation barrier becomes non-Gaussian. The problem of electron transfer is formulated as crossing of two non-parabolic free energy surfaces. An analytical solution for these free energy surfaces is provided and the activation barrier of electrode electron transfer is given in terms of two reorganization energies corresponding to the oxidized and reduced states of the molecule in solution. The new non-Gaussian theory is, therefore, based on two theory parameters in contrast to one-parameter Marcus formulation for electrode reactions. The theory, which is consistent with the Nernst equation, predicts asymmetry between the cathodic and anodic branches of the electrode current. They show different slopes at small electrode overpotentials and become curved at larger overpotentials. However, the curvature of the Tafel plot is reduced compared to the Marcus-Hush model and approaches the empirical Butler-Volmer form with different transfer coefficients for the anodic and cathodic currents.

  6. "Fathers" and "sons" of theories in cell physiology: the membrane theory.

    PubMed

    Matveev, V V; Wheatley, D N

    2005-12-16

    The last 50 years in the history of life sciences are remarkable for a new important feature that looks as a great threat for their future. A profound specialization dominating in quickly developing fields of science causes a crisis of the scientific method. The essence of the method is a unity of two elements, the experimental data and the theory that explains them. To us, "fathers" of science, classically, were the creators of new ideas and theories. They were the true experts of their own theories. It is only they who have the right to say: "I am the theory". In other words, they were carriers of theories, of the theoretical knowledge. The fathers provided the necessary logical integrity to their theories, since theories in biology have still to be based on strict mathematical proofs. It is not true for sons. As a result of massive specialization, modern experts operate in very confined close spaces. They formulate particular rules far from the level of theory. The main theories of science are known to them only at the textbook level. Nowadays, nobody can say: "I am the theory". With whom, then is it possible to discuss today on a broader theoretical level? How can a classical theory--for example, the membrane one--be changed or even disproved under these conditions? How can the "sons" with their narrow education catch sight of membrane theory defects? As a result, "global" theories have few critics and control. Due to specialization, we have lost the ability to work at the experimental level of biology within the correct or appropriate theoretical context. The scientific method in its classic form is now being rapidly eroded. A good case can be made for "Membrane Theory", to which we will largely refer throughout this article.

  7. A discussion of differences in preparation, performance and postreflections in participant observations within two grounded theory approaches.

    PubMed

    Berthelsen, Connie Bøttcher; Lindhardt, Tove; Frederiksen, Kirsten

    2017-06-01

    This paper presents a discussion of the differences in using participant observation as a data collection method by comparing the classic grounded theory methodology of Barney Glaser with the constructivist grounded theory methodology by Kathy Charmaz. Participant observations allow nursing researchers to experience activities and interactions directly in situ. However, using participant observations as a data collection method can be done in many ways, depending on the chosen grounded theory methodology, and may produce different results. This discussion shows that how the differences between using participant observations in classic and constructivist grounded theory can be considerable and that grounded theory researchers should adhere to the method descriptions of performing participant observations according to the selected grounded theory methodology to enhance the quality of research. © 2016 Nordic College of Caring Science.

  8. On the quantum Landau collision operator and electron collisions in dense plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daligault, Jérôme, E-mail: daligaul@lanl.gov

    2016-03-15

    The quantum Landau collision operator, which extends the widely used Landau/Fokker-Planck collision operator to include quantum statistical effects, is discussed. The quantum extension can serve as a reference model for including electron collisions in non-equilibrium dense plasmas, in which the quantum nature of electrons cannot be neglected. In this paper, the properties of the Landau collision operator that have been useful in traditional plasma kinetic theory and plasma transport theory are extended to the quantum case. We outline basic properties in connection with the conservation laws, the H-theorem, and the global and local equilibrium distributions. We discuss the Fokker-Planck formmore » of the operator in terms of three potentials that extend the usual two Rosenbluth potentials. We establish practical closed-form expressions for these potentials under local thermal equilibrium conditions in terms of Fermi-Dirac and Bose-Einstein integrals. We study the properties of linearized quantum Landau operator, and extend two popular approximations used in plasma physics to include collisions in kinetic simulations. We apply the quantum Landau operator to the classic test-particle problem to illustrate the physical effects embodied in the quantum extension. We present useful closed-form expressions for the electron-ion momentum and energy transfer rates. Throughout the paper, similarities and differences between the quantum and classical Landau collision operators are emphasized.« less

  9. On the quantum Landau collision operator and electron collisions in dense plasmas

    NASA Astrophysics Data System (ADS)

    Daligault, Jérôme

    2016-03-01

    The quantum Landau collision operator, which extends the widely used Landau/Fokker-Planck collision operator to include quantum statistical effects, is discussed. The quantum extension can serve as a reference model for including electron collisions in non-equilibrium dense plasmas, in which the quantum nature of electrons cannot be neglected. In this paper, the properties of the Landau collision operator that have been useful in traditional plasma kinetic theory and plasma transport theory are extended to the quantum case. We outline basic properties in connection with the conservation laws, the H-theorem, and the global and local equilibrium distributions. We discuss the Fokker-Planck form of the operator in terms of three potentials that extend the usual two Rosenbluth potentials. We establish practical closed-form expressions for these potentials under local thermal equilibrium conditions in terms of Fermi-Dirac and Bose-Einstein integrals. We study the properties of linearized quantum Landau operator, and extend two popular approximations used in plasma physics to include collisions in kinetic simulations. We apply the quantum Landau operator to the classic test-particle problem to illustrate the physical effects embodied in the quantum extension. We present useful closed-form expressions for the electron-ion momentum and energy transfer rates. Throughout the paper, similarities and differences between the quantum and classical Landau collision operators are emphasized.

  10. Quantum-like Modeling of Cognition

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2015-09-01

    This paper begins with a historical review of the mutual influence of physics and psychology, from Freud's invention of psychic energy inspired by von Boltzmann' thermodynamics to the enrichment quantum physics gained from the side of psychology by the notion of complementarity (the invention of Niels Bohr who was inspired by William James), besides we consider the resonance of the correspondence between Wolfgang Pauli and Carl Jung in both physics and psychology. Then we turn to the problem of development of mathematical models for laws of thought starting with Boolean logic and progressing towards foundations of classical probability theory. Interestingly, the laws of classical logic and probability are routinely violated not only by quantum statistical phenomena but by cognitive phenomena as well. This is yet another common feature between quantum physics and psychology. In particular, cognitive data can exhibit a kind of the probabilistic interference effect. This similarity with quantum physics convinced a multi-disciplinary group of scientists (physicists, psychologists, economists, sociologists) to apply the mathematical apparatus of quantum mechanics to modeling of cognition. We illustrate this activity by considering a few concrete phenomena: the order and disjunction effects, recognition of ambiguous figures, categorization-decision making. In Appendix 1 we briefly present essentials of theory of contextual probability and a method of representations of contextual probabilities by complex probability amplitudes (solution of the ``inverse Born's problem'') based on a quantum-like representation algorithm (QLRA).

  11. The Institution of Sociological Theory in Canada.

    PubMed

    Guzman, Cinthya; Silver, Daniel

    2018-02-01

    Using theory syllabi and departmental data collected for three academic years, this paper investigates the institutional practice of theory in sociology departments across Canada. In particular, it examines the position of theory within the sociological curriculum, and how this varies among universities. Taken together, our analyses indicate that theory remains deeply institutionalized at the core of sociological education and Canadian sociologists' self-understanding; that theorists as a whole show some coherence in how they define themselves, but differ in various ways, especially along lines of region, intellectual background, and gender; that despite these differences, the classical versus contemporary heuristic largely cuts across these divides, as does the strongly ingrained position of a small group of European authors as classics of the discipline as a whole. Nevertheless, who is a classic remains an unsettled question, alternatives to the "classical versus contemporary" heuristic do exist, and theorists' syllabi reveal diverse "others" as potential candidates. Our findings show that the field of sociology is neither marked by universal agreement nor by absolute division when it comes to its theoretical underpinnings. To the extent that they reveal a unified field, the findings suggest that unity lies more in a distinctive form than in a distinctive content, which defines the space and structure of the field of sociology. © 2018 Canadian Sociological Association/La Société canadienne de sociologie.

  12. Predicting Rotator Cuff Tears Using Data Mining and Bayesian Likelihood Ratios

    PubMed Central

    Lu, Hsueh-Yi; Huang, Chen-Yuan; Su, Chwen-Tzeng; Lin, Chen-Chiang

    2014-01-01

    Objectives Rotator cuff tear is a common cause of shoulder diseases. Correct diagnosis of rotator cuff tears can save patients from further invasive, costly and painful tests. This study used predictive data mining and Bayesian theory to improve the accuracy of diagnosing rotator cuff tears by clinical examination alone. Methods In this retrospective study, 169 patients who had a preliminary diagnosis of rotator cuff tear on the basis of clinical evaluation followed by confirmatory MRI between 2007 and 2011 were identified. MRI was used as a reference standard to classify rotator cuff tears. The predictor variable was the clinical assessment results, which consisted of 16 attributes. This study employed 2 data mining methods (ANN and the decision tree) and a statistical method (logistic regression) to classify the rotator cuff diagnosis into “tear” and “no tear” groups. Likelihood ratio and Bayesian theory were applied to estimate the probability of rotator cuff tears based on the results of the prediction models. Results Our proposed data mining procedures outperformed the classic statistical method. The correction rate, sensitivity, specificity and area under the ROC curve of predicting a rotator cuff tear were statistical better in the ANN and decision tree models compared to logistic regression. Based on likelihood ratios derived from our prediction models, Fagan's nomogram could be constructed to assess the probability of a patient who has a rotator cuff tear using a pretest probability and a prediction result (tear or no tear). Conclusions Our predictive data mining models, combined with likelihood ratios and Bayesian theory, appear to be good tools to classify rotator cuff tears as well as determine the probability of the presence of the disease to enhance diagnostic decision making for rotator cuff tears. PMID:24733553

  13. Crystal Melting and Wall Crossing Phenomena

    NASA Astrophysics Data System (ADS)

    Yamazaki, Masahito

    This paper summarizes recent developments in the theory of Bogomol'nyi-Prasad-Sommerfield (BPS) state counting and the wall crossing phenomena, emphasizing in particular the role of the statistical mechanical model of crystal melting. This paper is divided into two parts, which are closely related to each other. In the first part, we discuss the statistical mechanical model of crystal melting counting BPS states. Each of the BPS states contributing to the BPS index is in one-to-one correspondence with a configuration of a molten crystal, and the statistical partition function of the melting crystal gives the BPS partition function. We also show that smooth geometry of the Calabi-Yau manifold emerges in the thermodynamic limit of the crystal. This suggests a remarkable interpretation that an atom in the crystal is a discretization of the classical geometry, giving an important clue as such to the geometry at the Planck scale. In the second part, we discuss the wall crossing phenomena. Wall crossing phenomena states that the BPS index depends on the value of the moduli of the Calabi-Yau manifold, and jumps along real codimension one subspaces in the moduli space. We show that by using type IIA/M-theory duality, we can provide a simple and an intuitive derivation of the wall crossing phenomena, furthermore clarifying the connection with the topological string theory. This derivation is consistent with another derivation from the wall crossing formula, motivated by multicentered BPS extremal black holes. We also explain the representation of the wall crossing phenomena in terms of crystal melting, and the generalization of the counting problem and the wall crossing to the open BPS invariants.

  14. On the effective field theory of intersecting D3-branes

    NASA Astrophysics Data System (ADS)

    Abbaspur, Reza

    2018-05-01

    We study the effective field theory of two intersecting D3-branes with one common dimension along the lines recently proposed in ref. [1]. We introduce a systematic way of deriving the classical effective action to arbitrary orders in perturbation theory. Using a proper renormalization prescription to handle logarithmic divergencies arising at all orders in the perturbation series, we recover the first order renormalization group equation of ref. [1] plus an infinite set of higher order equations. We show the consistency of the higher order equations with the first order one and hence interpret the first order result as an exact RG flow equation in the classical theory.

  15. Ginzburg-Landau theory for the solid-liquid interface of bcc elements. II - Application to the classical one-component plasma, the Wigner crystal, and He-4

    NASA Technical Reports Server (NTRS)

    Zeng, X. C.; Stroud, D.

    1989-01-01

    The previously developed Ginzburg-Landau theory for calculating the crystal-melt interfacial tension of bcc elements to treat the classical one-component plasma (OCP), the charged fermion system, and the Bose crystal. For the OCP, a direct application of the theory of Shih et al. (1987) yields for the surface tension 0.0012(Z-squared e-squared/a-cubed), where Ze is the ionic charge and a is the radius of the ionic sphere. Bose crystal-melt interface is treated by a quantum extension of the classical density-functional theory, using the Feynman formalism to estimate the relevant correlation functions. The theory is applied to the metastable He-4 solid-superfluid interface at T = 0, with a resulting surface tension of 0.085 erg/sq cm, in reasonable agreement with the value extrapolated from the measured surface tension of the bcc solid in the range 1.46-1.76 K. These results suggest that the density-functional approach is a satisfactory mean-field theory for estimating the equilibrium properties of liquid-solid interfaces, given knowledge of the uniform phases.

  16. Antigravity Acts on Photons

    NASA Astrophysics Data System (ADS)

    Brynjolfsson, Ari

    2002-04-01

    Einstein's general theory of relativity assumes that photons don't change frequency as they move from Sun to Earth. This assumption is correct in classical physics. All experiments proving the general relativity are in the domain of classical physics. This include the tests by Pound et al. of the gravitational redshift of 14.4 keV photons; the rocket experiments by Vessot et al.; the Galileo solar redshift experiments by Krisher et al.; the gravitational deflection of light experiments by Riveros and Vucetich; and delay of echoes of radar signals passing close to Sun as observed by Shapiro et al. Bohr's correspondence principle assures that quantum mechanical theory of general relativity agrees with Einstein's classical theory when frequency and gravitational field gradient approach zero, or when photons cannot interact with the gravitational field. When we treat photons as quantum mechanical particles; we find that gravitational force on photons is reversed (antigravity). This modified theory contradicts the equivalence principle, but is consistent with all experiments. Solar lines and distant stars are redshifted in accordance with author's plasma redshift theory. These changes result in a beautiful consistent cosmology.

  17. Social cycling and conditional responses in the Rock-Paper-Scissors game

    PubMed Central

    Wang, Zhijian; Xu, Bin; Zhou, Hai-Jun

    2014-01-01

    How humans make decisions in non-cooperative strategic interactions is a big question. For the fundamental Rock-Paper-Scissors (RPS) model game system, classic Nash equilibrium (NE) theory predicts that players randomize completely their action choices to avoid being exploited, while evolutionary game theory of bounded rationality in general predicts persistent cyclic motions, especially in finite populations. However as empirical studies have been relatively sparse, it is still a controversial issue as to which theoretical framework is more appropriate to describe decision-making of human subjects. Here we observe population-level persistent cyclic motions in a laboratory experiment of the discrete-time iterated RPS game under the traditional random pairwise-matching protocol. This collective behavior contradicts with the NE theory but is quantitatively explained, without any adjustable parameter, by a microscopic model of win-lose-tie conditional response. Theoretical calculations suggest that if all players adopt the same optimized conditional response strategy, their accumulated payoff will be much higher than the reference value of the NE mixed strategy. Our work demonstrates the feasibility of understanding human competition behaviors from the angle of non-equilibrium statistical physics. PMID:25060115

  18. Improving Measurement in Health Education and Health Behavior Research Using Item Response Modeling: Comparison with the Classical Test Theory Approach

    ERIC Educational Resources Information Center

    Wilson, Mark; Allen, Diane D.; Li, Jun Corser

    2006-01-01

    This paper compares the approach and resultant outcomes of item response models (IRMs) and classical test theory (CTT). First, it reviews basic ideas of CTT, and compares them to the ideas about using IRMs introduced in an earlier paper. It then applies a comparison scheme based on the AERA/APA/NCME "Standards for Educational and…

  19. The Reliability and Precision of Total Scores and IRT Estimates as a Function of Polytomous IRT Parameters and Latent Trait Distribution

    ERIC Educational Resources Information Center

    Culpepper, Steven Andrew

    2013-01-01

    A classic topic in the fields of psychometrics and measurement has been the impact of the number of scale categories on test score reliability. This study builds on previous research by further articulating the relationship between item response theory (IRT) and classical test theory (CTT). Equations are presented for comparing the reliability and…

  20. The Effects of Academic and Interpersonal Stress on Dating Violence among College Students: A Test of Classical Strain Theory

    ERIC Educational Resources Information Center

    Mason, Brandon; Smithey, Martha

    2012-01-01

    This study examines Merton's Classical Strain Theory (1938) as a causative factor in intimate partner violence among college students. We theorize that college students experience general life strain and cumulative strain as they pursue the goal of a college degree. We test this strain on the likelihood of using intimate partner violence. Strain…

  1. A Classical Test Theory Analysis of the Light and Spectroscopy Concept Inventory National Study Data Set

    ERIC Educational Resources Information Center

    Schlingman, Wayne M.; Prather, Edward E.; Wallace, Colin S.; Brissenden, Gina; Rudolph, Alexander L.

    2012-01-01

    This paper is the first in a series of investigations into the data from the recent national study using the Light and Spectroscopy Concept Inventory (LSCI). In this paper, we use classical test theory to form a framework of results that will be used to evaluate individual item difficulties, item discriminations, and the overall reliability of the…

  2. Classical closure theory and Lam's interpretation of epsilon-RNG

    NASA Technical Reports Server (NTRS)

    Zhou, YE

    1995-01-01

    Lam's phenomenological epsilon-renormalization group (RNG) model is quite different from the other members of that group. It does not make use of the correspondence principle and the epsilon-expansion procedure. We demonstrate that Lam's epsilon-RNG model is essentially the physical space version of the classical closure theory in spectral space and consider the corresponding treatment of the eddy viscosity and energy backscatter.

  3. New variables for classical and quantum gravity

    NASA Technical Reports Server (NTRS)

    Ashtekar, Abhay

    1986-01-01

    A Hamiltonian formulation of general relativity based on certain spinorial variables is introduced. These variables simplify the constraints of general relativity considerably and enable one to imbed the constraint surface in the phase space of Einstein's theory into that of Yang-Mills theory. The imbedding suggests new ways of attacking a number of problems in both classical and quantum gravity. Some illustrative applications are discussed.

  4. An Analysis of Cross Racial Identity Scale Scores Using Classical Test Theory and Rasch Item Response Models

    ERIC Educational Resources Information Center

    Sussman, Joshua; Beaujean, A. Alexander; Worrell, Frank C.; Watson, Stevie

    2013-01-01

    Item response models (IRMs) were used to analyze Cross Racial Identity Scale (CRIS) scores. Rasch analysis scores were compared with classical test theory (CTT) scores. The partial credit model demonstrated a high goodness of fit and correlations between Rasch and CTT scores ranged from 0.91 to 0.99. CRIS scores are supported by both methods.…

  5. Conveying the Complex: Updating U.S. Joint Systems Analysis Doctrine with Complexity Theory

    DTIC Science & Technology

    2013-12-10

    screech during a public address, or sustain and amplify it during a guitar solo. Since the systems are nonlinear, understanding cause and effect... Classics , 2007), 12. 34 those frames.58 A technique to cope with the potentially confusing...Reynolds, Paul Davidson. A Primer in Theory Construction. Boston: Allyn and Bacon Classics , 2007. Riolo, Rick L. “The Effects and Evolution of Tag

  6. Quantum kinetic theory of the filamentation instability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bret, A.; Haas, F.

    2011-07-15

    The quantum electromagnetic dielectric tensor for a multi-species plasma is re-derived from the gauge-invariant Wigner-Maxwell system and presented under a form very similar to the classical one. The resulting expression is then applied to a quantum kinetic theory of the electromagnetic filamentation instability. Comparison is made with the quantum fluid theory including a Bohm pressure term and with the cold classical plasma result. A number of analytical expressions are derived for the cutoff wave vector, the largest growth rate, and the most unstable wave vector.

  7. Geometry, topology, and string theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Varadarajan, Uday

    A variety of scenarios are considered which shed light upon the uses and limitations of classical geometric and topological notions in string theory. The primary focus is on situations in which D-brane or string probes of a given classical space-time see the geometry quite differently than one might naively expect. In particular, situations in which extra dimensions, non-commutative geometries as well as other non-local structures emerge are explored in detail. Further, a preliminary exploration of such issues in Lorentzian space-times with non-trivial causal structures within string theory is initiated.

  8. Pauli structures arising from confined particles interacting via a statistical potential

    NASA Astrophysics Data System (ADS)

    Batle, Josep; Ciftja, Orion; Farouk, Ahmed; Alkhambashi, Majid; Abdalla, Soliman

    2017-09-01

    There have been suggestions that the Pauli exclusion principle alone can lead a non-interacting (free) system of identical fermions to form crystalline structures dubbed Pauli crystals. Single-shot imaging experiments for the case of ultra-cold systems of free spin-polarized fermionic atoms in a two-dimensional harmonic trap appear to show geometric arrangements that cannot be characterized as Wigner crystals. This work explores this idea and considers a well-known approach that enables one to treat a quantum system of free fermions as a system of classical particles interacting with a statistical interaction potential. The model under consideration, though classical in nature, incorporates the quantum statistics by endowing the classical particles with an effective interaction potential. The reasonable expectation is that possible Pauli crystal features seen in experiments may manifest in this model that captures the correct quantum statistics as a first order correction. We use the Monte Carlo simulated annealing method to obtain the most stable configurations of finite two-dimensional systems of confined particles that interact with an appropriate statistical repulsion potential. We consider both an isotropic harmonic and a hard-wall confinement potential. Despite minor differences, the most stable configurations observed in our model correspond to the reported Pauli crystals in single-shot imaging experiments of free spin-polarized fermions in a harmonic trap. The crystalline configurations observed appear to be different from the expected classical Wigner crystal structures that would emerge should the confined classical particles had interacted with a pair-wise Coulomb repulsion.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shore, B.W.; Knight, P.L.

    The Jaynes-Cummings Model (JCM), a soluble fully quantum mechanical model of an atom in a field, was first used (in 1963) to examine the classical aspects of spontaneous emission and to reveal the existence of Rabi oscillations in atomic excitation probability for fields with sharply defined energy (or photon number). For fields having a statistical distributions of photon numbers the oscillations collapse to an expected steady value. In 1980 it was discovered that with appropriate initial conditions (e.g. a near-classical field), the Rabi oscillations would eventually revive -- only to collapse and revive repeatedly in a complicated pattern. The existencemore » of these revivals, present in the analytic solutions of the JCM, provided direct evidence for discreteness of field excitation (photons) and hence for the truly quantum nature of radiation. Subsequent study revealed further nonclassical properties of the JCM field, such as a tendency of the photons to antibunch. Within the last two years it has been found that during the quiescent intervals of collapsed Rabi oscillations the atom and field exist in a macroscopic superposition state (a Schroedinger cat). This discovery offers the opportunity to use the JCM to elucidate the basic properties of quantum correlation (entanglement) and to explore still further the relationship between classical and quantum physics. In tribute to E. D. Jaynes, who first recognized the importance of the JCM for clarifying the differences and similarities between quantum and classical physics, we here present an overview of the theory of the JCM and some of the many remarkable discoveries about it.« less

  10. Semiclassical theory of electronically nonadiabatic transitions in molecular collision processes

    NASA Technical Reports Server (NTRS)

    Lam, K. S.; George, T. F.

    1979-01-01

    An introductory account of the semiclassical theory of the S-matrix for molecular collision processes is presented, with special emphasis on electronically nonadiabatic transitions. This theory is based on the incorporation of classical mechanics with quantum superposition, and in practice makes use of the analytic continuation of classical mechanics into the complex space of time domain. The relevant concepts of molecular scattering theory and related dynamical models are described and the formalism is developed and illustrated with simple examples - collinear collision of the A+BC type. The theory is then extended to include the effects of laser-induced nonadiabatic transitions. Two bound continuum processes collisional ionization and collision-induced emission also amenable to the same general semiclassical treatment are discussed.

  11. Thermostatistically approaching living systems: Boltzmann Gibbs or nonextensive statistical mechanics?

    NASA Astrophysics Data System (ADS)

    Tsallis, Constantino

    2006-03-01

    Boltzmann-Gibbs ( BG) statistical mechanics is, since well over one century, successfully used for many nonlinear dynamical systems which, in one way or another, exhibit strong chaos. A typical case is a classical many-body short-range-interacting Hamiltonian system (e.g., the Lennard-Jones model for a real gas at moderately high temperature). Its Lyapunov spectrum (which characterizes the sensitivity to initial conditions) includes positive values. This leads to ergodicity, the stationary state being thermal equilibrium, hence standard applicability of the BG theory is verified. The situation appears to be of a different nature for various phenomena occurring in living organisms. Indeed, such systems exhibit a complexity which does not really accommodate with this standard dynamical behavior. Life appears to emerge and evolve in a kind of delicate situation, at the frontier between large order (low adaptability and long memory; typically characterized by regular dynamics, hence only nonpositive Lyapunov exponents) and large disorder (high adaptability and short memory; typically characterized by strong chaos, hence at least one positive Lyapunov exponent). Along this frontier, the maximal relevant Lyapunov exponents are either zero or close to that, characterizing what is currently referred to as weak chaos. This type of situation is shared by a great variety of similar complex phenomena in economics, linguistics, to cite but a few. BG statistical mechanics is built upon the entropy S=-k∑plnp. A generalization of this form, S=k(1-∑piq)/(q-1) (with S=S), has been proposed in 1988 as a basis for formulating what is nowadays currently called nonextensive statistical mechanics. This theory appears to be particularly adapted for nonlinear dynamical systems exhibiting, precisely, weak chaos. Here, we briefly review the theory, its dynamical foundation, its applications in a variety of disciplines (with special emphasis to living systems), and its connections with the ubiquitous scale-free networks.

  12. Classical theory of atomic collisions - The first hundred years

    NASA Astrophysics Data System (ADS)

    Grujić, Petar V.

    2012-05-01

    Classical calculations of the atomic processes started in 1911 with famous Rutherford's evaluation of the differential cross section for α particles scattered on foil atoms [1]. The success of these calculations was soon overshadowed by the rise of Quantum Mechanics in 1925 and its triumphal success in describing processes at the atomic and subatomic levels. It was generally recognized that the classical approach should be inadequate and it was neglected until 1953, when the famous paper by Gregory Wannier appeared, in which the threshold law for the single ionization cross section behaviour by electron impact was derived. All later calculations and experimental studies confirmed the law derived by purely classical theory. The next step was taken by Ian Percival and collaborators in 60s, who developed a general classical three-body computer code, which was used by many researchers in evaluating various atomic processes like ionization, excitation, detachment, dissociation, etc. Another approach was pursued by Michal Gryzinski from Warsaw, who started a far reaching programme for treating atomic particles and processes as purely classical objects [2]. Though often criticized for overestimating the domain of the classical theory, results of his group were able to match many experimental data. Belgrade group was pursuing the classical approach using both analytical and numerical calculations, studying a number of atomic collisions, in particular near-threshold processes. Riga group, lead by Modris Gailitis [3], contributed considerably to the field, as it was done by Valentin Ostrovsky and coworkers from Sanct Petersbourg, who developed powerful analytical methods within purely classical mechanics [4]. We shall make an overview of these approaches and show some of the remarkable results, which were subsequently confirmed by semiclassical and quantum mechanical calculations, as well as by the experimental evidence. Finally we discuss the theoretical and epistemological background of the classical calculations and explain why these turned out so successful, despite the essentially quantum nature of the atomic and subatomic systems.

  13. Gravitational tides in the outer planets. I - Implications of classical tidal theory. II - Interior calculations and estimation of the tidal dissipation factor

    NASA Technical Reports Server (NTRS)

    Ioannou, Petros J.; Lindzen, Richard S.

    1993-01-01

    Classical tidal theory is applied to the atmospheres of the outer planets. The tidal geopotential due to satellites of the outer planets is discussed, and the solution of Laplace's tidal equation for Hough modes appropriate to tides on the outer planets is examined. The vertical structure of tidal modes is described, noting that only relatively high-order meridional mode numbers can propagate vertically with growing amplitude. Expected magnitudes for tides in the visible atmosphere of Jupiter are discussed. The classical theory is extended to planetary interiors taking the effects of spherically and self-gravity into account. The thermodynamic structure of Jupiter is described and the WKB theory of the vertical structure equation is presented. The regions for which inertial, gravity, and acoustic oscillations are possible are delineated. The case of a planet with a neutral interior is treated, discussing the various atmospheric boundary conditions and showing that the tidal response is small.

  14. Physics of automated driving in framework of three-phase traffic theory.

    PubMed

    Kerner, Boris S

    2018-04-01

    We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.

  15. Physics of automated driving in framework of three-phase traffic theory

    NASA Astrophysics Data System (ADS)

    Kerner, Boris S.

    2018-04-01

    We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.

  16. The sociobiology of genes: the gene's eye view as a unifying behavioural-ecological framework for biological evolution.

    PubMed

    De Tiège, Alexis; Van de Peer, Yves; Braeckman, Johan; Tanghe, Koen B

    2017-11-22

    Although classical evolutionary theory, i.e., population genetics and the Modern Synthesis, was already implicitly 'gene-centred', the organism was, in practice, still generally regarded as the individual unit of which a population is composed. The gene-centred approach to evolution only reached a logical conclusion with the advent of the gene-selectionist or gene's eye view in the 1960s and 1970s. Whereas classical evolutionary theory can only work with (genotypically represented) fitness differences between individual organisms, gene-selectionism is capable of working with fitness differences among genes within the same organism and genome. Here, we explore the explanatory potential of 'intra-organismic' and 'intra-genomic' gene-selectionism, i.e., of a behavioural-ecological 'gene's eye view' on genetic, genomic and organismal evolution. First, we give a general outline of the framework and how it complements the-to some extent-still 'organism-centred' approach of classical evolutionary theory. Secondly, we give a more in-depth assessment of its explanatory potential for biological evolution, i.e., for Darwin's 'common descent with modification' or, more specifically, for 'historical continuity or homology with modular evolutionary change' as it has been studied by evolutionary developmental biology (evo-devo) during the last few decades. In contrast with classical evolutionary theory, evo-devo focuses on 'within-organism' developmental processes. Given the capacity of gene-selectionism to adopt an intra-organismal gene's eye view, we outline the relevance of the latter model for evo-devo. Overall, we aim for the conceptual integration between the gene's eye view on the one hand, and more organism-centred evolutionary models (both classical evolutionary theory and evo-devo) on the other.

  17. Nonlinear effects in evolution - an ab initio study: A model in which the classical theory of evolution occurs as a special case.

    PubMed

    Clerc, Daryl G

    2016-07-21

    An ab initio approach was used to study the molecular-level interactions that connect gene-mutation to changes in an organism׳s phenotype. The study provides new insights into the evolutionary process and presents a simplification whereby changes in phenotypic properties may be studied in terms of the binding affinities of the chemical interactions affected by mutation, rather than by correlation to the genes. The study also reports the role that nonlinear effects play in the progression of organs, and how those effects relate to the classical theory of evolution. Results indicate that the classical theory of evolution occurs as a special case within the ab initio model - a case having two attributes. The first attribute: proteins and promoter regions are not shared among organs. The second attribute: continuous limiting behavior exists in the physical properties of organs as well as in the binding affinity of the associated chemical interactions, with respect to displacements in the chemical properties of proteins and promoter regions induced by mutation. Outside of the special case, second-order coupling contributions are significant and nonlinear effects play an important role, a result corroborated by analyses of published activity levels in binding and transactivation assays. Further, gradations in the state of perfection of an organ may be small or large depending on the type of mutation, and not necessarily closely-separated as maintained by the classical theory. Results also indicate that organs progress with varying degrees of interdependence, the likelihood of successful mutation decreases with increasing complexity of the affected chemical system, and differences between the ab initio model and the classical theory increase with increasing complexity of the organism. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.

  18. The Basics: What's Essential about Theory for Community Development Practice?

    ERIC Educational Resources Information Center

    Hustedde, Ronald J.; Ganowicz, Jacek

    2002-01-01

    Relates three classical theories (structural functionalism, conflict theory, symbolic interactionism) to fundamental concerns of community development (structure, power, and shared meaning). Links these theories to Giddens' structuration theory, which connects macro and micro structures and community influence on change through cultural norms.…

  19. On classical and quantum dynamics of tachyon-like fields and their cosmological implications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dimitrijević, Dragoljub D., E-mail: ddrag@pmf.ni.ac.rs; Djordjević, Goran S., E-mail: ddrag@pmf.ni.ac.rs; Milošević, Milan, E-mail: ddrag@pmf.ni.ac.rs

    2014-11-24

    We consider a class of tachyon-like potentials, motivated by string theory, D-brane dynamics and inflation theory in the context of classical and quantum mechanics. A formalism for describing dynamics of tachyon fields in spatially homogenous and one-dimensional - classical and quantum mechanical limit is proposed. A few models with concrete potentials are considered. Additionally, possibilities for p-adic and adelic generalization of these models are discussed. Classical actions and corresponding quantum propagators, in the Feynman path integral approach, are calculated in a form invariant on a change of the background number fields, i.e. on both archimedean and nonarchimedean spaces. Looking formore » a quantum origin of inflation, relevance of p-adic and adelic generalizations are briefly discussed.« less

  20. The Classical Theory of Light Colors: a Paradigm for Description of Particle Interactions

    NASA Astrophysics Data System (ADS)

    Mazilu, Nicolae; Agop, Maricel; Gatu, Irina; Iacob, Dan Dezideriu; Butuc, Irina; Ghizdovat, Vlad

    2016-06-01

    The color is an interaction property: of the interaction of light with matter. Classically speaking it is therefore akin to the forces. But while forces engendered the mechanical view of the world, the colors generated the optical view. One of the modern concepts of interaction between the fundamental particles of matter - the quantum chromodynamics - aims to fill the gap between mechanics and optics, in a specific description of strong interactions. We show here that this modern description of the particle interactions has ties with both the classical and quantum theories of light, regardless of the connection between forces and colors. In a word, the light is a universal model in the description of matter. The description involves classical Yang-Mills fields related to color.

  1. Pressure broadening of the electric dipole and Raman lines of CO2 by argon: Stringent test of the classical impact theory at different temperatures on a benchmark system

    NASA Astrophysics Data System (ADS)

    Ivanov, Sergey V.; Buzykin, Oleg G.

    2016-12-01

    A classical approach is applied to calculate pressure broadening coefficients of CO2 vibration-rotational spectral lines perturbed by Ar. Three types of spectra are examined: electric dipole (infrared) absorption; isotropic and anisotropic Raman Q branches. Simple and explicit formulae of the classical impact theory are used along with exact 3D Hamilton equations for CO2-Ar molecular motion. The calculations utilize vibrationally independent most accurate ab initio potential energy surface (PES) of Hutson et al. expanded in Legendre polynomial series up to lmax = 24. New improved algorithm of classical rotational frequency selection is applied. The dependences of CO2 half-widths on rotational quantum number J up to J=100 are computed for the temperatures between 77 and 765 K and compared with available experimental data as well as with the results of fully quantum dynamical calculations performed on the same PES. To make the picture complete, the predictions of two independent variants of the semi-classical Robert-Bonamy formalism for dipole absorption lines are included. This method. however, has demonstrated poor accuracy almost for all temperatures. On the contrary, classical broadening coefficients are in excellent agreement both with measurements and with quantum results at all temperatures. The classical impact theory in its present variant is capable to produce quickly and accurately the pressure broadening coefficients of spectral lines of linear molecules for any J value (including high Js) using full-dimensional ab initio - based PES in the cases where other computational methods are either extremely time consuming (like the quantum close coupling method) or give erroneous results (like semi-classical methods).

  2. The polymer physics of single DNA confined in nanochannels.

    PubMed

    Dai, Liang; Renner, C Benjamin; Doyle, Patrick S

    2016-06-01

    In recent years, applications and experimental studies of DNA in nanochannels have stimulated the investigation of the polymer physics of DNA in confinement. Recent advances in the physics of confined polymers, using DNA as a model polymer, have moved beyond the classic Odijk theory for the strong confinement, and the classic blob theory for the weak confinement. In this review, we present the current understanding of the behaviors of confined polymers while briefly reviewing classic theories. Three aspects of confined DNA are presented: static, dynamic, and topological properties. The relevant simulation methods are also summarized. In addition, comparisons of confined DNA with DNA under tension and DNA in semidilute solution are made to emphasize universal behaviors. Finally, an outlook of the possible future research for confined DNA is given. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Classical and non-classical effective medium theories: New perspectives

    NASA Astrophysics Data System (ADS)

    Tsukerman, Igor

    2017-05-01

    Future research in electrodynamics of periodic electromagnetic composites (metamaterials) can be expected to produce sophisticated homogenization theories valid for any composition and size of the lattice cell. The paper outlines a promising path in that direction, leading to non-asymptotic and nonlocal homogenization models, and highlights aspects of homogenization that are often overlooked: the finite size of the sample and the role of interface boundaries. Classical theories (e.g. Clausius-Mossotti, Maxwell Garnett), while originally derived from a very different set of ideas, fit well into the proposed framework. Nonlocal effects can be included in the model, making an order-of-magnitude accuracy improvements possible. One future challenge is to determine what effective parameters can or cannot be obtained for a given set of constituents of a metamaterial lattice cell, thereby delineating the possible from the impossible in metamaterial design.

  4. Horizon Entropy from Quantum Gravity Condensates.

    PubMed

    Oriti, Daniele; Pranzetti, Daniele; Sindoni, Lorenzo

    2016-05-27

    We construct condensate states encoding the continuum spherically symmetric quantum geometry of a horizon in full quantum gravity, i.e., without any classical symmetry reduction, in the group field theory formalism. Tracing over the bulk degrees of freedom, we show how the resulting reduced density matrix manifestly exhibits a holographic behavior. We derive a complete orthonormal basis of eigenstates for the reduced density matrix of the horizon and use it to compute the horizon entanglement entropy. By imposing consistency with the horizon boundary conditions and semiclassical thermodynamical properties, we recover the Bekenstein-Hawking entropy formula for any value of the Immirzi parameter. Our analysis supports the equivalence between the von Neumann (entanglement) entropy interpretation and the Boltzmann (statistical) one.

  5. Study of optimum methods of optical communication

    NASA Technical Reports Server (NTRS)

    Harger, R. O.

    1972-01-01

    Optimum methods of optical communication accounting for the effects of the turbulent atmosphere and quantum mechanics, both by the semi-classical method and the full-fledged quantum theoretical model are described. A concerted effort to apply the techniques of communication theory to the novel problems of optical communication by a careful study of realistic models and their statistical descriptions, the finding of appropriate optimum structures and the calculation of their performance and, insofar as possible, comparing them to conventional and other suboptimal systems are discussed. In this unified way the bounds on performance and the structure of optimum communication systems for transmission of information, imaging, tracking, and estimation can be determined for optical channels.

  6. On determining absolute entropy without quantum theory or the third law of thermodynamics

    NASA Astrophysics Data System (ADS)

    Steane, Andrew M.

    2016-04-01

    We employ classical thermodynamics to gain information about absolute entropy, without recourse to statistical methods, quantum mechanics or the third law of thermodynamics. The Gibbs-Duhem equation yields various simple methods to determine the absolute entropy of a fluid. We also study the entropy of an ideal gas and the ionization of a plasma in thermal equilibrium. A single measurement of the degree of ionization can be used to determine an unknown constant in the entropy equation, and thus determine the absolute entropy of a gas. It follows from all these examples that the value of entropy at absolute zero temperature does not need to be assigned by postulate, but can be deduced empirically.

  7. An analysis of a large dataset on immigrant integration in Spain. The Statistical Mechanics perspective on Social Action

    NASA Astrophysics Data System (ADS)

    Barra, Adriano; Contucci, Pierluigi; Sandell, Rickard; Vernia, Cecilia

    2014-02-01

    How does immigrant integration in a country change with immigration density? Guided by a statistical mechanics perspective we propose a novel approach to this problem. The analysis focuses on classical integration quantifiers such as the percentage of jobs (temporary and permanent) given to immigrants, mixed marriages, and newborns with parents of mixed origin. We find that the average values of different quantifiers may exhibit either linear or non-linear growth on immigrant density and we suggest that social action, a concept identified by Max Weber, causes the observed non-linearity. Using the statistical mechanics notion of interaction to quantitatively emulate social action, a unified mathematical model for integration is proposed and it is shown to explain both growth behaviors observed. The linear theory instead, ignoring the possibility of interaction effects would underestimate the quantifiers up to 30% when immigrant densities are low, and overestimate them as much when densities are high. The capacity to quantitatively isolate different types of integration mechanisms makes our framework a suitable tool in the quest for more efficient integration policies.

  8. Application of a renormalization-group treatment to the statistical associating fluid theory for potentials of variable range (SAFT-VR).

    PubMed

    Forte, Esther; Llovell, Felix; Vega, Lourdes F; Trusler, J P Martin; Galindo, Amparo

    2011-04-21

    An accurate prediction of phase behavior at conditions far and close to criticality cannot be accomplished by mean-field based theories that do not incorporate long-range density fluctuations. A treatment based on renormalization-group (RG) theory as developed by White and co-workers has proven to be very successful in improving the predictions of the critical region with different equations of state. The basis of the method is an iterative procedure to account for contributions to the free energy of density fluctuations of increasing wavelengths. The RG method has been combined with a number of versions of the statistical associating fluid theory (SAFT), by implementing White's earliest ideas with the improvements of Prausnitz and co-workers. Typically, this treatment involves two adjustable parameters: a cutoff wavelength L for density fluctuations and an average gradient of the wavelet function Φ. In this work, the SAFT-VR (variable range) equation of state is extended with a similar crossover treatment which, however, follows closely the most recent improvements introduced by White. The interpretation of White's latter developments allows us to establish a straightforward method which enables Φ to be evaluated; only the cutoff wavelength L then needs to be adjusted. The approach used here begins with an initial free energy incorporating only contributions from short-wavelength fluctuations, which are treated locally. The contribution from long-wavelength fluctuations is incorporated through an iterative procedure based on attractive interactions which incorporate the structure of the fluid following the ideas of perturbation theories and using a mapping that allows integration of the radial distribution function. Good agreement close and far from the critical region is obtained using a unique fitted parameter L that can be easily related to the range of the potential. In this way the thermodynamic properties of a square-well (SW) fluid are given by the same number of independent intermolecular model parameters as in the classical equation. Far from the critical region the approach provides the correct limiting behavior reducing to the classical equation (SAFT-VR). In the critical region the β critical exponent is calculated and is found to take values close to the universal value. In SAFT-VR the free energy of an associating chain fluid is obtained following the thermodynamic perturbation theory of Wertheim from the knowledge of the free energy and radial distribution function of a reference monomer fluid. By determining L for SW fluids of varying well width a unique equation of state is obtained for chain and associating systems without further adjustment of critical parameters. We use computer simulation data of the phase behavior of chain and associating SW fluids to test the accuracy of the new equation.

  9. Molecular dynamics studies of the thermal decomposition of 2,3-diazabicyclo(2.2.1)hept-2-ene

    NASA Astrophysics Data System (ADS)

    Sorescu, Dan C.; Thompson, Donald L.; Raff, Lionel M.

    1995-05-01

    The reaction dynamics of the thermal gas-phase decomposition of 2,3-diazabicyclo (2.2.1)hept-2-ene-exo, exo-5,6-d2 have been investigated using classical trajectory methods on a semiempirical potential-energy surface. The global potential is written as a superposition of different reaction channel potentials containing bond stretching, bending and torsional terms, connected by parametrized switching functions. Reaction channels for stepwise and concerted cleavage of the two C-N bonds of the reactant have both been considered in construction of the potential. The geometries of 2,3-diazabicyclo(2.2.1)hept-2-ene, the diazenyl biradical and of the transition state corresponding to breaking of the remaining C-N bond of diazenyl biradical have been determined at the second order Möller-Plesset perturbation theory (MP2/6-31G*) and at Hartree-Fock (HF/6-31G*) levels, respectively. The bond dissociation energies have been estimated using the available thermochemical data and previously reported results for bicyclo(2.1.0)pentane [J. Chem. Phys. 101, 3729 (1994)]. The equilibrium geometries predicted by the semiempirical potential for reactants and products, the barrier height for thermal nitrogen extrusion from 2,3-diazabicyclo(2.2.1)hept-2-ene and the fundamental vibrational frequencies are in good to excellent agreement with the measured or ab initio calculated values. Using a projection method of the instantaneous Cartesian velocities onto the normal mode vectors and classical trajectory calculations, the dissociation dynamics of 2,3-diazabicyclo(2.2.1)hept-2-ene-exo, exo-5,6-d2 are investigated at several excitation energies in the range 60-175 kcal/mol. The results show the following: (1) The thermal reaction takes place with a preference for inversion of configuration in the reaction products, the exo-labeled bicyclo(2.1.0) pentane being the major product. The exo/endo ratio of bicyclo(2.1.0) pentane isomers is found to vary between 1.8-2.2 for the energy range considered. (2) For random energization of the vibrational modes, the energy dependence of the rate coefficients can be described by a RRK expression. (3) The significant broadening and overlapping of the power spectral bands, together with the disappearance of characteristic features in the power spectra of the internal coordinates calculated at different energies, indicate high intramolecular vibrational redistribution rates and global statistical behavior. (4) The energy partitioning among products shows that the internal energy is preferentially distributed into the vibrational degrees of freedom in BCP, while N2 is formed with small amounts of rotational and vibrational energies. Overall, the distribution of energy among the product degrees of freedom follows statistical predictions in the internal energy range investigated. (5) Stepwise dissociation of the C-N bonds is the predominant mechanism which characterizes the N2 elimination from the parent molecule. (6) Although statistical theories of reaction rates, such as Rice-Ramsperger-Kassel-Marcus (RRKM) theory, are unable to predict the product exo/endo ratio, this is not a result of the breakdown of the statistical assumption inherent in these theories, but rather to the fact that statistical theory does not address mechanistic questions related to post transition-state events. Although the results show that there is a near microcanonical distribution of energy in the 1,3-cyclopentanediyl radical, the system does not have sufficient time to explore all of the energetically accessible configuration space prior to the closure of the 1-3 bridgehead bond. The result is a nonstatistical exo/endo product ratio that deviates from the statistically expected result of unity.

  10. Accuracy of a Classical Test Theory-Based Procedure for Estimating the Reliability of a Multistage Test. Research Report. ETS RR-17-02

    ERIC Educational Resources Information Center

    Kim, Sooyeon; Livingston, Samuel A.

    2017-01-01

    The purpose of this simulation study was to assess the accuracy of a classical test theory (CTT)-based procedure for estimating the alternate-forms reliability of scores on a multistage test (MST) having 3 stages. We generated item difficulty and discrimination parameters for 10 parallel, nonoverlapping forms of the complete 3-stage test and…

  11. Linear and angular coherence momenta in the classical second-order coherence theory of vector electromagnetic fields.

    PubMed

    Wang, Wei; Takeda, Mitsuo

    2006-09-01

    A new concept of vector and tensor densities is introduced into the general coherence theory of vector electromagnetic fields that is based on energy and energy-flow coherence tensors. Related coherence conservation laws are presented in the form of continuity equations that provide new insights into the propagation of second-order correlation tensors associated with stationary random classical electromagnetic fields.

  12. Application of ply level analysis to flexural wave propagation

    NASA Astrophysics Data System (ADS)

    Valisetty, R. R.; Rehfield, L. W.

    1988-10-01

    A brief survey is presented of the shear deformation theories of laminated plates. It indicates that there are certain non-classical influences that affect bending-related behavior in the same way as do the transverse shear stresses. They include bending- and stretching-related section warping and the concomitant non-classical surface parallel stress contributions and the transverse normal stress. A bending theory gives significantly improved performance if these non-classical affects are incorporated. The heterogeneous shear deformations that are characteristic of laminates with highly dissimilar materials, however, require that attention be paid to the modeling of local rotations. In this paper, it is shown that a ply level analysis can be used to model such disparate shear deformations. Here, equilibrium of each layer is analyzed separately. Earlier applications of this analysis include free-edge laminate stresses. It is now extended to the study of flexural wave propagation in laminates. A recently developed homogeneous plate theory is used as a ply level model. Due consideration is given to the non-classical influences and no shear correction factors are introduced extraneously in this theory. The results for the lowest flexural mode of travelling planar harmonic waves indicate that this approach is competitive and yields better results for certain laminates.

  13. The relationship between executive functions and fluid intelligence in Parkinson's disease

    PubMed Central

    Roca, M.; Manes, F.; Chade, A.; Gleichgerrcht, E.; Gershanik, O.; Arévalo, G. G.; Torralva, T.; Duncan, J.

    2012-01-01

    Background We recently demonstrated that decline in fluid intelligence is a substantial contributor to frontal deficits. For some classical ‘executive’ tasks, such as the Wisconsin Card Sorting Test (WCST) and Verbal Fluency, frontal deficits were entirely explained by fluid intelligence. However, on a second set of frontal tasks, deficits remained even after statistically controlling for this factor. These tasks included tests of theory of mind and multitasking. As frontal dysfunction is the most frequent cognitive deficit observed in early Parkinson's disease (PD), the present study aimed to determine the role of fluid intelligence in such deficits. Method We assessed patients with PD (n=32) and control subjects (n=22) with the aforementioned frontal tests and with a test of fluid intelligence. Group performance was compared and fluid intelligence was introduced as a covariate to determine its role in frontal deficits shown by PD patients. Results In line with our previous results, scores on the WCST and Verbal Fluency were closely linked to fluid intelligence. Significant patient–control differences were eliminated or at least substantially reduced once fluid intelligence was introduced as a covariate. However, for tasks of theory of mind and multitasking, deficits remained even after fluid intelligence was statistically controlled. Conclusions The present results suggest that clinical assessment of neuropsychological deficits in PD should include tests of fluid intelligence, together with one or more specific tasks that allow for the assessment of residual frontal deficits associated with theory of mind and multitasking. PMID:22440401

  14. Geometric Theory of Reduction of Nonlinear Control Systems

    NASA Astrophysics Data System (ADS)

    Elkin, V. I.

    2018-02-01

    The foundations of a differential geometric theory of nonlinear control systems are described on the basis of categorical concepts (isomorphism, factorization, restrictions) by analogy with classical mathematical theories (of linear spaces, groups, etc.).

  15. Comment on Gallistel: behavior theory and information theory: some parallels.

    PubMed

    Nevin, John A

    2012-05-01

    In this article, Gallistel proposes information theory as an approach to some enduring problems in the study of operant and classical conditioning. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Representational Realism, Closed Theories and the Quantum to Classical Limit

    NASA Astrophysics Data System (ADS)

    de Ronde, Christian

    In this chapter, we discuss the representational realist stance as a pluralistontic approach to inter-theoretic relationships. Our stance stresses the fact that physical theories require the necessary consideration of a conceptual level of discourse which determines and configures the specific field of phenomena discussed by each particular theory. We will criticize the orthodox line of research which has grounded the analysis about QM in two (Bohrian) metaphysical presuppositions - accepted in the present as dogmas that all interpretations must follow. We will also examine how the orthodox project of "bridging the gap" between the quantum and the classical domains has constrained the possibilities of research, producing only a limited set of interpretational problems which only focus in the justification of "classical reality" and exclude the possibility of analyzing the possibilities of non-classical conceptual representations of QM. The representational realist stance introduces two new problems, namely, the superposition problem and the contextuality problem, which consider explicitly the conceptual representation of orthodox QM beyond the mere reference to mathematical structures and measurement outcomes. In the final part of the chapter, we revisit, from representational realist perspective, the quantum to classical limit and the orthodox claim that this inter-theoretic relation can be explained through the principle of decoherence.

  17. [Discussion on six errors of formulas corresponding to syndromes in using the classic formulas].

    PubMed

    Bao, Yan-ju; Hua, Bao-jin

    2012-12-01

    The theory of formulas corresponding to syndromes is one of the characteristics of Treatise on Cold Damage and Miscellaneous Diseases (Shanghan Zabing Lun) and one of the main principles in applying classic prescriptions. It is important to take effect by following the principle of formulas corresponding to syndromes. However, some medical practitioners always feel that the actual clinical effect is far less than expected. Six errors in the use of classic prescriptions as well as the theory of formulas corresponding to syndromes are the most important causes to be considered, i.e. paying attention only to the local syndromes while neglecting the whole, paying attention only to formulas corresponding to syndromes while neglecting the pathogenesis, paying attention only to syndromes while neglecting the pulse diagnosis, paying attention only to unilateral prescription but neglecting the combined prescriptions, paying attention only to classic prescriptions while neglecting the modern formulas, and paying attention only to the formulas but neglecting the drug dosage. Therefore, not only the patients' clinical syndromes, but also the combination of main syndrome and pathogenesis simultaneously is necessary in the clinical applications of classic prescriptions and the theory of prescription corresponding to syndrome. In addition, comprehensive syndrome differentiation, modern formulas, current prescriptions, combined prescriptions, and drug dosage all contribute to avoid clinical errors and improve clinical effects.

  18. Nonclassical light revealed by the joint statistics of simultaneous measurements.

    PubMed

    Luis, Alfredo

    2016-04-15

    Nonclassicality cannot be a single-observable property, since the statistics of any quantum observable is compatible with classical physics. We develop a general procedure to reveal nonclassical behavior of light states from the joint statistics arising in the practical measurement of multiple observables. Beside embracing previous approaches, this protocol can disclose nonclassical features for standard examples of classical-like behavior, such as SU(2) and Glauber coherent states. When combined with other criteria, this would imply that every light state is nonclassical.

  19. Influences on and Limitations of Classical Test Theory Reliability Estimates.

    ERIC Educational Resources Information Center

    Arnold, Margery E.

    It is incorrect to say "the test is reliable" because reliability is a function not only of the test itself, but of many factors. The present paper explains how different factors affect classical reliability estimates such as test-retest, interrater, internal consistency, and equivalent forms coefficients. Furthermore, the limits of classical test…

  20. A Comparison of Kinetic Energy and Momentum in Special Relativity and Classical Mechanics

    ERIC Educational Resources Information Center

    Riggs, Peter J.

    2016-01-01

    Kinetic energy and momentum are indispensable dynamical quantities in both the special theory of relativity and in classical mechanics. Although momentum and kinetic energy are central to understanding dynamics, the differences between their relativistic and classical notions have not always received adequate treatment in undergraduate teaching.…

  1. A Comparative Analysis of Three Unique Theories of Organizational Learning

    ERIC Educational Resources Information Center

    Leavitt, Carol C.

    2011-01-01

    The purpose of this paper is to present three classical theories on organizational learning and conduct a comparative analysis that highlights their strengths, similarities, and differences. Two of the theories -- experiential learning theory and adaptive -- generative learning theory -- represent the thinking of the cognitive perspective, while…

  2. Classical evolution and quantum generation in generalized gravity theories including string corrections and tachyons: Unified analyses

    NASA Astrophysics Data System (ADS)

    Hwang, Jai-Chan; Noh, Hyerim

    2005-03-01

    We present cosmological perturbation theory based on generalized gravity theories including string theory correction terms and a tachyonic complication. The classical evolution as well as the quantum generation processes in these varieties of gravity theories are presented in unified forms. These apply both to the scalar- and tensor-type perturbations. Analyses are made based on the curvature variable in two different gauge conditions often used in the literature in Einstein’s gravity; these are the curvature variables in the comoving (or uniform-field) gauge and the zero-shear gauge. Applications to generalized slow-roll inflation and its consequent power spectra are derived in unified forms which include a wide range of inflationary scenarios based on Einstein’s gravity and others.

  3. Infinite derivative gravity: non-singular cosmology & blackhole solutions

    NASA Astrophysics Data System (ADS)

    Mazumdar, A.

    Both Einstein’s theory of General Relativity and Newton’s theory of gravity possess a short distance and small time scale catastrophe. The blackhole singularity and cosmological Big Bang singularity problems highlight that current theories of gravity are incomplete description at early times and small distances. I will discuss how one can potentially resolve these fundamental problems at a classical level and quantum level. In particular, I will discuss infinite derivative theories of gravity, where gravitational interactions become weaker in the ultraviolet, and therefore resolving some of the classical singularities, such as Big Bang and Schwarzschild singularity for compact non-singular objects with mass up to 1025 grams. In this lecture, I will discuss quantum aspects of infinite derivative gravity and discuss few aspects which can make the theory asymptotically free in the UV.

  4. (Never) Mind your p's and q's: Von Neumann versus Jordan on the foundations of quantum theory

    NASA Astrophysics Data System (ADS)

    Duncan, A.; Janssen, M.

    2013-03-01

    In 1927, in two papers entitled "On a new foundation [Neue Begründung] of quantum mechanics," Pascual Jordan presented his version of what came to be known as the Dirac-Jordan statistical transformation theory. Jordan and Paul Dirac arrived at essentially the same theory independently of one another at around the same time. Later in 1927, partly in response to Jordan and Dirac and avoiding the mathematical difficulties facing their approach, John von Neumann developed the modern Hilbert space formalism of quantum mechanics. We focus on Jordan and von Neumann. Central to the formalisms of both are expressions for conditional probabilities of finding some value for one quantity given the value of another. Beyond that Jordan and von Neumann had very different views about the appropriate formulation of problems in quantum mechanics. For Jordan, unable to let go of the analogy to classical mechanics, the solution of such problems required the identification of sets of canonically conjugate variables, i.e., p's and q's. For von Neumann, not constrained by the analogy to classical mechanics, it required only the identification of a maximal set of commuting operators with simultaneous eigenstates. He had no need for p's and q's. Jordan and von Neumann also stated the characteristic new rules for probabilities in quantum mechanics somewhat differently. Jordan and Dirac were the first to state those rules in full generality. Von Neumann rephrased them and, in a paper published a few months later, sought to derive them from more basic considerations. In this paper we reconstruct the central arguments of these 1927 papers by Jordan and von Neumann and of a paper on Jordan's approach by Hilbert, von Neumann, and Nordheim. We highlight those elements in these papers that bring out the gradual loosening of the ties between the new quantum formalism and classical mechanics. This paper was written as part of a joint project in the history of quantum physics of the Max Planck Institut für Wissenschaftsgeschichte and the Fritz-Haber-Institut in Berlin.

  5. Psychodrama: group psychotherapy through role playing.

    PubMed

    Kipper, D A

    1992-10-01

    The theory and the therapeutic procedure of classical psychodrama are described along with brief illustrations. Classical psychodrama and sociodrama stemmed from role theory, enactments, "tele," the reciprocity of choices, and the theory of spontaneity-robopathy and creativity. The discussion focuses on key concepts such as the therapeutic team, the structure of the session, transference and reality, countertransference, the here-and-now and the encounter, the group-as-a-whole, resistance and difficult clients, and affect and cognition. Also described are the neoclassical approaches of psychodrama, action methods, and clinical role playing, and the significance of the concept of behavioral simulation in group psychotherapy.

  6. Overview of Classical Test Theory and Item Response Theory for Quantitative Assessment of Items in Developing Patient-Reported Outcome Measures

    PubMed Central

    Cappelleri, Joseph C.; Lundy, J. Jason; Hays, Ron D.

    2014-01-01

    Introduction The U.S. Food and Drug Administration’s patient-reported outcome (PRO) guidance document defines content validity as “the extent to which the instrument measures the concept of interest” (FDA, 2009, p. 12). “Construct validity is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity” (Strauss & Smith, 2009, p. 7). Hence both qualitative and quantitative information are essential in evaluating the validity of measures. Methods We review classical test theory and item response theory approaches to evaluating PRO measures including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized “difficulty” (severity) order of items is represented by observed responses. Conclusion Classical test theory and item response theory can be useful in providing a quantitative assessment of items and scales during the content validity phase of patient-reported outcome measures. Depending on the particular type of measure and the specific circumstances, either one or both approaches should be considered to help maximize the content validity of PRO measures. PMID:24811753

  7. Spinning particles, axion radiation, and the classical double copy

    NASA Astrophysics Data System (ADS)

    Goldberger, Walter D.; Li, Jingping; Prabhu, Siddharth G.

    2018-05-01

    We extend the perturbative double copy between radiating classical sources in gauge theory and gravity to the case of spinning particles. We construct, to linear order in spins, perturbative radiating solutions to the classical Yang-Mills equations sourced by a set of interacting color charges with chromomagnetic dipole spin couplings. Using a color-to-kinematics replacement rule proposed earlier by one of the authors, these solutions map onto radiation in a theory of interacting particles coupled to massless fields that include the graviton, a scalar (dilaton) ϕ and the Kalb-Ramond axion field Bμ ν. Consistency of the double copy imposes constraints on the parameters of the theory on both the gauge and gravity sides of the correspondence. In particular, the color charges carry a chromomagnetic interaction which, in d =4 , corresponds to a gyromagnetic ratio equal to Dirac's value g =2 . The color-to-kinematics map implies that on the gravity side, the bulk theory of the fields (ϕ ,gμ ν,Bμ ν) has interactions which match those of d -dimensional "string gravity," as is the case both in the BCJ double copy of pure gauge theory scattering amplitudes and the KLT relations between the tree-level S -matrix elements of open and closed string theory.

  8. Lamb wave extraction of dispersion curves in micro/nano-plates using couple stress theories

    NASA Astrophysics Data System (ADS)

    Ghodrati, Behnam; Yaghootian, Amin; Ghanbar Zadeh, Afshin; Mohammad-Sedighi, Hamid

    2018-01-01

    In this paper, Lamb wave propagation in a homogeneous and isotropic non-classical micro/nano-plates is investigated. To consider the effect of material microstructure on the wave propagation, three size-dependent models namely indeterminate-, modified- and consistent couple stress theories are used to extract the dispersion equations. In the mentioned theories, a parameter called 'characteristic length' is used to consider the size of material microstructure in the governing equations. To generalize the parametric studies and examine the effect of thickness, propagation wavelength, and characteristic length on the behavior of miniature plate structures, the governing equations are nondimensionalized by defining appropriate dimensionless parameters. Then the dispersion curves for phase and group velocities are plotted in terms of a wide frequency-thickness range to study the lamb waves propagation considering microstructure effects in very high frequencies. According to the illustrated results, it was observed that the couple stress theories in the Cosserat type material predict more rigidity than the classical theory; so that in a plate with constant thickness, by increasing the thickness to characteristic length ratio, the results approach to the classical theory, and by reducing this ratio, wave propagation speed in the plate is significantly increased. In addition, it is demonstrated that for high-frequency Lamb waves, it converges to dispersive Rayleigh wave velocity.

  9. Causality re-established.

    PubMed

    D'Ariano, Giacomo Mauro

    2018-07-13

    Causality has never gained the status of a 'law' or 'principle' in physics. Some recent literature has even popularized the false idea that causality is a notion that should be banned from theory. Such misconception relies on an alleged universality of the reversibility of the laws of physics, based either on the determinism of classical theory, or on the multiverse interpretation of quantum theory, in both cases motivated by mere interpretational requirements for realism of the theory. Here, I will show that a properly defined unambiguous notion of causality is a theorem of quantum theory, which is also a falsifiable proposition of the theory. Such a notion of causality appeared in the literature within the framework of operational probabilistic theories. It is a genuinely theoretical notion, corresponding to establishing a definite partial order among events, in the same way as we do by using the future causal cone on Minkowski space. The notion of causality is logically completely independent of the misidentified concept of 'determinism', and, being a consequence of quantum theory, is ubiquitous in physics. In addition, as classical theory can be regarded as a restriction of quantum theory, causality holds also in the classical case, although the determinism of the theory trivializes it. I then conclude by arguing that causality naturally establishes an arrow of time. This implies that the scenario of the 'block Universe' and the connected 'past hypothesis' are incompatible with causality, and thus with quantum theory: they are both doomed to remain mere interpretations and, as such, are not falsifiable, similar to the hypothesis of 'super-determinism'.This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).

  10. Soliton Gases and Generalized Hydrodynamics

    NASA Astrophysics Data System (ADS)

    Doyon, Benjamin; Yoshimura, Takato; Caux, Jean-Sébastien

    2018-01-01

    We show that the equations of generalized hydrodynamics (GHD), a hydrodynamic theory for integrable quantum systems at the Euler scale, emerge in full generality in a family of classical gases, which generalize the gas of hard rods. In this family, the particles, upon colliding, jump forward or backward by a distance that depends on their velocities, reminiscent of classical soliton scattering. This provides a "molecular dynamics" for GHD: a numerical solver which is efficient, flexible, and which applies to the presence of external force fields. GHD also describes the hydrodynamics of classical soliton gases. We identify the GHD of any quantum model with that of the gas of its solitonlike wave packets, thus providing a remarkable quantum-classical equivalence. The theory is directly applicable, for instance, to integrable quantum chains and to the Lieb-Liniger model realized in cold-atom experiments.

  11. Topological and Orthomodular Modeling of Context in Behavioral Science

    NASA Astrophysics Data System (ADS)

    Narens, Louis

    2017-02-01

    Two non-boolean methods are discussed for modeling context in behavioral data and theory. The first is based on intuitionistic logic, which is similar to classical logic except that not every event has a complement. Its probability theory is also similar to classical probability theory except that the definition of probability function needs to be generalized to unions of events instead of applying only to unions of disjoint events. The generalization is needed, because intuitionistic event spaces may not contain enough disjoint events for the classical definition to be effective. The second method develops a version of quantum logic for its underlying probability theory. It differs from Hilbert space logic used in quantum mechanics as a foundation for quantum probability theory in variety of ways. John von Neumann and others have commented about the lack of a relative frequency approach and a rational foundation for this probability theory. This article argues that its version of quantum probability theory does not have such issues. The method based on intuitionistic logic is useful for modeling cognitive interpretations that vary with context, for example, the mood of the decision maker, the context produced by the influence of other items in a choice experiment, etc. The method based on this article's quantum logic is useful for modeling probabilities across contexts, for example, how probabilities of events from different experiments are related.

  12. A classical density functional theory of ionic liquids.

    PubMed

    Forsman, Jan; Woodward, Clifford E; Trulsson, Martin

    2011-04-28

    We present a simple, classical density functional approach to the study of simple models of room temperature ionic liquids. Dispersion attractions as well as ion correlation effects and excluded volume packing are taken into account. The oligomeric structure, common to many ionic liquid molecules, is handled by a polymer density functional treatment. The theory is evaluated by comparisons with simulations, with an emphasis on the differential capacitance, an experimentally measurable quantity of significant practical interest.

  13. Generalized quantum theory of recollapsing homogeneous cosmologies

    NASA Astrophysics Data System (ADS)

    Craig, David; Hartle, James B.

    2004-06-01

    A sum-over-histories generalized quantum theory is developed for homogeneous minisuperspace type A Bianchi cosmological models, focusing on the particular example of the classically recollapsing Bianchi type-IX universe. The decoherence functional for such universes is exhibited. We show how the probabilities of decoherent sets of alternative, coarse-grained histories of these model universes can be calculated. We consider in particular the probabilities for classical evolution defined by a suitable coarse graining. For a restricted class of initial conditions and coarse grainings we exhibit the approximate decoherence of alternative histories in which the universe behaves classically and those in which it does not. For these situations we show that the probability is near unity for the universe to recontract classically if it expands classically. We also determine the relative probabilities of quasiclassical trajectories for initial states of WKB form, recovering for such states a precise form of the familiar heuristic “JṡdΣ” rule of quantum cosmology, as well as a generalization of this rule to generic initial states.

  14. Ultrasensitivity and sharp threshold theorems for multisite systems

    NASA Astrophysics Data System (ADS)

    Dougoud, M.; Mazza, C.; Vinckenbosch, L.

    2017-02-01

    This work studies the ultrasensitivity of multisite binding processes where ligand molecules can bind to several binding sites. It considers more particularly recent models involving complex chemical reactions in allosteric phosphorylation processes and for transcription factors and nucleosomes competing for binding on DNA. New statistics-based formulas for the Hill coefficient and the effective Hill coefficient are provided and necessary conditions for a system to be ultrasensitive are exhibited. It is first shown that the ultrasensitivity of binding processes can be approached using sharp-threshold theorems which have been developed in applied probability theory and statistical mechanics for studying sharp threshold phenomena in reliability theory, random graph theory and percolation theory. Special classes of binding process are then introduced and are described as density dependent birth and death process. New precise large deviation results for the steady state distribution of the process are obtained, which permits to show that switch-like ultrasensitive responses are strongly related to the multi-modality of the steady state distribution. Ultrasensitivity occurs if and only if the entropy of the dynamical system has more than one global minimum for some critical ligand concentration. In this case, the Hill coefficient is proportional to the number of binding sites, and the system is highly ultrasensitive. The classical effective Hill coefficient I is extended to a new cooperativity index I q , for which we recommend the computation of a broad range of values of q instead of just the standard one I  =  I 0.9 corresponding to the 10%-90% variation in the dose-response. It is shown that this single choice can sometimes mislead the conclusion by not detecting ultrasensitivity. This new approach allows a better understanding of multisite ultrasensitive systems and provides new tools for the design of such systems.

  15. Balancing the books - a statistical theory of prospective budgets in Earth System science

    NASA Astrophysics Data System (ADS)

    O'Kane, J. Philip

    An honest declaration of the error in a mass, momentum or energy balance, ɛ, simply raises the question of its acceptability: "At what value of ɛ is the attempted balance to be rejected?" Answering this question requires a reference quantity against which to compare ɛ. This quantity must be a mathematical function of all the data used in making the balance. To deliver this function, a theory grounded in a workable definition of acceptability is essential. A distinction must be drawn between a retrospective balance and a prospective budget in relation to any natural space-filling body. Balances look to the past; budgets look to the future. The theory is built on the application of classical sampling theory to the measurement and closure of a prospective budget. It satisfies R.A. Fisher's "vital requirement that the actual and physical conduct of experiments should govern the statistical procedure of their interpretation". It provides a test, which rejects, or fails to reject, the hypothesis that the closing error on the budget, when realised, was due to sampling error only. By increasing the number of measurements, the discrimination of the test can be improved, controlling both the precision and accuracy of the budget and its components. The cost-effective design of such measurement campaigns is discussed briefly. This analysis may also show when campaigns to close a budget on a particular space-filling body are not worth the effort for either scientific or economic reasons. Other approaches, such as those based on stochastic processes, lack this finality, because they fail to distinguish between different types of error in the mismatch between a set of realisations of the process and the measured data.

  16. Applying the Longitudinal Model from Item Response Theory to Assess Health-Related Quality of Life in the PRODIGE 4/ACCORD 11 Randomized Trial.

    PubMed

    Barbieri, Antoine; Anota, Amélie; Conroy, Thierry; Gourgou-Bourgade, Sophie; Juzyna, Beata; Bonnetain, Franck; Lavergne, Christian; Bascoul-Mollevi, Caroline

    2016-07-01

    A new longitudinal statistical approach was compared to the classical methods currently used to analyze health-related quality-of-life (HRQoL) data. The comparison was made using data in patients with metastatic pancreatic cancer. Three hundred forty-two patients from the PRODIGE4/ACCORD 11 study were randomly assigned to FOLFIRINOX versus gemcitabine regimens. HRQoL was evaluated using the European Organization for Research and Treatment of Cancer (EORTC) QLQ-C30. The classical analysis uses a linear mixed model (LMM), considering an HRQoL score as a good representation of the true value of the HRQoL, following EORTC recommendations. In contrast, built on the item response theory (IRT), our approach considered HRQoL as a latent variable directly estimated from the raw data. For polytomous items, we extended the partial credit model to a longitudinal analysis (longitudinal partial credit model [LPCM]), thereby modeling the latent trait as a function of time and other covariates. Both models gave the same conclusions on 11 of 15 HRQoL dimensions. HRQoL evolution was similar between the 2 treatment arms, except for the symptoms of pain. Indeed, regarding the LPCM, pain perception was significantly less important in the FOLFIRINOX arm than in the gemcitabine arm. For most of the scales, HRQoL changes over time, and no difference was found between treatments in terms of HRQoL. The use of LMM to study the HRQoL score does not seem appropriate. It is an easy-to-use model, but the basic statistical assumptions do not check. Our IRT model may be more complex but shows the same qualities and gives similar results. It has the additional advantage of being more precise and suitable because of its direct use of raw data. © The Author(s) 2015.

  17. Aspects of Geodesical Motion with Fisher-Rao Metric: Classical and Quantum

    NASA Astrophysics Data System (ADS)

    Ciaglia, Florio M.; Cosmo, Fabio Di; Felice, Domenico; Mancini, Stefano; Marmo, Giuseppe; Pérez-Pardo, Juan M.

    The purpose of this paper is to exploit the geometric structure of quantum mechanics and of statistical manifolds to study the qualitative effect that the quantum properties have in the statistical description of a system. We show that the end points of geodesics in the classical setting coincide with the probability distributions that minimise Shannon’s entropy, i.e. with distributions of zero dispersion. In the quantum setting this happens only for particular initial conditions, which in turn correspond to classical submanifolds. This result can be interpreted as a geometric manifestation of the uncertainty principle.

  18. Thermodynamics of H-bonding in alcohols and water. The mobile order theory as opposed to the classical multicomponent order theories

    NASA Astrophysics Data System (ADS)

    Huyskens, P.; Kapuku, F.; Colemonts-Vandevyvere, C.

    1990-09-01

    In liquids the partners of H bonds constantly change. As a consequence the entities observed by IR spectroscopy are not the same as those considered for thermodynamic properties. For the latter, the H-bonds are shared by all the molecules. The thermodynamic "monomeric fraction", γ, the time fraction during which an alcohol molecule is vaporizable, is the square root of the spectroscopic monomeric fraction, and is the fraction of molecules which, during a time interval of 10 -14 s, have their hydroxylic proton and their lone pairs free. The classical thermodynamic treatments of Mecke and Prigogine consider the spectroscopic entities as real thermodynamic entities. Opposed to this, the mobile order theory considers all the formal molecules as equal but with a reduction of the entropy due to the fact that during a fraction 1-γ of the time, the OH proton follows a neighbouring oxygen atom on its journey through the liquid. Mobile order theory and classic multicomponent treatment lead, in binary mixtures of the associated substance A with the inert substance S, to expressions of the chemical potentials μ A and μ S that are fundamentally different. However, the differences become very important only when the molar volumes overlineVS and overlineVA differ by a factor larger than 2. As a consequence the equations of the classic theory can still fit the experimental vapour pressure data of mixtures of liquid alcohols and liquid alkanes. However, the solubilities of solid alkanes in water for which overlineVS > 3 overlineVA are only correctly predicted by the mobile order theory.

  19. From Foucault to Freire through Facebook: Toward an Integrated Theory of mHealth

    ERIC Educational Resources Information Center

    Bull, Sheana; Ezeanochie, Nnamdi

    2016-01-01

    Objective: To document the integration of social science theory in literature on mHealth (mobile health) and consider opportunities for integration of classic theory, health communication theory, and social networking to generate a relevant theory for mHealth program design. Method: A secondary review of research syntheses and meta-analyses…

  20. Cognitive methodology for forecasting oil and gas industry using pattern-based neural information technologies

    NASA Astrophysics Data System (ADS)

    Gafurov, O.; Gafurov, D.; Syryamkin, V.

    2018-05-01

    The paper analyses a field of computer science formed at the intersection of such areas of natural science as artificial intelligence, mathematical statistics, and database theory, which is referred to as "Data Mining" (discovery of knowledge in data). The theory of neural networks is applied along with classical methods of mathematical analysis and numerical simulation. The paper describes the technique protected by the patent of the Russian Federation for the invention “A Method for Determining Location of Production Wells during the Development of Hydrocarbon Fields” [1–3] and implemented using the geoinformation system NeuroInformGeo. There are no analogues in domestic and international practice. The paper gives an example of comparing the forecast of the oil reservoir quality made by the geophysicist interpreter using standard methods and the forecast of the oil reservoir quality made using this technology. The technical result achieved shows the increase of efficiency, effectiveness, and ecological compatibility of development of mineral deposits and discovery of a new oil deposit.

  1. Evolutionary games on graphs

    NASA Astrophysics Data System (ADS)

    Szabó, György; Fáth, Gábor

    2007-07-01

    Game theory is one of the key paradigms behind many scientific disciplines from biology to behavioral sciences to economics. In its evolutionary form and especially when the interacting agents are linked in a specific social network the underlying solution concepts and methods are very similar to those applied in non-equilibrium statistical physics. This review gives a tutorial-type overview of the field for physicists. The first four sections introduce the necessary background in classical and evolutionary game theory from the basic definitions to the most important results. The fifth section surveys the topological complications implied by non-mean-field-type social network structures in general. The next three sections discuss in detail the dynamic behavior of three prominent classes of models: the Prisoner's Dilemma, the Rock-Scissors-Paper game, and Competing Associations. The major theme of the review is in what sense and how the graph structure of interactions can modify and enrich the picture of long term behavioral patterns emerging in evolutionary games.

  2. The foundation: Mechanism, prediction, and falsification in Bayesian enactivism. Comment on "Answering Schrödinger's question: A free-energy formulation" by Maxwell James Désormeau Ramstead et al.

    NASA Astrophysics Data System (ADS)

    Allen, Micah

    2018-03-01

    In Isaac Asimov's science fiction classic, Foundation, fictional mathematician Hari Seldon applies his theory of psychohistory, a synthesis of psychology, history, and statistical physics, to predict that humanity will suffer a dark age lasting thirty millennia [1]. Although Seldon's psychohistory successfully predicts the future of human society, its basis in the physical law of mass action carries a limitation - it can only do so for sufficiently massive populations (i.e., billions of individuals), rendering it inert at an individual level. This limitation is of course a key source of dramatic tension in the series, in which the individual characters of Asimov's universe grapple with the challenges inherent to applying a lawlike theory of collective action to the constitutive individuals. To avert crisis, Seldon ultimately assembles the namesake Foundation, an interdisciplinary, intergalactic research centre bringing together various biological, physical, and social scientists who ultimately attempt to alter the predicted course of history.

  3. Bell's Theorem, Many Worlds and Backwards-Time Physics: Not Just a Matter of Interpretation

    NASA Astrophysics Data System (ADS)

    Werbos, Paul J.

    2008-11-01

    The classic “Bell’s Theorem” of Clauser, Holt, Shimony and Horne tells us that we must give up at least one of: (1) objective reality (aka “hidden variables”); (2) locality; or (3) time-forwards macroscopic statistics (aka “causality”). The orthodox Copenhagen version of physics gives up the first. The many-worlds theory of Everett and Wheeler gives up the second. The backwards-time theory of physics (BTP) gives up the third. Contrary to conventional wisdom, empirical evidence strongly favors Everett-Wheeler over orthodox Copenhagen. BTP allows two major variations—a many-worlds version and a neoclassical version based on Partial Differential Equations (PDE), in the spirit of Einstein. Section 2 of this paper discusses the origins of quantum measurement according to BTP, focusing on the issue of how we represent condensed matter objects like polarizers in a model “Bell’s Theorem” experiment. The backwards time telegraph (BTT) is not ruled out in BTP, but is highly speculative for now, as will be discussed.

  4. Attractive versus repulsive interactions in the Bose-Einstein condensation dynamics of relativistic field theories

    NASA Astrophysics Data System (ADS)

    Berges, J.; Boguslavski, K.; Chatrchyan, A.; Jaeckel, J.

    2017-10-01

    We study the impact of attractive self-interactions on the nonequilibrium dynamics of relativistic quantum fields with large occupancies at low momenta. Our primary focus is on Bose-Einstein condensation and nonthermal fixed points in such systems. For a model system, we consider O (N ) -symmetric scalar field theories. We use classical-statistical real-time simulations as well as a systematic 1 /N expansion of the quantum (two-particle-irreducible) effective action to next-to-leading order. When the mean self-interactions are repulsive, condensation occurs as a consequence of a universal inverse particle cascade to the zero-momentum mode with self-similar scaling behavior. For attractive mean self-interactions, the inverse cascade is absent, and the particle annihilation rate is enhanced compared to the repulsive case, which counteracts the formation of coherent field configurations. For N ≥2 , the presence of a nonvanishing conserved charge can suppress number-changing processes and lead to the formation of stable localized charge clumps, i.e., Q balls.

  5. Electric field induced sheeting and breakup of dielectric liquid jets

    NASA Astrophysics Data System (ADS)

    Khoshnevis, Ahmad; Tsai, Scott S. H.; Esmaeilzadeh, Esmaeil

    2014-01-01

    We report experimental observations of the controlled deformation of a dielectric liquid jet subjected to a local high-voltage electrostatic field in the direction normal to the jet. The jet deforms to the shape of an elliptic cylinder upon application of a normal electrostatic field. As the applied electric field strength is increased, the elliptic cylindrical jet deforms permanently into a flat sheet, and eventually breaks-up into droplets. We interpret this observation—the stretch of the jet is in the normal direction to the applied electric field—qualitatively using the Taylor-Melcher leaky dielectric theory, and develop a simple scaling model that predicts the critical electric field strength for the jet-to-sheet transition. Our model shows a good agreement with experimental results, and has a form that is consistent with the classical drop deformation criterion in the Taylor-Melcher theory. Finally, we statistically analyze the resultant droplets from sheet breakup, and find that increasing the applied electric field strength improves droplet uniformity and reduces droplet size.

  6. Generalizability Theory and Classical Test Theory

    ERIC Educational Resources Information Center

    Brennan, Robert L.

    2011-01-01

    Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…

  7. Theories of the Alcoholic Personality.

    ERIC Educational Resources Information Center

    Cox, W. Miles

    Several theories of the alcoholic personality have been devised to determine the relationship between the clusters of personality characteristics of alcoholics and their abuse of alcohol. The oldest and probably best known theory is the dependency theory, formulated in the tradition of classical psychoanalysis, which associates the alcoholic's…

  8. The Giffen Effect: A Note on Economic Purposes.

    ERIC Educational Resources Information Center

    Williams, William D.

    1990-01-01

    Describes the Giffen effect: demand for a commodity increases as price increases. Explains how applying control theory eliminates the paradox that the Giffen effect presents to classic economics supply and demand theory. Notes the differences in how conventional demand theory and control theory treat consumer behavior. (CH)

  9. Personality Theories for the 21st Century

    ERIC Educational Resources Information Center

    McCrae, Robert R.

    2011-01-01

    Classic personality theories, although intriguing, are outdated. The five-factor model of personality traits reinvigorated personality research, and the resulting findings spurred a new generation of personality theories. These theories assign a central place to traits and acknowledge the crucial role of evolved biology in shaping human…

  10. Continuous Time in Consistent Histories

    NASA Astrophysics Data System (ADS)

    Savvidou, Konstantina

    1999-12-01

    We discuss the case of histories labelled by a continuous time parameter in the History Projection Operator consistent-histories quantum theory. We describe how the appropriate representation of the history algebra may be chosen by requiring the existence of projection operators that represent propositions about time averages of the energy. We define the action operator for the consistent histories formalism, as the quantum analogue of the classical action functional, for the simple harmonic oscillator case. We show that the action operator is the generator of two types of time transformations that may be related to the two laws of time-evolution of the standard quantum theory: the `state-vector reduction' and the unitary time-evolution. We construct the corresponding classical histories and demonstrate the relevance with the quantum histories; we demonstrate how the requirement of the temporal logic structure of the theory is sufficient for the definition of classical histories. Furthermore, we show the relation of the action operator to the decoherence functional which describes the dynamics of the system. Finally, the discussion is extended to give a preliminary account of quantum field theory in this approach to the consistent histories formalism.

  11. Effects of Extrinsic Mortality on the Evolution of Aging: A Stochastic Modeling Approach

    PubMed Central

    Shokhirev, Maxim Nikolaievich; Johnson, Adiv Adam

    2014-01-01

    The evolutionary theories of aging are useful for gaining insights into the complex mechanisms underlying senescence. Classical theories argue that high levels of extrinsic mortality should select for the evolution of shorter lifespans and earlier peak fertility. Non-classical theories, in contrast, posit that an increase in extrinsic mortality could select for the evolution of longer lifespans. Although numerous studies support the classical paradigm, recent data challenge classical predictions, finding that high extrinsic mortality can select for the evolution of longer lifespans. To further elucidate the role of extrinsic mortality in the evolution of aging, we implemented a stochastic, agent-based, computational model. We used a simulated annealing optimization approach to predict which model parameters predispose populations to evolve longer or shorter lifespans in response to increased levels of predation. We report that longer lifespans evolved in the presence of rising predation if the cost of mating is relatively high and if energy is available in excess. Conversely, we found that dramatically shorter lifespans evolved when mating costs were relatively low and food was relatively scarce. We also analyzed the effects of increased predation on various parameters related to density dependence and energy allocation. Longer and shorter lifespans were accompanied by increased and decreased investments of energy into somatic maintenance, respectively. Similarly, earlier and later maturation ages were accompanied by increased and decreased energetic investments into early fecundity, respectively. Higher predation significantly decreased the total population size, enlarged the shared resource pool, and redistributed energy reserves for mature individuals. These results both corroborate and refine classical predictions, demonstrating a population-level trade-off between longevity and fecundity and identifying conditions that produce both classical and non-classical lifespan effects. PMID:24466165

  12. Assessing the quantum physics impacts on future x-ray free-electron lasers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmitt, Mark J.; Anisimov, Petr Mikhaylovich

    A new quantum mechanical theory of x-ray free electron lasers (XFELs) has been successfully developed that has placed LANL at the forefront of the understanding of quantum effects in XFELs. Our quantum theory describes the interaction of relativistic electrons with x-ray radiation in the periodic magnetic field of an undulator using the same mathematical formalism as classical XFEL theory. This places classical and quantum treatments on the same footing and allows for a continuous transition from one regime to the other eliminating the disparate analytical approaches previously used. Moreover, Dr. Anisimov, the architect of this new theory, is now consideredmore » a resource in the international FEL community for assessing quantum effects in XFELs.« less

  13. The applications of Complexity Theory and Tsallis Non-extensive Statistics at Solar Plasma Dynamics

    NASA Astrophysics Data System (ADS)

    Pavlos, George

    2015-04-01

    As the solar plasma lives far from equilibrium it is an excellent laboratory for testing complexity theory and non-equilibrium statistical mechanics. In this study, we present the highlights of complexity theory and Tsallis non extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at sunspot, solar flare and solar wind phenomena. Generally, when a physical system is driven far from equilibrium states some novel characteristics can be observed related to the nonlinear character of dynamics. Generally, the nonlinearity in space plasma dynamics can generate intermittent turbulence with the typical characteristics of the anomalous diffusion process and strange topologies of stochastic space plasma fields (velocity and magnetic fields) caused by the strange dynamics and strange kinetics (Zaslavsky, 2002). In addition, according to Zelenyi and Milovanov (2004) the complex character of the space plasma system includes the existence of non-equilibrium (quasi)-stationary states (NESS) having the topology of a percolating fractal set. The stabilization of a system near the NESS is perceived as a transition into a turbulent state determined by self-organization processes. The long-range correlation effects manifest themselves as a strange non-Gaussian behavior of kinetic processes near the NESS plasma state. The complex character of space plasma can also be described by the non-extensive statistical thermodynamics pioneered by Tsallis, which offers a consistent and effective theoretical framework, based on a generalization of Boltzmann - Gibbs (BG) entropy, to describe far from equilibrium nonlinear complex dynamics (Tsallis, 2009). In a series of recent papers, the hypothesis of Tsallis non-extensive statistics in magnetosphere, sunspot dynamics, solar flares, solar wind and space plasma in general, was tested and verified (Karakatsanis et al., 2013; Pavlos et al., 2014; 2015). Our study includes the analysis of solar plasma time series at three cases: sunspot index, solar flare and solar wind data. The non-linear analysis of the sunspot index is embedded in the non-extensive statistical theory of Tsallis (1988; 2004; 2009). The q-triplet of Tsallis, as well as the correlation dimension and the Lyapunov exponent spectrum were estimated for the SVD components of the sunspot index timeseries. Also the multifractal scaling exponent spectrum f(a), the generalized Renyi dimension spectrum D(q) and the spectrum J(p) of the structure function exponents were estimated experimentally and theoretically by using the q-entropy principle included in Tsallis non-extensive statistical theory, following Arimitsu and Arimitsu (2000, 2001). Our analysis showed clearly the following: (a) a phase transition process in the solar dynamics from high dimensional non-Gaussian SOC state to a low dimensional non-Gaussian chaotic state, (b) strong intermittent solar turbulence and anomalous (multifractal) diffusion solar process, which is strengthened as the solar dynamics makes a phase transition to low dimensional chaos in accordance to Ruzmaikin, Zelenyi and Milovanov's studies (Zelenyi and Milovanov, 1991; Milovanov and Zelenyi, 1993; Ruzmakin et al., 1996), (c) faithful agreement of Tsallis non-equilibrium statistical theory with the experimental estimations of: (i) non-Gaussian probability distribution function P(x), (ii) multifractal scaling exponent spectrum f(a) and generalized Renyi dimension spectrum Dq, (iii) exponent spectrum J(p) of the structure functions estimated for the sunspot index and its underlying non equilibrium solar dynamics. Also, the q-triplet of Tsallis as well as the correlation dimension and the Lyapunov exponent spectrum were estimated for the singular value decomposition (SVD) components of the solar flares timeseries. Also the multifractal scaling exponent spectrum f(a), the generalized Renyi dimension spectrum D(q) and the spectrum J(p) of the structure function exponents were estimated experimentally and theoretically by using the q-entropy principle included in Tsallis non-extensive statistical theory, following Arimitsu and Arimitsu (2000). Our analysis showed clearly the following: (a) a phase transition process in the solar flare dynamics from a high dimensional non-Gaussian self-organized critical (SOC) state to a low dimensional also non-Gaussian chaotic state, (b) strong intermittent solar corona turbulence and an anomalous (multifractal) diffusion solar corona process, which is strengthened as the solar corona dynamics makes a phase transition to low dimensional chaos, (c) faithful agreement of Tsallis non-equilibrium statistical theory with the experimental estimations of the functions: (i) non-Gaussian probability distribution function P(x), (ii) f(a) and D(q), and (iii) J(p) for the solar flares timeseries and its underlying non-equilibrium solar dynamics, and (d) the solar flare dynamical profile is revealed similar to the dynamical profile of the solar corona zone as far as the phase transition process from self-organized criticality (SOC) to chaos state. However the solar low corona (solar flare) dynamical characteristics can be clearly discriminated from the dynamical characteristics of the solar convection zone. At last we present novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which can take place in Solar wind plasma system. The solar wind plasma as well as the entire solar plasma system is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields ( ) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992). References 1. T. Arimitsu, N. Arimitsu, Tsallis statistics and fully developed turbulence, J. Phys. A: Math. Gen. 33 (2000) L235. 2. T. Arimitsu, N. Arimitsu, Analysis of turbulence by statistics based on generalized entropies, Physica A 295 (2001) 177-194. 3. T. Chang, Low-dimensional behavior and symmetry braking of stochastic systems near criticality can these effects be observed in space and in the laboratory, IEEE 20 (6) (1992) 691-694. 4. U. Frisch, Turbulence, Cambridge University Press, Cambridge, UK, 1996, p. 310. 5. L.P. Karakatsanis, G.P. Pavlos, M.N. Xenakis, Tsallis non-extensive statistics, intermittent turbulence, SOC and chaos in the solar plasma. Part two: Solar flares dynamics, Physica A 392 (2013) 3920-3944. 6. A.V. Milovanov, Topological proof for the Alexander-Orbach conjecture, Phys. Rev. E 56 (3) (1997) 2437-2446. 7. A.V. Milovanov, L.M. Zelenyi, Fracton excitations as a driving mechanism for the self-organized dynamical structuring in the solar wind, Astrophys. Space Sci. 264 (1-4) (1999) 317-345. 8. A.V. Milovanov, Stochastic dynamics from the fractional Fokker-Planck-Kolmogorov equation: large-scale behavior of the turbulent transport coefficient, Phys. Rev. E 63 (2001) 047301. 9. G.P. Pavlos, et al., Universality of non-extensive Tsallis statistics and time series analysis: Theory and applications, Physica A 395 (2014) 58-95. 10. G.P. Pavlos, et al., Tsallis non-extensive statistics and solar wind plasma complexity, Physica A 422 (2015) 113-135. 11. A.A. Ruzmaikin, et al., Spectral properties of solar convection and diffusion, ApJ 471 (1996) 1022. 12. V.E. Tarasov, Review of some promising fractional physical models, Internat. J. Modern Phys. B 27 (9) (2013) 1330005. 13. C. Tsallis, Possible generalization of BG statistics, J. Stat. Phys. J 52 (1-2) (1988) 479-487. 14. C. Tsallis, Nonextensive statistical mechanics: construction and physical interpretation, in: G.M. Murray, C. Tsallis (Eds.), Nonextensive Entropy-Interdisciplinary Applications, Oxford Univ. Press, 2004, pp. 1-53. 15. C. Tsallis, Introduction to Non-Extensive Statistical Mechanics, Springer, 2009. 16. G.M. Zaslavsky, Chaos, fractional kinetics, and anomalous transport, Physics Reports 371 (2002) 461-580. 17. L.M. Zelenyi, A.V. Milovanov, Fractal properties of sunspots, Sov. Astron. Lett. 17 (6) (1991) 425. 18. L.M. Zelenyi, A.V. Milovanov, Fractal topology and strange kinetics: from percolation theory to problems in cosmic electrodynamics, Phys.-Usp. 47 (8), (2004) 749-788.

  14. Classical test theory and Rasch analysis validation of the Upper Limb Functional Index in subjects with upper limb musculoskeletal disorders.

    PubMed

    Bravini, Elisabetta; Franchignoni, Franco; Giordano, Andrea; Sartorio, Francesco; Ferriero, Giorgio; Vercelli, Stefano; Foti, Calogero

    2015-01-01

    To perform a comprehensive analysis of the psychometric properties and dimensionality of the Upper Limb Functional Index (ULFI) using both classical test theory and Rasch analysis (RA). Prospective, single-group observational design. Freestanding rehabilitation center. Convenience sample of Italian-speaking subjects with upper limb musculoskeletal disorders (N=174). Not applicable. The Italian version of the ULFI. Data were analyzed using parallel analysis, exploratory factor analysis, and RA for evaluating dimensionality, functioning of rating scale categories, item fit, hierarchy of item difficulties, and reliability indices. Parallel analysis revealed 2 factors explaining 32.5% and 10.7% of the response variance. RA confirmed the failure of the unidimensionality assumption, and 6 items out of the 25 misfitted the Rasch model. When the analysis was rerun excluding the misfitting items, the scale showed acceptable fit values, loading meaningfully to a single factor. Item separation reliability and person separation reliability were .98 and .89, respectively. Cronbach alpha was .92. RA revealed weakness of the scale concerning dimensionality and internal construct validity. However, a set of 19 ULFI items defined through the statistical process demonstrated a unidimensional structure, good psychometric properties, and clinical meaningfulness. These findings represent a useful starting point for further analyses of the tool (based on modern psychometric approaches and confirmatory factor analysis) in larger samples, including different patient populations and nationalities. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  15. Perturbative thermodynamic geometry of nonextensive ideal classical, Bose, and Fermi gases.

    PubMed

    Mohammadzadeh, Hosein; Adli, Fereshteh; Nouri, Sahereh

    2016-12-01

    We investigate perturbative thermodynamic geometry of nonextensive ideal classical, Bose, and Fermi gases. We show that the intrinsic statistical interaction of nonextensive Bose (Fermi) gas is attractive (repulsive) similar to the extensive case but the value of thermodynamic curvature is changed by a nonextensive parameter. In contrary to the extensive ideal classical gas, the nonextensive one may be divided to two different regimes. According to the deviation parameter of the system to the nonextensive case, one can find a special value of fugacity, z^{*}, where the sign of thermodynamic curvature is changed. Therefore, we argue that the nonextensive parameter induces an attractive (repulsive) statistical interaction for zz^{*}) for an ideal classical gas. Also, according to the singular point of thermodynamic curvature, we consider the condensation of nonextensive Boson gas.

  16. Perceptual basis of evolving Western musical styles

    PubMed Central

    Rodriguez Zivic, Pablo H.; Shifres, Favio; Cecchi, Guillermo A.

    2013-01-01

    The brain processes temporal statistics to predict future events and to categorize perceptual objects. These statistics, called expectancies, are found in music perception, and they span a variety of different features and time scales. Specifically, there is evidence that music perception involves strong expectancies regarding the distribution of a melodic interval, namely, the distance between two consecutive notes within the context of another. The recent availability of a large Western music dataset, consisting of the historical record condensed as melodic interval counts, has opened new possibilities for data-driven analysis of musical perception. In this context, we present an analytical approach that, based on cognitive theories of music expectation and machine learning techniques, recovers a set of factors that accurately identifies historical trends and stylistic transitions between the Baroque, Classical, Romantic, and Post-Romantic periods. We also offer a plausible musicological and cognitive interpretation of these factors, allowing us to propose them as data-driven principles of melodic expectation. PMID:23716669

  17. Quantum signature of chaos and thermalization in the kicked Dicke model

    NASA Astrophysics Data System (ADS)

    Ray, S.; Ghosh, A.; Sinha, S.

    2016-09-01

    We study the quantum dynamics of the kicked Dicke model (KDM) in terms of the Floquet operator, and we analyze the connection between chaos and thermalization in this context. The Hamiltonian map is constructed by suitably taking the classical limit of the Heisenberg equation of motion to study the corresponding phase-space dynamics, which shows a crossover from regular to chaotic motion by tuning the kicking strength. The fixed-point analysis and calculation of the Lyapunov exponent (LE) provide us with a complete picture of the onset of chaos in phase-space dynamics. We carry out a spectral analysis of the Floquet operator, which includes a calculation of the quasienergy spacing distribution and structural entropy to show the correspondence to the random matrix theory in the chaotic regime. Finally, we analyze the thermodynamics and statistical properties of the bosonic sector as well as the spin sector, and we discuss how such a periodically kicked system relaxes to a thermalized state in accordance with the laws of statistical mechanics.

  18. Quantum signature of chaos and thermalization in the kicked Dicke model.

    PubMed

    Ray, S; Ghosh, A; Sinha, S

    2016-09-01

    We study the quantum dynamics of the kicked Dicke model (KDM) in terms of the Floquet operator, and we analyze the connection between chaos and thermalization in this context. The Hamiltonian map is constructed by suitably taking the classical limit of the Heisenberg equation of motion to study the corresponding phase-space dynamics, which shows a crossover from regular to chaotic motion by tuning the kicking strength. The fixed-point analysis and calculation of the Lyapunov exponent (LE) provide us with a complete picture of the onset of chaos in phase-space dynamics. We carry out a spectral analysis of the Floquet operator, which includes a calculation of the quasienergy spacing distribution and structural entropy to show the correspondence to the random matrix theory in the chaotic regime. Finally, we analyze the thermodynamics and statistical properties of the bosonic sector as well as the spin sector, and we discuss how such a periodically kicked system relaxes to a thermalized state in accordance with the laws of statistical mechanics.

  19. NP-hardness of decoding quantum error-correction codes

    NASA Astrophysics Data System (ADS)

    Hsieh, Min-Hsiu; Le Gall, François

    2011-05-01

    Although the theory of quantum error correction is intimately related to classical coding theory and, in particular, one can construct quantum error-correction codes (QECCs) from classical codes with the dual-containing property, this does not necessarily imply that the computational complexity of decoding QECCs is the same as their classical counterparts. Instead, decoding QECCs can be very much different from decoding classical codes due to the degeneracy property. Intuitively, one expects degeneracy would simplify the decoding since two different errors might not and need not be distinguished in order to correct them. However, we show that general quantum decoding problem is NP-hard regardless of the quantum codes being degenerate or nondegenerate. This finding implies that no considerably fast decoding algorithm exists for the general quantum decoding problems and suggests the existence of a quantum cryptosystem based on the hardness of decoding QECCs.

  20. On quantum effects in a theory of biological evolution.

    PubMed

    Martin-Delgado, M A

    2012-01-01

    We construct a descriptive toy model that considers quantum effects on biological evolution starting from Chaitin's classical framework. There are smart evolution scenarios in which a quantum world is as favorable as classical worlds for evolution to take place. However, in more natural scenarios, the rate of evolution depends on the degree of entanglement present in quantum organisms with respect to classical organisms. If the entanglement is maximal, classical evolution turns out to be more favorable.

  1. On Quantum Effects in a Theory of Biological Evolution

    PubMed Central

    Martin-Delgado, M. A.

    2012-01-01

    We construct a descriptive toy model that considers quantum effects on biological evolution starting from Chaitin's classical framework. There are smart evolution scenarios in which a quantum world is as favorable as classical worlds for evolution to take place. However, in more natural scenarios, the rate of evolution depends on the degree of entanglement present in quantum organisms with respect to classical organisms. If the entanglement is maximal, classical evolution turns out to be more favorable. PMID:22413059

  2. The Value of Item Response Theory in Clinical Assessment: A Review

    ERIC Educational Resources Information Center

    Thomas, Michael L.

    2011-01-01

    Item response theory (IRT) and related latent variable models represent modern psychometric theory, the successor to classical test theory in psychological assessment. Although IRT has become prevalent in the measurement of ability and achievement, its contributions to clinical domains have been less extensive. Applications of IRT to clinical…

  3. Quasinormal modes of scale dependent black holes in (1 +2 )-dimensional Einstein-power-Maxwell theory

    NASA Astrophysics Data System (ADS)

    Rincón, Ángel; Panotopoulos, Grigoris

    2018-01-01

    We study for the first time the stability against scalar perturbations, and we compute the spectrum of quasinormal modes of three-dimensional charged black holes in Einstein-power-Maxwell nonlinear electrodynamics assuming running couplings. Adopting the sixth order Wentzel-Kramers-Brillouin (WKB) approximation we investigate how the running of the couplings change the spectrum of the classical theory. Our results show that all modes corresponding to nonvanishing angular momentum are unstable both in the classical theory and with the running of the couplings, while the fundamental mode can be stable or unstable depending on the running parameter and the electric charge.

  4. Constraints on primordial magnetic fields from inflation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, Daniel; Kobayashi, Takeshi, E-mail: drgreen@cita.utoronto.ca, E-mail: takeshi.kobayashi@sissa.it

    2016-03-01

    We present generic bounds on magnetic fields produced from cosmic inflation. By investigating field bounds on the vector potential, we constrain both the quantum mechanical production of magnetic fields and their classical growth in a model independent way. For classical growth, we show that only if the reheating temperature is as low as T{sub reh} ∼< 10{sup 2} MeV can magnetic fields of 10{sup −15} G be produced on Mpc scales in the present universe. For purely quantum mechanical scenarios, even stronger constraints are derived. Our bounds on classical and quantum mechanical scenarios apply to generic theories of inflationary magnetogenesis with a two-derivative timemore » kinetic term for the vector potential. In both cases, the magnetic field strength is limited by the gravitational back-reaction of the electric fields that are produced simultaneously. As an example of quantum mechanical scenarios, we construct vector field theories whose time diffeomorphisms are spontaneously broken, and explore magnetic field generation in theories with a variable speed of light. Transitions of quantum vector field fluctuations into classical fluctuations are also analyzed in the examples.« less

  5. Quantum Matching Theory (with new complexity-theoretic, combinatorial and topical insights on the nature of the quantum entanglement)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gurvits, L.

    2002-01-01

    Classical matching theory can be defined in terms of matrices with nonnegative entries. The notion of Positive operator, central in Quantum Theory, is a natural generalization of matrices with non-negative entries. Based on this point of view, we introduce a definition of perfect Quantum (operator) matching. We show that the new notion inherits many 'classical' properties, but not all of them. This new notion goes somewhere beyound matroids. For separable bipartite quantum states this new notion coinsides with the full rank property of the intersection of two corresponding geometric matroids. In the classical situation, permanents are naturally associated with perfectsmore » matchings. We introduce an analog of permanents for positive operators, called Quantum Permanent and show how this generalization of the permanent is related to the Quantum Entanglement. Besides many other things, Quantum Permanents provide new rational inequalities necessary for the separability of bipartite quantum states. Using Quantum Permanents, we give deterministic poly-time algorithm to solve Hidden Matroids Intersection Problem and indicate some 'classical' complexity difficulties associated with the Quantum Entanglement. Finally, we prove that the weak membership problem for the convex set of separable bipartite density matrices is NP-HARD.« less

  6. Ghost imaging of phase objects with classical incoherent light

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shirai, Tomohiro; Setaelae, Tero; Friberg, Ari T.

    2011-10-15

    We describe an optical setup for performing spatial Fourier filtering in ghost imaging with classical incoherent light. This is achieved by a modification of the conventional geometry for lensless ghost imaging. It is shown on the basis of classical coherence theory that with this technique one can realize what we call phase-contrast ghost imaging to visualize pure phase objects.

  7. Quantum-mechanical machinery for rational decision-making in classical guessing game

    NASA Astrophysics Data System (ADS)

    Bang, Jeongho; Ryu, Junghee; Pawłowski, Marcin; Ham, Byoung S.; Lee, Jinhyoung

    2016-02-01

    In quantum game theory, one of the most intriguing and important questions is, “Is it possible to get quantum advantages without any modification of the classical game?” The answer to this question so far has largely been negative. So far, it has usually been thought that a change of the classical game setting appears to be unavoidable for getting the quantum advantages. However, we give an affirmative answer here, focusing on the decision-making process (we call ‘reasoning’) to generate the best strategy, which may occur internally, e.g., in the player’s brain. To show this, we consider a classical guessing game. We then define a one-player reasoning problem in the context of the decision-making theory, where the machinery processes are designed to simulate classical and quantum reasoning. In such settings, we present a scenario where a rational player is able to make better use of his/her weak preferences due to quantum reasoning, without any altering or resetting of the classically defined game. We also argue in further analysis that the quantum reasoning may make the player fail, and even make the situation worse, due to any inappropriate preferences.

  8. Quantum-mechanical machinery for rational decision-making in classical guessing game

    PubMed Central

    Bang, Jeongho; Ryu, Junghee; Pawłowski, Marcin; Ham, Byoung S.; Lee, Jinhyoung

    2016-01-01

    In quantum game theory, one of the most intriguing and important questions is, “Is it possible to get quantum advantages without any modification of the classical game?” The answer to this question so far has largely been negative. So far, it has usually been thought that a change of the classical game setting appears to be unavoidable for getting the quantum advantages. However, we give an affirmative answer here, focusing on the decision-making process (we call ‘reasoning’) to generate the best strategy, which may occur internally, e.g., in the player’s brain. To show this, we consider a classical guessing game. We then define a one-player reasoning problem in the context of the decision-making theory, where the machinery processes are designed to simulate classical and quantum reasoning. In such settings, we present a scenario where a rational player is able to make better use of his/her weak preferences due to quantum reasoning, without any altering or resetting of the classically defined game. We also argue in further analysis that the quantum reasoning may make the player fail, and even make the situation worse, due to any inappropriate preferences. PMID:26875685

  9. Quantum-mechanical machinery for rational decision-making in classical guessing game.

    PubMed

    Bang, Jeongho; Ryu, Junghee; Pawłowski, Marcin; Ham, Byoung S; Lee, Jinhyoung

    2016-02-15

    In quantum game theory, one of the most intriguing and important questions is, "Is it possible to get quantum advantages without any modification of the classical game?" The answer to this question so far has largely been negative. So far, it has usually been thought that a change of the classical game setting appears to be unavoidable for getting the quantum advantages. However, we give an affirmative answer here, focusing on the decision-making process (we call 'reasoning') to generate the best strategy, which may occur internally, e.g., in the player's brain. To show this, we consider a classical guessing game. We then define a one-player reasoning problem in the context of the decision-making theory, where the machinery processes are designed to simulate classical and quantum reasoning. In such settings, we present a scenario where a rational player is able to make better use of his/her weak preferences due to quantum reasoning, without any altering or resetting of the classically defined game. We also argue in further analysis that the quantum reasoning may make the player fail, and even make the situation worse, due to any inappropriate preferences.

  10. Electoral surveys’ influence on the voting processes: a cellular automata model

    NASA Astrophysics Data System (ADS)

    Alves, S. G.; Oliveira Neto, N. M.; Martins, M. L.

    2002-12-01

    Nowadays, in societies threatened by atomization, selfishness, short-term thinking, and alienation from political life, there is a renewed debate about classical questions concerning the quality of democratic decision making. In this work a cellular automata model for the dynamics of free elections, based on the social impact theory is proposed. By using computer simulations, power-law distributions for the size of electoral clusters and decision time have been obtained. The major role of broadcasted electoral surveys in guiding opinion formation and stabilizing the “status quo” was demonstrated. Furthermore, it was shown that in societies where these surveys are manipulated within the universally accepted statistical error bars, even a majoritary opposition could be hindered from reaching power through the electoral path.

  11. Detailed Balance of Thermalization Dynamics in Rydberg-Atom Quantum Simulators.

    PubMed

    Kim, Hyosub; Park, YeJe; Kim, Kyungtae; Sim, H-S; Ahn, Jaewook

    2018-05-04

    Dynamics of large complex systems, such as relaxation towards equilibrium in classical statistical mechanics, often obeys a master equation that captures essential information from the complexities. Here, we find that thermalization of an isolated many-body quantum state can be described by a master equation. We observe sudden quench dynamics of quantum Ising-like models implemented in our quantum simulator, defect-free single-atom tweezers in conjunction with Rydberg-atom interaction. Saturation of their local observables, a thermalization signature, obeys a master equation experimentally constructed by monitoring the occupation probabilities of prequench states and imposing the principle of the detailed balance. Our experiment agrees with theories and demonstrates the detailed balance in a thermalization dynamics that does not require coupling to baths or postulated randomness.

  12. Effective convergence of the two-particle irreducible 1/N expansion for nonequilibrium quantum fields

    NASA Astrophysics Data System (ADS)

    Aarts, Gert; Laurie, Nathan; Tranberg, Anders

    2008-12-01

    The 1/N expansion of the two-particle irreducible effective action offers a powerful approach to study quantum field dynamics far from equilibrium. We investigate the effective convergence of the 1/N expansion in the O(N) model by comparing results obtained numerically in 1+1 dimensions at leading, next-to-leading and next-to-next-to-leading order in 1/N as well as in the weak coupling limit. A comparison in classical statistical field theory, where exact numerical results are available, is made as well. We focus on early-time dynamics and quasiparticle properties far from equilibrium and observe rapid effective convergence already for moderate values of 1/N or the coupling.

  13. Detailed Balance of Thermalization Dynamics in Rydberg-Atom Quantum Simulators

    NASA Astrophysics Data System (ADS)

    Kim, Hyosub; Park, YeJe; Kim, Kyungtae; Sim, H.-S.; Ahn, Jaewook

    2018-05-01

    Dynamics of large complex systems, such as relaxation towards equilibrium in classical statistical mechanics, often obeys a master equation that captures essential information from the complexities. Here, we find that thermalization of an isolated many-body quantum state can be described by a master equation. We observe sudden quench dynamics of quantum Ising-like models implemented in our quantum simulator, defect-free single-atom tweezers in conjunction with Rydberg-atom interaction. Saturation of their local observables, a thermalization signature, obeys a master equation experimentally constructed by monitoring the occupation probabilities of prequench states and imposing the principle of the detailed balance. Our experiment agrees with theories and demonstrates the detailed balance in a thermalization dynamics that does not require coupling to baths or postulated randomness.

  14. A model of the human observer and decision maker

    NASA Technical Reports Server (NTRS)

    Wewerinke, P. H.

    1981-01-01

    The decision process is described in terms of classical sequential decision theory by considering the hypothesis that an abnormal condition has occurred by means of a generalized likelihood ratio test. For this, a sufficient statistic is provided by the innovation sequence which is the result of the perception an information processing submodel of the human observer. On the basis of only two model parameters, the model predicts the decision speed/accuracy trade-off and various attentional characteristics. A preliminary test of the model for single variable failure detection tasks resulted in a very good fit of the experimental data. In a formal validation program, a variety of multivariable failure detection tasks was investigated and the predictive capability of the model was demonstrated.

  15. Detection of low-contrast images in film-grain noise.

    PubMed

    Naderi, F; Sawchuk, A A

    1978-09-15

    When low contrast photographic images are digitized by a very small aperture, extreme film-grain noise almost completely obliterates the image information. Using a large aperture to average out the noise destroys the fine details of the image. In these situations conventional statistical restoration techniques have little effect, and well chosen heuristic algorithms have yielded better results. In this paper we analyze the noisecheating algorithm of Zweig et al. [J. Opt. Soc. Am. 65, 1347 (1975)] and show that it can be justified by classical maximum-likelihood detection theory. A more general algorithm applicable to a broader class of images is then developed by considering the signal-dependent nature of film-grain noise. Finally, a Bayesian detection algorithm with improved performance is presented.

  16. Imperial College near infrared spectroscopy neuroimaging analysis framework.

    PubMed

    Orihuela-Espina, Felipe; Leff, Daniel R; James, David R C; Darzi, Ara W; Yang, Guang-Zhong

    2018-01-01

    This paper describes the Imperial College near infrared spectroscopy neuroimaging analysis (ICNNA) software tool for functional near infrared spectroscopy neuroimaging data. ICNNA is a MATLAB-based object-oriented framework encompassing an application programming interface and a graphical user interface. ICNNA incorporates reconstruction based on the modified Beer-Lambert law and basic processing and data validation capabilities. Emphasis is placed on the full experiment rather than individual neuroimages as the central element of analysis. The software offers three types of analyses including classical statistical methods based on comparison of changes in relative concentrations of hemoglobin between the task and baseline periods, graph theory-based metrics of connectivity and, distinctively, an analysis approach based on manifold embedding. This paper presents the different capabilities of ICNNA in its current version.

  17. Toda theories as contractions of affine Toda theories

    NASA Astrophysics Data System (ADS)

    Aghamohammadi, A.; Khorrami, M.; Shariati, A.

    1996-02-01

    Using a contraction procedure, we obtain Toda theories and their structures, from affine Toda theories and their corresponding structures. By structures, we mean the equation of motion, the classical Lax pair, the boundary term for half line theories, and the quantum transfer matrix. The Lax pair and the transfer matrix so obtained, depend nontrivially on the spectral parameter.

  18. An Approach to Biased Item Identification Using Latent Trait Measurement Theory.

    ERIC Educational Resources Information Center

    Rudner, Lawrence M.

    Because it is a true score model employing item parameters which are independent of the examined sample, item characteristic curve theory (ICC) offers several advantages over classical measurement theory. In this paper an approach to biased item identification using ICC theory is described and applied. The ICC theory approach is attractive in that…

  19. Quantum-like model of processing of information in the brain based on classical electromagnetic field.

    PubMed

    Khrennikov, Andrei

    2011-09-01

    We propose a model of quantum-like (QL) processing of mental information. This model is based on quantum information theory. However, in contrast to models of "quantum physical brain" reducing mental activity (at least at the highest level) to quantum physical phenomena in the brain, our model matches well with the basic neuronal paradigm of the cognitive science. QL information processing is based (surprisingly) on classical electromagnetic signals induced by joint activity of neurons. This novel approach to quantum information is based on representation of quantum mechanics as a version of classical signal theory which was recently elaborated by the author. The brain uses the QL representation (QLR) for working with abstract concepts; concrete images are described by classical information theory. Two processes, classical and QL, are performed parallely. Moreover, information is actively transmitted from one representation to another. A QL concept given in our model by a density operator can generate a variety of concrete images given by temporal realizations of the corresponding (Gaussian) random signal. This signal has the covariance operator coinciding with the density operator encoding the abstract concept under consideration. The presence of various temporal scales in the brain plays the crucial role in creation of QLR in the brain. Moreover, in our model electromagnetic noise produced by neurons is a source of superstrong QL correlations between processes in different spatial domains in the brain; the binding problem is solved on the QL level, but with the aid of the classical background fluctuations. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  20. Origins of Inner Solar Systems

    NASA Astrophysics Data System (ADS)

    Dawson, Rebekah Ilene

    2017-06-01

    Over the past couple decades, thousands of extra-solar planetshave been discovered orbiting other stars. The exoplanets discovered to date exhibit a wide variety of orbital and compositional properties; most are dramatically different from the planets in our own Solar System. Our classical theories for the origins of planetary systems were crafted to account for the Solar System and fail to account for the diversity of planets now known. We are working to establish a new blueprint for the origin of planetary systems and identify the key parameters of planet formation and evolution that establish the distribution of planetary properties observed today. The new blueprint must account for the properties of planets in inner solar systems, regions of planetary systems closer to their star than Earth’s separation from the Sun and home to most exoplanets detected to data. I present work combining simulations and theory with data analysis and statistics of observed planets to test theories of the origins of inner solars, including hot Jupiters, warm Jupiters, and tightly-packed systems of super-Earths. Ultimately a comprehensive blueprint for planetary systems will allow us to better situate discovered planets in the context of their system’s formation and evolution, important factors in whether the planets may harbor life.

  1. Transfer function modeling of damping mechanisms in viscoelastic plates

    NASA Technical Reports Server (NTRS)

    Slater, J. C.; Inman, D. J.

    1991-01-01

    This work formulates a method for the modeling of material damping characteristics in plates. The Sophie German equation of classical plate theory is modified to incorporate hysteresis effects represented by complex stiffness using the transfer function approach proposed by Golla and Hughes, (1985). However, this procedure is not limited to this representation. The governing characteristic equation is decoupled through separation of variables, yielding a solution similar to that of undamped classical plate theory, allowing solution of the steady state as well as the transient response problem.

  2. Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.

    PubMed

    Verde, Pablo E

    2010-12-30

    In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples. Copyright © 2010 John Wiley & Sons, Ltd.

  3. Statistical ultrasonics: the influence of Robert F. Wagner

    NASA Astrophysics Data System (ADS)

    Insana, Michael F.

    2009-02-01

    An important ongoing question for higher education is how to successfully mentor the next generation of scientists and engineers. It has been my privilege to have been mentored by one of the best, Dr Robert F. Wagner and his colleagues at the CDRH/FDA during the mid 1980s. Bob introduced many of us in medical ultrasonics to statistical imaging techniques. These ideas continue to broadly influence studies on adaptive aperture management (beamforming, speckle suppression, compounding), tissue characterization (texture features, Rayleigh/Rician statistics, scatterer size and number density estimators), and fundamental questions about how limitations of the human eye-brain system for extracting information from textured images can motivate image processing. He adapted the classical techniques of signal detection theory to coherent imaging systems that, for the first time in ultrasonics, related common engineering metrics for image quality to task-based clinical performance. This talk summarizes my wonderfully-exciting three years with Bob as I watched him explore topics in statistical image analysis that formed a rational basis for many of the signal processing techniques used in commercial systems today. It is a story of an exciting time in medical ultrasonics, and of how a sparkling personality guided and motivated the development of junior scientists who flocked around him in admiration and amazement.

  4. Field Extension of Real Values of Physical Observables in Classical Theory can Help Attain Quantum Results

    NASA Astrophysics Data System (ADS)

    Wang, Hai; Kumar, Asutosh; Cho, Minhyung; Wu, Junde

    2018-04-01

    Physical quantities are assumed to take real values, which stems from the fact that an usual measuring instrument that measures a physical observable always yields a real number. Here we consider the question of what would happen if physical observables are allowed to assume complex values. In this paper, we show that by allowing observables in the Bell inequality to take complex values, a classical physical theory can actually get the same upper bound of the Bell expression as quantum theory. Also, by extending the real field to the quaternionic field, we can puzzle out the GHZ problem using local hidden variable model. Furthermore, we try to build a new type of hidden-variable theory of a single qubit based on the result.

  5. From integrability to conformal symmetry: Bosonic superconformal Toda theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bo-Yu Hou; Liu Chao

    In this paper the authors study the conformal integrable models obtained from conformal reductions of WZNW theory associated with second order constraints. These models are called bosonic superconformal Toda models due to their conformal spectra and their resemblance to the usual Toda theories. From the reduction procedure they get the equations of motion and the linearized Lax equations in a generic Z gradation of the underlying Lie algebra. Then, in the special case of principal gradation, they derive the classical r matrix, fundamental Poisson relation, exchange algebra of chiral operators and find out the classical vertex operators. The result showsmore » that their model is very similar to the ordinary Toda theories in that one can obtain various conformal properties of the model from its integrability.« less

  6. Sound and Vision: Using Progressive Rock To Teach Social Theory.

    ERIC Educational Resources Information Center

    Ahlkvist, Jarl A.

    2001-01-01

    Describes a teaching technique that utilizes progressive rock music to educate students about sociological theories in introductory sociology courses. Discusses the use of music when teaching about classical social theory and offers an evaluation of this teaching strategy. Includes references. (CMK)

  7. Niels Bohr as philosopher of experiment: Does decoherence theory challenge Bohr's doctrine of classical concepts?

    NASA Astrophysics Data System (ADS)

    Camilleri, Kristian; Schlosshauer, Maximilian

    2015-02-01

    Niels Bohr's doctrine of the primacy of "classical concepts" is arguably his most criticized and misunderstood view. We present a new, careful historical analysis that makes clear that Bohr's doctrine was primarily an epistemological thesis, derived from his understanding of the functional role of experiment. A hitherto largely overlooked disagreement between Bohr and Heisenberg about the movability of the "cut" between measuring apparatus and observed quantum system supports the view that, for Bohr, such a cut did not originate in dynamical (ontological) considerations, but rather in functional (epistemological) considerations. As such, both the motivation and the target of Bohr's doctrine of classical concepts are of a fundamentally different nature than what is understood as the dynamical problem of the quantum-to-classical transition. Our analysis suggests that, contrary to claims often found in the literature, Bohr's doctrine is not, and cannot be, at odds with proposed solutions to the dynamical problem of the quantum-classical transition that were pursued by several of Bohr's followers and culminated in the development of decoherence theory.

  8. Force-field functor theory: classical force-fields which reproduce equilibrium quantum distributions

    PubMed Central

    Babbush, Ryan; Parkhill, John; Aspuru-Guzik, Alán

    2013-01-01

    Feynman and Hibbs were the first to variationally determine an effective potential whose associated classical canonical ensemble approximates the exact quantum partition function. We examine the existence of a map between the local potential and an effective classical potential which matches the exact quantum equilibrium density and partition function. The usefulness of such a mapping rests in its ability to readily improve Born-Oppenheimer potentials for use with classical sampling. We show that such a map is unique and must exist. To explore the feasibility of using this result to improve classical molecular mechanics, we numerically produce a map from a library of randomly generated one-dimensional potential/effective potential pairs then evaluate its performance on independent test problems. We also apply the map to simulate liquid para-hydrogen, finding that the resulting radial pair distribution functions agree well with path integral Monte Carlo simulations. The surprising accessibility and transferability of the technique suggest a quantitative route to adapting Born-Oppenheimer potentials, with a motivation similar in spirit to the powerful ideas and approximations of density functional theory. PMID:24790954

  9. Quantum theory for 1D X-ray free electron laser

    DOE PAGES

    Anisimov, Petr Mikhaylovich

    2017-09-19

    Classical 1D X-ray Free Electron Laser (X-ray FEL) theory has stood the test of time by guiding FEL design and development prior to any full-scale analysis. Future X-ray FELs and inverse-Compton sources, where photon recoil approaches an electron energy spread value, push the classical theory to its limits of applicability. After substantial efforts by the community to find what those limits are, there is no universally agreed upon quantum approach to design and development of future X-ray sources. We offer a new approach to formulate the quantum theory for 1D X-ray FELs that has an obvious connection to the classicalmore » theory, which allows for immediate transfer of knowledge between the two regimes. In conclusion, we exploit this connection in order to draw quantum mechanical conclusions about the quantum nature of electrons and generated radiation in terms of FEL variables.« less

  10. Bifurcation Analysis of an Electrostatically Actuated Nano-Beam Based on Modified Couple Stress Theory

    NASA Astrophysics Data System (ADS)

    Rezaei Kivi, Araz; Azizi, Saber; Norouzi, Peyman

    2017-12-01

    In this paper, the nonlinear size-dependent static and dynamic behavior of an electrostatically actuated nano-beam is investigated. A fully clamped nano-beam is considered for the modeling of the deformable electrode of the NEMS. The governing differential equation of the motion is derived using Hamiltonian principle based on couple stress theory; a non-classical theory for considering length scale effects. The nonlinear partial differential equation of the motion is discretized to a nonlinear Duffing type ODE's using Galerkin method. Static and dynamic pull-in instabilities obtained by both classical theory and MCST are compared. At the second stage of analysis, shooting technique is utilized to obtain the frequency response curve, and to capture the periodic solutions of the motion; the stability of the periodic solutions are gained by Floquet theory. The nonlinear dynamic behavior of the deformable electrode due to the AC harmonic accompanied with size dependency is investigated.

  11. Finite conformal quantum gravity and spacetime singularities

    NASA Astrophysics Data System (ADS)

    Modesto, Leonardo; Rachwał, Lesław

    2017-12-01

    We show that a class of finite quantum non-local gravitational theories is conformally invariant at classical as well as at quantum level. This is actually a range of conformal anomaly-free theories in the spontaneously broken phase of the Weyl symmetry. At classical level we show how the Weyl conformal invariance is able to tame all the spacetime singularities that plague not only Einstein gravity, but also local and weakly non-local higher derivative theories. The latter statement is proved by a singularity theorem that applies to a large class of weakly non-local theories. Therefore, we are entitled to look for a solution of the spacetime singularity puzzle in a missed symmetry of nature, namely the Weyl conformal symmetry. Following the seminal paper by Narlikar and Kembhavi, we provide an explicit construction of singularity-free black hole exact solutions in a class of conformally invariant theories.

  12. Calculating the spontaneous magnetization and defining the Curie temperature using a positive-feedback model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, R. G., E-mail: rgh@doe.carleton.ca

    2014-01-21

    A positive-feedback mean-field modification of the classical Brillouin magnetization theory provides an explanation of the apparent persistence of the spontaneous magnetization beyond the conventional Curie temperature—the little understood “tail” phenomenon that occurs in many ferromagnetic materials. The classical theory is unable to resolve this apparent anomaly. The modified theory incorporates the temperature-dependent quantum-scale hysteretic and mesoscopic domain-scale anhysteretic magnetization processes and includes the effects of demagnetizing and exchange fields. It is found that the thermal behavior of the reversible and irreversible segments of the hysteresis loops, as predicted by the theory, is a key to the presence or absence ofmore » the “tails.” The theory, which permits arbitrary values of the quantum spin number J, generally provides a quantitative agreement with the thermal variations of both the spontaneous magnetization and the shape of the hysteresis loop.« less

  13. Topics in quantum chaos

    NASA Astrophysics Data System (ADS)

    Jordan, Andrew Noble

    2002-09-01

    In this dissertation, we study the quantum mechanics of classically chaotic dynamical systems. We begin by considering the decoherence effects a quantum chaotic system has on a simple quantum few state system. Typical time evolution of a quantum system whose classical limit is chaotic generates structures in phase space whose size is much smaller than Planck's constant. A naive application of Heisenberg's uncertainty principle indicates that these structures are not physically relevant. However, if we take the quantum chaotic system in question to be an environment which interacts with a simple two state quantum system (qubit), we show that these small phase-space structures cause the qubit to generically lose quantum coherence if and only if the environment has many degrees of freedom, such as a dilute gas. This implies that many-body environments may be crucial for the phenomenon of quantum decoherence. Next, we turn to an analysis of statistical properties of time correlation functions and matrix elements of quantum chaotic systems. A semiclassical evaluation of matrix elements of an operator indicates that the dominant contribution will be related to a classical time correlation function over the energy surface. For a highly chaotic class of dynamics, these correlation functions may be decomposed into sums of Ruelle resonances, which control exponential decay to the ergodic distribution. The theory is illustrated both numerically and theoretically on the Baker map. For this system, we are able to isolate individual Ruelle modes. We further consider dynamical systems whose approach to ergodicity is given by a power law rather than an exponential in time. We propose a billiard with diffusive boundary conditions, whose classical solution may be calculated analytically. We go on to compare the exact solution with an approximation scheme, as well calculate asympotic corrections. Quantum spectral statistics are calculated assuming the validity of the Again, Altshuler and Andreev ansatz. We find singular behavior of the two point spectral correlator in the limit of small spacing. Finally, we analyse the effect that slow decay to ergodicity has on the structure of the quantum propagator, as well as wavefunction localization. We introduce a statistical quantum description of systems that are composed of both an orderly region and a random region. By averaging over the random region only, we find that measures of localization in momentum space semiclassically diverge with the dimension of the Hilbert space. We illustrate this numerically with quantum maps and suggest various other systems where this behavior should be important.

  14. Item Response Modeling with Sum Scores

    ERIC Educational Resources Information Center

    Johnson, Timothy R.

    2013-01-01

    One of the distinctions between classical test theory and item response theory is that the former focuses on sum scores and their relationship to true scores, whereas the latter concerns item responses and their relationship to latent scores. Although item response theory is often viewed as the richer of the two theories, sum scores are still…

  15. Test Theories, Educational Priorities and Reliability of Public Examinations in England

    ERIC Educational Resources Information Center

    Baird, Jo-Anne; Black, Paul

    2013-01-01

    Much has already been written on the controversies surrounding the use of different test theories in educational assessment. Other authors have noted the prevalence of classical test theory over item response theory in practice. This Special Issue draws together articles based upon work conducted on the Reliability Programme for England's…

  16. Recent developments in bimetric theory

    NASA Astrophysics Data System (ADS)

    Schmidt-May, Angnis; von Strauss, Mikael

    2016-05-01

    This review is dedicated to recent progress in the field of classical, interacting, massive spin-2 theories, with a focus on ghost-free bimetric theory. We will outline its history and its development as a nontrivial extension and generalisation of nonlinear massive gravity. We present a detailed discussion of the consistency proofs of both theories, before we review Einstein solutions to the bimetric equations of motion in vacuum as well as the resulting mass spectrum. We introduce couplings to matter and then discuss the general relativity and massive gravity limits of bimetric theory, which correspond to decoupling the massive or the massless spin-2 field from the matter sector, respectively. More general classical solutions are reviewed and the present status of bimetric cosmology is summarised. An interesting corner in the bimetric parameter space which could potentially give rise to a nonlinear theory for partially massless spin-2 fields is also discussed. Relations to higher-curvature theories of gravity are explained and finally we give an overview of possible extensions of the theory and review its formulation in terms of vielbeins.

  17. Assessing the Kansas water-level monitoring program: An example of the application of classical statistics to a geological problem

    USGS Publications Warehouse

    Davis, J.C.

    2000-01-01

    Geologists may feel that geological data are not amenable to statistical analysis, or at best require specialized approaches such as nonparametric statistics and geostatistics. However, there are many circumstances, particularly in systematic studies conducted for environmental or regulatory purposes, where traditional parametric statistical procedures can be beneficial. An example is the application of analysis of variance to data collected in an annual program of measuring groundwater levels in Kansas. Influences such as well conditions, operator effects, and use of the water can be assessed and wells that yield less reliable measurements can be identified. Such statistical studies have resulted in yearly improvements in the quality and reliability of the collected hydrologic data. Similar benefits may be achieved in other geological studies by the appropriate use of classical statistical tools.

  18. Spectral dimension controlling the decay of the quantum first-detection probability

    NASA Astrophysics Data System (ADS)

    Thiel, Felix; Kessler, David A.; Barkai, Eli

    2018-06-01

    We consider a quantum system that is initially localized at xin and that is repeatedly projectively probed with a fixed period τ at position xd. We ask for the probability Fn that the system is detected at xd for the very first time, where n is the number of detection attempts. We relate the asymptotic decay and oscillations of Fn with the system's energy spectrum, which is assumed to be absolutely continuous. In particular, Fn is determined by the Hamiltonian's measurement spectral density of states (MSDOS) f (E ) that is closely related to the density of energy states (DOS). We find that Fn decays like a power law whose exponent is determined by the power-law exponent dS of f (E ) around its singularities E*. Our findings are analogous to the classical first passage theory of random walks. In contrast to the classical case, the decay of Fn is accompanied by oscillations with frequencies that are determined by the singularities E*. This gives rise to critical detection periods τc at which the oscillations disappear. In the ordinary case dS can be identified with the spectral dimension associated with the DOS. Furthermore, the singularities E* are the van Hove singularities of the DOS in this case. We find that the asymptotic statistics of Fn depend crucially on the initial and detection state and can be wildly different for out-of-the-ordinary states, which is in sharp contrast to the classical theory. The properties of the first-detection probabilities can alternatively be derived from the transition amplitudes. All our results are confirmed by numerical simulations of the tight-binding model, and of a free particle in continuous space both with a normal and with an anomalous dispersion relation. We provide explicit asymptotic formulas for the first-detection probability in these models.

  19. Classical Aerodynamic Theory

    NASA Technical Reports Server (NTRS)

    Jones, R. T. (Compiler)

    1979-01-01

    A collection of papers on modern theoretical aerodynamics is presented. Included are theories of incompressible potential flow and research on the aerodynamic forces on wing and wing sections of aircraft and on airship hulls.

  20. Contextual Advantage for State Discrimination

    NASA Astrophysics Data System (ADS)

    Schmid, David; Spekkens, Robert W.

    2018-02-01

    Finding quantitative aspects of quantum phenomena which cannot be explained by any classical model has foundational importance for understanding the boundary between classical and quantum theory. It also has practical significance for identifying information processing tasks for which those phenomena provide a quantum advantage. Using the framework of generalized noncontextuality as our notion of classicality, we find one such nonclassical feature within the phenomenology of quantum minimum-error state discrimination. Namely, we identify quantitative limits on the success probability for minimum-error state discrimination in any experiment described by a noncontextual ontological model. These constraints constitute noncontextuality inequalities that are violated by quantum theory, and this violation implies a quantum advantage for state discrimination relative to noncontextual models. Furthermore, our noncontextuality inequalities are robust to noise and are operationally formulated, so that any experimental violation of the inequalities is a witness of contextuality, independently of the validity of quantum theory. Along the way, we introduce new methods for analyzing noncontextuality scenarios and demonstrate a tight connection between our minimum-error state discrimination scenario and a Bell scenario.

Top