Finite-block-length analysis in classical and quantum information theory.
Hayashi, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects.
Finite-block-length analysis in classical and quantum information theory
HAYASHI, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects. PMID:28302962
Generalized mutual information and Tsirelson's bound
NASA Astrophysics Data System (ADS)
Wakakuwa, Eyuri; Murao, Mio
2014-12-01
We introduce a generalization of the quantum mutual information between a classical system and a quantum system into the mutual information between a classical system and a system described by general probabilistic theories. We apply this generalized mutual information (GMI) to a derivation of Tsirelson's bound from information causality, and prove that Tsirelson's bound can be derived from the chain rule of the GMI. By using the GMI, we formulate the "no-supersignalling condition" (NSS), that the assistance of correlations does not enhance the capability of classical communication. We prove that NSS is never violated in any no-signalling theory.
Generalized mutual information and Tsirelson's bound
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wakakuwa, Eyuri; Murao, Mio
2014-12-04
We introduce a generalization of the quantum mutual information between a classical system and a quantum system into the mutual information between a classical system and a system described by general probabilistic theories. We apply this generalized mutual information (GMI) to a derivation of Tsirelson's bound from information causality, and prove that Tsirelson's bound can be derived from the chain rule of the GMI. By using the GMI, we formulate the 'no-supersignalling condition' (NSS), that the assistance of correlations does not enhance the capability of classical communication. We prove that NSS is never violated in any no-signalling theory.
de Bock, Élodie; Hardouin, Jean-Benoit; Blanchin, Myriam; Le Neel, Tanguy; Kubis, Gildas; Bonnaud-Antignac, Angélique; Dantan, Étienne; Sébille, Véronique
2016-10-01
The objective was to compare classical test theory and Rasch-family models derived from item response theory for the analysis of longitudinal patient-reported outcomes data with possibly informative intermittent missing items. A simulation study was performed in order to assess and compare the performance of classical test theory and Rasch model in terms of bias, control of the type I error and power of the test of time effect. The type I error was controlled for classical test theory and Rasch model whether data were complete or some items were missing. Both methods were unbiased and displayed similar power with complete data. When items were missing, Rasch model remained unbiased and displayed higher power than classical test theory. Rasch model performed better than the classical test theory approach regarding the analysis of longitudinal patient-reported outcomes with possibly informative intermittent missing items mainly for power. This study highlights the interest of Rasch-based models in clinical research and epidemiology for the analysis of incomplete patient-reported outcomes data. © The Author(s) 2013.
Comment on Gallistel: behavior theory and information theory: some parallels.
Nevin, John A
2012-05-01
In this article, Gallistel proposes information theory as an approach to some enduring problems in the study of operant and classical conditioning. Copyright © 2012 Elsevier B.V. All rights reserved.
Khrennikov, Andrei
2011-09-01
We propose a model of quantum-like (QL) processing of mental information. This model is based on quantum information theory. However, in contrast to models of "quantum physical brain" reducing mental activity (at least at the highest level) to quantum physical phenomena in the brain, our model matches well with the basic neuronal paradigm of the cognitive science. QL information processing is based (surprisingly) on classical electromagnetic signals induced by joint activity of neurons. This novel approach to quantum information is based on representation of quantum mechanics as a version of classical signal theory which was recently elaborated by the author. The brain uses the QL representation (QLR) for working with abstract concepts; concrete images are described by classical information theory. Two processes, classical and QL, are performed parallely. Moreover, information is actively transmitted from one representation to another. A QL concept given in our model by a density operator can generate a variety of concrete images given by temporal realizations of the corresponding (Gaussian) random signal. This signal has the covariance operator coinciding with the density operator encoding the abstract concept under consideration. The presence of various temporal scales in the brain plays the crucial role in creation of QLR in the brain. Moreover, in our model electromagnetic noise produced by neurons is a source of superstrong QL correlations between processes in different spatial domains in the brain; the binding problem is solved on the QL level, but with the aid of the classical background fluctuations. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Cappelleri, Joseph C; Jason Lundy, J; Hays, Ron D
2014-05-01
The US Food and Drug Administration's guidance for industry document on patient-reported outcomes (PRO) defines content validity as "the extent to which the instrument measures the concept of interest" (FDA, 2009, p. 12). According to Strauss and Smith (2009), construct validity "is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity" (p. 7). Hence, both qualitative and quantitative information are essential in evaluating the validity of measures. We review classical test theory and item response theory (IRT) approaches to evaluating PRO measures, including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized "difficulty" (severity) order of items is represented by observed responses. If a researcher has few qualitative data and wants to get preliminary information about the content validity of the instrument, then descriptive assessments using classical test theory should be the first step. As the sample size grows during subsequent stages of instrument development, confidence in the numerical estimates from Rasch and other IRT models (as well as those of classical test theory) would also grow. Classical test theory and IRT can be useful in providing a quantitative assessment of items and scales during the content-validity phase of PRO-measure development. Depending on the particular type of measure and the specific circumstances, the classical test theory and/or the IRT should be considered to help maximize the content validity of PRO measures. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.
Testing the Moral Algebra of Two Kohlbergian Informers
ERIC Educational Resources Information Center
Hommers, Wilfried; Lewand, Martin; Ehrmann, Dominic
2012-01-01
This paper seeks to unify two major theories of moral judgment: Kohlberg's stage theory and Anderson's moral information integration theory. Subjects were told about thoughts of actors in Kohlberg's classic altruistic Heinz dilemma and in a new egoistical dilemma. These actors's thoughts represented Kohlberg's stages I (Personal Risk) and IV…
Affine Isoperimetry and Information Theoretic Inequalities
ERIC Educational Resources Information Center
Lv, Songjun
2012-01-01
There are essential connections between the isoperimetric theory and information theoretic inequalities. In general, the Brunn-Minkowski inequality and the entropy power inequality, as well as the classical isoperimetric inequality and the classical entropy-moment inequality, turn out to be equivalent in some certain sense, respectively. Based on…
NASA Astrophysics Data System (ADS)
Lombardi, Olimpia; Fortin, Sebastian; Holik, Federico; López, Cristian
2017-04-01
Preface; Introduction; Part I. About the Concept of Information: 1. About the concept of information Sebastian Fortin and Olimpia Lombardi; 2. Representation, information, and theories of information Armond Duwell; 3. Information, communication, and manipulability Olimpia Lombardi and Cristian López; Part II. Information and quantum mechanics: 4. Quantum versus classical information Jeffrey Bub; 5. Quantum information and locality Dennis Dieks; 6. Pragmatic information in quantum mechanics Juan Roederer; 7. Interpretations of quantum theory: a map of madness Adán Cabello; Part III. Probability, Correlations, and Information: 8. On the tension between ontology and epistemology in quantum probabilities Amit Hagar; 9. Inferential versus dynamical conceptions of physics David Wallace; 10. Classical models for quantum information Federico Holik and Gustavo Martin Bosyk; 11. On the relative character of quantum correlations Guido Bellomo and Ángel Ricardo Plastino; Index.
Superadditivity of two quantum information resources
Nawareg, Mohamed; Muhammad, Sadiq; Horodecki, Pawel; Bourennane, Mohamed
2017-01-01
Entanglement is one of the most puzzling features of quantum theory and a principal resource for quantum information processing. It is well known that in classical information theory, the addition of two classical information resources will not lead to any extra advantages. On the contrary, in quantum information, a spectacular phenomenon of the superadditivity of two quantum information resources emerges. It shows that quantum entanglement, which was completely absent in any of the two resources separately, emerges as a result of combining them together. We present the first experimental demonstration of this quantum phenomenon with two photonic three-partite nondistillable entangled states shared between three parties Alice, Bob, and Charlie, where the entanglement was completely absent between Bob and Charlie. PMID:28951886
Fundamental finite key limits for one-way information reconciliation in quantum key distribution
NASA Astrophysics Data System (ADS)
Tomamichel, Marco; Martinez-Mateo, Jesus; Pacher, Christoph; Elkouss, David
2017-11-01
The security of quantum key distribution protocols is guaranteed by the laws of quantum mechanics. However, a precise analysis of the security properties requires tools from both classical cryptography and information theory. Here, we employ recent results in non-asymptotic classical information theory to show that one-way information reconciliation imposes fundamental limitations on the amount of secret key that can be extracted in the finite key regime. In particular, we find that an often used approximation for the information leakage during information reconciliation is not generally valid. We propose an improved approximation that takes into account finite key effects and numerically test it against codes for two probability distributions, that we call binary-binary and binary-Gaussian, that typically appear in quantum key distribution protocols.
An application of information theory to stochastic classical gravitational fields
NASA Astrophysics Data System (ADS)
Angulo, J.; Angulo, J. C.; Angulo, J. M.
2018-06-01
The objective of this study lies on the incorporation of the concepts developed in the Information Theory (entropy, complexity, etc.) with the aim of quantifying the variation of the uncertainty associated with a stochastic physical system resident in a spatiotemporal region. As an example of application, a relativistic classical gravitational field has been considered, with a stochastic behavior resulting from the effect induced by one or several external perturbation sources. One of the key concepts of the study is the covariance kernel between two points within the chosen region. Using this concept and the appropriate criteria, a methodology is proposed to evaluate the change of uncertainty at a given spatiotemporal point, based on available information and efficiently applying the diverse methods that Information Theory provides. For illustration, a stochastic version of the Einstein equation with an added Gaussian Langevin term is analyzed.
A quantum Rosetta Stone for the information paradox
NASA Astrophysics Data System (ADS)
Pando Zayas, Leopoldo A.
2014-11-01
The black hole information loss paradox epitomizes the contradictions between general relativity and quantum field theory. The AdS/conformal field theory (CFT) correspondence provides an implicit answer for the information loss paradox in black hole physics by equating a gravity theory with an explicitly unitary field theory. Gravitational collapse in asymptotically AdS spacetimes is generically turbulent. Given that the mechanism to read out the information about correlations functions in the field theory side is plagued by deterministic classical chaos, we argue that quantum chaos might provide the true Rosetta Stone for answering the information paradox in the context of the AdS/CFT correspondence.
Using Rasch Analysis to Inform Rating Scale Development
ERIC Educational Resources Information Center
Van Zile-Tamsen, Carol
2017-01-01
The use of surveys, questionnaires, and rating scales to measure important outcomes in higher education is pervasive, but reliability and validity information is often based on problematic Classical Test Theory approaches. Rasch Analysis, based on Item Response Theory, provides a better alternative for examining the psychometric quality of rating…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takeoka, Masahiro; Fujiwara, Mikio; Mizuno, Jun
2004-05-01
Quantum-information theory predicts that when the transmission resource is doubled in quantum channels, the amount of information transmitted can be increased more than twice by quantum-channel coding technique, whereas the increase is at most twice in classical information theory. This remarkable feature, the superadditive quantum-coding gain, can be implemented by appropriate choices of code words and corresponding quantum decoding which requires a collective quantum measurement. Recently, an experimental demonstration was reported [M. Fujiwara et al., Phys. Rev. Lett. 90, 167906 (2003)]. The purpose of this paper is to describe our experiment in detail. Particularly, a design strategy of quantum-collective decodingmore » in physical quantum circuits is emphasized. We also address the practical implication of the gain on communication performance by introducing the quantum-classical hybrid coding scheme. We show how the superadditive quantum-coding gain, even in a small code length, can boost the communication performance of conventional coding techniques.« less
Li, Tao; Zhang, Xiong; Zeng, Qiang; Wang, Bo; Zhang, Xiangdong
2018-04-30
The Clauser-Horne-Shimony-Holt (CHSH) inequality and the Klyachko-Can-Binicioglu-Shumovski (KCBS) inequality present a tradeoff on the no-disturbance (ND) principle. Recently, the fundamental monogamy relation between contextuality and nonlocality in quantum theory has been demonstrated experimentally. Here we show that such a relation and tradeoff can also be simulated in classical optical systems. Using polarization, path and orbital angular momentum of the classical optical beam, in classical optical experiment we have observed the stringent monogamy relation between the two inequalities by implementing the projection measurement. Our results show the application prospect of the concepts developed recently in quantum information science to classical optical system and optical information processing.
Coherent-state constellations and polar codes for thermal Gaussian channels
NASA Astrophysics Data System (ADS)
Lacerda, Felipe; Renes, Joseph M.; Scholz, Volkher B.
2017-06-01
Optical communication channels are ultimately quantum mechanical in nature, and we must therefore look beyond classical information theory to determine their communication capacity as well as to find efficient encoding and decoding schemes of the highest rates. Thermal channels, which arise from linear coupling of the field to a thermal environment, are of particular practical relevance; their classical capacity has been recently established, but their quantum capacity remains unknown. While the capacity sets the ultimate limit on reliable communication rates, it does not promise that such rates are achievable by practical means. Here we construct efficiently encodable codes for thermal channels which achieve the classical capacity and the so-called Gaussian coherent information for transmission of classical and quantum information, respectively. Our codes are based on combining polar codes with a discretization of the channel input into a finite "constellation" of coherent states. Encoding of classical information can be done using linear optics.
Epistemic View of Quantum States and Communication Complexity of Quantum Channels
NASA Astrophysics Data System (ADS)
Montina, Alberto
2012-09-01
The communication complexity of a quantum channel is the minimal amount of classical communication required for classically simulating a process of state preparation, transmission through the channel and subsequent measurement. It establishes a limit on the power of quantum communication in terms of classical resources. We show that classical simulations employing a finite amount of communication can be derived from a special class of hidden variable theories where quantum states represent statistical knowledge about the classical state and not an element of reality. This special class has attracted strong interest very recently. The communication cost of each derived simulation is given by the mutual information between the quantum state and the classical state of the parent hidden variable theory. Finally, we find that the communication complexity for single qubits is smaller than 1.28 bits. The previous known upper bound was 1.85 bits.
Modeling Conditional Probabilities in Complex Educational Assessments. CSE Technical Report.
ERIC Educational Resources Information Center
Mislevy, Robert J.; Almond, Russell; Dibello, Lou; Jenkins, Frank; Steinberg, Linda; Yan, Duanli; Senturk, Deniz
An active area in psychometric research is coordinated task design and statistical analysis built around cognitive models. Compared with classical test theory and item response theory, there is often less information from observed data about the measurement-model parameters. On the other hand, there is more information from the grounding…
NASA Astrophysics Data System (ADS)
Basiladze, S. G.
2017-05-01
The paper describes the general physical theory of signals, carriers of information, which supplements Shannon's abstract classical theory and is applicable in much broader fields, including nuclear physics. It is shown that in the absence of classical noise its place should be taken by the physical threshold of signal perception for objects of both macrocosm and microcosm. The signal perception threshold allows the presence of subthreshold (virtual) signal states. For these states, Boolean algebra of logic ( A = 0/1) is transformed into the "algebraic logic" of probabilities (0 ≤ a ≤ 1). The similarity and difference of virtual states of macroand microsignals are elucidated. "Real" and "quantum" information for computers is considered briefly. The maximum information transmission rate is estimated based on physical constants.
Information Theoretic Characterization of Physical Theories with Projective State Space
NASA Astrophysics Data System (ADS)
Zaopo, Marco
2015-08-01
Probabilistic theories are a natural framework to investigate the foundations of quantum theory and possible alternative or deeper theories. In a generic probabilistic theory, states of a physical system are represented as vectors of outcomes probabilities and state spaces are convex cones. In this picture the physics of a given theory is related to the geometric shape of the cone of states. In quantum theory, for instance, the shape of the cone of states corresponds to a projective space over complex numbers. In this paper we investigate geometric constraints on the state space of a generic theory imposed by the following information theoretic requirements: every non completely mixed state of a system is perfectly distinguishable from some other state in a single shot measurement; information capacity of physical systems is conserved under making mixtures of states. These assumptions guarantee that a generic physical system satisfies a natural principle asserting that the more a state of the system is mixed the less information can be stored in the system using that state as logical value. We show that all theories satisfying the above assumptions are such that the shape of their cones of states is that of a projective space over a generic field of numbers. Remarkably, these theories constitute generalizations of quantum theory where superposition principle holds with coefficients pertaining to a generic field of numbers in place of complex numbers. If the field of numbers is trivial and contains only one element we obtain classical theory. This result tells that superposition principle is quite common among probabilistic theories while its absence gives evidence of either classical theory or an implausible theory.
Quantum Information as a Non-Kolmogorovian Generalization of Shannon's Theory
NASA Astrophysics Data System (ADS)
Holik, Federico; Bosyk, Gustavo; Bellomo, Guido
2015-10-01
In this article we discuss the formal structure of a generalized information theory based on the extension of the probability calculus of Kolmogorov to a (possibly) non-commutative setting. By studying this framework, we argue that quantum information can be considered as a particular case of a huge family of non-commutative extensions of its classical counterpart. In any conceivable information theory, the possibility of dealing with different kinds of information measures plays a key role. Here, we generalize a notion of state spectrum, allowing us to introduce a majorization relation and a new family of generalized entropic measures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradler, Kamil; Hayden, Patrick; Touchette, Dave
Coding theorems in quantum Shannon theory express the ultimate rates at which a sender can transmit information over a noisy quantum channel. More often than not, the known formulas expressing these transmission rates are intractable, requiring an optimization over an infinite number of uses of the channel. Researchers have rarely found quantum channels with a tractable classical or quantum capacity, but when such a finding occurs, it demonstrates a complete understanding of that channel's capabilities for transmitting classical or quantum information. Here we show that the three-dimensional capacity region for entanglement-assisted transmission of classical and quantum information is tractable formore » the Hadamard class of channels. Examples of Hadamard channels include generalized dephasing channels, cloning channels, and the Unruh channel. The generalized dephasing channels and the cloning channels are natural processes that occur in quantum systems through the loss of quantum coherence or stimulated emission, respectively. The Unruh channel is a noisy process that occurs in relativistic quantum information theory as a result of the Unruh effect and bears a strong relationship to the cloning channels. We give exact formulas for the entanglement-assisted classical and quantum communication capacity regions of these channels. The coding strategy for each of these examples is superior to a naieve time-sharing strategy, and we introduce a measure to determine this improvement.« less
Weiss, Fran
2018-05-07
This article examines psychological sequelae underlying dysregulated eating in the overweight and obese patient and proposes a psychotherapy approach informed by classical and modern attachment theory, developmental trauma, and neuroscience to address these structural deficits.
Quantum Approach to Informatics
NASA Astrophysics Data System (ADS)
Stenholm, Stig; Suominen, Kalle-Antti
2005-08-01
An essential overview of quantum information Information, whether inscribed as a mark on a stone tablet or encoded as a magnetic domain on a hard drive, must be stored in a physical object and thus made subject to the laws of physics. Traditionally, information processing such as computation occurred in a framework governed by laws of classical physics. However, information can also be stored and processed using the states of matter described by non-classical quantum theory. Understanding this quantum information, a fundamentally different type of information, has been a major project of physicists and information theorists in recent years, and recent experimental research has started to yield promising results. Quantum Approach to Informatics fills the need for a concise introduction to this burgeoning new field, offering an intuitive approach for readers in both the physics and information science communities, as well as in related fields. Only a basic background in quantum theory is required, and the text keeps the focus on bringing this theory to bear on contemporary informatics. Instead of proofs and other highly formal structures, detailed examples present the material, making this a uniquely accessible introduction to quantum informatics. Topics covered include: * An introduction to quantum information and the qubit * Concepts and methods of quantum theory important for informatics * The application of information concepts to quantum physics * Quantum information processing and computing * Quantum gates * Error correction using quantum-based methods * Physical realizations of quantum computing circuits A helpful and economical resource for understanding this exciting new application of quantum theory to informatics, Quantum Approach to Informatics provides students and researchers in physics and information science, as well as other interested readers with some scientific background, with an essential overview of the field.
Amplification in Technical Manuals: Theory and Practice.
ERIC Educational Resources Information Center
Killingsworth, M. Jimmie; And Others
1989-01-01
Examines how amplification (rhetorical techniques by which discourse is extended to enhance its appeal and information value) tends to increase and improve the coverage, rationale, warnings, behavioral alternatives, examples, previews, and general emphasis of technical manuals. Shows how classical and modern rhetorical theories can be applied to…
NASA Astrophysics Data System (ADS)
Blume-Kohout, Robin; Zurek, Wojciech H.
2006-06-01
We lay a comprehensive foundation for the study of redundant information storage in decoherence processes. Redundancy has been proposed as a prerequisite for objectivity, the defining property of classical objects. We consider two ensembles of states for a model universe consisting of one system and many environments: the first consisting of arbitrary states, and the second consisting of “singly branching” states consistent with a simple decoherence model. Typical states from the random ensemble do not store information about the system redundantly, but information stored in branching states has a redundancy proportional to the environment’s size. We compute the specific redundancy for a wide range of model universes, and fit the results to a simple first-principles theory. Our results show that the presence of redundancy divides information about the system into three parts: classical (redundant); purely quantum; and the borderline, undifferentiated or “nonredundant,” information.
JOURNAL SCOPE GUIDELINES: Paper classification scheme
NASA Astrophysics Data System (ADS)
2005-06-01
This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas
The Information Function for the One-Parameter Logistic Model: Is it Reliability?
ERIC Educational Resources Information Center
Doran, Harold C.
2005-01-01
The information function is an important statistic in item response theory (IRT) applications. Although the information function is often described as the IRT version of reliability, it differs from the classical notion of reliability from a critical perspective: replication. This article first explores the information function for the…
ERIC Educational Resources Information Center
Yelboga, Atilla; Tavsancil, Ezel
2010-01-01
In this research, the classical test theory and generalizability theory analyses were carried out with the data obtained by a job performance scale for the years 2005 and 2006. The reliability coefficients obtained (estimated) from the classical test theory and generalizability theory analyses were compared. In classical test theory, test retest…
Baladrón, Carlos; Khrennikov, Andrei
2016-12-01
The similarities between biological and physical systems as respectively defined in quantum information biology (QIB) and in a Darwinian approach to quantum mechanics (DAQM) have been analysed. In both theories the processing of information is a central feature characterising the systems. The analysis highlights a mutual support on the thesis contended by each theory. On the one hand, DAQM provides a physical basis that might explain the key role played by quantum information at the macroscopic level for bio-systems in QIB. On the other hand, QIB offers the possibility, acting as a macroscopic testing ground, to analyse the emergence of quantumness from classicality in the terms held by DAQM. As an added result of the comparison, a tentative definition of quantum information in terms of classical information flows has been proposed. The quantum formalism would appear from this comparative analysis between QIB and DAQM as an optimal information scheme that would maximise the stability of biological and physical systems at any scale. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Quantum and classical behavior in interacting bosonic systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hertzberg, Mark P.
It is understood that in free bosonic theories, the classical field theory accurately describes the full quantum theory when the occupancy numbers of systems are very large. However, the situation is less understood in interacting theories, especially on time scales longer than the dynamical relaxation time. Recently there have been claims that the quantum theory deviates spectacularly from the classical theory on this time scale, even if the occupancy numbers are extremely large. Furthermore, it is claimed that the quantum theory quickly thermalizes while the classical theory does not. The evidence for these claims comes from noticing a spectacular differencemore » in the time evolution of expectation values of quantum operators compared to the classical micro-state evolution. If true, this would have dramatic consequences for many important phenomena, including laboratory studies of interacting BECs, dark matter axions, preheating after inflation, etc. In this work we critically examine these claims. We show that in fact the classical theory can describe the quantum behavior in the high occupancy regime, even when interactions are large. The connection is that the expectation values of quantum operators in a single quantum micro-state are approximated by a corresponding classical ensemble average over many classical micro-states. Furthermore, by the ergodic theorem, a classical ensemble average of local fields with statistical translation invariance is the spatial average of a single micro-state. So the correlation functions of the quantum and classical field theories of a single micro-state approximately agree at high occupancy, even in interacting systems. Furthermore, both quantum and classical field theories can thermalize, when appropriate coarse graining is introduced, with the classical case requiring a cutoff on low occupancy UV modes. We discuss applications of our results.« less
Computation in generalised probabilisitic theories
NASA Astrophysics Data System (ADS)
Lee, Ciarán M.; Barrett, Jonathan
2015-08-01
From the general difficulty of simulating quantum systems using classical systems, and in particular the existence of an efficient quantum algorithm for factoring, it is likely that quantum computation is intrinsically more powerful than classical computation. At present, the best upper bound known for the power of quantum computation is that {{BQP}}\\subseteq {{AWPP}}, where {{AWPP}} is a classical complexity class (known to be included in {{PP}}, hence {{PSPACE}}). This work investigates limits on computational power that are imposed by simple physical, or information theoretic, principles. To this end, we define a circuit-based model of computation in a class of operationally-defined theories more general than quantum theory, and ask: what is the minimal set of physical assumptions under which the above inclusions still hold? We show that given only an assumption of tomographic locality (roughly, that multipartite states and transformations can be characterized by local measurements), efficient computations are contained in {{AWPP}}. This inclusion still holds even without assuming a basic notion of causality (where the notion is, roughly, that probabilities for outcomes cannot depend on future measurement choices). Following Aaronson, we extend the computational model by allowing post-selection on measurement outcomes. Aaronson showed that the corresponding quantum complexity class, {{PostBQP}}, is equal to {{PP}}. Given only the assumption of tomographic locality, the inclusion in {{PP}} still holds for post-selected computation in general theories. Hence in a world with post-selection, quantum theory is optimal for computation in the space of all operational theories. We then consider whether one can obtain relativized complexity results for general theories. It is not obvious how to define a sensible notion of a computational oracle in the general framework that reduces to the standard notion in the quantum case. Nevertheless, it is possible to define computation relative to a ‘classical oracle’. Then, we show there exists a classical oracle relative to which efficient computation in any theory satisfying the causality assumption does not include {{NP}}.
The Nature of Quantum Truth: Logic, Set Theory, & Mathematics in the Context of Quantum Theory
NASA Astrophysics Data System (ADS)
Frey, Kimberly
The purpose of this dissertation is to construct a radically new type of mathematics whose underlying logic differs from the ordinary classical logic used in standard mathematics, and which we feel may be more natural for applications in quantum mechanics. Specifically, we begin by constructing a first order quantum logic, the development of which closely parallels that of ordinary (classical) first order logic --- the essential differences are in the nature of the logical axioms, which, in our construction, are motivated by quantum theory. After showing that the axiomatic first order logic we develop is sound and complete (with respect to a particular class of models), this logic is then used as a foundation on which to build (axiomatic) mathematical systems --- and we refer to the resulting new mathematics as "quantum mathematics." As noted above, the hope is that this form of mathematics is more natural than classical mathematics for the description of quantum systems, and will enable us to address some foundational aspects of quantum theory which are still troublesome --- e.g. the measurement problem --- as well as possibly even inform our thinking about quantum gravity. After constructing the underlying logic, we investigate properties of several mathematical systems --- e.g. axiom systems for abstract algebras, group theory, linear algebra, etc. --- in the presence of this quantum logic. In the process, we demonstrate that the resulting quantum mathematical systems have some strange, but very interesting features, which indicates a richness in the structure of mathematics that is classically inaccessible. Moreover, some of these features do indeed suggest possible applications to foundational questions in quantum theory. We continue our investigation of quantum mathematics by constructing an axiomatic quantum set theory, which we show satisfies certain desirable criteria. Ultimately, we hope that such a set theory will lead to a foundation for quantum mathematics in a sense which parallels the foundational role of classical set theory in classical mathematics. One immediate application of the quantum set theory we develop is to provide a foundation on which to construct quantum natural numbers, which are the quantum analog of the classical counting numbers. It turns out that in a special class of models, there exists a 1-1 correspondence between the quantum natural numbers and bounded observables in quantum theory whose eigenvalues are (ordinary) natural numbers. This 1-1 correspondence is remarkably satisfying, and not only gives us great confidence in our quantum set theory, but indicates the naturalness of such models for quantum theory itself. We go on to develop a Peano-like arithmetic for these new "numbers," as well as consider some of its consequences. Finally, we conclude by summarizing our results, and discussing directions for future work.
Classical and Contemporary Approaches for Moral Development
ERIC Educational Resources Information Center
Cam, Zekeriya; Seydoogullari, Sedef; Cavdar, Duygu; Cok, Figen
2012-01-01
Most of the information in the moral development literature depends on Theories of Piaget and Kohlberg. The theoretical contribution by Gilligan and Turiel are not widely known and not much resource is available in Turkish. For this reason introducing and discussing the theories of Gilligan and Turiel and more comprehensive perspective for moral…
ERIC Educational Resources Information Center
Ismail, Yilmaz
2017-01-01
This study reveals the transformation of prospective science teachers into knowledgeable individuals through classical, combination, and information theories. It distinguishes between knowledge and success, and between knowledge levels and success levels calculated each through three theories. The relation between the knowledge of prospective…
Classical theory of atom-surface scattering: The rainbow effect
NASA Astrophysics Data System (ADS)
Miret-Artés, Salvador; Pollak, Eli
2012-07-01
The scattering of heavy atoms and molecules from surfaces is oftentimes dominated by classical mechanics. A large body of experiments have gathered data on the angular distributions of the scattered species, their energy loss distribution, sticking probability, dependence on surface temperature and more. For many years these phenomena have been considered theoretically in the framework of the “washboard model” in which the interaction of the incident particle with the surface is described in terms of hard wall potentials. Although this class of models has helped in elucidating some of the features it left open many questions such as: true potentials are clearly not hard wall potentials, it does not provide a realistic framework for phonon scattering, and it cannot explain the incident angle and incident energy dependence of rainbow scattering, nor can it provide a consistent theory for sticking. In recent years we have been developing a classical perturbation theory approach which has provided new insight into the dynamics of atom-surface scattering. The theory includes both surface corrugation as well as interaction with surface phonons in terms of harmonic baths which are linearly coupled to the system coordinates. This model has been successful in elucidating many new features of rainbow scattering in terms of frictions and bath fluctuations or noise. It has also given new insight into the origins of asymmetry in atomic scattering from surfaces. New phenomena deduced from the theory include friction induced rainbows, energy loss rainbows, a theory of super-rainbows, and more. In this review we present the classical theory of atom-surface scattering as well as extensions and implications for semiclassical scattering and the further development of a quantum theory of surface scattering. Special emphasis is given to the inversion of scattering data into information on the particle-surface interactions.
Classical theory of atom-surface scattering: The rainbow effect
NASA Astrophysics Data System (ADS)
Miret-Artés, Salvador; Pollak, Eli
The scattering of heavy atoms and molecules from surfaces is oftentimes dominated by classical mechanics. A large body of experiments have gathered data on the angular distributions of the scattered species, their energy loss distribution, sticking probability, dependence on surface temperature and more. For many years these phenomena have been considered theoretically in the framework of the "washboard model" in which the interaction of the incident particle with the surface is described in terms of hard wall potentials. Although this class of models has helped in elucidating some of the features it left open many questions such as: true potentials are clearly not hard wall potentials, it does not provide a realistic framework for phonon scattering, and it cannot explain the incident angle and incident energy dependence of rainbow scattering, nor can it provide a consistent theory for sticking. In recent years we have been developing a classical perturbation theory approach which has provided new insight into the dynamics of atom-surface scattering. The theory includes both surface corrugation as well as interaction with surface phonons in terms of harmonic baths which are linearly coupled to the system coordinates. This model has been successful in elucidating many new features of rainbow scattering in terms of frictions and bath fluctuations or noise. It has also given new insight into the origins of asymmetry in atomic scattering from surfaces. New phenomena deduced from the theory include friction induced rainbows, energy loss rainbows, a theory of super-rainbows, and more. In this review we present the classical theory of atom-surface scattering as well as extensions and implications for semiclassical scattering and the further development of a quantum theory of surface scattering. Special emphasis is given to the inversion of scattering data into information on the particle-surface interactions.
Cappelleri, Joseph C.; Lundy, J. Jason; Hays, Ron D.
2014-01-01
Introduction The U.S. Food and Drug Administration’s patient-reported outcome (PRO) guidance document defines content validity as “the extent to which the instrument measures the concept of interest” (FDA, 2009, p. 12). “Construct validity is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity” (Strauss & Smith, 2009, p. 7). Hence both qualitative and quantitative information are essential in evaluating the validity of measures. Methods We review classical test theory and item response theory approaches to evaluating PRO measures including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized “difficulty” (severity) order of items is represented by observed responses. Conclusion Classical test theory and item response theory can be useful in providing a quantitative assessment of items and scales during the content validity phase of patient-reported outcome measures. Depending on the particular type of measure and the specific circumstances, either one or both approaches should be considered to help maximize the content validity of PRO measures. PMID:24811753
Contextual Advantage for State Discrimination
NASA Astrophysics Data System (ADS)
Schmid, David; Spekkens, Robert W.
2018-02-01
Finding quantitative aspects of quantum phenomena which cannot be explained by any classical model has foundational importance for understanding the boundary between classical and quantum theory. It also has practical significance for identifying information processing tasks for which those phenomena provide a quantum advantage. Using the framework of generalized noncontextuality as our notion of classicality, we find one such nonclassical feature within the phenomenology of quantum minimum-error state discrimination. Namely, we identify quantitative limits on the success probability for minimum-error state discrimination in any experiment described by a noncontextual ontological model. These constraints constitute noncontextuality inequalities that are violated by quantum theory, and this violation implies a quantum advantage for state discrimination relative to noncontextual models. Furthermore, our noncontextuality inequalities are robust to noise and are operationally formulated, so that any experimental violation of the inequalities is a witness of contextuality, independently of the validity of quantum theory. Along the way, we introduce new methods for analyzing noncontextuality scenarios and demonstrate a tight connection between our minimum-error state discrimination scenario and a Bell scenario.
Diagrammar in classical scalar field theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cattaruzza, E., E-mail: Enrico.Cattaruzza@gmail.com; Gozzi, E., E-mail: gozzi@ts.infn.it; INFN, Sezione di Trieste
2011-09-15
In this paper we analyze perturbatively a g{phi}{sup 4}classical field theory with and without temperature. In order to do that, we make use of a path-integral approach developed some time ago for classical theories. It turns out that the diagrams appearing at the classical level are many more than at the quantum level due to the presence of extra auxiliary fields in the classical formalism. We shall show that a universal supersymmetry present in the classical path-integral mentioned above is responsible for the cancelation of various diagrams. The same supersymmetry allows the introduction of super-fields and super-diagrams which considerably simplifymore » the calculations and make the classical perturbative calculations almost 'identical' formally to the quantum ones. Using the super-diagrams technique, we develop the classical perturbation theory up to third order. We conclude the paper with a perturbative check of the fluctuation-dissipation theorem. - Highlights: > We provide the Feynman diagrams of perturbation theory for a classical field theory. > We give a super-formalism which links the quantum diagrams to the classical ones. > We check perturbatively the fluctuation-dissipation theorem.« less
Horizons of description: Black holes and complementarity
NASA Astrophysics Data System (ADS)
Bokulich, Peter Joshua Martin
Niels Bohr famously argued that a consistent understanding of quantum mechanics requires a new epistemic framework, which he named complementarity . This position asserts that even in the context of quantum theory, classical concepts must be used to understand and communicate measurement results. The apparent conflict between certain classical descriptions is avoided by recognizing that their application now crucially depends on the measurement context. Recently it has been argued that a new form of complementarity can provide a solution to the so-called information loss paradox. Stephen Hawking argues that the evolution of black holes cannot be described by standard unitary quantum evolution, because such evolution always preserves information, while the evaporation of a black hole will imply that any information that fell into it is irrevocably lost---hence a "paradox." Some researchers in quantum gravity have argued that this paradox can be resolved if one interprets certain seemingly incompatible descriptions of events around black holes as instead being complementary. In this dissertation I assess the extent to which this black hole complementarity can be undergirded by Bohr's account of the limitations of classical concepts. I begin by offering an interpretation of Bohr's complementarity and the role that it plays in his philosophy of quantum theory. After clarifying the nature of classical concepts, I offer an account of the limitations these concepts face, and argue that Bohr's appeal to disturbance is best understood as referring to these conceptual limits. Following preparatory chapters on issues in quantum field theory and black hole mechanics, I offer an analysis of the information loss paradox and various responses to it. I consider the three most prominent accounts of black hole complementarity and argue that they fail to offer sufficient justification for the proposed incompatibility between descriptions. The lesson that emerges from this dissertation is that we have as much to learn from the limitations facing our scientific descriptions as we do from the successes they enjoy. Because all of our scientific theories offer at best limited, effective accounts of the world, an important part of our interpretive efforts will be assessing the borders of these domains of description.
Quantum information theory of the Bell-state quantum eraser
NASA Astrophysics Data System (ADS)
Glick, Jennifer R.; Adami, Christoph
2017-01-01
Quantum systems can display particle- or wavelike properties, depending on the type of measurement that is performed on them. The Bell-state quantum eraser is an experiment that brings the duality to the forefront, as a single measurement can retroactively be made to measure particlelike or wavelike properties (or anything in between). Here we develop a unitary information-theoretic description of this and several related quantum measurement situations that sheds light on the trade-off between the quantum and classical features of the measurement. In particular, we show that both the coherence of the quantum state and the classical information obtained from it can be described using only quantum-information-theoretic tools and that those two measures satisfy an equality on account of the chain rule for entropies. The coherence information and the which-path information have simple interpretations in terms of state preparation and state determination and suggest ways to account for the relationship between the classical and the quantum world.
Generation of a non-zero discord bipartite state with classical second-order interference.
Choi, Yujun; Hong, Kang-Hee; Lim, Hyang-Tag; Yune, Jiwon; Kwon, Osung; Han, Sang-Wook; Oh, Kyunghwan; Kim, Yoon-Ho; Kim, Yong-Su; Moon, Sung
2017-02-06
We report an investigation on quantum discord in classical second-order interference. In particular, we theoretically show that a bipartite state with D = 0.311 of discord can be generated via classical second-order interference. We also experimentally verify the theory by obtaining D = 0.197 ± 0.060 of non-zero discord state. Together with the fact that the nonclassicalities originated from physical constraints and information theoretic perspectives are not equivalent, this result provides an insight to understand the nature of quantum discord.
Relevance of a Managerial Decision-Model to Educational Administration.
ERIC Educational Resources Information Center
Lundin, Edward.; Welty, Gordon
The rational model of classical economic theory assumes that the decision maker has complete information on alternatives and consequences, and that he chooses the alternative that maximizes expected utility. This model does not allow for constraints placed on the decision maker resulting from lack of information, organizational pressures,…
Raykov, Tenko; Marcoulides, George A
2016-04-01
The frequently neglected and often misunderstood relationship between classical test theory and item response theory is discussed for the unidimensional case with binary measures and no guessing. It is pointed out that popular item response models can be directly obtained from classical test theory-based models by accounting for the discrete nature of the observed items. Two distinct observational equivalence approaches are outlined that render the item response models from corresponding classical test theory-based models, and can each be used to obtain the former from the latter models. Similarly, classical test theory models can be furnished using the reverse application of either of those approaches from corresponding item response models.
INFORMATION-THEORETIC INEQUALITIES ON UNIMODULAR LIE GROUPS
Chirikjian, Gregory S.
2010-01-01
Classical inequalities used in information theory such as those of de Bruijn, Fisher, Cramér, Rao, and Kullback carry over in a natural way from Euclidean space to unimodular Lie groups. These are groups that possess an integration measure that is simultaneously invariant under left and right shifts. All commutative groups are unimodular. And even in noncommutative cases unimodular Lie groups share many of the useful features of Euclidean space. The rotation and Euclidean motion groups, which are perhaps the most relevant Lie groups to problems in geometric mechanics, are unimodular, as are the unitary groups that play important roles in quantum computing. The extension of core information theoretic inequalities defined in the setting of Euclidean space to this broad class of Lie groups is potentially relevant to a number of problems relating to information gathering in mobile robotics, satellite attitude control, tomographic image reconstruction, biomolecular structure determination, and quantum information theory. In this paper, several definitions are extended from the Euclidean setting to that of Lie groups (including entropy and the Fisher information matrix), and inequalities analogous to those in classical information theory are derived and stated in the form of fifteen small theorems. In all such inequalities, addition of random variables is replaced with the group product, and the appropriate generalization of convolution of probability densities is employed. An example from the field of robotics demonstrates how several of these results can be applied to quantify the amount of information gained by pooling different sensory inputs. PMID:21113416
Enhanced Communication with the Assistance of Indefinite Causal Order
NASA Astrophysics Data System (ADS)
Ebler, Daniel; Salek, Sina; Chiribella, Giulio
2018-03-01
In quantum Shannon theory, the way information is encoded and decoded takes advantage of the laws of quantum mechanics, while the way communication channels are interlinked is assumed to be classical. In this Letter, we relax the assumption that quantum channels are combined classically, showing that a quantum communication network where quantum channels are combined in a superposition of different orders can achieve tasks that are impossible in conventional quantum Shannon theory. In particular, we show that two identical copies of a completely depolarizing channel become able to transmit information when they are combined in a quantum superposition of two alternative orders. This finding runs counter to the intuition that if two communication channels are identical, using them in different orders should not make any difference. The failure of such intuition stems from the fact that a single noisy channel can be a random mixture of elementary, noncommuting processes, whose order (or lack thereof) can affect the ability to transmit information.
Enhanced Communication with the Assistance of Indefinite Causal Order.
Ebler, Daniel; Salek, Sina; Chiribella, Giulio
2018-03-23
In quantum Shannon theory, the way information is encoded and decoded takes advantage of the laws of quantum mechanics, while the way communication channels are interlinked is assumed to be classical. In this Letter, we relax the assumption that quantum channels are combined classically, showing that a quantum communication network where quantum channels are combined in a superposition of different orders can achieve tasks that are impossible in conventional quantum Shannon theory. In particular, we show that two identical copies of a completely depolarizing channel become able to transmit information when they are combined in a quantum superposition of two alternative orders. This finding runs counter to the intuition that if two communication channels are identical, using them in different orders should not make any difference. The failure of such intuition stems from the fact that a single noisy channel can be a random mixture of elementary, noncommuting processes, whose order (or lack thereof) can affect the ability to transmit information.
ERIC Educational Resources Information Center
Algesheimer, René; Bagozzi, Richard P.; Dholakia, Utpal M.
2018-01-01
We offer a new conceptualization and measurement models for constructs at the group-level of analysis in small group research. The conceptualization starts with classical notions of group behavior proposed by Tönnies, Simmel, and Weber and then draws upon plural subject theory by philosophers Gilbert and Tuomela to frame a new perspective…
ERIC Educational Resources Information Center
Raykov, Tenko; Marcoulides, George A.
2016-01-01
The frequently neglected and often misunderstood relationship between classical test theory and item response theory is discussed for the unidimensional case with binary measures and no guessing. It is pointed out that popular item response models can be directly obtained from classical test theory-based models by accounting for the discrete…
ERIC Educational Resources Information Center
Peeraer, Jef; Van Petegem, Peter
2012-01-01
This research describes the development and validation of an instrument to measure integration of Information and Communication Technology (ICT) in education. After literature research on definitions of integration of ICT in education, a comparison is made between the classical test theory and the item response modeling approach for the…
Fundamental theories of waves and particles formulated without classical mass
NASA Astrophysics Data System (ADS)
Fry, J. L.; Musielak, Z. E.
2010-12-01
Quantum and classical mechanics are two conceptually and mathematically different theories of physics, and yet they do use the same concept of classical mass that was originally introduced by Newton in his formulation of the laws of dynamics. In this paper, physical consequences of using the classical mass by both theories are explored, and a novel approach that allows formulating fundamental (Galilean invariant) theories of waves and particles without formally introducing the classical mass is presented. In this new formulation, the theories depend only on one common parameter called 'wave mass', which is deduced from experiments for selected elementary particles and for the classical mass of one kilogram. It is shown that quantum theory with the wave mass is independent of the Planck constant and that higher accuracy of performing calculations can be attained by such theory. Natural units in connection with the presented approach are also discussed and justification beyond dimensional analysis is given for the particular choice of such units.
The contrasting roles of Planck's constant in classical and quantum theories
NASA Astrophysics Data System (ADS)
Boyer, Timothy H.
2018-04-01
We trace the historical appearance of Planck's constant in physics, and we note that initially the constant did not appear in connection with quanta. Furthermore, we emphasize that Planck's constant can appear in both classical and quantum theories. In both theories, Planck's constant sets the scale of atomic phenomena. However, the roles played in the foundations of the theories are sharply different. In quantum theory, Planck's constant is crucial to the structure of the theory. On the other hand, in classical electrodynamics, Planck's constant is optional, since it appears only as the scale factor for the (homogeneous) source-free contribution to the general solution of Maxwell's equations. Since classical electrodynamics can be solved while taking the homogenous source-free contribution in the solution as zero or non-zero, there are naturally two different theories of classical electrodynamics, one in which Planck's constant is taken as zero and one where it is taken as non-zero. The textbooks of classical electromagnetism present only the version in which Planck's constant is taken to vanish.
Classical geometry to quantum behavior correspondence in a virtual extra dimension
NASA Astrophysics Data System (ADS)
Dolce, Donatello
2012-09-01
In the Lorentz invariant formalism of compact space-time dimensions the assumption of periodic boundary conditions represents a consistent semi-classical quantization condition for relativistic fields. In Dolce (2011) [18] we have shown, for instance, that the ordinary Feynman path integral is obtained from the interference between the classical paths with different winding numbers associated with the cyclic dynamics of the field solutions. By means of the boundary conditions, the kinematical information of interactions can be encoded on the relativistic geometrodynamics of the boundary, see Dolce (2012) [8]. Furthermore, such a purely four-dimensional theory is manifestly dual to an extra-dimensional field theory. The resulting correspondence between extra-dimensional geometrodynamics and ordinary quantum behavior can be interpreted in terms of AdS/CFT correspondence. By applying this approach to a simple Quark-Gluon-Plasma freeze-out model we obtain fundamental analogies with basic aspects of AdS/QCD phenomenology.
No extension of quantum theory can have improved predictive power.
Colbeck, Roger; Renner, Renato
2011-08-02
According to quantum theory, measurements generate random outcomes, in stark contrast with classical mechanics. This raises the question of whether there could exist an extension of the theory that removes this indeterminism, as suspected by Einstein, Podolsky and Rosen. Although this has been shown to be impossible, existing results do not imply that the current theory is maximally informative. Here we ask the more general question of whether any improved predictions can be achieved by any extension of quantum theory. Under the assumption that measurements can be chosen freely, we answer this question in the negative: no extension of quantum theory can give more information about the outcomes of future measurements than quantum theory itself. Our result has significance for the foundations of quantum mechanics, as well as applications to tasks that exploit the inherent randomness in quantum theory, such as quantum cryptography.
No extension of quantum theory can have improved predictive power
Colbeck, Roger; Renner, Renato
2011-01-01
According to quantum theory, measurements generate random outcomes, in stark contrast with classical mechanics. This raises the question of whether there could exist an extension of the theory that removes this indeterminism, as suspected by Einstein, Podolsky and Rosen. Although this has been shown to be impossible, existing results do not imply that the current theory is maximally informative. Here we ask the more general question of whether any improved predictions can be achieved by any extension of quantum theory. Under the assumption that measurements can be chosen freely, we answer this question in the negative: no extension of quantum theory can give more information about the outcomes of future measurements than quantum theory itself. Our result has significance for the foundations of quantum mechanics, as well as applications to tasks that exploit the inherent randomness in quantum theory, such as quantum cryptography. PMID:21811240
Taking-On: A Grounded Theory of Addressing Barriers in Task Completion
ERIC Educational Resources Information Center
Austinson, Julie Ann
2011-01-01
This study of taking-on was conducted using classical grounded theory methodology (Glaser, 1978, 1992, 1998, 2001, 2005; Glaser & Strauss, 1967). Classical grounded theory is inductive, empirical, and naturalistic; it does not utilize manipulation or constrained time frames. Classical grounded theory is a systemic research method used to generate…
Non-Kolmogorovian Approach to the Context-Dependent Systems Breaking the Classical Probability Law
NASA Astrophysics Data System (ADS)
Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Yamato, Ichiro
2013-07-01
There exist several phenomena breaking the classical probability laws. The systems related to such phenomena are context-dependent, so that they are adaptive to other systems. In this paper, we present a new mathematical formalism to compute the joint probability distribution for two event-systems by using concepts of the adaptive dynamics and quantum information theory, e.g., quantum channels and liftings. In physics the basic example of the context-dependent phenomena is the famous double-slit experiment. Recently similar examples have been found in biological and psychological sciences. Our approach is an extension of traditional quantum probability theory, and it is general enough to describe aforementioned contextual phenomena outside of quantum physics.
Existence of an information unit as a postulate of quantum theory.
Masanes, Lluís; Müller, Markus P; Augusiak, Remigiusz; Pérez-García, David
2013-10-08
Does information play a significant role in the foundations of physics? Information is the abstraction that allows us to refer to the states of systems when we choose to ignore the systems themselves. This is only possible in very particular frameworks, like in classical or quantum theory, or more generally, whenever there exists an information unit such that the state of any system can be reversibly encoded in a sufficient number of such units. In this work, we show how the abstract formalism of quantum theory can be deduced solely from the existence of an information unit with suitable properties, together with two further natural assumptions: the continuity and reversibility of dynamics, and the possibility of characterizing the state of a composite system by local measurements. This constitutes a set of postulates for quantum theory with a simple and direct physical meaning, like the ones of special relativity or thermodynamics, and it articulates a strong connection between physics and information.
Existence of an information unit as a postulate of quantum theory
Masanes, Lluís; Müller, Markus P.; Augusiak, Remigiusz; Pérez-García, David
2013-01-01
Does information play a significant role in the foundations of physics? Information is the abstraction that allows us to refer to the states of systems when we choose to ignore the systems themselves. This is only possible in very particular frameworks, like in classical or quantum theory, or more generally, whenever there exists an information unit such that the state of any system can be reversibly encoded in a sufficient number of such units. In this work, we show how the abstract formalism of quantum theory can be deduced solely from the existence of an information unit with suitable properties, together with two further natural assumptions: the continuity and reversibility of dynamics, and the possibility of characterizing the state of a composite system by local measurements. This constitutes a set of postulates for quantum theory with a simple and direct physical meaning, like the ones of special relativity or thermodynamics, and it articulates a strong connection between physics and information. PMID:24062431
Quantum stochastic walks on networks for decision-making.
Martínez-Martínez, Ismael; Sánchez-Burillo, Eduardo
2016-03-31
Recent experiments report violations of the classical law of total probability and incompatibility of certain mental representations when humans process and react to information. Evidence shows promise of a more general quantum theory providing a better explanation of the dynamics and structure of real decision-making processes than classical probability theory. Inspired by this, we show how the behavioral choice-probabilities can arise as the unique stationary distribution of quantum stochastic walkers on the classical network defined from Luce's response probabilities. This work is relevant because (i) we provide a very general framework integrating the positive characteristics of both quantum and classical approaches previously in confrontation, and (ii) we define a cognitive network which can be used to bring other connectivist approaches to decision-making into the quantum stochastic realm. We model the decision-maker as an open system in contact with her surrounding environment, and the time-length of the decision-making process reveals to be also a measure of the process' degree of interplay between the unitary and irreversible dynamics. Implementing quantum coherence on classical networks may be a door to better integrate human-like reasoning biases in stochastic models for decision-making.
Quantum stochastic walks on networks for decision-making
NASA Astrophysics Data System (ADS)
Martínez-Martínez, Ismael; Sánchez-Burillo, Eduardo
2016-03-01
Recent experiments report violations of the classical law of total probability and incompatibility of certain mental representations when humans process and react to information. Evidence shows promise of a more general quantum theory providing a better explanation of the dynamics and structure of real decision-making processes than classical probability theory. Inspired by this, we show how the behavioral choice-probabilities can arise as the unique stationary distribution of quantum stochastic walkers on the classical network defined from Luce’s response probabilities. This work is relevant because (i) we provide a very general framework integrating the positive characteristics of both quantum and classical approaches previously in confrontation, and (ii) we define a cognitive network which can be used to bring other connectivist approaches to decision-making into the quantum stochastic realm. We model the decision-maker as an open system in contact with her surrounding environment, and the time-length of the decision-making process reveals to be also a measure of the process’ degree of interplay between the unitary and irreversible dynamics. Implementing quantum coherence on classical networks may be a door to better integrate human-like reasoning biases in stochastic models for decision-making.
Quantum stochastic walks on networks for decision-making
Martínez-Martínez, Ismael; Sánchez-Burillo, Eduardo
2016-01-01
Recent experiments report violations of the classical law of total probability and incompatibility of certain mental representations when humans process and react to information. Evidence shows promise of a more general quantum theory providing a better explanation of the dynamics and structure of real decision-making processes than classical probability theory. Inspired by this, we show how the behavioral choice-probabilities can arise as the unique stationary distribution of quantum stochastic walkers on the classical network defined from Luce’s response probabilities. This work is relevant because (i) we provide a very general framework integrating the positive characteristics of both quantum and classical approaches previously in confrontation, and (ii) we define a cognitive network which can be used to bring other connectivist approaches to decision-making into the quantum stochastic realm. We model the decision-maker as an open system in contact with her surrounding environment, and the time-length of the decision-making process reveals to be also a measure of the process’ degree of interplay between the unitary and irreversible dynamics. Implementing quantum coherence on classical networks may be a door to better integrate human-like reasoning biases in stochastic models for decision-making. PMID:27030372
On the classic and modern theories of matching.
McDowell, J J
2005-07-01
Classic matching theory, which is based on Herrnstein's (1961) original matching equation and includes the well-known quantitative law of effect, is almost certainly false. The theory is logically inconsistent with known experimental findings, and experiments have shown that its central constant-k assumption is not tenable. Modern matching theory, which is based on the power function version of the original matching equation, remains tenable, although it has not been discussed or studied extensively. The modern theory is logically consistent with known experimental findings, it predicts the fact and details of the violation of the classic theory's constant-k assumption, and it accurately describes at least some data that are inconsistent with the classic theory.
Classical Field Theory and the Stress-Energy Tensor
NASA Astrophysics Data System (ADS)
Swanson, Mark S.
2015-09-01
This book is a concise introduction to the key concepts of classical field theory for beginning graduate students and advanced undergraduate students who wish to study the unifying structures and physical insights provided by classical field theory without dealing with the additional complication of quantization. In that regard, there are many important aspects of field theory that can be understood without quantizing the fields. These include the action formulation, Galilean and relativistic invariance, traveling and standing waves, spin angular momentum, gauge invariance, subsidiary conditions, fluctuations, spinor and vector fields, conservation laws and symmetries, and the Higgs mechanism, all of which are often treated briefly in a course on quantum field theory. The variational form of classical mechanics and continuum field theory are both developed in the time-honored graduate level text by Goldstein et al (2001). An introduction to classical field theory from a somewhat different perspective is available in Soper (2008). Basic classical field theory is often treated in books on quantum field theory. Two excellent texts where this is done are Greiner and Reinhardt (1996) and Peskin and Schroeder (1995). Green's function techniques are presented in Arfken et al (2013).
An Introduction to the Problem of the Existence of Classical and Quantum Information
NASA Astrophysics Data System (ADS)
Rocchi, Paolo; Gianfagna, Leonida
2006-01-01
Quantum computing raises novel meditation upon the nature of information, notably a number of theorists set out the critical elements of Shannon's work, which currently emerges as the most popular reference in the quantum territory. The present paper follows this vein and highlights how the prerequisites of the information theory, which should detail the precise hypotheses of this theory, appear rather obscure and the problem of the existence of information is still open. This work puts forward a theoretical scheme that calculates the existence of elementary items. These results clarify basic assumptions in information engineering. Later we bring evidence how information is not an absolute quantity and close with a discussion upon the information relativity.
Constructor theory of information
Deutsch, David; Marletto, Chiara
2015-01-01
We propose a theory of information expressed solely in terms of which transformations of physical systems are possible and which are impossible—i.e. in constructor-theoretic terms. It includes conjectured, exact laws of physics expressing the regularities that allow information to be physically instantiated. Although these laws are directly about information, independently of the details of particular physical instantiations, information is not regarded as an a priori mathematical or logical concept, but as something whose nature and properties are determined by the laws of physics alone. This theory solves a problem at the foundations of existing information theory, namely that information and distinguishability are each defined in terms of the other. It also explains the relationship between classical and quantum information, and reveals the single, constructor-theoretic property underlying the most distinctive phenomena associated with the latter, including the lack of in-principle distinguishability of some states, the impossibility of cloning, the existence of pairs of variables that cannot simultaneously have sharp values, the fact that measurement processes can be both deterministic and unpredictable, the irreducible perturbation caused by measurement, and locally inaccessible information (as in entangled systems). PMID:25663803
Quantum Foundations of Quantum Information
NASA Astrophysics Data System (ADS)
Griffiths, Robert
2009-03-01
The main foundational issue for quantum information is: What is quantum information about? What does it refer to? Classical information typically refers to physical properties, and since classical is a subset of quantum information (assuming the world is quantum mechanical), quantum information should--and, it will be argued, does--refer to quantum physical properties represented by projectors on appropriate subspaces of a quantum Hilbert space. All sorts of microscopic and macroscopic properties, not just measurement outcomes, can be represented in this way, and are thus a proper subject of quantum information. The Stern-Gerlach experiment illustrates this. When properties are compatible, which is to say their projectors commute, Shannon's classical information theory based on statistical correlations extends without difficulty or change to the quantum case. When projectors do not commute, giving rise to characteristic quantum effects, a foundation for the subject can still be constructed by replacing the ``measurement and wave-function collapse'' found in textbooks--an efficient calculational tool, but one giving rise to numerous conceptual difficulties--with a fully consistent and paradox free stochastic formulation of standard quantum mechanics. This formulation is particularly helpful in that it contains no nonlocal superluminal influences; the reason the latter carry no information is that they do not exist.
A quantum-classical theory with nonlinear and stochastic dynamics
NASA Astrophysics Data System (ADS)
Burić, N.; Popović, D. B.; Radonjić, M.; Prvanović, S.
2014-12-01
The method of constrained dynamical systems on the quantum-classical phase space is utilized to develop a theory of quantum-classical hybrid systems. Effects of the classical degrees of freedom on the quantum part are modeled using an appropriate constraint, and the interaction also includes the effects of neglected degrees of freedom. Dynamical law of the theory is given in terms of nonlinear stochastic differential equations with Hamiltonian and gradient terms. The theory provides a successful dynamical description of the collapse during quantum measurement.
Classical probes of string/gauge theory duality
NASA Astrophysics Data System (ADS)
Ishizeki, Riei
The AdS/CFT correspondence has played an important role in the recent development of string theory. The reason is that it proposes a description of certain gauge theories in terms of string theory. It is such that simple string theory computations give information about the strong coupling regime of the gauge theory. Vice versa, gauge theory computations give information about string theory and quantum gravity. Although much is known about AdS/CFT, the precise map between the two sides of the correspondence is not completely understood. In the unraveling of such map classical string solutions play a vital role. In this thesis, several classical string solutions are proposed to help understand the AdS/CFT duality. First, rigidly rotating strings on a two-sphere are studied. Taking special limits of such solutions leads to two cases: the already known giant magnon solution, and a new solution which we call the single spike solution. Next, we compute the scattering phase shift of the single spike solutions and compare the result with the giant magnon solutions. Intriguingly, the results are the same up to non-logarithmic terms, indicating that the single spike solution should have the same rich spin chain structure as the giant magnon solution. Afterward, we consider open string solutions ending on the boundary of AdS5. The lines traced by the ends of such open strings can be viewed as Wilson loops in N = 4 SYM theory. After applying an inversion transformation, the open Wilson loops become closed Wilson loops whose expectation value is consistent with previously conjectured results. Next, several Wilson loops for N = 4 SYM in an AdS5 pp-wave background are considered and translated to the pure AdS 5 background and their interpretation as forward quark-gluon scattering is suggested. In the last part of this thesis, a class of classical solutions for closed strings moving in AdS3 x S 1 ⊂ AdS5 x S5 with energy E and spin S in AdS3 and angular momentum J and winding m in S1 is explained. The relation between different limits of the spiky string solution with the Landau-Lifshitz model is of particular interest. The presented solutions provide new classes of string motion that are used to better understand the AdS/CFT correspondence, including the single spike solution and previously unknown examples of supersymmetric Wilson loops.
Linear Quantum Systems: Non-Classical States and Robust Stability
2016-06-29
has a history going back some 50 years, to the birth of modern control theory with Kalman’s foundational work on filtering and LQG optimal control ...information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ORGANIZATION. 1. REPORT DATE (DD...analysis and control of quantum linear systems and their interactions with non-classical quantum fields by developing control theoretic concepts exploiting
On the emergence of classical gravity
NASA Astrophysics Data System (ADS)
Larjo, Klaus
In this thesis I will discuss how certain black holes arise as an effective, thermodynamical description from non-singular microstates in string theory. This provides a possible solution to the information paradox, and strengthens the case for treating black holes as thermodynamical objects. I will characterize the data defining a microstate of a black hole in several settings, and demonstrate that most of the data is unmeasurable for a classical observer. I will further show that the data that is measurable is universal for nearly all microstates, making it impossible for a classical observer to distinguish between microstates, thus giving rise to an effective statistical description for the black hole. In the first half of the thesis I will work with two specific systems: the half-BPS sector of [Special characters omitted.] = 4 super Yang-Mills the and the conformal field theory corresponding to the D1/D5 system; in both cases the high degree of symmetry present provides great control over potentially intractable computations. For these systems, I will further specify the conditions a quantum mechanical microstate must satisfy in order to have a classical description in terms of a unique metric, and define a 'metric operator' whose eigenstates correspond to classical geometries. In the second half of the thesis I will consider a much broader setting, general [Special characters omitted.] = I superconformal quiver gauge the= ories and their dual gravity theories, and demonstrate that a similar effective description arises also in this setting.
Bagarello, F; Haven, E; Khrennikov, A
2017-11-13
We present the mathematical model of decision-making (DM) of agents acting in a complex and uncertain environment (combining huge variety of economical, financial, behavioural and geopolitical factors). To describe interaction of agents with it, we apply the formalism of quantum field theory (QTF). Quantum fields are a purely informational nature. The QFT model can be treated as a far relative of the expected utility theory, where the role of utility is played by adaptivity to an environment (bath). However, this sort of utility-adaptivity cannot be represented simply as a numerical function. The operator representation in Hilbert space is used and adaptivity is described as in quantum dynamics. We are especially interested in stabilization of solutions for sufficiently large time. The outputs of this stabilization process, probabilities for possible choices, are treated in the framework of classical DM. To connect classical and quantum DM, we appeal to Quantum Bayesianism. We demonstrate the quantum-like interference effect in DM, which is exhibited as a violation of the formula of total probability, and hence the classical Bayesian inference scheme.This article is part of the themed issue 'Second quantum revolution: foundational questions'. © 2017 The Author(s).
A model of adaptive decision-making from representation of information environment by quantum fields
NASA Astrophysics Data System (ADS)
Bagarello, F.; Haven, E.; Khrennikov, A.
2017-10-01
We present the mathematical model of decision-making (DM) of agents acting in a complex and uncertain environment (combining huge variety of economical, financial, behavioural and geopolitical factors). To describe interaction of agents with it, we apply the formalism of quantum field theory (QTF). Quantum fields are a purely informational nature. The QFT model can be treated as a far relative of the expected utility theory, where the role of utility is played by adaptivity to an environment (bath). However, this sort of utility-adaptivity cannot be represented simply as a numerical function. The operator representation in Hilbert space is used and adaptivity is described as in quantum dynamics. We are especially interested in stabilization of solutions for sufficiently large time. The outputs of this stabilization process, probabilities for possible choices, are treated in the framework of classical DM. To connect classical and quantum DM, we appeal to Quantum Bayesianism. We demonstrate the quantum-like interference effect in DM, which is exhibited as a violation of the formula of total probability, and hence the classical Bayesian inference scheme. This article is part of the themed issue `Second quantum revolution: foundational questions'.
Entanglement-assisted quantum convolutional coding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilde, Mark M.; Brun, Todd A.
2010-04-15
We show how to protect a stream of quantum information from decoherence induced by a noisy quantum communication channel. We exploit preshared entanglement and a convolutional coding structure to develop a theory of entanglement-assisted quantum convolutional coding. Our construction produces a Calderbank-Shor-Steane (CSS) entanglement-assisted quantum convolutional code from two arbitrary classical binary convolutional codes. The rate and error-correcting properties of the classical convolutional codes directly determine the corresponding properties of the resulting entanglement-assisted quantum convolutional code. We explain how to encode our CSS entanglement-assisted quantum convolutional codes starting from a stream of information qubits, ancilla qubits, and shared entangled bits.
The phonon theory of liquid thermodynamics
Bolmatov, D.; Brazhkin, V. V.; Trachenko, K.
2012-01-01
Heat capacity of matter is considered to be its most important property because it holds information about system's degrees of freedom as well as the regime in which the system operates, classical or quantum. Heat capacity is well understood in gases and solids but not in the third main state of matter, liquids, and is not discussed in physics textbooks as a result. The perceived difficulty is that interactions in a liquid are both strong and system-specific, implying that the energy strongly depends on the liquid type and that, therefore, liquid energy can not be calculated in general form. Here, we develop a phonon theory of liquids where this problem is avoided. The theory covers both classical and quantum regimes. We demonstrate good agreement of calculated and experimental heat capacity of 21 liquids, including noble, metallic, molecular and hydrogen-bonded network liquids in a wide range of temperature and pressure. PMID:22639729
Communication Strength of Correlations Violating Monogamy Relations
NASA Astrophysics Data System (ADS)
Kłobus, Waldemar; Oszmaniec, Michał; Augusiak, Remigiusz; Grudka, Andrzej
2016-05-01
In any theory satisfying the no-signaling principle correlations generated among spatially separated parties in a Bell-type experiment are subject to certain constraints known as monogamy relations. Recently, in the context of the black hole information loss problem it was suggested that these monogamy relations might be violated. This in turn implies that correlations arising in such a scenario must violate the no-signaling principle and hence can be used to send classical information between parties. Here, we study the amount of information that can be sent using such correlations. To this aim, we first provide a framework associating them with classical channels whose capacities are then used to quantify the usefulness of these correlations in sending information. Finally, we determine the minimal amount of information that can be sent using signaling correlations violating the monogamy relation associated to the chained Bell inequalities.
Using quantum theory to simplify input-output processes
NASA Astrophysics Data System (ADS)
Thompson, Jayne; Garner, Andrew J. P.; Vedral, Vlatko; Gu, Mile
2017-02-01
All natural things process and transform information. They receive environmental information as input, and transform it into appropriate output responses. Much of science is dedicated to building models of such systems-algorithmic abstractions of their input-output behavior that allow us to simulate how such systems can behave in the future, conditioned on what has transpired in the past. Here, we show that classical models cannot avoid inefficiency-storing past information that is unnecessary for correct future simulation. We construct quantum models that mitigate this waste, whenever it is physically possible to do so. This suggests that the complexity of general input-output processes depends fundamentally on what sort of information theory we use to describe them.
Jaeger, Johannes; Irons, David; Monk, Nick
2008-10-01
Positional specification by morphogen gradients is traditionally viewed as a two-step process. A gradient is formed and then interpreted, providing a spatial metric independent of the target tissue, similar to the concept of space in classical mechanics. However, the formation and interpretation of gradients are coupled, dynamic processes. We introduce a conceptual framework for positional specification in which cellular activity feeds back on positional information encoded by gradients, analogous to the feedback between mass-energy distribution and the geometry of space-time in Einstein's general theory of relativity. We discuss how such general relativistic positional information (GRPI) can guide systems-level approaches to pattern formation.
Generalized classical and quantum signal theories
NASA Astrophysics Data System (ADS)
Rundblad, E.; Labunets, V.; Novak, P.
2005-05-01
In this paper we develop two topics and show their inter- and cross-relation. The first centers on general notions of the generalized classical signal theory on finite Abelian hypergroups. The second concerns the generalized quantum hyperharmonic analysis of quantum signals (Hermitean operators associated with classical signals). We study classical and quantum generalized convolution hypergroup algebras of classical and quantum signals.
Machine learning phases of matter
NASA Astrophysics Data System (ADS)
Carrasquilla, Juan; Stoudenmire, Miles; Melko, Roger
We show how the technology that allows automatic teller machines read hand-written digits in cheques can be used to encode and recognize phases of matter and phase transitions in many-body systems. In particular, we analyze the (quasi-)order-disorder transitions in the classical Ising and XY models. Furthermore, we successfully use machine learning to study classical Z2 gauge theories that have important technological application in the coming wave of quantum information technologies and whose phase transitions have no conventional order parameter.
On information, negentropy and H-theorem
NASA Astrophysics Data System (ADS)
Chakrabarti, C. G.; Sarker, N. G.
1983-09-01
The paper deals with the imprtance of the Kullback descrimination information in the statistical characterization of negentropy of non-equilibrium state and the irreversibility of a classical dynamical system. The theory based on the Kullback discrimination information as the H-function gives new insight into the interrelation between the concepts of coarse-graining and the principle of sufficiency leading to important statistical characterization of thermal equilibrium of a closed system.
Horodecki, Michał; Oppenheim, Jonathan; Winter, Andreas
2005-08-04
Information--be it classical or quantum--is measured by the amount of communication needed to convey it. In the classical case, if the receiver has some prior information about the messages being conveyed, less communication is needed. Here we explore the concept of prior quantum information: given an unknown quantum state distributed over two systems, we determine how much quantum communication is needed to transfer the full state to one system. This communication measures the partial information one system needs, conditioned on its prior information. We find that it is given by the conditional entropy--a quantity that was known previously, but lacked an operational meaning. In the classical case, partial information must always be positive, but we find that in the quantum world this physical quantity can be negative. If the partial information is positive, its sender needs to communicate this number of quantum bits to the receiver; if it is negative, then sender and receiver instead gain the corresponding potential for future quantum communication. We introduce a protocol that we term 'quantum state merging' which optimally transfers partial information. We show how it enables a systematic understanding of quantum network theory, and discuss several important applications including distributed compression, noiseless coding with side information, multiple access channels and assisted entanglement distillation.
The capacity to transmit classical information via black holes
NASA Astrophysics Data System (ADS)
Adami, Christoph; Ver Steeg, Greg
2013-03-01
One of the most vexing problems in theoretical physics is the relationship between quantum mechanics and gravity. According to an argument originally by Hawking, a black hole must destroy any information that is incident on it because the only radiation that a black hole releases during its evaporation (the Hawking radiation) is precisely thermal. Surprisingly, this claim has never been investigated within a quantum information-theoretic framework, where the black hole is treated as a quantum channel to transmit classical information. We calculate the capacity of the quantum black hole channel to transmit classical information (the Holevo capacity) within curved-space quantum field theory, and show that the information carried by late-time particles sent into a black hole can be recovered with arbitrary accuracy, from the signature left behind by the stimulated emission of radiation that must accompany any absorption event. We also show that this stimulated emission turns the black hole into an almost-optimal quantum cloning machine, where the violation of the no-cloning theorem is ensured by the noise provided by the Hawking radiation. Thus, rather than threatening the consistency of theoretical physics, Hawking radiation manages to save it instead.
Introduction to Classical Density Functional Theory by a Computational Experiment
ERIC Educational Resources Information Center
Jeanmairet, Guillaume; Levy, Nicolas; Levesque, Maximilien; Borgis, Daniel
2014-01-01
We propose an in silico experiment to introduce the classical density functional theory (cDFT). Density functional theories, whether quantum or classical, rely on abstract concepts that are nonintuitive; however, they are at the heart of powerful tools and active fields of research in both physics and chemistry. They led to the 1998 Nobel Prize in…
Design Equations and Criteria of Orthotropic Composite Panels
2013-05-01
33 Appendix A Classical Laminate Theory ( CLT ): ....................................................................... A–1 Appendix...Science London , 1990. NSWCCD-65-TR–2004/16A A–1 Appendix A Classical Laminate Theory ( CLT ): In Section 6 of this report, preliminary design...determined using: Classical Laminate Theory, CLT , to Predict Equivalent Stiffness Characteristics, First- Ply Strength Note: CLT is valid for
Coherence and measurement in quantum thermodynamics
Kammerlander, P.; Anders, J.
2016-01-01
Thermodynamics is a highly successful macroscopic theory widely used across the natural sciences and for the construction of everyday devices, from car engines to solar cells. With thermodynamics predating quantum theory, research now aims to uncover the thermodynamic laws that govern finite size systems which may in addition host quantum effects. Recent theoretical breakthroughs include the characterisation of the efficiency of quantum thermal engines, the extension of classical non-equilibrium fluctuation theorems to the quantum regime and a new thermodynamic resource theory has led to the discovery of a set of second laws for finite size systems. These results have substantially advanced our understanding of nanoscale thermodynamics, however putting a finger on what is genuinely quantum in quantum thermodynamics has remained a challenge. Here we identify information processing tasks, the so-called projections, that can only be formulated within the framework of quantum mechanics. We show that the physical realisation of such projections can come with a non-trivial thermodynamic work only for quantum states with coherences. This contrasts with information erasure, first investigated by Landauer, for which a thermodynamic work cost applies for classical and quantum erasure alike. Repercussions on quantum work fluctuation relations and thermodynamic single-shot approaches are also discussed. PMID:26916503
Coherence and measurement in quantum thermodynamics.
Kammerlander, P; Anders, J
2016-02-26
Thermodynamics is a highly successful macroscopic theory widely used across the natural sciences and for the construction of everyday devices, from car engines to solar cells. With thermodynamics predating quantum theory, research now aims to uncover the thermodynamic laws that govern finite size systems which may in addition host quantum effects. Recent theoretical breakthroughs include the characterisation of the efficiency of quantum thermal engines, the extension of classical non-equilibrium fluctuation theorems to the quantum regime and a new thermodynamic resource theory has led to the discovery of a set of second laws for finite size systems. These results have substantially advanced our understanding of nanoscale thermodynamics, however putting a finger on what is genuinely quantum in quantum thermodynamics has remained a challenge. Here we identify information processing tasks, the so-called projections, that can only be formulated within the framework of quantum mechanics. We show that the physical realisation of such projections can come with a non-trivial thermodynamic work only for quantum states with coherences. This contrasts with information erasure, first investigated by Landauer, for which a thermodynamic work cost applies for classical and quantum erasure alike. Repercussions on quantum work fluctuation relations and thermodynamic single-shot approaches are also discussed.
Coherence and measurement in quantum thermodynamics
NASA Astrophysics Data System (ADS)
Kammerlander, P.; Anders, J.
2016-02-01
Thermodynamics is a highly successful macroscopic theory widely used across the natural sciences and for the construction of everyday devices, from car engines to solar cells. With thermodynamics predating quantum theory, research now aims to uncover the thermodynamic laws that govern finite size systems which may in addition host quantum effects. Recent theoretical breakthroughs include the characterisation of the efficiency of quantum thermal engines, the extension of classical non-equilibrium fluctuation theorems to the quantum regime and a new thermodynamic resource theory has led to the discovery of a set of second laws for finite size systems. These results have substantially advanced our understanding of nanoscale thermodynamics, however putting a finger on what is genuinely quantum in quantum thermodynamics has remained a challenge. Here we identify information processing tasks, the so-called projections, that can only be formulated within the framework of quantum mechanics. We show that the physical realisation of such projections can come with a non-trivial thermodynamic work only for quantum states with coherences. This contrasts with information erasure, first investigated by Landauer, for which a thermodynamic work cost applies for classical and quantum erasure alike. Repercussions on quantum work fluctuation relations and thermodynamic single-shot approaches are also discussed.
Quantum Image Processing and Its Application to Edge Detection: Theory and Experiment
NASA Astrophysics Data System (ADS)
Yao, Xi-Wei; Wang, Hengyan; Liao, Zeyang; Chen, Ming-Cheng; Pan, Jian; Li, Jun; Zhang, Kechao; Lin, Xingcheng; Wang, Zhehui; Luo, Zhihuang; Zheng, Wenqiang; Li, Jianzhong; Zhao, Meisheng; Peng, Xinhua; Suter, Dieter
2017-07-01
Processing of digital images is continuously gaining in volume and relevance, with concomitant demands on data storage, transmission, and processing power. Encoding the image information in quantum-mechanical systems instead of classical ones and replacing classical with quantum information processing may alleviate some of these challenges. By encoding and processing the image information in quantum-mechanical systems, we here demonstrate the framework of quantum image processing, where a pure quantum state encodes the image information: we encode the pixel values in the probability amplitudes and the pixel positions in the computational basis states. Our quantum image representation reduces the required number of qubits compared to existing implementations, and we present image processing algorithms that provide exponential speed-up over their classical counterparts. For the commonly used task of detecting the edge of an image, we propose and implement a quantum algorithm that completes the task with only one single-qubit operation, independent of the size of the image. This demonstrates the potential of quantum image processing for highly efficient image and video processing in the big data era.
Application of Dirac's Generalized Hamiltonian Dynamics to Atomic and Molecular Systems
NASA Astrophysics Data System (ADS)
Uzer, Turgay
2002-10-01
Incorporating electronic degrees of freedom into classical treatments of atoms and molecules is a challenging problem from both the practical and fundamental points of view. Because it goes to the heart of classical-quantal correspondence, there are now a number of prescriptions which differ by the extent of quantal information that they include. We reach back to Dirac for inspiration, who, half a century ago, designed a so-called Generalized Hamiltonian Dynamics (GHD) with applications to field theory in mind. Physically, the GHD is a purely classical formalism for systems with constraints; it incorporates the constraints into the Hamiltonian. We apply the GHD to atomic and molecular physics by choosing integrals of motion as the constraints. We show that this purely classical formalism allows the derivation of energies of non-radiating states.
Quantum Information Theory - an Invitation
NASA Astrophysics Data System (ADS)
Werner, Reinhard F.
Quantum information and quantum computers have received a lot of public attention recently. Quantum computers have been advertised as a kind of warp drive for computing, and indeed the promise of the algorithms of Shor and Grover is to perform computations which are extremely hard or even provably impossible on any merely ``classical'' computer.In this article I shall give an account of the basic concepts of quantum information theory is given, staying as much as possible in the area of general agreement.The article is divided into two parts. The first (up to the end of Sect. 2.5) is mostly in plain English, centered around the exploration of what can or cannot be done with quantum systems as information carriers. The second part, Sect. 2.6, then gives a description of the mathematical structures and of some of the tools needed to develop the theory.
k-Cosymplectic Classical Field Theories: Tulczyjew and Skinner-Rusk Formulations
NASA Astrophysics Data System (ADS)
Rey, Angel M.; Román-Roy, Narciso; Salgado, Modesto; Vilariño, Silvia
2012-06-01
The k-cosymplectic Lagrangian and Hamiltonian formalisms of first-order classical field theories are reviewed and completed. In particular, they are stated for singular and almost-regular systems. Subsequently, several alternative formulations for k-cosymplectic first-order field theories are developed: First, generalizing the construction of Tulczyjew for mechanics, we give a new interpretation of the classical field equations. Second, the Lagrangian and Hamiltonian formalisms are unified by giving an extension of the Skinner-Rusk formulation on classical mechanics.
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Theory and applications survey of decentralized control methods
NASA Technical Reports Server (NTRS)
Athans, M.
1975-01-01
A nonmathematical overview is presented of trends in the general area of decentralized control strategies which are suitable for hierarchical systems. Advances in decentralized system theory are closely related to advances in the so-called stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools pertaining to the classical stochastic control problem are outlined. Particular attention is devoted to pitfalls in the mathematical problem formulation for decentralized control. Major conclusions are that any purely deterministic approach to multilevel hierarchical dynamic systems is unlikely to lead to realistic theories or designs, that the flow of measurements and decisions in a decentralized system should not be instantaneous and error-free, and that delays in information exchange in a decentralized system lead to reasonable approaches to decentralized control. A mathematically precise notion of aggregating information is not yet available.
Enhanced delegated computing using coherence
NASA Astrophysics Data System (ADS)
Barz, Stefanie; Dunjko, Vedran; Schlederer, Florian; Moore, Merritt; Kashefi, Elham; Walmsley, Ian A.
2016-03-01
A longstanding question is whether it is possible to delegate computational tasks securely—such that neither the computation nor the data is revealed to the server. Recently, both a classical and a quantum solution to this problem were found [C. Gentry, in Proceedings of the 41st Annual ACM Symposium on the Theory of Computing (Association for Computing Machinery, New York, 2009), pp. 167-178; A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science (IEEE Computer Society, Los Alamitos, CA, 2009), pp. 517-526]. Here, we study the first step towards the interplay between classical and quantum approaches and show how coherence can be used as a tool for secure delegated classical computation. We show that a client with limited computational capacity—restricted to an XOR gate—can perform universal classical computation by manipulating information carriers that may occupy superpositions of two states. Using single photonic qubits or coherent light, we experimentally implement secure delegated classical computations between an independent client and a server, which are installed in two different laboratories and separated by 50 m . The server has access to the light sources and measurement devices, whereas the client may use only a restricted set of passive optical devices to manipulate the information-carrying light beams. Thus, our work highlights how minimal quantum and classical resources can be combined and exploited for classical computing.
Generalized probability theories: what determines the structure of quantum theory?
NASA Astrophysics Data System (ADS)
Janotta, Peter; Hinrichsen, Haye
2014-08-01
The framework of generalized probabilistic theories is a powerful tool for studying the foundations of quantum physics. It provides the basis for a variety of recent findings that significantly improve our understanding of the rich physical structure of quantum theory. This review paper tries to present the framework and recent results to a broader readership in an accessible manner. To achieve this, we follow a constructive approach. Starting from a few basic physically motivated assumptions we show how a given set of observations can be manifested in an operational theory. Furthermore, we characterize consistency conditions limiting the range of possible extensions. In this framework classical and quantum theory appear as special cases, and the aim is to understand what distinguishes quantum mechanics as the fundamental theory realized in nature. It turns out that non-classical features of single systems can equivalently result from higher-dimensional classical theories that have been restricted. Entanglement and non-locality, however, are shown to be genuine non-classical features.
Introducing the Qplex: a novel arena for quantum theory
NASA Astrophysics Data System (ADS)
Appleby, Marcus; Fuchs, Christopher A.; Stacey, Blake C.; Zhu, Huangjun
2017-07-01
We reconstruct quantum theory starting from the premise that, as Asher Peres remarked, "Unperformed experiments have no results." The tools of quantum information theory, and in particular the symmetric informationally complete (SIC) measurements, provide a concise expression of how exactly Peres's dictum holds true. That expression is a constraint on how the probability distributions for outcomes of different, hypothetical and mutually exclusive experiments ought to mesh together, a type of constraint not foreseen in classical thinking. Taking this as our foundational principle, we show how to reconstruct the formalism of quantum theory in finite-dimensional Hilbert spaces. The central variety of mathematical entity in our reconstruction is the qplex, a very particular type of subset of a probability simplex. Along the way, by closely studying the symmetry properties of qplexes, we derive a condition for the existence of a d-dimensional SIC.
Kiefer, Markus
2012-01-01
Unconscious priming is a prototypical example of an automatic process, which is initiated without deliberate intention. Classical theories of automaticity assume that such unconscious automatic processes occur in a purely bottom-up driven fashion independent of executive control mechanisms. In contrast to these classical theories, our attentional sensitization model of unconscious information processing proposes that unconscious processing is susceptible to executive control and is only elicited if the cognitive system is configured accordingly. It is assumed that unconscious processing depends on attentional amplification of task-congruent processing pathways as a function of task sets. This article provides an overview of the latest research on executive control influences on unconscious information processing. I introduce refined theories of automaticity with a particular focus on the attentional sensitization model of unconscious cognition which is specifically developed to account for various attentional influences on different types of unconscious information processing. In support of the attentional sensitization model, empirical evidence is reviewed demonstrating executive control influences on unconscious cognition in the domains of visuo-motor and semantic processing: subliminal priming depends on attentional resources, is susceptible to stimulus expectations and is influenced by action intentions and task sets. This suggests that even unconscious processing is flexible and context-dependent as a function of higher-level executive control settings. I discuss that the assumption of attentional sensitization of unconscious information processing can accommodate conflicting findings regarding the automaticity of processes in many areas of cognition and emotion. This theoretical view has the potential to stimulate future research on executive control of unconscious processing in healthy and clinical populations. PMID:22470329
Kiefer, Markus
2012-01-01
Unconscious priming is a prototypical example of an automatic process, which is initiated without deliberate intention. Classical theories of automaticity assume that such unconscious automatic processes occur in a purely bottom-up driven fashion independent of executive control mechanisms. In contrast to these classical theories, our attentional sensitization model of unconscious information processing proposes that unconscious processing is susceptible to executive control and is only elicited if the cognitive system is configured accordingly. It is assumed that unconscious processing depends on attentional amplification of task-congruent processing pathways as a function of task sets. This article provides an overview of the latest research on executive control influences on unconscious information processing. I introduce refined theories of automaticity with a particular focus on the attentional sensitization model of unconscious cognition which is specifically developed to account for various attentional influences on different types of unconscious information processing. In support of the attentional sensitization model, empirical evidence is reviewed demonstrating executive control influences on unconscious cognition in the domains of visuo-motor and semantic processing: subliminal priming depends on attentional resources, is susceptible to stimulus expectations and is influenced by action intentions and task sets. This suggests that even unconscious processing is flexible and context-dependent as a function of higher-level executive control settings. I discuss that the assumption of attentional sensitization of unconscious information processing can accommodate conflicting findings regarding the automaticity of processes in many areas of cognition and emotion. This theoretical view has the potential to stimulate future research on executive control of unconscious processing in healthy and clinical populations.
Entropy in sound and vibration: towards a new paradigm.
Le Bot, A
2017-01-01
This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart.
NASA Astrophysics Data System (ADS)
Tacnet, Jean-Marc; Dupouy, Guillaume; Carladous, Simon; Dezert, Jean; Batton-Hubert, Mireille
2017-04-01
In mountain areas, natural phenomena such as snow avalanches, debris-flows and rock-falls, put people and objects at risk with sometimes dramatic consequences. Risk is classically considered as a combination of hazard, the combination of the intensity and frequency of the phenomenon, and vulnerability which corresponds to the consequences of the phenomenon on exposed people and material assets. Risk management consists in identifying the risk level as well as choosing the best strategies for risk prevention, i.e. mitigation. In the context of natural phenomena in mountainous areas, technical and scientific knowledge is often lacking. Risk management decisions are therefore based on imperfect information. This information comes from more or less reliable sources ranging from historical data, expert assessments, numerical simulations etc. Finally, risk management decisions are the result of complex knowledge management and reasoning processes. Tracing the information and propagating information quality from data acquisition to decisions are therefore important steps in the decision-making process. One major goal today is therefore to assist decision-making while considering the availability, quality and reliability of information content and sources. A global integrated framework is proposed to improve the risk management process in a context of information imperfection provided by more or less reliable sources: uncertainty as well as imprecision, inconsistency and incompleteness are considered. Several methods are used and associated in an original way: sequential decision context description, development of specific multi-criteria decision-making methods, imperfection propagation in numerical modeling and information fusion. This framework not only assists in decision-making but also traces the process and evaluates the impact of information quality on decision-making. We focus and present two main developments. The first one relates to uncertainty and imprecision propagation in numerical modeling using both classical Monte-Carlo probabilistic approach and also so-called Hybrid approach using possibility theory. Second approach deals with new multi-criteria decision-making methods which consider information imperfection, source reliability, importance and conflict, using fuzzy sets as well as possibility and belief function theories. Implemented methods consider information imperfection propagation and information fusion in total aggregation methods such as AHP (Saaty, 1980) or partial aggregation methods such as the Electre outranking method (see Soft Electre Tri ) or decisions in certain but also risky or uncertain contexts (see new COWA-ER and FOWA-ER- Cautious and Fuzzy Ordered Weighted Averaging-Evidential Reasoning). For example, the ER-MCDA methodology considers expert assessment as a multi-criteria decision process based on imperfect information provided by more or less heterogeneous, reliable and conflicting sources: it mixes AHP, fuzzy sets theory, possibility theory and belief function theory using DSmT (Dezert-Smarandache Theory) framework which provides powerful fusion rules.
del Moral, F; Vázquez, J A; Ferrero, J J; Willisch, P; Ramírez, R D; Teijeiro, A; López Medina, A; Andrade, B; Vázquez, J; Salvador, F; Medal, D; Salgado, M; Muñoz, V
2009-09-01
Modern radiotherapy uses complex treatments that necessitate more complex quality assurance procedures. As a continuous medium, GafChromic EBT films offer suitable features for such verification. However, its sensitometric curve is not fully understood in terms of classical theoretical models. In fact, measured optical densities and those predicted by the classical models differ significantly. This difference increases systematically with wider dose ranges. Thus, achieving the accuracy required for intensity-modulated radiotherapy (IMRT) by classical methods is not possible, plecluding their use. As a result, experimental parametrizations, such as polynomial fits, are replacing phenomenological expressions in modern investigations. This article focuses on identifying new theoretical ways to describe sensitometric curves and on evaluating the quality of fit for experimental data based on four proposed models. A whole mathematical formalism starting with a geometrical version of the classical theory is used to develop new expressions for the sensitometric curves. General results from the percolation theory are also used. A flat-bed-scanner-based method was chosen for the film analysis. Different tests were performed, such as consistency of the numeric results for the proposed model and double examination using data from independent researchers. Results show that the percolation-theory-based model provides the best theoretical explanation for the sensitometric behavior of GafChromic films. The different sizes of active centers or monomer crystals of the film are the basis of this model, allowing acquisition of information about the internal structure of the films. Values for the mean size of the active centers were obtained in accordance with technical specifications. In this model, the dynamics of the interaction between the active centers of GafChromic film and radiation is also characterized by means of its interaction cross-section value. The percolation model fulfills the accuracy requirements for quality-control procedures when large ranges of doses are used and offers a physical explanation for the film response.
History, Philosophy, and Public Opinion Research.
ERIC Educational Resources Information Center
Herbst, Susan
1993-01-01
Argues for the importance of the classical tradition (broad, speculative, and historically informed writing and research) in public opinion research. Argues that asking large, normative questions about public opinion processes, trying to build grand theory, and taking history seriously will enrich the field and command the attention of scholars in…
NASA Astrophysics Data System (ADS)
Wu, Sheng-Jhih; Chu, Moody T.
2017-08-01
An inverse eigenvalue problem usually entails two constraints, one conditioned upon the spectrum and the other on the structure. This paper investigates the problem where triple constraints of eigenvalues, singular values, and diagonal entries are imposed simultaneously. An approach combining an eclectic mix of skills from differential geometry, optimization theory, and analytic gradient flow is employed to prove the solvability of such a problem. The result generalizes the classical Mirsky, Sing-Thompson, and Weyl-Horn theorems concerning the respective majorization relationships between any two of the arrays of main diagonal entries, eigenvalues, and singular values. The existence theory fills a gap in the classical matrix theory. The problem might find applications in wireless communication and quantum information science. The technique employed can be implemented as a first-step numerical method for constructing the matrix. With slight modification, the approach might be used to explore similar types of inverse problems where the prescribed entries are at general locations.
Telling and Not-Telling: A Classic Grounded Theory of Sharing Life-Stories
ERIC Educational Resources Information Center
Powers, Trudy Lee
2013-01-01
This study of "Telling and Not-Telling" was conducted using the classic grounded theory methodology (Glaser 1978, 1992, 1998; Glaser & Strauss, 1967). This unique methodology systematically and inductively generates conceptual theories from data. The goal is to discover theory that explains, predicts, and provides practical…
Information-theoretic metamodel of organizational evolution
NASA Astrophysics Data System (ADS)
Sepulveda, Alfredo
2011-12-01
Social organizations are abstractly modeled by holarchies---self-similar connected networks---and intelligent complex adaptive multiagent systems---large networks of autonomous reasoning agents interacting via scaled processes. However, little is known of how information shapes evolution in such organizations, a gap that can lead to misleading analytics. The research problem addressed in this study was the ineffective manner in which classical model-predict-control methods used in business analytics attempt to define organization evolution. The purpose of the study was to construct an effective metamodel for organization evolution based on a proposed complex adaptive structure---the info-holarchy. Theoretical foundations of this study were holarchies, complex adaptive systems, evolutionary theory, and quantum mechanics, among other recently developed physical and information theories. Research questions addressed how information evolution patterns gleamed from the study's inductive metamodel more aptly explained volatility in organization. In this study, a hybrid grounded theory based on abstract inductive extensions of information theories was utilized as the research methodology. An overarching heuristic metamodel was framed from the theoretical analysis of the properties of these extension theories and applied to business, neural, and computational entities. This metamodel resulted in the synthesis of a metaphor for, and generalization of organization evolution, serving as the recommended and appropriate analytical tool to view business dynamics for future applications. This study may manifest positive social change through a fundamental understanding of complexity in business from general information theories, resulting in more effective management.
An Introduction to Quantum Theory
NASA Astrophysics Data System (ADS)
Greensite, Jeff
2017-02-01
Written in a lucid and engaging style, the author takes readers from an overview of classical mechanics and the historical development of quantum theory through to advanced topics. The mathematical aspects of quantum theory necessary for a firm grasp of the subject are developed in the early chapters, but an effort is made to motivate that formalism on physical grounds. Including animated figures and their respective Mathematica® codes, this book provides a complete and comprehensive text for students in physics, maths, chemistry and engineering needing an accessible introduction to quantum mechanics. Supplementary Mathematica codes available within Book Information
Koopman-von Neumann formulation of classical Yang-Mills theories: I
NASA Astrophysics Data System (ADS)
Carta, P.; Gozzi, E.; Mauro, D.
2006-03-01
In this paper we present the Koopman-von Neumann (KvN) formulation of classical non-Abelian gauge field theories. In particular we shall explore the functional (or classical path integral) counterpart of the KvN method. In the quantum path integral quantization of Yang-Mills theories concepts like gauge-fixing and Faddeev-Popov determinant appear in a quite natural way. We will prove that these same objects are needed also in this classical path integral formulation for Yang-Mills theories. We shall also explore the classical path integral counterpart of the BFV formalism and build all the associated universal and gauge charges. These last are quite different from the analog quantum ones and we shall show the relation between the two. This paper lays the foundation of this formalism which, due to the many auxiliary fields present, is rather heavy. Applications to specific topics outlined in the paper will appear in later publications.
a Classical Isodual Theory of Antimatter and its Prediction of Antigravity
NASA Astrophysics Data System (ADS)
Santilli, Ruggero Maria
An inspection of the contemporary physics literature reveals that, while matter is treated at all levels of study, from Newtonian mechanics to quantum field theory, antimatter is solely treated at the level of second quantization. For the purpose of initiating the restoration of full equivalence in the treatment of matter and antimatter in due time, and as the classical foundations of an axiomatically consistent inclusion of gravitation in unified gauge theories recently appeared elsewhere, in this paper we present a classical representation of antimatter which begins at the primitive Newtonian level with corresponding formulations at all subsequent levels. By recalling that charge conjugation of particles into antiparticles is antiautomorphic, the proposed theory of antimatter is based on a new map, called isoduality, which is also antiautomorphic (and more generally, antiisomorphic), yet it is applicable beginning at the classical level and then persists at the quantum level where it becomes equivalent to charge conjugation. We therefore present, apparently for the first time, the classical isodual theory of antimatter, we identify the physical foundations of the theory as being the novel isodual Galilean, special and general relativities, and we show the compatibility of the theory with all available classical experimental data on antimatter. We identify the classical foundations of the prediction of antigravity for antimatter in the field of matter (or vice-versa) without any claim on its validity, and defer its resolution to specifically identified experiments. We identify the novel, classical, isodual electromagnetic waves which are predicted to be emitted by antimatter, the so-called space-time machine based on a novel non-Newtonian geometric propulsion, and other implications of the theory. We also introduce, apparently for the first time, the isodual space and time inversions and show that they are nontrivially different than the conventional ones, thus offering a possibility for the future resolution whether far away galaxies and quasars are made up of matter or of antimatter. The paper ends with the indication that the studies are at their first infancy, and indicates some of the open problems. To avoid a prohibitive length, the paper is restricted to the classical treatment, while studies on operator profiles are treated elsewhere.
Daly, Louise; McCarron, Mary; Higgins, Agnes; McCallion, Philip
2013-02-01
This paper presents a theory explaining the processes used by informal carers of people with dementia to mange alterations to their, and people with dementias' relationships with and places within their social worlds. Informal carers provide the majority of care to people with dementia. A great deal of international informal dementia care research is available, much of which elucidates the content, impacts and consequences of the informal caring role and the coping mechanisms that carers use. However, the socially situated experiences and processes integral to informal caring in dementia have not yet been robustly accounted for. A classic grounded theory approach was used as it is designed for research enquiries that aim to generate theory illustrating social patterns of action used to address an identified problem. Thirty interviews were conducted with 31 participants between 2006-2008. The theory was conceptualised from the data using the concurrent methods of theoretical sampling, constant comparative analysis, memo writing and theoretical sensitivity. Informal carers' main concern was identified as 'Living on the fringes', which was stimulated by dementia-related stigma and living a different life. The theory of 'Sustaining Place' explains the social pattern of actions employed by informal carers to manage this problem on behalf of themselves and the person with dementia. The theory of 'Sustaining Place' identifies an imperative for nurses, other formal carers and society to engage in actions to support and enable social connectedness, social inclusion and citizenship for informal carers and people with dementia. 'Sustaining Place' facilitates enhanced understanding of the complex and socially situated nature of informal dementia care through its portrayal of informal carers as social agents and can be used to guide nurses to better support those who live with dementia. © 2012 Blackwell Publishing Ltd.
Metal Ion Modeling Using Classical Mechanics
2017-01-01
Metal ions play significant roles in numerous fields including chemistry, geochemistry, biochemistry, and materials science. With computational tools increasingly becoming important in chemical research, methods have emerged to effectively face the challenge of modeling metal ions in the gas, aqueous, and solid phases. Herein, we review both quantum and classical modeling strategies for metal ion-containing systems that have been developed over the past few decades. This Review focuses on classical metal ion modeling based on unpolarized models (including the nonbonded, bonded, cationic dummy atom, and combined models), polarizable models (e.g., the fluctuating charge, Drude oscillator, and the induced dipole models), the angular overlap model, and valence bond-based models. Quantum mechanical studies of metal ion-containing systems at the semiempirical, ab initio, and density functional levels of theory are reviewed as well with a particular focus on how these methods inform classical modeling efforts. Finally, conclusions and future prospects and directions are offered that will further enhance the classical modeling of metal ion-containing systems. PMID:28045509
Entropy in sound and vibration: towards a new paradigm
2017-01-01
This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart. PMID:28265190
Quantum Bohmian model for financial market
NASA Astrophysics Data System (ADS)
Choustova, Olga Al.
2007-01-01
We apply methods of quantum mechanics for mathematical modeling of price dynamics at the financial market. The Hamiltonian formalism on the price/price-change phase space describes the classical-like evolution of prices. This classical dynamics of prices is determined by “hard” conditions (natural resources, industrial production, services and so on). These conditions are mathematically described by the classical financial potential V(q), where q=(q1,…,qn) is the vector of prices of various shares. But the information exchange and market psychology play important (and sometimes determining) role in price dynamics. We propose to describe such behavioral financial factors by using the pilot wave (Bohmian) model of quantum mechanics. The theory of financial behavioral waves takes into account the market psychology. The real trajectories of prices are determined (through the financial analogue of the second Newton law) by two financial potentials: classical-like V(q) (“hard” market conditions) and quantum-like U(q) (behavioral market conditions).
The Prediction of Item Parameters Based on Classical Test Theory and Latent Trait Theory
ERIC Educational Resources Information Center
Anil, Duygu
2008-01-01
In this study, the prediction power of the item characteristics based on the experts' predictions on conditions try-out practices cannot be applied was examined for item characteristics computed depending on classical test theory and two-parameters logistic model of latent trait theory. The study was carried out on 9914 randomly selected students…
Any Ontological Model of the Single Qubit Stabilizer Formalism must be Contextual
NASA Astrophysics Data System (ADS)
Lillystone, Piers; Wallman, Joel J.
Quantum computers allow us to easily solve some problems classical computers find hard. Non-classical improvements in computational power should be due to some non-classical property of quantum theory. Contextuality, a more general notion of non-locality, is a necessary, but not sufficient, resource for quantum speed-up. Proofs of contextuality can be constructed for the classically simulable stabilizer formalism. Previous proofs of stabilizer contextuality are known for 2 or more qubits, for example the Mermin-Peres magic square. In the work presented we extend these results and prove that any ontological model of the single qubit stabilizer theory must be contextual, as defined by R. Spekkens, and give a relation between our result and the Mermin-Peres square. By demonstrating that contextuality is present in the qubit stabilizer formalism we provide further insight into the contextuality present in quantum theory. Understanding the contextuality of classical sub-theories will allow us to better identify the physical properties of quantum theory required for computational speed up. This research was supported by CIFAR, the Government of Ontario, and the Government of Canada through NSERC and Industry Canada.
NASA Astrophysics Data System (ADS)
Oblow, E. M.
1982-10-01
An evaluation was made of the mathematical and economic basis for conversion processes in the Long-term Energy Analysis Program (LEAP) energy economy model. Conversion processes are the main modeling subunit in LEAP used to represent energy conversion industries and are supposedly based on the classical economic theory of the firm. Questions about uniqueness and existence of LEAP solutions and their relation to classical equilibrium economic theory prompted the study. An analysis of classical theory and LEAP model equations was made to determine their exact relationship. The conclusions drawn from this analysis were that LEAP theory is not consistent with the classical theory of the firm. Specifically, the capacity factor formalism used by LEAP does not support a classical interpretation in terms of a technological production function for energy conversion processes. The economic implications of this inconsistency are suboptimal process operation and short term negative profits in years where plant operation should be terminated. A new capacity factor formalism, which retains the behavioral features of the original model, is proposed to resolve these discrepancies.
ERIC Educational Resources Information Center
Lange, Elizabeth
2015-01-01
This article argues that sociology has been a foundational discipline for the field of adult education, but it has been largely implicit, until recently. This article contextualizes classical theories of sociology within contemporary critiques, reviews the historical roots of sociology and then briefly introduces the classical theories…
How is quantum information localized in gravity?
NASA Astrophysics Data System (ADS)
Donnelly, William; Giddings, Steven B.
2017-10-01
A notion of localization of information within quantum subsystems plays a key role in describing the physics of quantum systems, and in particular is a prerequisite for discussing important concepts such as entanglement and information transfer. While subsystems can be readily defined for finite quantum systems and in local quantum field theory, a corresponding definition for gravitational systems is significantly complicated by the apparent nonlocality arising due to gauge invariance, enforced by the constraints. A related question is whether "soft hair" encodes otherwise localized information, and the question of such localization also remains an important puzzle for proposals that gravity emerges from another structure such as a boundary field theory as in AdS/CFT. This paper describes different approaches to defining local subsystem structure, and shows that at least classically, perturbative gravity has localized subsystems based on a split structure, generalizing the split property of quantum field theory. This, and related arguments for QED, give simple explanations that in these theories there is localized information that is independent of fields outside a region, in particular so that there is no role for "soft hair" in encoding such information. Additional subtleties appear in quantum gravity. We argue that localized information exists in perturbative quantum gravity in the presence of global symmetries, but that nonperturbative dynamics is likely tied to a modification of such structure.
Stott, Clifford; Drury, John
2016-04-01
This article explores the origins and ideology of classical crowd psychology, a body of theory reflected in contemporary popularised understandings such as of the 2011 English 'riots'. This article argues that during the nineteenth century, the crowd came to symbolise a fear of 'mass society' and that 'classical' crowd psychology was a product of these fears. Classical crowd psychology pathologised, reified and decontextualised the crowd, offering the ruling elites a perceived opportunity to control it. We contend that classical theory misrepresents crowd psychology and survives in contemporary understanding because it is ideological. We conclude by discussing how classical theory has been supplanted in academic contexts by an identity-based crowd psychology that restores the meaning to crowd action, replaces it in its social context and in so doing transforms theoretical understanding of 'riots' and the nature of the self. © The Author(s) 2016.
Influence of an asymmetric ring on the modeling of an orthogonally stiffened cylindrical shell
NASA Technical Reports Server (NTRS)
Rastogi, Naveen; Johnson, Eric R.
1994-01-01
Structural models are examined for the influence of a ring with an asymmetrical cross section on the linear elastic response of an orthogonally stiffened cylindrical shell subjected to internal pressure. The first structural model employs classical theory for the shell and stiffeners. The second model employs transverse shear deformation theories for the shell and stringer and classical theory for the ring. Closed-end pressure vessel effects are included. Interacting line load intensities are computed in the stiffener-to-skin joints for an example problem having the dimensions of the fuselage of a large transport aircraft. Classical structural theory is found to exaggerate the asymmetric response compared to the transverse shear deformation theory.
Application of quantum master equation for long-term prognosis of asset-prices
NASA Astrophysics Data System (ADS)
Khrennikova, Polina
2016-05-01
This study combines the disciplines of behavioral finance and an extension of econophysics, namely the concepts and mathematical structure of quantum physics. We apply the formalism of quantum theory to model the dynamics of some correlated financial assets, where the proposed model can be potentially applied for developing a long-term prognosis of asset price formation. At the informational level, the asset price states interact with each other by the means of a ;financial bath;. The latter is composed of agents' expectations about the future developments of asset prices on the finance market, as well as financially important information from mass-media, society, and politicians. One of the essential behavioral factors leading to the quantum-like dynamics of asset prices is the irrationality of agents' expectations operating on the finance market. These expectations lead to a deeper type of uncertainty concerning the future price dynamics of the assets, than given by a classical probability theory, e.g., in the framework of the classical financial mathematics, which is based on the theory of stochastic processes. The quantum dimension of the uncertainty in price dynamics is expressed in the form of the price-states superposition and entanglement between the prices of the different financial assets. In our model, the resolution of this deep quantum uncertainty is mathematically captured with the aid of the quantum master equation (its quantum Markov approximation). We illustrate our model of preparation of a future asset price prognosis by a numerical simulation, involving two correlated assets. Their returns interact more intensively, than understood by a classical statistical correlation. The model predictions can be extended to more complex models to obtain price configuration for multiple assets and portfolios.
Experimental quantum data locking
NASA Astrophysics Data System (ADS)
Liu, Yang; Cao, Zhu; Wu, Cheng; Fukuda, Daiji; You, Lixing; Zhong, Jiaqiang; Numata, Takayuki; Chen, Sijing; Zhang, Weijun; Shi, Sheng-Cai; Lu, Chao-Yang; Wang, Zhen; Ma, Xiongfeng; Fan, Jingyun; Zhang, Qiang; Pan, Jian-Wei
2016-08-01
Classical correlation can be locked via quantum means: quantum data locking. With a short secret key, one can lock an exponentially large amount of information in order to make it inaccessible to unauthorized users without the key. Quantum data locking presents a resource-efficient alternative to one-time pad encryption which requires a key no shorter than the message. We report experimental demonstrations of a quantum data locking scheme originally proposed by D. P. DiVincenzo et al. [Phys. Rev. Lett. 92, 067902 (2004), 10.1103/PhysRevLett.92.067902] and a loss-tolerant scheme developed by O. Fawzi et al. [J. ACM 60, 44 (2013), 10.1145/2518131]. We observe that the unlocked amount of information is larger than the key size in both experiments, exhibiting strong violation of the incremental proportionality property of classical information theory. As an application example, we show the successful transmission of a photo over a lossy channel with quantum data (un)locking and error correction.
ERIC Educational Resources Information Center
Magno, Carlo
2009-01-01
The present report demonstrates the difference between classical test theory (CTT) and item response theory (IRT) approach using an actual test data for chemistry junior high school students. The CTT and IRT were compared across two samples and two forms of test on their item difficulty, internal consistency, and measurement errors. The specific…
ERIC Educational Resources Information Center
Guler, Nese; Gelbal, Selahattin
2010-01-01
In this study, the Classical test theory and generalizability theory were used for determination to reliability of scores obtained from measurement tool of mathematics success. 24 open-ended mathematics question of the TIMSS-1999 was applied to 203 students in 2007-spring semester. Internal consistency of scores was found as 0.92. For…
ERIC Educational Resources Information Center
Kohli, Nidhi; Koran, Jennifer; Henn, Lisa
2015-01-01
There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…
Constrained variational calculus for higher order classical field theories
NASA Astrophysics Data System (ADS)
Campos, Cédric M.; de León, Manuel; Martín de Diego, David
2010-11-01
We develop an intrinsic geometrical setting for higher order constrained field theories. As a main tool we use an appropriate generalization of the classical Skinner-Rusk formalism. Some examples of applications are studied, in particular to the geometrical description of optimal control theory for partial differential equations.
Chance, determinism and the classical theory of probability.
Vasudevan, Anubav
2018-02-01
This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.
A knowledge-based system for prototypical reasoning
NASA Astrophysics Data System (ADS)
Lieto, Antonio; Minieri, Andrea; Piana, Alberto; Radicioni, Daniele P.
2015-04-01
In this work we present a knowledge-based system equipped with a hybrid, cognitively inspired architecture for the representation of conceptual information. The proposed system aims at extending the classical representational and reasoning capabilities of the ontology-based frameworks towards the realm of the prototype theory. It is based on a hybrid knowledge base, composed of a classical symbolic component (grounded on a formal ontology) with a typicality based one (grounded on the conceptual spaces framework). The resulting system attempts to reconcile the heterogeneous approach to the concepts in Cognitive Science with the dual process theories of reasoning and rationality. The system has been experimentally assessed in a conceptual categorisation task where common sense linguistic descriptions were given in input, and the corresponding target concepts had to be identified. The results show that the proposed solution substantially extends the representational and reasoning 'conceptual' capabilities of standard ontology-based systems.
Neuromodulated Spike-Timing-Dependent Plasticity, and Theory of Three-Factor Learning Rules.
Frémaux, Nicolas; Gerstner, Wulfram
2015-01-01
Classical Hebbian learning puts the emphasis on joint pre- and postsynaptic activity, but neglects the potential role of neuromodulators. Since neuromodulators convey information about novelty or reward, the influence of neuromodulators on synaptic plasticity is useful not just for action learning in classical conditioning, but also to decide "when" to create new memories in response to a flow of sensory stimuli. In this review, we focus on timing requirements for pre- and postsynaptic activity in conjunction with one or several phasic neuromodulatory signals. While the emphasis of the text is on conceptual models and mathematical theories, we also discuss some experimental evidence for neuromodulation of Spike-Timing-Dependent Plasticity. We highlight the importance of synaptic mechanisms in bridging the temporal gap between sensory stimulation and neuromodulatory signals, and develop a framework for a class of neo-Hebbian three-factor learning rules that depend on presynaptic activity, postsynaptic variables as well as the influence of neuromodulators.
Dressing the post-Newtonian two-body problem and classical effective field theory
NASA Astrophysics Data System (ADS)
Kol, Barak; Smolkin, Michael
2009-12-01
We apply a dressed perturbation theory to better organize and economize the computation of high orders of the 2-body effective action of an inspiralling post-Newtonian (PN) gravitating binary. We use the effective field theory approach with the nonrelativistic field decomposition (NRG fields). For that purpose we develop quite generally the dressing theory of a nonlinear classical field theory coupled to pointlike sources. We introduce dressed charges and propagators, but unlike the quantum theory there are no dressed bulk vertices. The dressed quantities are found to obey recursive integral equations which succinctly encode parts of the diagrammatic expansion, and are the classical version of the Schwinger-Dyson equations. Actually, the classical equations are somewhat stronger since they involve only finitely many quantities, unlike the quantum theory. Classical diagrams are shown to factorize exactly when they contain nonlinear worldline vertices, and we classify all the possible topologies of irreducible diagrams for low loop numbers. We apply the dressing program to our post-Newtonian case of interest. The dressed charges consist of the dressed energy-momentum tensor after a nonrelativistic decomposition, and we compute all dressed charges (in the harmonic gauge) appearing up to 2PN in the 2-body effective action (and more). We determine the irreducible skeleton diagrams up to 3PN and we employ the dressed charges to compute several terms beyond 2PN.
Quantum many-body theory for electron spin decoherence in nanoscale nuclear spin baths.
Yang, Wen; Ma, Wen-Long; Liu, Ren-Bao
2017-01-01
Decoherence of electron spins in nanoscale systems is important to quantum technologies such as quantum information processing and magnetometry. It is also an ideal model problem for studying the crossover between quantum and classical phenomena. At low temperatures or in light-element materials where the spin-orbit coupling is weak, the phonon scattering in nanostructures is less important and the fluctuations of nuclear spins become the dominant decoherence mechanism for electron spins. Since the 1950s, semi-classical noise theories have been developed for understanding electron spin decoherence. In spin-based solid-state quantum technologies, the relevant systems are in the nanometer scale and nuclear spin baths are quantum objects which require a quantum description. Recently, quantum pictures have been established to understand the decoherence and quantum many-body theories have been developed to quantitatively describe this phenomenon. Anomalous quantum effects have been predicted and some have been experimentally confirmed. A systematically truncated cluster-correlation expansion theory has been developed to account for the many-body correlations in nanoscale nuclear spin baths that are built up during electron spin decoherence. The theory has successfully predicted and explained a number of experimental results in a wide range of physical systems. In this review, we will cover this recent progress. The limitations of the present quantum many-body theories and possible directions for future development will also be discussed.
A Demographic Perspective on Family Change
Bianchi, Suzanne M.
2014-01-01
Demographic analysis seeks to understand how individual microlevel decisions about child-bearing, marriage and partnering, geographic mobility, and behaviors that influence health and longevity aggregate to macrolevel population trends and differentials in fertility, mortality and migration. In this review, I first discuss theoretical perspectives—classic demographic transition theory, the perspective of the “second demographic transition,” the spread of developmental idealism—that inform demographers’ understanding of macrolevel population change. Then, I turn to a discussion of the role that demographically informed data collection has played in illuminating family change since the mid-20th century in the United States. Finally, I discuss ways in which demographic theory and data collection might inform future areas of family research, particularly in the area of intergenerational family relationships and new and emerging family forms. PMID:26078785
The Double-Well Potential in Quantum Mechanics: A Simple, Numerically Exact Formulation
ERIC Educational Resources Information Center
Jelic, V.; Marsiglio, F.
2012-01-01
The double-well potential is arguably one of the most important potentials in quantum mechanics, because the solution contains the notion of a state as a linear superposition of "classical" states, a concept which has become very important in quantum information theory. It is therefore desirable to have solutions to simple double-well potentials…
ERIC Educational Resources Information Center
Mair, Christine A.; Thivierge-Rikard, R. V.
2010-01-01
Classic and contemporary sociological theories suggest that social interaction differs in rural and urban areas. Intimate, informal interactions (strong ties) are theorized to characterize rural areas while urban areas may possess more formal and rationalized interactions (weak ties). Aging and social support literature stresses social interaction…
Emergent Geometry from Entropy and Causality
NASA Astrophysics Data System (ADS)
Engelhardt, Netta
In this thesis, we investigate the connections between the geometry of spacetime and aspects of quantum field theory such as entanglement entropy and causality. This work is motivated by the idea that spacetime geometry is an emergent phenomenon in quantum gravity, and that the physics responsible for this emergence is fundamental to quantum field theory. Part I of this thesis is focused on the interplay between spacetime and entropy, with a special emphasis on entropy due to entanglement. In general spacetimes, there exist locally-defined surfaces sensitive to the geometry that may act as local black hole boundaries or cosmological horizons; these surfaces, known as holographic screens, are argued to have a connection with the second law of thermodynamics. Holographic screens obey an area law, suggestive of an association with entropy; they are also distinguished surfaces from the perspective of the covariant entropy bound, a bound on the total entropy of a slice of the spacetime. This construction is shown to be quite general, and is formulated in both classical and perturbatively quantum theories of gravity. The remainder of Part I uses the Anti-de Sitter/ Conformal Field Theory (AdS/CFT) correspondence to both expand and constrain the connection between entanglement entropy and geometry. The AdS/CFT correspondence posits an equivalence between string theory in the "bulk" with AdS boundary conditions and certain quantum field theories. In the limit where the string theory is simply classical General Relativity, the Ryu-Takayanagi and more generally, the Hubeny-Rangamani-Takayanagi (HRT) formulae provide a way of relating the geometry of surfaces to entanglement entropy. A first-order bulk quantum correction to HRT was derived by Faulkner, Lewkowycz and Maldacena. This formula is generalized to include perturbative quantum corrections in the bulk at any (finite) order. Hurdles to spacetime emergence from entanglement entropy as described by HRT and its quantum generalizations are discussed, both at the classical and perturbatively quantum limits. In particular, several No Go Theorems are proven, indicative of a conclusion that supplementary approaches or information may be necessary to recover the full spacetime geometry. Part II of this thesis involves the relation between geometry and causality, the property that information cannot travel faster than light. Requiring this of any quantum field theory results in constraints on string theory setups that are dual to quantum field theories via the AdS/CFT correspondence. At the level of perturbative quantum gravity, it is shown that causality in the field theory constraints the causal structure in the bulk. At the level of nonperturbative quantum string theory, we find that constraints on causal signals restrict the possible ways in which curvature singularities can be resolved in string theory. Finally, a new program of research is proposed for the construction of bulk geometry from the divergences of correlation functions in the dual field theory. This divergence structure is linked to the causal structure of the bulk and of the field theory.
Bosonic Loop Diagrams as Perturbative Solutions of the Classical Field Equations in ϕ4-Theory
NASA Astrophysics Data System (ADS)
Finster, Felix; Tolksdorf, Jürgen
2012-05-01
Solutions of the classical ϕ4-theory in Minkowski space-time are analyzed in a perturbation expansion in the nonlinearity. Using the language of Feynman diagrams, the solution of the Cauchy problem is expressed in terms of tree diagrams which involve the retarded Green's function and have one outgoing leg. In order to obtain general tree diagrams, we set up a "classical measurement process" in which a virtual observer of a scattering experiment modifies the field and detects suitable energy differences. By adding a classical stochastic background field, we even obtain all loop diagrams. The expansions are compared with the standard Feynman diagrams of the corresponding quantum field theory.
Zurek, Wojciech Hubert
2018-07-13
The emergence of the classical world from the quantum substrate of our Universe is a long-standing conundrum. In this paper, I describe three insights into the transition from quantum to classical that are based on the recognition of the role of the environment. I begin with the derivation of preferred sets of states that help to define what exists-our everyday classical reality. They emerge as a result of the breaking of the unitary symmetry of the Hilbert space which happens when the unitarity of quantum evolutions encounters nonlinearities inherent in the process of amplification-of replicating information. This derivation is accomplished without the usual tools of decoherence, and accounts for the appearance of quantum jumps and the emergence of preferred pointer states consistent with those obtained via environment-induced superselection, or einselection The pointer states obtained in this way determine what can happen-define events-without appealing to Born's Rule for probabilities. Therefore, p k =| ψ k | 2 can now be deduced from the entanglement-assisted invariance, or envariance -a symmetry of entangled quantum states. With probabilities at hand, one also gains new insights into the foundations of quantum statistical physics. Moreover, one can now analyse the information flows responsible for decoherence. These information flows explain how the perception of objective classical reality arises from the quantum substrate: the effective amplification that they represent accounts for the objective existence of the einselected states of macroscopic quantum systems through the redundancy of pointer state records in their environment-through quantum Darwinism This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).
How Settings Change People: Applying Behavior Setting Theory to Consumer-Run Organizations
ERIC Educational Resources Information Center
Brown, Louis D.; Shepherd, Matthew D.; Wituk, Scott A.; Meissen, Greg
2007-01-01
Self-help initiatives stand as a classic context for organizational studies in community psychology. Behavior setting theory stands as a classic conception of organizations and the environment. This study explores both, applying behavior setting theory to consumer-run organizations (CROs). Analysis of multiple data sets from all CROs in Kansas…
ERIC Educational Resources Information Center
Langdale, John A.
The construct of "organizational climate" was explicated and various ways of operationalizing it were reviewed. A survey was made of the literature pertinent to the classical-human relations dimension of environmental quality. As a result, it was hypothesized that the appropriateness of the classical and human-relations master plans is moderated…
The evolving Planck mass in classically scale-invariant theories
NASA Astrophysics Data System (ADS)
Kannike, K.; Raidal, M.; Spethmann, C.; Veermäe, H.
2017-04-01
We consider classically scale-invariant theories with non-minimally coupled scalar fields, where the Planck mass and the hierarchy of physical scales are dynamically generated. The classical theories possess a fixed point, where scale invariance is spontaneously broken. In these theories, however, the Planck mass becomes unstable in the presence of explicit sources of scale invariance breaking, such as non-relativistic matter and cosmological constant terms. We quantify the constraints on such classical models from Big Bang Nucleosynthesis that lead to an upper bound on the non-minimal coupling and require trans-Planckian field values. We show that quantum corrections to the scalar potential can stabilise the fixed point close to the minimum of the Coleman-Weinberg potential. The time-averaged motion of the evolving fixed point is strongly suppressed, thus the limits on the evolving gravitational constant from Big Bang Nucleosynthesis and other measurements do not presently constrain this class of theories. Field oscillations around the fixed point, if not damped, contribute to the dark matter density of the Universe.
Information theory lateral density distribution for Earth inferred from global gravity field
NASA Technical Reports Server (NTRS)
Rubincam, D. P.
1981-01-01
Information Theory Inference, better known as the Maximum Entropy Method, was used to infer the lateral density distribution inside the Earth. The approach assumed that the Earth consists of indistinguishable Maxwell-Boltzmann particles populating infinitesimal volume elements, and followed the standard methods of statistical mechanics (maximizing the entropy function). The GEM 10B spherical harmonic gravity field coefficients, complete to degree and order 36, were used as constraints on the lateral density distribution. The spherically symmetric part of the density distribution was assumed to be known. The lateral density variation was assumed to be small compared to the spherically symmetric part. The resulting information theory density distribution for the cases of no crust removed, 30 km of compensated crust removed, and 30 km of uncompensated crust removed all gave broad density anomalies extending deep into the mantle, but with the density contrasts being the greatest towards the surface (typically + or 0.004 g cm 3 in the first two cases and + or - 0.04 g cm 3 in the third). None of the density distributions resemble classical organized convection cells. The information theory approach may have use in choosing Standard Earth Models, but, the inclusion of seismic data into the approach appears difficult.
Amplification of Information by Photons and the Quantum Chernoff Bound
NASA Astrophysics Data System (ADS)
Zwolak, Michael; Riedel, C. Jess; Zurek, Wojciech H.
2014-03-01
Amplification was regarded, since the early days of quantum theory, as a mysterious ingredient that endows quantum microstates with macroscopic consequences, key to the ``collapse of the wavepacket,'' and a way to avoid embarrassing problems exemplified by Schrödinger's cat. This bridge between the quantum microworld and the classical world of our experience was postulated ad hoc in the Copenhagen Interpretation. Quantum Darwinism views amplification as replication, in many copies, of information about quantum states. We show that such amplification is a natural consequence of a broad class of models of decoherence, including the photon environment we use to obtain most of our information. The resultant amplification is huge, proportional to # ξQCB . Here, # is the environment size and ξQCB is the ``typical'' Quantum Chernoff Information, which quantifies the efficiency of the amplification. The information communicated though the environment is imprinted in the states of individual environment subsystems, e.g., in single photons, which document the transfer of information into the environment and result in the emergence of the classical world. See, http://mike.zwolak.org
Holography as a highly efficient renormalization group flow. I. Rephrasing gravity
NASA Astrophysics Data System (ADS)
Behr, Nicolas; Kuperstein, Stanislav; Mukhopadhyay, Ayan
2016-07-01
We investigate how the holographic correspondence can be reformulated as a generalization of Wilsonian renormalization group (RG) flow in a strongly interacting large-N quantum field theory. We first define a highly efficient RG flow as one in which the Ward identities related to local conservation of energy, momentum and charges preserve the same form at each scale. To achieve this, it is necessary to redefine the background metric and external sources at each scale as functionals of the effective single-trace operators. These redefinitions also absorb the contributions of the multitrace operators to these effective Ward identities. Thus, the background metric and external sources become effectively dynamical, reproducing the dual classical gravity equations in one higher dimension. Here, we focus on reconstructing the pure gravity sector as a highly efficient RG flow of the energy-momentum tensor operator, leaving the explicit constructive field theory approach for generating such RG flows to the second part of the work. We show that special symmetries of the highly efficient RG flows carry information through which we can decode the gauge fixing of bulk diffeomorphisms in the corresponding gravity equations. We also show that the highly efficient RG flow which reproduces a given classical gravity theory in a given gauge is unique provided the endpoint can be transformed to a nonrelativistic fixed point with a finite number of parameters under a universal rescaling. The results obtained here are used in the second part of this work, where we do an explicit field-theoretic construction of the RG flow and obtain the dual classical gravity theory.
Hadronic density of states from string theory.
Pando Zayas, Leopoldo A; Vaman, Diana
2003-09-12
We present an exact calculation of the finite temperature partition function for the hadronic states corresponding to a Penrose-Güven limit of the Maldacena-Nùñez embedding of the N=1 super Yang-Mills (SYM) into string theory. It is established that the theory exhibits a Hagedorn density of states. We propose a semiclassical string approximation to the finite temperature partition function for confining gauge theories admitting a supergravity dual, by performing an expansion around classical solutions characterized by temporal windings. This semiclassical approximation reveals a hadronic energy density of states of a Hagedorn type, with the coefficient determined by the gauge theory string tension as expected for confining theories. We argue that our proposal captures primarily information about states of pure N=1 SYM theory, given that this semiclassical approximation does not entail a projection onto states of large U(1) charge.
DOE R&D Accomplishments Database
Weinberg, Alvin M.; Noderer, L. C.
1951-05-15
The large scale release of nuclear energy in a uranium fission chain reaction involves two essentially distinct physical phenomena. On the one hand there are the individual nuclear processes such as fission, neutron capture, and neutron scattering. These are essentially quantum mechanical in character, and their theory is non-classical. On the other hand, there is the process of diffusion -- in particular, diffusion of neutrons, which is of fundamental importance in a nuclear chain reaction. This process is classical; insofar as the theory of the nuclear chain reaction depends on the theory of neutron diffusion, the mathematical study of chain reactions is an application of classical, not quantum mechanical, techniques.
Classical conformality in the Standard Model from Coleman’s theory
NASA Astrophysics Data System (ADS)
Kawana, Kiyoharu
2016-09-01
The classical conformality (CC) is one of the possible candidates for explaining the gauge hierarchy of the Standard Model (SM). We show that it is naturally obtained from the Coleman’s theory on baby universe.
NASA Astrophysics Data System (ADS)
Baumeler, ńmin; Feix, Adrien; Wolf, Stefan
2014-10-01
Quantum theory in a global spacetime gives rise to nonlocal correlations, which cannot be explained causally in a satisfactory way; this motivates the study of theories with reduced global assumptions. Oreshkov, Costa, and Brukner [Nat. Commun. 3, 1092 (2012), 10.1038/ncomms2076] proposed a framework in which quantum theory is valid locally but where, at the same time, no global spacetime, i.e., predefined causal order, is assumed beyond the absence of logical paradoxes. It was shown for the two-party case, however, that a global causal order always emerges in the classical limit. Quite naturally, it has been conjectured that the same also holds in the multiparty setting. We show that, counter to this belief, classical correlations locally compatible with classical probability theory exist that allow for deterministic signaling between three or more parties incompatible with any predefined causal order.
A post-classical theory of enamel biomineralization… and why we need one.
Simmer, James P; Richardson, Amelia S; Hu, Yuan-Yuan; Smith, Charles E; Ching-Chun Hu, Jan
2012-09-01
Enamel crystals are unique in shape, orientation and organization. They are hundreds of thousands times longer than they are wide, run parallel to each other, are oriented with respect to the ameloblast membrane at the mineralization front and are organized into rod or interrod enamel. The classical theory of amelogenesis postulates that extracellular matrix proteins shape crystallites by specifically inhibiting ion deposition on the crystal sides, orient them by binding multiple crystallites and establish higher levels of crystal organization. Elements of the classical theory are supported in principle by in vitro studies; however, the classical theory does not explain how enamel forms in vivo. In this review, we describe how amelogenesis is highly integrated with ameloblast cell activities and how the shape, orientation and organization of enamel mineral ribbons are established by a mineralization front apparatus along the secretory surface of the ameloblast cell membrane.
Quantum algorithm for energy matching in hard optimization problems
NASA Astrophysics Data System (ADS)
Baldwin, C. L.; Laumann, C. R.
2018-06-01
We consider the ability of local quantum dynamics to solve the "energy-matching" problem: given an instance of a classical optimization problem and a low-energy state, find another macroscopically distinct low-energy state. Energy matching is difficult in rugged optimization landscapes, as the given state provides little information about the distant topography. Here, we show that the introduction of quantum dynamics can provide a speedup over classical algorithms in a large class of hard optimization problems. Tunneling allows the system to explore the optimization landscape while approximately conserving the classical energy, even in the presence of large barriers. Specifically, we study energy matching in the random p -spin model of spin-glass theory. Using perturbation theory and exact diagonalization, we show that introducing a transverse field leads to three sharp dynamical phases, only one of which solves the matching problem: (1) a small-field "trapped" phase, in which tunneling is too weak for the system to escape the vicinity of the initial state; (2) a large-field "excited" phase, in which the field excites the system into high-energy states, effectively forgetting the initial energy; and (3) the intermediate "tunneling" phase, in which the system succeeds at energy matching. The rate at which distant states are found in the tunneling phase, although exponentially slow in system size, is exponentially faster than classical search algorithms.
Autosophy information theory provides lossless data and video compression based on the data content
NASA Astrophysics Data System (ADS)
Holtz, Klaus E.; Holtz, Eric S.; Holtz, Diana
1996-09-01
A new autosophy information theory provides an alternative to the classical Shannon information theory. Using the new theory in communication networks provides both a high degree of lossless compression and virtually unbreakable encryption codes for network security. The bandwidth in a conventional Shannon communication is determined only by the data volume and the hardware parameters, such as image size; resolution; or frame rates in television. The data content, or what is shown on the screen, is irrelevant. In contrast, the bandwidth in autosophy communication is determined only by data content, such as novelty and movement in television images. It is the data volume and hardware parameters that become irrelevant. Basically, the new communication methods use prior 'knowledge' of the data, stored in a library, to encode subsequent transmissions. The more 'knowledge' stored in the libraries, the higher the potential compression ratio. 'Information' is redefined as that which is not already known by the receiver. Everything already known is redundant and need not be re-transmitted. In a perfect communication each transmission code, called a 'tip,' creates a new 'engram' of knowledge in the library in which each tip transmission can represent any amount of data. Autosophy theories provide six separate learning modes, or omni dimensional networks, all of which can be used for data compression. The new information theory reveals the theoretical flaws of other data compression methods, including: the Huffman; Ziv Lempel; LZW codes and commercial compression codes such as V.42bis and MPEG-2.
Statistical mechanics in the context of special relativity. II.
Kaniadakis, G
2005-09-01
The special relativity laws emerge as one-parameter (light speed) generalizations of the corresponding laws of classical physics. These generalizations, imposed by the Lorentz transformations, affect both the definition of the various physical observables (e.g., momentum, energy, etc.), as well as the mathematical apparatus of the theory. Here, following the general lines of [Phys. Rev. E 66, 056125 (2002)], we show that the Lorentz transformations impose also a proper one-parameter generalization of the classical Boltzmann-Gibbs-Shannon entropy. The obtained relativistic entropy permits us to construct a coherent and self-consistent relativistic statistical theory, preserving the main features of the ordinary statistical theory, which is recovered in the classical limit. The predicted distribution function is a one-parameter continuous deformation of the classical Maxwell-Boltzmann distribution and has a simple analytic form, showing power law tails in accordance with the experimental evidence. Furthermore, this statistical mechanics can be obtained as the stationary case of a generalized kinetic theory governed by an evolution equation obeying the H theorem and reproducing the Boltzmann equation of the ordinary kinetics in the classical limit.
Topics in quantum cryptography, quantum error correction, and channel simulation
NASA Astrophysics Data System (ADS)
Luo, Zhicheng
In this thesis, we mainly investigate four different topics: efficiently implementable codes for quantum key expansion [51], quantum error-correcting codes based on privacy amplification [48], private classical capacity of quantum channels [44], and classical channel simulation with quantum side information [49, 50]. For the first topic, we propose an efficiently implementable quantum key expansion protocol, capable of increasing the size of a pre-shared secret key by a constant factor. Previously, the Shor-Preskill proof [64] of the security of the Bennett-Brassard 1984 (BB84) [6] quantum key distribution protocol relied on the theoretical existence of good classical error-correcting codes with the "dual-containing" property. But the explicit and efficiently decodable construction of such codes is unknown. We show that we can lift the dual-containing constraint by employing the non-dual-containing codes with excellent performance and efficient decoding algorithms. For the second topic, we propose a construction of Calderbank-Shor-Steane (CSS) [19, 68] quantum error-correcting codes, which are originally based on pairs of mutually dual-containing classical codes, by combining a classical code with a two-universal hash function. We show, using the results of Renner and Koenig [57], that the communication rates of such codes approach the hashing bound on tensor powers of Pauli channels in the limit of large block-length. For the third topic, we prove a regularized formula for the secret key assisted capacity region of a quantum channel for transmitting private classical information. This result parallels the work of Devetak on entanglement assisted quantum communication capacity. This formula provides a new family protocol, the private father protocol, under the resource inequality framework that includes the private classical communication without the assisted secret keys as a child protocol. For the fourth topic, we study and solve the problem of classical channel simulation with quantum side information at the receiver. Our main theorem has two important corollaries: rate-distortion theory with quantum side information and common randomness distillation. Simple proofs of achievability of classical multi-terminal source coding problems can be made via a unified approach using the channel simulation theorem as building blocks. The fully quantum generalization of the problem is also conjectured with outer and inner bounds on the achievable rate pairs.
Signatures of bifurcation on quantum correlations: Case of the quantum kicked top
NASA Astrophysics Data System (ADS)
Bhosale, Udaysinh T.; Santhanam, M. S.
2017-01-01
Quantum correlations reflect the quantumness of a system and are useful resources for quantum information and computational processes. Measures of quantum correlations do not have a classical analog and yet are influenced by classical dynamics. In this work, by modeling the quantum kicked top as a multiqubit system, the effect of classical bifurcations on measures of quantum correlations such as the quantum discord, geometric discord, and Meyer and Wallach Q measure is studied. The quantum correlation measures change rapidly in the vicinity of a classical bifurcation point. If the classical system is largely chaotic, time averages of the correlation measures are in good agreement with the values obtained by considering the appropriate random matrix ensembles. The quantum correlations scale with the total spin of the system, representing its semiclassical limit. In the vicinity of trivial fixed points of the kicked top, the scaling function decays as a power law. In the chaotic limit, for large total spin, quantum correlations saturate to a constant, which we obtain analytically, based on random matrix theory, for the Q measure. We also suggest that it can have experimental consequences.
Classical Causal Models for Bell and Kochen-Specker Inequality Violations Require Fine-Tuning
NASA Astrophysics Data System (ADS)
Cavalcanti, Eric G.
2018-04-01
Nonlocality and contextuality are at the root of conceptual puzzles in quantum mechanics, and they are key resources for quantum advantage in information-processing tasks. Bell nonlocality is best understood as the incompatibility between quantum correlations and the classical theory of causality, applied to relativistic causal structure. Contextuality, on the other hand, is on a more controversial foundation. In this work, I provide a common conceptual ground between nonlocality and contextuality as violations of classical causality. First, I show that Bell inequalities can be derived solely from the assumptions of no signaling and no fine-tuning of the causal model. This removes two extra assumptions from a recent result from Wood and Spekkens and, remarkably, does not require any assumption related to independence of measurement settings—unlike all other derivations of Bell inequalities. I then introduce a formalism to represent contextuality scenarios within causal models and show that all classical causal models for violations of a Kochen-Specker inequality require fine-tuning. Thus, the quantum violation of classical causality goes beyond the case of spacelike-separated systems and already manifests in scenarios involving single systems.
ERIC Educational Resources Information Center
Pfeifer, Jennifer H.; Masten, Carrie L.; Borofsky, Larissa A.; Dapretto, Mirella; Fuligni, Andrew J.; Lieberman, Matthew D.
2009-01-01
Classic theories of self-development suggest people define themselves in part through internalized perceptions of other people's beliefs about them, known as reflected self-appraisals. This study uses functional magnetic resonance imaging to compare the neural correlates of direct and reflected self-appraisals in adolescence (N = 12, ages 11-14…
Deterministic Chaos: Proposal of an Informal Educational Activity Aimed at High School Students
ERIC Educational Resources Information Center
Greco, Valeria; Spagnolo, Salvatore
2016-01-01
Chaos theory is not present in the Italian school curricula and textbooks in spite of being present in many topics of classical physics and in everyday life. Chaotic dynamics, in fact, are involved in phenomena easily accessible to everyone or in events experienced by most people in their lives (the dripping of a faucet which keeps people awoken…
Informational analysis for compressive sampling in radar imaging.
Zhang, Jingxiong; Yang, Ke
2015-03-24
Compressive sampling or compressed sensing (CS) works on the assumption of the sparsity or compressibility of the underlying signal, relies on the trans-informational capability of the measurement matrix employed and the resultant measurements, operates with optimization-based algorithms for signal reconstruction and is thus able to complete data compression, while acquiring data, leading to sub-Nyquist sampling strategies that promote efficiency in data acquisition, while ensuring certain accuracy criteria. Information theory provides a framework complementary to classic CS theory for analyzing information mechanisms and for determining the necessary number of measurements in a CS environment, such as CS-radar, a radar sensor conceptualized or designed with CS principles and techniques. Despite increasing awareness of information-theoretic perspectives on CS-radar, reported research has been rare. This paper seeks to bridge the gap in the interdisciplinary area of CS, radar and information theory by analyzing information flows in CS-radar from sparse scenes to measurements and determining sub-Nyquist sampling rates necessary for scene reconstruction within certain distortion thresholds, given differing scene sparsity and average per-sample signal-to-noise ratios (SNRs). Simulated studies were performed to complement and validate the information-theoretic analysis. The combined strategy proposed in this paper is valuable for information-theoretic orientated CS-radar system analysis and performance evaluation.
Emergent dark energy via decoherence in quantum interactions
NASA Astrophysics Data System (ADS)
Altamirano, Natacha; Corona-Ugalde, Paulina; Khosla, Kiran E.; Milburn, Gerard J.; Mann, Robert B.
2017-06-01
In this work we consider a recent proposal that gravitational interactions are mediated via classical information and apply it to a relativistic context. We study a toy model of a quantized Friedman-Robertson-Walker (FRW) universe with the assumption that any test particles must feel a classical metric. We show that such a model results in decoherence in the FRW state that manifests itself as a dark energy fluid that fills the spacetime. Analysis of the resulting fluid, shows the equation of state asymptotically oscillates around the value w = -1/3, regardless of the spatial curvature, which provides the bound between accelerating and decelerating expanding FRW cosmologies. Motivated with quantum-classical interactions this model is yet another example of theories with violation of energy-momentum conservation whose signature could have significant consequences for the observable universe.
Characterization of classical static noise via qubit as probe
NASA Astrophysics Data System (ADS)
Javed, Muhammad; Khan, Salman; Ullah, Sayed Arif
2018-03-01
The dynamics of quantum Fisher information (QFI) of a single qubit coupled to classical static noise is investigated. The analytical relation for QFI fixes the optimal initial state of the qubit that maximizes it. An approximate limit for the time of coupling that leads to physically useful results is identified. Moreover, using the approach of quantum estimation theory and the analytical relation for QFI, the qubit is used as a probe to precisely estimate the disordered parameter of the environment. Relation for optimal interaction time with the environment is obtained, and condition for the optimal measurement of the noise parameter of the environment is given. It is shown that all values, in the mentioned range, of the noise parameter are estimable with equal precision. A comparison of our results with the previous studies in different classical environments is made.
Integrative mental health care: from theory to practice, Part 2.
Lake, James
2008-01-01
Integrative approaches will lead to more accurate and different understandings of mental illness. Beneficial responses to complementary and alternative therapies provide important clues about the phenomenal nature of the human body in space-time and disparate biological, informational, and energetic factors associated with normal and abnormal psychological functioning. The conceptual framework of contemporary Western psychiatry includes multiple theoretical viewpoints, and there is no single best explanatory model of mental illness. Future theories of mental illness causation will not depend exclusively on empirical verification of strictly biological processes but will take into account both classically described biological processes and non-classical models, including complexity theory, resulting in more complete explanations of the characteristics and causes of symptoms and mechanisms of action that result in beneficial responses to treatments. Part 1 of this article examined the limitations of the theory and contemporary clinical methods employed in Western psychiatry and discussed implications of emerging paradigms in physics and the biological sciences for the future of psychiatry. In part 2, a practical methodology, for planning integrative assessment and treatment strategies in mental health care is proposed. Using this methodology the integrative management of moderate and severe psychiatric symptoms is reviewed in detail. As the conceptual framework of Western medicine evolves toward an increasingly integrative perspective, novel understanding of complex relationships between biological, informational, and energetic processes associated with normal psychological functioning and mental illness will lead to more effective integrative assessment and treatment strategies addressing the causes or meanings of symptoms at multiple hierarchic levels of body-brain-mind.
Integrative mental health care: from theory to practice, part 1.
Lake, James
2007-01-01
Integrative approaches will lead to more accurate and different understandings of mental illness. Beneficial responses to complementary and alternative therapies provide important clues about the phenomenal nature of the human body in space-time and disparate biological, informational, and energetic factors associated with normal and abnormal psychological functioning. The conceptual framework of contemporary Western psychiatry includes multiple theoretical viewpoints, and there is no single best explanatory model of mental illness. Future theories of mental illness causation will not depend exclusively on empirical verification of strictly biological processes but will take into account both classically described biological processes and non-classical models, including complexity theory, resulting in more complete explanations of the characteristics and causes of symptoms and mechanisms of action that result in beneficial responses to treatments. Part 1 of this article examines the limitations of the theory and contemporary clinical methods employed in Western psychiatry and discusses implications of emerging paradigms in physics and the biological sciences for the future of psychiatry. In part 2, a practical methodology for planning integrative assessment and treatment strategies in mental health care is proposed. Using this methodology the integrative management of moderate and severe psychiatric symptoms is reviewed in detail. As the conceptual framework of Western medicine evolves toward an increasingly integrative perspective, novel understandings of complex relationships between biological, informational, and energetic processes associated with normal psychological functioning and mental illness will lead to more effective integrative assessment and treatment strategies addressing the causes or meanings of symptoms at multiple hierarchic levels of body-brain-mind.
A self-interfering clock as a “which path” witness
NASA Astrophysics Data System (ADS)
Margalit, Yair; Zhou, Zhifan; Machluf, Shimon; Rohrlich, Daniel; Japha, Yonathan; Folman, Ron
2015-09-01
In Einstein’s general theory of relativity, time depends locally on gravity; in standard quantum theory, time is global—all clocks “tick” uniformly. We demonstrate a new tool for investigating time in the overlap of these two theories: a self-interfering clock, comprising two atomic spin states. We prepare the clock in a spatial superposition of quantum wave packets, which evolve coherently along two paths into a stable interference pattern. If we make the clock wave packets “tick” at different rates, to simulate a gravitational time lag, the clock time along each path yields “which path” information, degrading the pattern’s visibility. In contrast, in standard interferometry, time cannot yield “which path” information. This proof-of-principle experiment may have implications for the study of time and general relativity and their impact on fundamental effects such as decoherence and the emergence of a classical world.
A self-interfering clock as a "which path" witness.
Margalit, Yair; Zhou, Zhifan; Machluf, Shimon; Rohrlich, Daniel; Japha, Yonathan; Folman, Ron
2015-09-11
In Einstein's general theory of relativity, time depends locally on gravity; in standard quantum theory, time is global-all clocks "tick" uniformly. We demonstrate a new tool for investigating time in the overlap of these two theories: a self-interfering clock, comprising two atomic spin states. We prepare the clock in a spatial superposition of quantum wave packets, which evolve coherently along two paths into a stable interference pattern. If we make the clock wave packets "tick" at different rates, to simulate a gravitational time lag, the clock time along each path yields "which path" information, degrading the pattern's visibility. In contrast, in standard interferometry, time cannot yield "which path" information. This proof-of-principle experiment may have implications for the study of time and general relativity and their impact on fundamental effects such as decoherence and the emergence of a classical world. Copyright © 2015, American Association for the Advancement of Science.
Leading-order classical Lagrangians for the nonminimal standard-model extension
NASA Astrophysics Data System (ADS)
Reis, J. A. A. S.; Schreck, M.
2018-03-01
In this paper, we derive the general leading-order classical Lagrangian covering all fermion operators of the nonminimal standard-model extension (SME). Such a Lagrangian is considered to be the point-particle analog of the effective field theory description of Lorentz violation that is provided by the SME. At leading order in Lorentz violation, the Lagrangian obtained satisfies the set of five nonlinear equations that govern the map from the field theory to the classical description. This result can be of use for phenomenological studies of classical bodies in gravitational fields.
Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes
NASA Astrophysics Data System (ADS)
Harrington, James William
Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present a local classical processing scheme for correcting errors on toric codes, which demonstrates that quantum information can be maintained in two dimensions by purely local (quantum and classical) resources.
Quasi-Static Analysis of Round LaRC THUNDER Actuators
NASA Technical Reports Server (NTRS)
Campbell, Joel F.
2007-01-01
An analytic approach is developed to predict the shape and displacement with voltage in the quasi-static limit of round LaRC Thunder Actuators. The problem is treated with classical lamination theory and Von Karman non-linear analysis. In the case of classical lamination theory exact analytic solutions are found. It is shown that classical lamination theory is insufficient to describe the physical situation for large actuators but is sufficient for very small actuators. Numerical results are presented for the non-linear analysis and compared with experimental measurements. Snap-through behavior, bifurcation, and stability are presented and discussed.
Quasi-Static Analysis of LaRC THUNDER Actuators
NASA Technical Reports Server (NTRS)
Campbell, Joel F.
2007-01-01
An analytic approach is developed to predict the shape and displacement with voltage in the quasi-static limit of LaRC Thunder Actuators. The problem is treated with classical lamination theory and Von Karman non-linear analysis. In the case of classical lamination theory exact analytic solutions are found. It is shown that classical lamination theory is insufficient to describe the physical situation for large actuators but is sufficient for very small actuators. Numerical results are presented for the non-linear analysis and compared with experimental measurements. Snap-through behavior, bifurcation, and stability are presented and discussed.
Scalf, Paige E; Torralbo, Ana; Tapia, Evelina; Beck, Diane M
2013-01-01
Both perceptual load theory and dilution theory purport to explain when and why task-irrelevant information, or so-called distractors are processed. Central to both explanations is the notion of limited resources, although the theories differ in the precise way in which those limitations affect distractor processing. We have recently proposed a neurally plausible explanation of limited resources in which neural competition among stimuli hinders their representation in the brain. This view of limited capacity can also explain distractor processing, whereby the competitive interactions and bias imposed to resolve the competition determine the extent to which a distractor is processed. This idea is compatible with aspects of both perceptual load and dilution models of distractor processing, but also serves to highlight their differences. Here we review the evidence in favor of a biased competition view of limited resources and relate these ideas to both classic perceptual load theory and dilution theory.
Navigating the grounded theory terrain. Part 2.
Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John
2011-01-01
In this paper, the choice of classic grounded theory will be discussed and justified in the context of the first author's PhD research. The methodological discussion takes place within the context of PhD research entitled: Development of a stakeholder-led framework for a structured education programme that will prepare nurses and healthcare assistants to deliver a psychosocial intervention for people with dementia. There is a lack of research and limited understanding of the effect of psychosocial interventions on people with dementia. The first author thought classic grounded theory a suitable research methodology to investigate as it is held to be ideal for areas of research where there is little understanding of the social processes at work. The literature relating to the practical application of classic grounded theory is illustrated using examples relating to four key grounded theory components: Theory development: using constant comparison and memoing, Methodological rigour, Emergence of a core category, Inclusion of self and engagement with participants. Following discussion of the choice and application of classic grounded theory, this paper explores the need for researchers to visit and understand the various grounded theory options. This paper argues that researchers new to grounded theory must be familiar with and understand the various options. The researchers will then be able to apply the methodologies they choose consistently and critically. Doing so will allow them to develop theory rigorously and they will ultimately be able to better defend their final methodological destinations.
Mathematical model of the SH-3G helicopter
NASA Technical Reports Server (NTRS)
Phillips, J. D.
1982-01-01
A mathematical model of the Sikorsky SH-3G helicopter based on classical nonlinear, quasi-steady rotor theory was developed. The model was validated statically and dynamically by comparison with Navy flight-test data. The model incorporates ad hoc revisions which address the ideal assumptions of classical rotor theory and improve the static trim characteristics to provide a more realistic simulation, while retaining the simplicity of the classical model.
Geometric Algebra for Physicists
NASA Astrophysics Data System (ADS)
Doran, Chris; Lasenby, Anthony
2007-11-01
Preface; Notation; 1. Introduction; 2. Geometric algebra in two and three dimensions; 3. Classical mechanics; 4. Foundations of geometric algebra; 5. Relativity and spacetime; 6. Geometric calculus; 7. Classical electrodynamics; 8. Quantum theory and spinors; 9. Multiparticle states and quantum entanglement; 10. Geometry; 11. Further topics in calculus and group theory; 12. Lagrangian and Hamiltonian techniques; 13. Symmetry and gauge theory; 14. Gravitation; Bibliography; Index.
Nucleation theory - Is replacement free energy needed?. [error analysis of capillary approximation
NASA Technical Reports Server (NTRS)
Doremus, R. H.
1982-01-01
It has been suggested that the classical theory of nucleation of liquid from its vapor as developed by Volmer and Weber (1926) needs modification with a factor referred to as the replacement free energy and that the capillary approximation underlying the classical theory is in error. Here, the classical nucleation equation is derived from fluctuation theory, Gibb's result for the reversible work to form a critical nucleus, and the rate of collision of gas molecules with a surface. The capillary approximation is not used in the derivation. The chemical potential of small drops is then considered, and it is shown that the capillary approximation can be derived from thermodynamic equations. The results show that no corrections to Volmer's equation are needed.
Effective model hierarchies for dynamic and static classical density functional theories
NASA Astrophysics Data System (ADS)
Majaniemi, S.; Provatas, N.; Nonomura, M.
2010-09-01
The origin and methodology of deriving effective model hierarchies are presented with applications to solidification of crystalline solids. In particular, it is discussed how the form of the equations of motion and the effective parameters on larger scales can be obtained from the more microscopic models. It will be shown that tying together the dynamic structure of the projection operator formalism with static classical density functional theories can lead to incomplete (mass) transport properties even though the linearized hydrodynamics on large scales is correctly reproduced. To facilitate a more natural way of binding together the dynamics of the macrovariables and classical density functional theory, a dynamic generalization of density functional theory based on the nonequilibrium generating functional is suggested.
Calculation of the room-temperature shapes of unsymmetric laminates
NASA Technical Reports Server (NTRS)
Hyer, M. W.
1981-01-01
A theory explaining the characteristics of the cured shapes of unsymmetric laminates is presented. The theory is based on an extension of classical lamination theory which accounts for geometric nonlinearities. A Rayleigh-Ritz approach to minimizing the total potential energy is used to obtain quantitative information regarding the room temperature shapes of square T300/5208 (0(2)/90(2))T and (0(4)/90(4))T graphite-epoxy laminates. It is shown that, depending on the thickness of the laminate and the length of the side the square, the saddle shape configuration is actually unstable. For values of length and thickness that render the saddle shape unstable, it is shown that two stable cylindrical shapes exist. The predictions of the theory are compared with existing experimental data.
Quantum to Classical Transitions via Weak Measurements and Post-Selection
NASA Astrophysics Data System (ADS)
Cohen, Eliahu; Aharonov, Yakir
Alongside its immense empirical success, the quantum mechanical account of physical systems imposes a myriad of divergences from our thoroughly ingrained classical ways of thinking. These divergences, while striking, would have been acceptable if only a continuous transition to the classical domain was at hand. Strangely, this is not quite the case. The difficulties involved in reconciling the quantum with the classical have given rise to different interpretations, each with its own shortcomings. Traditionally, the two domains are sewed together by invoking an ad hoc theory of measurement, which has been incorporated in the axiomatic foundations of quantum theory. This work will incorporate a few related tools for addressing the above conceptual difficulties: deterministic operators, weak measurements, and post-selection. Weak Measurement, based on a very weak von Neumann coupling, is a unique kind of quantum measurement with numerous theoretical and practical applications. In contrast to other measurement techniques, it allows to gather a small amount of information regarding the quantum system, with only a negligible probability of collapsing it onto an eigenstate of the measured observable. A single weak measurement yieldsan almost random outcome, but when performed repeatedly over a large ensemble, the averaged outcome becomes increasingly robust and accurate. Importantly, a long sequence of weak measurements can be thought of as a single projective measurement. We claim in this work that classical variables appearing in the o-world, such as center of mass, moment of inertia, pressure, and average forces, result from a multitude of quantum weak measurements performed in the micro-world. Here again, the quantum outcomes are highly uncertain, but the law of large numbers obliges their convergence to the definite quantities we know from our everyday lives. By augmenting this description with a final boundary condition and employing the notion of "classical robustness under time-reversal", we will draw a quantitative borderline between the classical and quantum regimes. We will conclude by analyzing the role of oscopic systems in amplifying and recording quantum outcomes.
Using extant literature in a grounded theory study: a personal account.
Yarwood-Ross, Lee; Jack, Kirsten
2015-03-01
To provide a personal account of the factors in a doctoral study that led to the adoption of classic grounded theory principles relating to the use of literature. Novice researchers considering grounded theory methodology will become aware of the contentious issue of how and when extant literature should be incorporated into a study. The three main grounded theory approaches are classic, Straussian and constructivist, and the seminal texts provide conflicting beliefs surrounding the use of literature. A classic approach avoids a pre-study literature review to minimise preconceptions and emphasises the constant comparison method, while the Straussian and constructivist approaches focus more on the beneficial aspects of an initial literature review and researcher reflexivity. The debate also extends into the wider academic community, where no consensus exists. This is a methodological paper detailing the authors' engagement in the debate surrounding the role of the literature in a grounded theory study. In the authors' experience, researchers can best understand the use of literature in grounded theory through immersion in the seminal texts, engaging with wider academic literature, and examining their preconceptions of the substantive area. The authors concluded that classic grounded theory principles were appropriate in the context of their doctoral study. Novice researchers will have their own sets of circumstances when preparing their studies and should become aware of the different perspectives to make decisions that they can ultimately justify. This paper can be used by other novice researchers as an example of the decision-making process that led to delaying a pre-study literature review and identifies the resources used to write a research proposal when using a classic grounded theory approach.
Classical BV Theories on Manifolds with Boundary
NASA Astrophysics Data System (ADS)
Cattaneo, Alberto S.; Mnev, Pavel; Reshetikhin, Nicolai
2014-12-01
In this paper we extend the classical BV framework to gauge theories on spacetime manifolds with boundary. In particular, we connect the BV construction in the bulk with the BFV construction on the boundary and we develop its extension to strata of higher codimension in the case of manifolds with corners. We present several examples including electrodynamics, Yang-Mills theory and topological field theories coming from the AKSZ construction, in particular, the Chern-Simons theory, the BF theory, and the Poisson sigma model. This paper is the first step towards developing the perturbative quantization of such theories on manifolds with boundary in a way consistent with gluing.
An Examination of the Flynn Effect in the National Intelligence Test in Estonia
ERIC Educational Resources Information Center
Shiu, William
2012-01-01
This study examined the Flynn Effect (FE; i.e., the rise in IQ scores over time) in Estonia from Scale B of the National Intelligence Test using both classical test theory (CTT) and item response theory (IRT) methods. Secondary data from two cohorts (1934, n = 890 and 2006, n = 913) of students were analyzed, using both classical test theory (CTT)…
Beyond Classical Information Theory: Advancing the Fundamentals for Improved Geophysical Prediction
NASA Astrophysics Data System (ADS)
Perdigão, R. A. P.; Pires, C. L.; Hall, J.; Bloeschl, G.
2016-12-01
Information Theory, in its original and quantum forms, has gradually made its way into various fields of science and engineering. From the very basic concepts of Information Entropy and Mutual Information to Transit Information, Interaction Information and respective partitioning into statistical synergy, redundancy and exclusivity, the overall theoretical foundations have matured as early as the mid XX century. In the Earth Sciences various interesting applications have been devised over the last few decades, such as the design of complex process networks of descriptive and/or inferential nature, wherein earth system processes are "nodes" and statistical relationships between them designed as information-theoretical "interactions". However, most applications still take the very early concepts along with their many caveats, especially in heavily non-Normal, non-linear and structurally changing scenarios. In order to overcome the traditional limitations of information theory and tackle elusive Earth System phenomena, we introduce a new suite of information dynamic methodologies towards a more physically consistent and information comprehensive framework. The methodological developments are then illustrated on a set of practical examples from geophysical fluid dynamics, where high-order nonlinear relationships elusive to the current non-linear information measures are aptly captured. In doing so, these advances increase the predictability of critical events such as the emergence of hyper-chaotic regimes in ocean-atmospheric dynamics and the occurrence of hydro-meteorological extremes.
Quantum information, cognition, and music.
Dalla Chiara, Maria L; Giuntini, Roberto; Leporini, Roberto; Negri, Eleonora; Sergioli, Giuseppe
2015-01-01
Parallelism represents an essential aspect of human mind/brain activities. One can recognize some common features between psychological parallelism and the characteristic parallel structures that arise in quantum theory and in quantum computation. The article is devoted to a discussion of the following questions: a comparison between classical probabilistic Turing machines and quantum Turing machines.possible applications of the quantum computational semantics to cognitive problems.parallelism in music.
Issues of Dynamic Coalition Formation Among Rational Agents
2002-04-01
approaches of forming stable coalitions among rational agents. Issues and problems of dynamic coalition environments are discussed in section 3 while...2.2. 2.1.2 Coalition Algorithm, Coalition Formation Environment and Model Rational agents which are involved in a co-operative game (A,v) are...publicly available simulation environment for coalition formation among rational information agents based on selected classic coalition theories is, for
Quantum information, cognition, and music
Dalla Chiara, Maria L.; Giuntini, Roberto; Leporini, Roberto; Negri, Eleonora; Sergioli, Giuseppe
2015-01-01
Parallelism represents an essential aspect of human mind/brain activities. One can recognize some common features between psychological parallelism and the characteristic parallel structures that arise in quantum theory and in quantum computation. The article is devoted to a discussion of the following questions: a comparison between classical probabilistic Turing machines and quantum Turing machines.possible applications of the quantum computational semantics to cognitive problems.parallelism in music. PMID:26539139
Bukhvostov-Lipatov model and quantum-classical duality
NASA Astrophysics Data System (ADS)
Bazhanov, Vladimir V.; Lukyanov, Sergei L.; Runov, Boris A.
2018-02-01
The Bukhvostov-Lipatov model is an exactly soluble model of two interacting Dirac fermions in 1 + 1 dimensions. The model describes weakly interacting instantons and anti-instantons in the O (3) non-linear sigma model. In our previous work [arxiv:arXiv:1607.04839] we have proposed an exact formula for the vacuum energy of the Bukhvostov-Lipatov model in terms of special solutions of the classical sinh-Gordon equation, which can be viewed as an example of a remarkable duality between integrable quantum field theories and integrable classical field theories in two dimensions. Here we present a complete derivation of this duality based on the classical inverse scattering transform method, traditional Bethe ansatz techniques and analytic theory of ordinary differential equations. In particular, we show that the Bethe ansatz equations defining the vacuum state of the quantum theory also define connection coefficients of an auxiliary linear problem for the classical sinh-Gordon equation. Moreover, we also present details of the derivation of the non-linear integral equations determining the vacuum energy and other spectral characteristics of the model in the case when the vacuum state is filled by 2-string solutions of the Bethe ansatz equations.
Whitley, Heather D.; Scullard, Christian R.; Benedict, Lorin X.; ...
2014-12-04
Here, we present a discussion of kinetic theory treatments of linear electrical and thermal transport in hydrogen plasmas, for a regime of interest to inertial confinement fusion applications. In order to assess the accuracy of one of the more involved of these approaches, classical Lenard-Balescu theory, we perform classical molecular dynamics simulations of hydrogen plasmas using 2-body quantum statistical potentials and compute both electrical and thermal conductivity from out particle trajectories using the Kubo approach. Our classical Lenard-Balescu results employing the identical statistical potentials agree well with the simulations.
Information processing, computation, and cognition.
Piccinini, Gualtiero; Scarantino, Andrea
2011-01-01
Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.
Neuromodulated Spike-Timing-Dependent Plasticity, and Theory of Three-Factor Learning Rules
Frémaux, Nicolas; Gerstner, Wulfram
2016-01-01
Classical Hebbian learning puts the emphasis on joint pre- and postsynaptic activity, but neglects the potential role of neuromodulators. Since neuromodulators convey information about novelty or reward, the influence of neuromodulators on synaptic plasticity is useful not just for action learning in classical conditioning, but also to decide “when” to create new memories in response to a flow of sensory stimuli. In this review, we focus on timing requirements for pre- and postsynaptic activity in conjunction with one or several phasic neuromodulatory signals. While the emphasis of the text is on conceptual models and mathematical theories, we also discuss some experimental evidence for neuromodulation of Spike-Timing-Dependent Plasticity. We highlight the importance of synaptic mechanisms in bridging the temporal gap between sensory stimulation and neuromodulatory signals, and develop a framework for a class of neo-Hebbian three-factor learning rules that depend on presynaptic activity, postsynaptic variables as well as the influence of neuromodulators. PMID:26834568
Symmetry aspects in emergent quantum mechanics
NASA Astrophysics Data System (ADS)
Elze, Hans-Thomas
2009-06-01
We discuss an explicit realization of the dissipative dynamics anticipated in the proof of 't Hooft's existence theorem, which states that 'For any quantum system there exists at least one deterministic model that reproduces all its dynamics after prequantization'. - There is an energy-parity symmetry hidden in the Liouville equation, which mimics the Kaplan-Sundrum protective symmetry for the cosmological constant. This symmetry may be broken by the coarse-graining inherent in physics at scales much larger than the Planck length. We correspondingly modify classical ensemble theory by incorporating dissipative fluctuations (information loss) - which are caused by discrete spacetime continually 'measuring' matter. In this way, aspects of quantum mechanics, such as the von Neumann equation, including a Lindblad term, arise dynamically and expectations of observables agree with the Born rule. However, the resulting quantum coherence is accompanied by an intrinsic decoherence and continuous localization mechanism. Our proposal leads towards a theory that is linear and local at the quantum mechanical level, but the relation to the underlying classical degrees of freedom is nonlocal.
Further Development of an Optimal Design Approach Applied to Axial Magnetic Bearings
NASA Technical Reports Server (NTRS)
Bloodgood, V. Dale, Jr.; Groom, Nelson J.; Britcher, Colin P.
2000-01-01
Classical design methods involved in magnetic bearings and magnetic suspension systems have always had their limitations. Because of this, the overall effectiveness of a design has always relied heavily on the skill and experience of the individual designer. This paper combines two approaches that have been developed to aid the accuracy and efficiency of magnetostatic design. The first approach integrates classical magnetic circuit theory with modern optimization theory to increase design efficiency. The second approach uses loss factors to increase the accuracy of classical magnetic circuit theory. As an example, an axial magnetic thrust bearing is designed for minimum power.
Quantum Bayesian perspective for intelligence reservoir characterization, monitoring and management
NASA Astrophysics Data System (ADS)
Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia; de Jesús Correa, María
2017-10-01
The paper starts with a brief review of the literature about uncertainty in geological, geophysical and petrophysical data. In particular, we present the viewpoints of experts in geophysics on the application of Bayesian inference and subjective probability. Then we present arguments that the use of classical probability theory (CP) does not match completely the structure of geophysical data. We emphasize that such data are characterized by contextuality and non-Kolmogorovness (the impossibility to use the CP model), incompleteness as well as incompatibility of some geophysical measurements. These characteristics of geophysical data are similar to the characteristics of quantum physical data. Notwithstanding all this, contextuality can be seen as a major deviation of quantum theory from classical physics. In particular, the contextual probability viewpoint is the essence of the Växjö interpretation of quantum mechanics. We propose to use quantum probability (QP) for decision-making during the characterization, modelling, exploring and management of the intelligent hydrocarbon reservoir. Quantum Bayesianism (QBism), one of the recently developed information interpretations of quantum theory, can be used as the interpretational basis for such QP decision-making in geology, geophysics and petroleum projects design and management. This article is part of the themed issue `Second quantum revolution: foundational questions'.
Measures and applications of quantum correlations
NASA Astrophysics Data System (ADS)
Adesso, Gerardo; Bromley, Thomas R.; Cianciaruso, Marco
2016-11-01
Quantum information theory is built upon the realisation that quantum resources like coherence and entanglement can be exploited for novel or enhanced ways of transmitting and manipulating information, such as quantum cryptography, teleportation, and quantum computing. We now know that there is potentially much more than entanglement behind the power of quantum information processing. There exist more general forms of non-classical correlations, stemming from fundamental principles such as the necessary disturbance induced by a local measurement, or the persistence of quantum coherence in all possible local bases. These signatures can be identified and are resilient in almost all quantum states, and have been linked to the enhanced performance of certain quantum protocols over classical ones in noisy conditions. Their presence represents, among other things, one of the most essential manifestations of quantumness in cooperative systems, from the subatomic to the macroscopic domain. In this work we give an overview of the current quest for a proper understanding and characterisation of the frontier between classical and quantum correlations (QCs) in composite states. We focus on various approaches to define and quantify general QCs, based on different yet interlinked physical perspectives, and comment on the operational significance of the ensuing measures for quantum technology tasks such as information encoding, distribution, discrimination and metrology. We then provide a broader outlook of a few applications in which quantumness beyond entanglement looks fit to play a key role.
Anomalous Quantum Correlations of Squeezed Light
NASA Astrophysics Data System (ADS)
Kühn, B.; Vogel, W.; Mraz, M.; Köhnke, S.; Hage, B.
2017-04-01
Three different noise moments of field strength, intensity, and their correlations are simultaneously measured. For this purpose a homodyne cross-correlation measurement [1] is implemented by superimposing the signal field and a weak local oscillator on an unbalanced beam splitter. The relevant information is obtained via the intensity noise correlation of the output modes. Detection details like quantum efficiencies or uncorrelated dark noise are meaningless for our technique. Yet unknown insight in the quantumness of a squeezed signal field is retrieved from the anomalous moment, correlating field strength with intensity noise. A classical inequality including this moment is violated for almost all signal phases. Precognition on quantum theory is superfluous, as our analysis is solely based on classical physics.
The Gibbs paradox and the physical criteria for indistinguishability of identical particles
NASA Astrophysics Data System (ADS)
Unnikrishnan, C. S.
2016-08-01
Gibbs paradox in the context of statistical mechanics addresses the issue of additivity of entropy of mixing gases. The usual discussion attributes the paradoxical situation to classical distinguishability of identical particles and credits quantum theory for enabling indistinguishability of identical particles to solve the problem. We argue that indistinguishability of identical particles is already a feature in classical mechanics and this is clearly brought out when the problem is treated in the language of information and associated entropy. We pinpoint the physical criteria for indistinguishability that is crucial for the treatment of the Gibbs’ problem and the consistency of its solution with conventional thermodynamics. Quantum mechanics provides a quantitative criterion, not possible in the classical picture, for the degree of indistinguishability in terms of visibility of quantum interference, or overlap of the states as pointed out by von Neumann, thereby endowing the entropy expression with mathematical continuity and physical reasonableness.
On classical de Sitter and Minkowski solutions with intersecting branes
NASA Astrophysics Data System (ADS)
Andriot, David
2018-03-01
Motivated by the connection of string theory to cosmology or particle physics, we study solutions of type II supergravities having a four-dimensional de Sitter or Minkowski space-time, with intersecting D p -branes and orientifold O p -planes. Only few such solutions are known, and we aim at a better characterisation. Modulo a few restrictions, we prove that there exists no classical de Sitter solution for any combination of D 3/ O 3 and D 7/ O 7, while we derive interesting constraints for intersecting D 5/ O 5 or D 6/ O 6, or combinations of D 4/ O 4 and D 8/ O 8. Concerning classical Minkowski solutions, we understand some typical features, and propose a solution ansatz. Overall, a central information appears to be the way intersecting D p / O p overlap each other, a point we focus on.
Ethical and Stylistic Implications in Delivering Conference Papers.
ERIC Educational Resources Information Center
Enos, Theresa
1986-01-01
Analyzes shortcomings of conference papers intended for the eye rather than the ear. Referring to classical oratory, speech act theory, and cognitive theory, recommends revising papers for oral presentation by using classical disposition; deductive rather than inductive argument; formulaic repetition of words and phrases; non-inverted clause…
Quantum theory for 1D X-ray free electron laser
NASA Astrophysics Data System (ADS)
Anisimov, Petr M.
2018-06-01
Classical 1D X-ray Free Electron Laser (X-ray FEL) theory has stood the test of time by guiding FEL design and development prior to any full-scale analysis. Future X-ray FELs and inverse-Compton sources, where photon recoil approaches an electron energy spread value, push the classical theory to its limits of applicability. After substantial efforts by the community to find what those limits are, there is no universally agreed upon quantum approach to design and development of future X-ray sources. We offer a new approach to formulate the quantum theory for 1D X-ray FELs that has an obvious connection to the classical theory, which allows for immediate transfer of knowledge between the two regimes. We exploit this connection in order to draw quantum mechanical conclusions about the quantum nature of electrons and generated radiation in terms of FEL variables.
Quantum Locality in Game Strategy
NASA Astrophysics Data System (ADS)
Melo-Luna, Carlos A.; Susa, Cristian E.; Ducuara, Andrés F.; Barreiro, Astrid; Reina, John H.
2017-03-01
Game theory is a well established branch of mathematics whose formalism has a vast range of applications from the social sciences, biology, to economics. Motivated by quantum information science, there has been a leap in the formulation of novel game strategies that lead to new (quantum Nash) equilibrium points whereby players in some classical games are always outperformed if sharing and processing joint information ruled by the laws of quantum physics is allowed. We show that, for a bipartite non zero-sum game, input local quantum correlations, and separable states in particular, suffice to achieve an advantage over any strategy that uses classical resources, thus dispensing with quantum nonlocality, entanglement, or even discord between the players’ input states. This highlights the remarkable key role played by pure quantum coherence at powering some protocols. Finally, we propose an experiment that uses separable states and basic photon interferometry to demonstrate the locally-correlated quantum advantage.
Quantum Locality in Game Strategy
Melo-Luna, Carlos A.; Susa, Cristian E.; Ducuara, Andrés F.; Barreiro, Astrid; Reina, John H.
2017-01-01
Game theory is a well established branch of mathematics whose formalism has a vast range of applications from the social sciences, biology, to economics. Motivated by quantum information science, there has been a leap in the formulation of novel game strategies that lead to new (quantum Nash) equilibrium points whereby players in some classical games are always outperformed if sharing and processing joint information ruled by the laws of quantum physics is allowed. We show that, for a bipartite non zero-sum game, input local quantum correlations, and separable states in particular, suffice to achieve an advantage over any strategy that uses classical resources, thus dispensing with quantum nonlocality, entanglement, or even discord between the players’ input states. This highlights the remarkable key role played by pure quantum coherence at powering some protocols. Finally, we propose an experiment that uses separable states and basic photon interferometry to demonstrate the locally-correlated quantum advantage. PMID:28327567
Plasmon mass scale and quantum fluctuations of classical fields on a real time lattice
NASA Astrophysics Data System (ADS)
Kurkela, Aleksi; Lappi, Tuomas; Peuron, Jarkko
2018-03-01
Classical real-time lattice simulations play an important role in understanding non-equilibrium phenomena in gauge theories and are used in particular to model the prethermal evolution of heavy-ion collisions. Above the Debye scale the classical Yang-Mills (CYM) theory can be matched smoothly to kinetic theory. First we study the limits of the quasiparticle picture of the CYM fields by determining the plasmon mass of the system using 3 different methods. Then we argue that one needs a numerical calculation of a system of classical gauge fields and small linearized fluctuations, which correspond to quantum fluctuations, in a way that keeps the separation between the two manifest. We demonstrate and test an implementation of an algorithm with the linearized fluctuation showing that the linearization indeed works and that the Gauss's law is conserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nomura, Yasunori; Salzetta, Nico; Sanches, Fabio
We study the Hilbert space structure of classical spacetimes under the assumption that entanglement in holographic theories determines semiclassical geometry. We show that this simple assumption has profound implications; for example, a superposition of classical spacetimes may lead to another classical spacetime. Despite its unconventional nature, this picture admits the standard interpretation of superpositions of well-defined semiclassical spacetimes in the limit that the number of holographic degrees of freedom becomes large. We illustrate these ideas using a model for the holographic theory of cosmological spacetimes.
Classical theory of radiating strings
NASA Technical Reports Server (NTRS)
Copeland, Edmund J.; Haws, D.; Hindmarsh, M.
1990-01-01
The divergent part of the self force of a radiating string coupled to gravity, an antisymmetric tensor and a dilaton in four dimensions are calculated to first order in classical perturbation theory. While this divergence can be absorbed into a renormalization of the string tension, demanding that both it and the divergence in the energy momentum tensor vanish forces the string to have the couplings of compactified N = 1 D = 10 supergravity. In effect, supersymmetry cures the classical infinities.
Emergence of a classical Universe from quantum gravity and cosmology.
Kiefer, Claus
2012-09-28
I describe how we can understand the classical appearance of our world from a universal quantum theory. The essential ingredient is the process of decoherence. I start with a general discussion in ordinary quantum theory and then turn to quantum gravity and quantum cosmology. There is a whole hierarchy of classicality from the global gravitational field to the fluctuations in the cosmic microwave background, which serve as the seeds for the structure in the Universe.
Robust bidirectional links for photonic quantum networks
Xu, Jin-Shi; Yung, Man-Hong; Xu, Xiao-Ye; Tang, Jian-Shun; Li, Chuan-Feng; Guo, Guang-Can
2016-01-01
Optical fibers are widely used as one of the main tools for transmitting not only classical but also quantum information. We propose and report an experimental realization of a promising method for creating robust bidirectional quantum communication links through paired optical polarization-maintaining fibers. Many limitations of existing protocols can be avoided with the proposed method. In particular, the path and polarization degrees of freedom are combined to deterministically create a photonic decoherence-free subspace without the need for any ancillary photon. This method is input state–independent, robust against dephasing noise, postselection-free, and applicable bidirectionally. To rigorously quantify the amount of quantum information transferred, the optical fibers are analyzed with the tools developed in quantum communication theory. These results not only suggest a practical means for protecting quantum information sent through optical quantum networks but also potentially provide a new physical platform for enriching the structure of the quantum communication theory. PMID:26824069
Data Structures in Natural Computing: Databases as Weak or Strong Anticipatory Systems
NASA Astrophysics Data System (ADS)
Rossiter, B. N.; Heather, M. A.
2004-08-01
Information systems anticipate the real world. Classical databases store, organise and search collections of data of that real world but only as weak anticipatory information systems. This is because of the reductionism and normalisation needed to map the structuralism of natural data on to idealised machines with von Neumann architectures consisting of fixed instructions. Category theory developed as a formalism to explore the theoretical concept of naturality shows that methods like sketches arising from graph theory as only non-natural models of naturality cannot capture real-world structures for strong anticipatory information systems. Databases need a schema of the natural world. Natural computing databases need the schema itself to be also natural. Natural computing methods including neural computers, evolutionary automata, molecular and nanocomputing and quantum computation have the potential to be strong. At present they are mainly at the stage of weak anticipatory systems.
Classical gluon and graviton radiation from the bi-adjoint scalar double copy
NASA Astrophysics Data System (ADS)
Goldberger, Walter D.; Prabhu, Siddharth G.; Thompson, Jedidiah O.
2017-09-01
We find double-copy relations between classical radiating solutions in Yang-Mills theory coupled to dynamical color charges and their counterparts in a cubic bi-adjoint scalar field theory which interacts linearly with particles carrying bi-adjoint charge. The particular color-to-kinematics replacements we employ are motivated by the Bern-Carrasco-Johansson double-copy correspondence for on-shell amplitudes in gauge and gravity theories. They are identical to those recently used to establish relations between classical radiating solutions in gauge theory and in dilaton gravity. Our explicit bi-adjoint solutions are constructed to second order in a perturbative expansion, and map under the double copy onto gauge theory solutions which involve at most cubic gluon self-interactions. If the correspondence is found to persist to higher orders in perturbation theory, our results suggest the possibility of calculating gravitational radiation from colliding compact objects, directly from a scalar field with vastly simpler (purely cubic) Feynman vertices.
Combinatorial Market Processing for Multilateral Coordination
2005-09-01
8 In the classical auction theory literature, most of the attention is focused on one-sided, single-item auctions [86]. There is now a growing body of...Programming in Infinite-dimensional Spaces: Theory and Applications, Wiley, 1987. [3] K. J. Arrow, “An extension of the basic theorems of classical ...Commodities, Princeton University Press, 1969. [43] D. Friedman and J. Rust, The Double Auction Market: Institutions, Theories, and Evidence, Addison
From Data to Semantic Information
NASA Astrophysics Data System (ADS)
Floridi, Luciano
2003-06-01
There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene on semantic information, and misinformation (that is, false semantic information) is not a type of semantic information, but pseudo-information, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo-information. As a consequence, SDI is revised to include a necessary truth-condition. The last section summarises the main results of the paper and indicates the important implications of the revised definition for the analysis of the deflationary theories of truth, the standard definition of knowledge and the classic, quantitative theory of semantic information.
ERIC Educational Resources Information Center
Boyer, Timothy H.
1985-01-01
The classical vacuum of physics is not empty, but contains a distinctive pattern of electromagnetic fields. Discovery of the vacuum, thermal spectrum, classical electron theory, zero-point spectrum, and effects of acceleration are discussed. Connection between thermal radiation and the classical vacuum reveals unexpected unity in the laws of…
Weak Measurement and Quantum Smoothing of a Superconducting Qubit
NASA Astrophysics Data System (ADS)
Tan, Dian
In quantum mechanics, the measurement outcome of an observable in a quantum system is intrinsically random, yielding a probability distribution. The state of the quantum system can be described by a density matrix rho(t), which depends on the information accumulated until time t, and represents our knowledge about the system. The density matrix rho(t) gives probabilities for the outcomes of measurements at time t. Further probing of the quantum system allows us to refine our prediction in hindsight. In this thesis, we experimentally examine a quantum smoothing theory in a superconducting qubit by introducing an auxiliary matrix E(t) which is conditioned on information obtained from time t to a final time T. With the complete information before and after time t, the pair of matrices [rho(t), E(t)] can be used to make smoothed predictions for the measurement outcome at time t. We apply the quantum smoothing theory in the case of continuous weak measurement unveiling the retrodicted quantum trajectories and weak values. In the case of strong projective measurement, while the density matrix rho(t) with only diagonal elements in a given basis |n〉 may be treated as a classical mixture, we demonstrate a failure of this classical mixture description in determining the smoothed probabilities for the measurement outcome at time t with both diagonal rho(t) and diagonal E(t). We study the correlations between quantum states and weak measurement signals and examine aspects of the time symmetry of continuous quantum measurement. We also extend our study of quantum smoothing theory to the case of resonance fluorescence of a superconducting qubit with homodyne measurement and observe some interesting effects such as the modification of the excited state probabilities, weak values, and evolution of the predicted and retrodicted trajectories.
An Overview of Judgment and Decision Making Research Through the Lens of Fuzzy Trace Theory.
Setton, Roni; Wilhelms, Evan; Weldon, Becky; Chick, Christina; Reyna, Valerie
2014-12-01
We present the basic tenets of fuzzy trace theory, a comprehensive theory of memory, judgment, and decision making that is grounded in research on how information is stored as knowledge, mentally represented, retrieved from storage, and processed. In doing so, we highlight how it is distinguished from traditional models of decision making in that gist reasoning plays a central role. The theory also distinguishes advanced intuition from primitive impulsivity. It predicts that different sorts of errors occur with respect to each component of judgment and decision making: background knowledge, representation, retrieval, and processing. Classic errors in the judgment and decision making literature, such as risky-choice framing and the conjunction fallacy, are accounted for by fuzzy trace theory and new results generated by the theory contradict traditional approaches. We also describe how developmental changes in brain and behavior offer crucial insight into adult cognitive processing. Research investigating brain and behavior in developing and special populations supports fuzzy trace theory's predictions about reliance on gist processing.
Buckling of pressure-loaded, long, shear deformable, cylindrical laminated shells
NASA Astrophysics Data System (ADS)
Anastasiadis, John S.; Simitses, George J.
A higher-order shell theory was developed (kinematic relations, constitutive relations, equilibrium equations and boundary conditions), which includes initial geometric imperfections and transverse shear effects for a laminated cylindrical shell under the action of pressure, axial compression and in-plane shear. Through the perturbation technique, buckling equations are derived for the corresponding 'perfect geometry' symmetric laminated configuration. Critical pressures are computed for very long cylinders for several stacking sequences, several radius-to-total-thickness ratios, three lamina materials (boron/epoxy, graphite/epoxy, and Kevlar/epoxy), and three shell theories: classical, first-order shear deformable and higher- (third-)order shear deformable. The results provide valuable information concerning the applicability (accurate prediction of buckling pressures) of the various shell theories.
Bayesian theories of conditioning in a changing world.
Courville, Aaron C; Daw, Nathaniel D; Touretzky, David S
2006-07-01
The recent flowering of Bayesian approaches invites the re-examination of classic issues in behavior, even in areas as venerable as Pavlovian conditioning. A statistical account can offer a new, principled interpretation of behavior, and previous experiments and theories can inform many unexplored aspects of the Bayesian enterprise. Here we consider one such issue: the finding that surprising events provoke animals to learn faster. We suggest that, in a statistical account of conditioning, surprise signals change and therefore uncertainty and the need for new learning. We discuss inference in a world that changes and show how experimental results involving surprise can be interpreted from this perspective, and also how, thus understood, these phenomena help constrain statistical theories of animal and human learning.
Random walk in generalized quantum theory
NASA Astrophysics Data System (ADS)
Martin, Xavier; O'Connor, Denjoe; Sorkin, Rafael D.
2005-01-01
One can view quantum mechanics as a generalization of classical probability theory that provides for pairwise interference among alternatives. Adopting this perspective, we “quantize” the classical random walk by finding, subject to a certain condition of “strong positivity”, the most general Markovian, translationally invariant “decoherence functional” with nearest neighbor transitions.
Antoneli, Fernando; Ferreira, Renata C; Briones, Marcelo R S
2016-06-01
Here we propose a new approach to modeling gene expression based on the theory of random dynamical systems (RDS) that provides a general coupling prescription between the nodes of any given regulatory network given the dynamics of each node is modeled by a RDS. The main virtues of this approach are the following: (i) it provides a natural way to obtain arbitrarily large networks by coupling together simple basic pieces, thus revealing the modularity of regulatory networks; (ii) the assumptions about the stochastic processes used in the modeling are fairly general, in the sense that the only requirement is stationarity; (iii) there is a well developed mathematical theory, which is a blend of smooth dynamical systems theory, ergodic theory and stochastic analysis that allows one to extract relevant dynamical and statistical information without solving the system; (iv) one may obtain the classical rate equations form the corresponding stochastic version by averaging the dynamic random variables (small noise limit). It is important to emphasize that unlike the deterministic case, where coupling two equations is a trivial matter, coupling two RDS is non-trivial, specially in our case, where the coupling is performed between a state variable of one gene and the switching stochastic process of another gene and, hence, it is not a priori true that the resulting coupled system will satisfy the definition of a random dynamical system. We shall provide the necessary arguments that ensure that our coupling prescription does indeed furnish a coupled regulatory network of random dynamical systems. Finally, the fact that classical rate equations are the small noise limit of our stochastic model ensures that any validation or prediction made on the basis of the classical theory is also a validation or prediction of our model. We illustrate our framework with some simple examples of single-gene system and network motifs. Copyright © 2016 Elsevier Inc. All rights reserved.
Neo-classical theory of competition or Adam Smith's hand as mathematized ideology
NASA Astrophysics Data System (ADS)
McCauley, Joseph L.
2001-10-01
Orthodox economic theory (utility maximization, rational agents, efficient markets in equilibrium) is based on arbitrarily postulated, nonempiric notions. The disagreement between economic reality and a key feature of neo-classical economic theory was criticized empirically by Osborne. I show that the orthodox theory is internally self-inconsistent for the very reason suggested by Osborne: lack of invertibility of demand and supply as functions of price to obtain price as functions of supply and demand. The reason for the noninvertibililty arises from nonintegrable excess demand dynamics, a feature of their theory completely ignored by economists.
Information flow dynamics in the brain
NASA Astrophysics Data System (ADS)
Rabinovich, Mikhail I.; Afraimovich, Valentin S.; Bick, Christian; Varona, Pablo
2012-03-01
Timing and dynamics of information in the brain is a hot field in modern neuroscience. The analysis of the temporal evolution of brain information is crucially important for the understanding of higher cognitive mechanisms in normal and pathological states. From the perspective of information dynamics, in this review we discuss working memory capacity, language dynamics, goal-dependent behavior programming and other functions of brain activity. In contrast with the classical description of information theory, which is mostly algebraic, brain flow information dynamics deals with problems such as the stability/instability of information flows, their quality, the timing of sequential processing, the top-down cognitive control of perceptual information, and information creation. In this framework, different types of information flow instabilities correspond to different cognitive disorders. On the other hand, the robustness of cognitive activity is related to the control of the information flow stability. We discuss these problems using both experimental and theoretical approaches, and we argue that brain activity is better understood considering information flows in the phase space of the corresponding dynamical model. In particular, we show how theory helps to understand intriguing experimental results in this matter, and how recent knowledge inspires new theoretical formalisms that can be tested with modern experimental techniques.
A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sinitskiy, Anton V.; Voth, Gregory A., E-mail: gavoth@uchicago.edu
2015-09-07
Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman’s imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionistmore » perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.« less
A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals.
Sinitskiy, Anton V; Voth, Gregory A
2015-09-07
Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman's imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionist perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Banik, Manik, E-mail: manik11ju@gmail.com
Steering is one of the most counter intuitive non-classical features of bipartite quantum system, first noticed by Schrödinger at the early days of quantum theory. On the other hand, measurement incompatibility is another non-classical feature of quantum theory, initially pointed out by Bohr. Recently, Quintino et al. [Phys. Rev. Lett. 113, 160402 (2014)] and Uola et al. [Phys. Rev. Lett. 113, 160403 (2014)] have investigated the relation between these two distinct non-classical features. They have shown that a set of measurements is not jointly measurable (i.e., incompatible) if and only if they can be used for demonstrating Schrödinger-Einstein-Podolsky-Rosen steering. Themore » concept of steering has been generalized for more general abstract tensor product theories rather than just Hilbert space quantum mechanics. In this article, we discuss that the notion of measurement incompatibility can be extended for general probability theories. Further, we show that the connection between steering and measurement incompatibility holds in a border class of tensor product theories rather than just quantum theory.« less
What is Quantum Mechanics? A Minimal Formulation
NASA Astrophysics Data System (ADS)
Friedberg, R.; Hohenberg, P. C.
2018-03-01
This paper presents a minimal formulation of nonrelativistic quantum mechanics, by which is meant a formulation which describes the theory in a succinct, self-contained, clear, unambiguous and of course correct manner. The bulk of the presentation is the so-called "microscopic theory", applicable to any closed system S of arbitrary size N, using concepts referring to S alone, without resort to external apparatus or external agents. An example of a similar minimal microscopic theory is the standard formulation of classical mechanics, which serves as the template for a minimal quantum theory. The only substantive assumption required is the replacement of the classical Euclidean phase space by Hilbert space in the quantum case, with the attendant all-important phenomenon of quantum incompatibility. Two fundamental theorems of Hilbert space, the Kochen-Specker-Bell theorem and Gleason's theorem, then lead inevitably to the well-known Born probability rule. For both classical and quantum mechanics, questions of physical implementation and experimental verification of the predictions of the theories are the domain of the macroscopic theory, which is argued to be a special case or application of the more general microscopic theory.
Theory of mind deficit in adult patients with congenital heart disease.
Chiavarino, Claudia; Bianchino, Claudia; Brach-Prever, Silvia; Riggi, Chiara; Palumbo, Luigi; Bara, Bruno G; Bosco, Francesca M
2015-10-01
This article provides the first assessment of theory of mind, that is, the ability to reason about mental states, in adult patients with congenital heart disease. Patients with congenital heart disease and matched healthy controls were administered classical theory of mind tasks and a semi-structured interview which provides a multidimensional evaluation of theory of mind (Theory of Mind Assessment Scale). The patients with congenital heart disease performed worse than the controls on the Theory of Mind Assessment Scale, whereas they did as well as the control group on the classical theory-of-mind tasks. These findings provide the first evidence that adults with congenital heart disease may display specific impairments in theory of mind. © The Author(s) 2013.
Classical Physics and the Bounds of Quantum Correlations.
Frustaglia, Diego; Baltanás, José P; Velázquez-Ahumada, María C; Fernández-Prieto, Armando; Lujambio, Aintzane; Losada, Vicente; Freire, Manuel J; Cabello, Adán
2016-06-24
A unifying principle explaining the numerical bounds of quantum correlations remains elusive, despite the efforts devoted to identifying it. Here, we show that these bounds are indeed not exclusive to quantum theory: for any abstract correlation scenario with compatible measurements, models based on classical waves produce probability distributions indistinguishable from those of quantum theory and, therefore, share the same bounds. We demonstrate this finding by implementing classical microwaves that propagate along meter-size transmission-line circuits and reproduce the probabilities of three emblematic quantum experiments. Our results show that the "quantum" bounds would also occur in a classical universe without quanta. The implications of this observation are discussed.
Controlling the Transport of an Ion: Classical and Quantum Mechanical Solutions
2014-07-09
quantum systems: tools, achievements, and limitations Christiane P Koch Shortcuts to adiabaticity for an ion in a rotating radially- tight trap M Palmero...Keywords: coherent control, ion traps, quantum information, optimal control theory 1. Introduction Control methods are key enabling techniques in many...figure 6. 3.4. Feasibility analysis of quantum optimal control Numerical optimization of the wavepacket motion is expected to become necessary once
Open or closed? Dirac, Heisenberg, and the relation between classical and quantum mechanics
NASA Astrophysics Data System (ADS)
Bokulich, Alisa
2004-09-01
This paper describes a long-standing, though little known, debate between Dirac and Heisenberg over the nature of scientific methodology, theory change, and intertheoretic relations. Following Heisenberg's terminology, their disagreements can be summarized as a debate over whether the classical and quantum theories are "open" or "closed." A close examination of this debate sheds new light on the philosophical views of two of the great founders of quantum theory.
The role of a posteriori mathematics in physics
NASA Astrophysics Data System (ADS)
MacKinnon, Edward
2018-05-01
The calculus that co-evolved with classical mechanics relied on definitions of functions and differentials that accommodated physical intuitions. In the early nineteenth century mathematicians began the rigorous reformulation of calculus and eventually succeeded in putting almost all of mathematics on a set-theoretic foundation. Physicists traditionally ignore this rigorous mathematics. Physicists often rely on a posteriori math, a practice of using physical considerations to determine mathematical formulations. This is illustrated by examples from classical and quantum physics. A justification of such practice stems from a consideration of the role of phenomenological theories in classical physics and effective theories in contemporary physics. This relates to the larger question of how physical theories should be interpreted.
NASA Technical Reports Server (NTRS)
Paquette, John A.; Nuth, Joseph A., III
2011-01-01
Classical nucleation theory has been used in models of dust nucleation in circumstellar outflows around oxygen-rich asymptotic giant branch stars. One objection to the application of classical nucleation theory (CNT) to astrophysical systems of this sort is that an equilibrium distribution of clusters (assumed by CNT) is unlikely to exist in such conditions due to a low collision rate of condensable species. A model of silicate grain nucleation and growth was modified to evaluate the effect of a nucleation flux orders of magnitUde below the equilibrium value. The results show that a lack of chemical equilibrium has only a small effect on the ultimate grain distribution.
NASA Astrophysics Data System (ADS)
Tang, Jian-Shun; Wang, Yi-Tao; Yu, Shang; He, De-Yong; Xu, Jin-Shi; Liu, Bi-Heng; Chen, Geng; Sun, Yong-Nan; Sun, Kai; Han, Yong-Jian; Li, Chuan-Feng; Guo, Guang-Can
2016-10-01
The experimental progress achieved in parity-time () symmetry in classical optics is the most important accomplishment in the past decade and stimulates many new applications, such as unidirectional light transport and single-mode lasers. However, in the quantum regime, some controversial effects are proposed for -symmetric theory, for example, the potential violation of the no-signalling principle. It is therefore important to understand whether -symmetric theory is consistent with well-established principles. Here, we experimentally study this no-signalling problem related to the -symmetric theory using two space-like separated entangled photons, with one of them passing through a post-selected quantum gate, which effectively simulates a -symmetric evolution. Our results suggest that the superluminal information transmission can be simulated when the successfully -symmetrically evolved subspace is solely considered. However, considering this subspace is only a part of the full Hermitian system, additional information regarding whether the -symmetric evolution is successful is necessary, which transmits to the receiver at maximally light speed, maintaining the no-signalling principle.
NASA Astrophysics Data System (ADS)
Tang, Jian-Shun; Wang, Yi-Tao; Han, Yong-Jian; Li, Chuan-Feng; Guo, Guang-Can
The experimental progress achieved in parity-time (PT) symmetry in classical optics is the most important accomplishment in the past decade and stimulates many new applications, such as unidirectional light transport and single-mode lasers. However, in the quantum regime, some controversial effects are proposed for PT-symmetric theory, for example, the potential violation of the no-signalling principle. It is therefore important to understand whether PT-symmetric theory is consistent with well-established principles. Here, we experimentally study this no-signalling problem related to the PT-symmetric theory using two space-like separated entangled photons, with one of them passing through a post-selected quantum gate, which effectively simulates a PT-symmetric evolution. Our results suggest that the superluminal information transmission can be simulated when the successfully PT-symmetrically evolved subspace is solely considered. However, considering this subspace is only a part of the full Hermitian system, additional information regarding whether the PT-symmetric evolution is successful is necessary, which transmits to the receiver at maximally light speed, maintaining the no-signalling principle.
Research on the Fusion of Dependent Evidence Based on Rank Correlation Coefficient.
Shi, Fengjian; Su, Xiaoyan; Qian, Hong; Yang, Ning; Han, Wenhua
2017-10-16
In order to meet the higher accuracy and system reliability requirements, the information fusion for multi-sensor systems is an increasing concern. Dempster-Shafer evidence theory (D-S theory) has been investigated for many applications in multi-sensor information fusion due to its flexibility in uncertainty modeling. However, classical evidence theory assumes that the evidence is independent of each other, which is often unrealistic. Ignoring the relationship between the evidence may lead to unreasonable fusion results, and even lead to wrong decisions. This assumption severely prevents D-S evidence theory from practical application and further development. In this paper, an innovative evidence fusion model to deal with dependent evidence based on rank correlation coefficient is proposed. The model first uses rank correlation coefficient to measure the dependence degree between different evidence. Then, total discount coefficient is obtained based on the dependence degree, which also considers the impact of the reliability of evidence. Finally, the discount evidence fusion model is presented. An example is illustrated to show the use and effectiveness of the proposed method.
Testing the Quantum-Classical Boundary and Dimensionality of Quantum Systems
NASA Astrophysics Data System (ADS)
Shun, Poh Hou
Quantum theory introduces a cut between the observer and the observed system [1], but does not provide a definition of what is an observer [2]. Based on an informational def- inition of the observer, Grinbaum has recently [3] predicted an upper bound on bipartite correlations in the Clauser-Horne-Shimony-Holt (CHSH) Bell scenario equal to 2.82537, which is slightly smaller than the Tsirelson bound [4] of standard quantum theory, but is consistent with all the available experimental results [5--17]. Not being able to exceed Grin- baum's limit would support that quantum theory is only an effective description of a more fundamental theory and would have a deep impact in physics and quantum information processing. In this thesis, we present a test of the CHSH inequality on photon pairs in maximally entangled states of polarization in which a value 2.8276 +/- 0.00082 is observed, violating Grinbaum's bound by 2.72 standard deviations and providing the smallest distance with respect to Tsirelson's bound ever reported, namely, 0.0008 +/- 0.00082. (Abstract shortened by UMI.).
Research on the Fusion of Dependent Evidence Based on Rank Correlation Coefficient
Su, Xiaoyan; Qian, Hong; Yang, Ning; Han, Wenhua
2017-01-01
In order to meet the higher accuracy and system reliability requirements, the information fusion for multi-sensor systems is an increasing concern. Dempster–Shafer evidence theory (D–S theory) has been investigated for many applications in multi-sensor information fusion due to its flexibility in uncertainty modeling. However, classical evidence theory assumes that the evidence is independent of each other, which is often unrealistic. Ignoring the relationship between the evidence may lead to unreasonable fusion results, and even lead to wrong decisions. This assumption severely prevents D–S evidence theory from practical application and further development. In this paper, an innovative evidence fusion model to deal with dependent evidence based on rank correlation coefficient is proposed. The model first uses rank correlation coefficient to measure the dependence degree between different evidence. Then, total discount coefficient is obtained based on the dependence degree, which also considers the impact of the reliability of evidence. Finally, the discount evidence fusion model is presented. An example is illustrated to show the use and effectiveness of the proposed method. PMID:29035341
S-Duality, Deconstruction and Confinement for a Marginal Deformation of N=4 SUSY Yang-Mills
NASA Astrophysics Data System (ADS)
Dorey, Nick
2004-08-01
We study an exactly marginal deformation of Script N = 4 SUSY Yang-Mills with gauge group U(N) using field theory and string theory methods. The classical theory has a Higgs branch for rational values of the deformation parameter. We argue that the quantum theory also has an S-dual confining branch which cannot be seen classically. The low-energy effective theory on these branches is a six-dimensional non-commutative gauge theory with sixteen supercharges. Confinement of magnetic and electric charges, on the Higgs and confining branches respectively, occurs due to the formation of BPS-saturated strings in the low energy theory. The results also suggest a new way of deconstructing Little String Theory as a large-N limit of a confining gauge theory in four dimensions.
High-pressure phase transitions - Examples of classical predictability
NASA Astrophysics Data System (ADS)
Celebonovic, Vladan
1992-09-01
The applicability of the Savic and Kasanin (1962-1967) classical theory of dense matter to laboratory experiments requiring estimates of high-pressure phase transitions was examined by determining phase transition pressures for a set of 19 chemical substances (including elements, hydrocarbons, metal oxides, and salts) for which experimental data were available. A comparison between experimental and transition points and those predicted by the Savic-Kasanin theory showed that the theory can be used for estimating values of transition pressures. The results also support conclusions obtained in previous astronomical applications of the Savic-Kasanin theory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khrennikov, Andrei
We present fundamentals of a prequantum model with hidden variables of the classical field type. In some sense this is the comeback of classical wave mechanics. Our approach also can be considered as incorporation of quantum mechanics into classical signal theory. All quantum averages (including correlations of entangled systems) can be represented as classical signal averages and correlations.
Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis
NASA Astrophysics Data System (ADS)
Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.
2013-05-01
Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.
Ashby, Nathaniel J S; Glöckner, Andreas; Dickert, Stephan
2011-01-01
Daily we make decisions ranging from the mundane to the seemingly pivotal that shape our lives. Assuming rationality, all relevant information about one's options should be thoroughly examined in order to make the best choice. However, some findings suggest that under specific circumstances thinking too much has disadvantageous effects on decision quality and that it might be best to let the unconscious do the busy work. In three studies we test the capacity assumption and the appropriate weighting principle of Unconscious Thought Theory using a classic risky choice paradigm and including a "deliberation with information" condition. Although we replicate an advantage for unconscious thought (UT) over "deliberation without information," we find that "deliberation with information" equals or outperforms UT in risky choices. These results speak against the generality of the assumption that UT has a higher capacity for information integration and show that this capacity assumption does not hold in all domains. Furthermore, we show that "deliberate thought with information" leads to more differentiated knowledge compared to UT which speaks against the generality of the appropriate weighting assumption.
Uniting the Spheres: Modern Feminist Theory and Classic Texts in AP English
ERIC Educational Resources Information Center
Drew, Simao J. A.; Bosnic, Brenda G.
2008-01-01
High school teachers Simao J. A. Drew and Brenda G. Bosnic help familiarize students with gender role analysis and feminist theory. Students examine classic literature and contemporary texts, considering characters' historical, literary, and social contexts while expanding their understanding of how patterns of identity and gender norms exist and…
Aesthetic Creativity: Insights from Classical Literary Theory on Creative Learning
ERIC Educational Resources Information Center
Hellstrom, Tomas Georg
2011-01-01
This paper addresses the subject of textual creativity by drawing on work done in classical literary theory and criticism, specifically new criticism, structuralism and early poststructuralism. The question of how readers and writers engage creatively with the text is closely related to educational concerns, though they are often thought of as…
ERIC Educational Resources Information Center
Bazaldua, Diego A. Luna; Lee, Young-Sun; Keller, Bryan; Fellers, Lauren
2017-01-01
The performance of various classical test theory (CTT) item discrimination estimators has been compared in the literature using both empirical and simulated data, resulting in mixed results regarding the preference of some discrimination estimators over others. This study analyzes the performance of various item discrimination estimators in CTT:…
Louis Guttman's Contributions to Classical Test Theory
ERIC Educational Resources Information Center
Zimmerman, Donald W.; Williams, Richard H.; Zumbo, Bruno D.; Ross, Donald
2005-01-01
This article focuses on Louis Guttman's contributions to the classical theory of educational and psychological tests, one of the lesser known of his many contributions to quantitative methods in the social sciences. Guttman's work in this field provided a rigorous mathematical basis for ideas that, for many decades after Spearman's initial work,…
Generalization of the Activated Complex Theory of Reaction Rates. II. Classical Mechanical Treatment
DOE R&D Accomplishments Database
Marcus, R. A.
1964-01-01
In its usual classical form activated complex theory assumes a particular expression for the kinetic energy of the reacting system -- one associated with a rectilinear motion along the reaction coordinate. The derivation of the rate expression given in the present paper is based on the general kinetic energy expression.
NASA Astrophysics Data System (ADS)
Yang, Chen
2018-05-01
The transitions from classical theories to quantum theories have attracted many interests. This paper demonstrates the analogy between the electromagnetic potentials and wave-like dynamic variables with their connections to quantum theory for audiences at advanced undergraduate level and above. In the first part, the counterpart relations in the classical electrodynamics (e.g. gauge transform and Lorenz condition) and classical mechanics (e.g. Legendre transform and free particle condition) are presented. These relations lead to similar governing equations of the field variables and dynamic variables. The Lorenz gauge, scalar potential and vector potential manifest a one-to-one similarity to the action, Hamiltonian and momentum, respectively. In the second part, the connections between the classical pictures of electromagnetic field and particle to quantum picture are presented. By characterising the states of electromagnetic field and particle via their (corresponding) variables, their evolution pictures manifest the same algebraic structure (isomorphic). Subsequently, pictures of the electromagnetic field and particle are compared to the quantum picture and their interconnections are given. A brief summary of the obtained results are presented at the end of the paper.
Quantum-correlation breaking channels, quantum conditional probability and Perron-Frobenius theory
NASA Astrophysics Data System (ADS)
Chruściński, Dariusz
2013-03-01
Using the quantum analog of conditional probability and classical Bayes theorem we discuss some aspects of particular entanglement breaking channels: quantum-classical and classical-classical channels. Applying the quantum analog of Perron-Frobenius theorem we generalize the recent result of Korbicz et al. (2012) [8] on full and spectrum broadcasting from quantum-classical channels to arbitrary quantum channels.
Operator Formulation of Classical Mechanics.
ERIC Educational Resources Information Center
Cohn, Jack
1980-01-01
Discusses the construction of an operator formulation of classical mechanics which is directly concerned with wave packets in configuration space and is more similar to that of convential quantum theory than other extant operator formulations of classical mechanics. (Author/HM)
Studies in the Theory of Quantum Games
NASA Astrophysics Data System (ADS)
Iqbal, Azhar
2005-03-01
Theory of quantum games is a new area of investigation that has gone through rapid development during the last few years. Initial motivation for playing games, in the quantum world, comes from the possibility of re-formulating quantum communication protocols, and algorithms, in terms of games between quantum and classical players. The possibility led to the view that quantum games have a potential to provide helpful insight into working of quantum algorithms, and even in finding new ones. This thesis analyzes and compares some interesting games when played classically and quantum mechanically. A large part of the thesis concerns investigations into a refinement notion of the Nash equilibrium concept. The refinement, called an evolutionarily stable strategy (ESS), was originally introduced in 1970s by mathematical biologists to model an evolving population using techniques borrowed from game theory. Analysis is developed around a situation when quantization changes ESSs without affecting corresponding Nash equilibria. Effects of quantization on solution-concepts other than Nash equilibrium are presented and discussed. For this purpose the notions of value of coalition, backwards-induction outcome, and subgame-perfect outcome are selected. Repeated games are known to have different information structure than one-shot games. Investigation is presented into a possible way where quantization changes the outcome of a repeated game. Lastly, two new suggestions are put forward to play quantum versions of classical matrix games. The first one uses the association of De Broglie's waves, with travelling material objects, as a resource for playing a quantum game. The second suggestion concerns an EPR type setting exploiting directly the correlations in Bell's inequalities to play a bi-matrix game.
Extended theory of harmonic maps connects general relativity to chaos and quantum mechanism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Gang; Duan, Yi-Shi
General relativity and quantum mechanism are two separate rules of modern physics explaining how nature works. Both theories are accurate, but the direct connection between two theories was not yet clarified. Recently, researchers blur the line between classical and quantum physics by connecting chaos and entanglement equation. Here in this paper, we showed the Duan's extended HM theory, which has the solution of the general relativity, can also have the solutions of the classic chaos equations and even the solution of Schrödinger equation in quantum physics, suggesting the extended theory of harmonic maps may act as a universal theory ofmore » physics.« less
Extended theory of harmonic maps connects general relativity to chaos and quantum mechanism
Ren, Gang; Duan, Yi-Shi
2017-07-20
General relativity and quantum mechanism are two separate rules of modern physics explaining how nature works. Both theories are accurate, but the direct connection between two theories was not yet clarified. Recently, researchers blur the line between classical and quantum physics by connecting chaos and entanglement equation. Here in this paper, we showed the Duan's extended HM theory, which has the solution of the general relativity, can also have the solutions of the classic chaos equations and even the solution of Schrödinger equation in quantum physics, suggesting the extended theory of harmonic maps may act as a universal theory ofmore » physics.« less
Measuring Quality and Outcomes in Sports Medicine.
Ruzbarsky, Joseph J; Marom, Niv; Marx, Robert G
2018-07-01
Patient-reported outcome measures (PROMs) are objective metrics critical to evaluating outcomes throughout orthopedic surgery. New instruments continue to emerge, increasing the breadth of information required for those intending to use these measures for research or clinical care. Although earlier metrics were developed using the principles of classic test theory, newer instruments constructed using item response theory are amenable to computer-adaptive testing and may change the way these instruments are administered. This article aims to define the psychometric properties that are important to understand when using all PROMs and to review the most widely used instruments in sports medicine. Copyright © 2018 Elsevier Inc. All rights reserved.
Maximum Mass-Particle Velocities in Kantor's Information Mechanics
NASA Astrophysics Data System (ADS)
Sverdlik, Daniel I.
1989-02-01
Kantor's information mechanics links phenomena previously regarded as not treatable by a single theory. It is used here to calculate the maximum velocities ν m of single particles. For the electron, ν m/c≈1-1.253 814×10-77. The maximum ν m corresponds to ν m/c≈1-1.097864×10-122 for a single mass particle with a rest mass of 3.078 496×10-5g. This is the fastest that matter can move. Either information mechanics or classical mechanics can be used to show that ν m is less for heavier particles. That ν m is less for lighter particles can be deduced from an information mechanics argument alone.
Quantum correlations and dynamics from classical random fields valued in complex Hilbert spaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khrennikov, Andrei
2010-08-15
One of the crucial differences between mathematical models of classical and quantum mechanics (QM) is the use of the tensor product of the state spaces of subsystems as the state space of the corresponding composite system. (To describe an ensemble of classical composite systems, one uses random variables taking values in the Cartesian product of the state spaces of subsystems.) We show that, nevertheless, it is possible to establish a natural correspondence between the classical and the quantum probabilistic descriptions of composite systems. Quantum averages for composite systems (including entangled) can be represented as averages with respect to classical randommore » fields. It is essentially what Albert Einstein dreamed of. QM is represented as classical statistical mechanics with infinite-dimensional phase space. While the mathematical construction is completely rigorous, its physical interpretation is a complicated problem. We present the basic physical interpretation of prequantum classical statistical field theory in Sec. II. However, this is only the first step toward real physical theory.« less
An Overview of Judgment and Decision Making Research Through the Lens of Fuzzy Trace Theory
Setton, Roni; Wilhelms, Evan; Weldon, Becky; Chick, Christina; Reyna, Valerie
2017-01-01
We present the basic tenets of fuzzy trace theory, a comprehensive theory of memory, judgment, and decision making that is grounded in research on how information is stored as knowledge, mentally represented, retrieved from storage, and processed. In doing so, we highlight how it is distinguished from traditional models of decision making in that gist reasoning plays a central role. The theory also distinguishes advanced intuition from primitive impulsivity. It predicts that different sorts of errors occur with respect to each component of judgment and decision making: background knowledge, representation, retrieval, and processing. Classic errors in the judgment and decision making literature, such as risky-choice framing and the conjunction fallacy, are accounted for by fuzzy trace theory and new results generated by the theory contradict traditional approaches. We also describe how developmental changes in brain and behavior offer crucial insight into adult cognitive processing. Research investigating brain and behavior in developing and special populations supports fuzzy trace theory’s predictions about reliance on gist processing. PMID:28725239
Negative energy, superluminosity, and holography
NASA Astrophysics Data System (ADS)
Polchinski, Joseph; Susskind, Leonard; Toumbas, Nicolaos
1999-10-01
The holographic connection between large N super Yang-Mills (SYM) theory and gravity in anti-de Sitter (AdS) space requires unfamiliar behavior of the SYM theory in the limit that the curvature of the AdS geometry becomes small. The paradoxical behavior includes superluminal oscillations and negative energy density. These effects typically occur in the SYM description of events which take place far from the boundary of AdS when the signal from the event arrives at the boundary. The paradoxes can be resolved by assuming a very rich collection of hidden degrees of freedom of the SYM theory which store information but give rise to no local energy density. These degrees of freedom, called precursors, are needed to make possible sudden apparently acausal energy momentum flows. Such behavior would be impossible in classical field theory as a consequence of the positivity of the energy density. However we show that these effects are not only allowed in quantum field theory but that we can model them in free quantum field theory.
Opening Switch Research on a Plasma Focus VI.
1988-02-26
Sausage Instability in the Plasma Focus In this section the classical Kruskal- Schwarzschild 3 theory for the sausage mode is applied to the pinch phase...on 1) the shape of the pinch, 2) axial flow of plasma, and 3) self-generated magnetic fields are also presented. The Kruskal- Schwarzschild Theory The...classical mhd theory for the m=O mode in a plasma supported by a magnetic field against gravity; this is the well-known Kruskal- Schwarzschild
Nanoscale Capillary Flows in Alumina: Testing the Limits of Classical Theory.
Lei, Wenwen; McKenzie, David R
2016-07-21
Anodic aluminum oxide (AAO) membranes have well-formed cylindrical channels, as small as 10 nm in diameter, in a close packed hexagonal array. The channels in AAO membranes simulate very small leaks that may be present for example in an aluminum oxide device encapsulation. The 10 nm alumina channel is the smallest that has been studied to date for its moisture flow properties and provides a stringent test of classical capillary theory. We measure the rate at which moisture penetrates channels with diameters in the range of 10 to 120 nm with moist air present at 1 atm on one side and dry air at the same total pressure on the other. We extend classical theory for water leak rates at high humidities by allowing for variable meniscus curvature at the entrance and show that the extended theory explains why the flow increases greatly when capillary filling occurs and enables the contact angle to be determined. At low humidities our measurements for air-filled channels agree well with theory for the interdiffusive flow of water vapor in air. The flow rate of water-filled channels is one order of magnitude less than expected from classical capillary filling theory and is coincidentally equal to the helium flow rate, validating the use of helium leak testing for evaluating moisture flows in aluminum oxide leaks.
Linking environmental variability to population and community dynamics: Chapter 7
Pantel, Jelena H.; Pendleton, Daniel E.; Walters, Annika W.; Rogers, Lauren A.
2014-01-01
Linking population and community responses to environmental variability lies at the heart of ecology, yet methodological approaches vary and existence of broad patterns spanning taxonomic groups remains unclear. We review the characteristics of environmental and biological variability. Classic approaches to link environmental variability to population and community variability are discussed as are the importance of biotic factors such as life history and community interactions. In addition to classic approaches, newer techniques such as information theory and artificial neural networks are reviewed. The establishment and expansion of observing networks will provide new long-term ecological time-series data, and with it, opportunities to incorporate environmental variability into research. This review can help guide future research in the field of ecological and environmental variability.
NASA Astrophysics Data System (ADS)
Baccetti, Valentina; Mann, Robert B.; Terno, Daniel R.
Event horizons are the defining feature of classical black holes. They are the key ingredient of the information loss paradox which, as paradoxes in quantum foundations, is built on a combination of predictions of quantum theory and counterfactual classical features: neither horizon formation nor its crossing by a test body can be detected by a distant observer. Furthermore, horizons are unnecessary for the production of Hawking-like radiation. We demonstrate that when this radiation is taken into account, it can prevent horizon crossing/formation in a large class of models. We conjecture that horizon avoidance is a general feature of collapse. The nonexistence of event horizons dispels the paradox, but opens up important questions about thermodynamic properties of the resulting objects and correlations between different degrees of freedom.
Hamilton-Jacobi theory in multisymplectic classical field theories
NASA Astrophysics Data System (ADS)
de León, Manuel; Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso; Vilariño, Silvia
2017-09-01
The geometric framework for the Hamilton-Jacobi theory developed in the studies of Cariñena et al. [Int. J. Geom. Methods Mod. Phys. 3(7), 1417-1458 (2006)], Cariñena et al. [Int. J. Geom. Methods Mod. Phys. 13(2), 1650017 (2015)], and de León et al. [Variations, Geometry and Physics (Nova Science Publishers, New York, 2009)] is extended for multisymplectic first-order classical field theories. The Hamilton-Jacobi problem is stated for the Lagrangian and the Hamiltonian formalisms of these theories as a particular case of a more general problem, and the classical Hamilton-Jacobi equation for field theories is recovered from this geometrical setting. Particular and complete solutions to these problems are defined and characterized in several equivalent ways in both formalisms, and the equivalence between them is proved. The use of distributions in jet bundles that represent the solutions to the field equations is the fundamental tool in this formulation. Some examples are analyzed and, in particular, the Hamilton-Jacobi equation for non-autonomous mechanical systems is obtained as a special case of our results.
Properties of the Boltzmann equation in the classical approximation
Epelbaum, Thomas; Gelis, François; Tanji, Naoto; ...
2014-12-30
We examine the Boltzmann equation with elastic point-like scalar interactions in two different versions of the the classical approximation. Although solving numerically the Boltzmann equation with the unapproximated collision term poses no problem, this allows one to study the effect of the ultraviolet cutoff in these approximations. This cutoff dependence in the classical approximations of the Boltzmann equation is closely related to the non-renormalizability of the classical statistical approximation of the underlying quantum field theory. The kinetic theory setup that we consider here allows one to study in a much simpler way the dependence on the ultraviolet cutoff, since onemore » has also access to the non-approximated result for comparison.« less
A New QKD Protocol Based upon Authentication by EPR Entanglement State
NASA Astrophysics Data System (ADS)
Abushgra, Abdulbast A.
Cryptographic world has faced multiple challenges that are included in encoding and decoding transmitting information into a secure communication channel. Quantum cryptography may be another generation of the cryptography world, which is based on the law of physics. After decades of using the classical cryptography, there is an essential need to move a step forward through the most trusted systems, especially enormous amount of data flows through billions of communicating channels (e.g. The internet), and keeping this transmitting information away from eavesdropping is obligatory. Moreover, quantum cryptography has proved its standing against many weaknesses in the classical cryptography. One of these weaknesses is the ability to copy any type of information using a passive attack without an interruption, which is impossible in the quantum system. Theoretically, several quantum observables are utilized to diagnose an action of one particle. These observables are included in measuring mass, movement, speed, etc. The polarization of one photon occurs normally and randomly in the space. Any interruption that happens during sending of a light will cause a deconstruction of the light polarization. Therefore, particles' movement in a three-dimensional space is supported by Non-Cloning theory that makes eavesdroppers unable to interrupt a communication system. In case an eavesdropper tried to interrupt a photon, the photon will be destroyed after passing the photon into a quantum detector or any measurement device. In the last decades, many Quantum Key Distribution (QKD) protocols have been created to initiate a secret key during encoding and decoding transmitted data operations. Some of these protocols were proven un-secure based on the quantum attacks that were released early. Even though the power of physics is still active and the Non-Cloning theory is unbroken, some QKD protocols failed during the security measurements. The main reason of the failure is based on the inability to provide the authentication between the end users during the quantum and classical channels. The proposed QKD protocol was designed to utilize some advantages of quantum physics as well as solid functions that are used in the classical cryptography. The authentication is a requirement during different communication channels, where both legitimate parties must confirm their identities before starting to submit data (plain-text). Moreover, the protocol uses most needed scenarios to finish the communication without leaking important data. These scenarios have been approved in existing QKD protocols either by classical or quantum systems. The matrix techniques also are used as a part of the preparation of the authentication key, where the end users communicate by an EPR (related to Einstein, Podolsky, and Rosen theory in 1935 ) channel. The EPR channel will be supported by an entanglement of particles. If the EPR communication succeeded, transferring the converted plain-text is required. Finally, both end users will have an authenticated secret key, and the submission will be done without any interruption.
Quantum particles in general spacetimes: A tangent bundle formalism
NASA Astrophysics Data System (ADS)
Wohlfarth, Mattias N. R.
2018-06-01
Using tangent bundle geometry we construct an equivalent reformulation of classical field theory on flat spacetimes which simultaneously encodes the perspectives of multiple observers. Its generalization to curved spacetimes realizes a new type of nonminimal coupling of the fields and is shown to admit a canonical quantization procedure. For the resulting quantum theory we demonstrate the emergence of a particle interpretation, fully consistent with general relativistic geometry. The path dependency of parallel transport forces each observer to carry their own quantum state; we find that the communication of the corresponding quantum information may generate extra particles on curved spacetimes. A speculative link between quantum information and spacetime curvature is discussed which might lead to novel explanations for quantum decoherence and vanishing interference in double-slit or interaction-free measurement scenarios, in the mere presence of additional observers.
Experimental Observation of Two Features Unexpected from the Classical Theories of Rubber Elasticity
NASA Astrophysics Data System (ADS)
Nishi, Kengo; Fujii, Kenta; Chung, Ung-il; Shibayama, Mitsuhiro; Sakai, Takamasa
2017-12-01
Although the elastic modulus of a Gaussian chain network is thought to be successfully described by classical theories of rubber elasticity, such as the affine and phantom models, verification experiments are largely lacking owing to difficulties in precisely controlling of the network structure. We prepared well-defined model polymer networks experimentally, and measured the elastic modulus G for a broad range of polymer concentrations and connectivity probabilities, p . In our experiment, we observed two features that were distinct from those predicted by classical theories. First, we observed the critical behavior G ˜|p -pc|1.95 near the sol-gel transition. This scaling law is different from the prediction of classical theories, but can be explained by analogy between the electric conductivity of resistor networks and the elasticity of polymer networks. Here, pc is the sol-gel transition point. Furthermore, we found that the experimental G -p relations in the region above C* did not follow the affine or phantom theories. Instead, all the G /G0-p curves fell onto a single master curve when G was normalized by the elastic modulus at p =1 , G0. We show that the effective medium approximation for Gaussian chain networks explains this master curve.
Physical models of biological information and adaptation.
Stuart, C I
1985-04-07
The bio-informational equivalence asserts that biological processes reduce to processes of information transfer. In this paper, that equivalence is treated as a metaphor with deeply anthropomorphic content of a sort that resists constitutive-analytical definition, including formulation within mathematical theories of information. It is argued that continuance of the metaphor, as a quasi-theoretical perspective in biology, must entail a methodological dislocation between biological and physical science. It is proposed that a general class of functions, drawn from classical physics, can serve to eliminate the anthropomorphism. Further considerations indicate that the concept of biological adaptation is central to the general applicability of the informational idea in biology; a non-anthropomorphic treatment of adaptive phenomena is suggested in terms of variational principles.
ERIC Educational Resources Information Center
MacMillan, Peter D.
2000-01-01
Compared classical test theory (CTT), generalizability theory (GT), and multifaceted Rasch model (MFRM) approaches to detecting and correcting for rater variability using responses of 4,930 high school students graded by 3 raters on 9 scales. The MFRM approach identified far more raters as different than did the CTT analysis. GT and Rasch…
Marshaling Resources: A Classic Grounded Theory Study of Online Learners
ERIC Educational Resources Information Center
Yalof, Barbara
2012-01-01
Students who enroll in online courses comprise one quarter of an increasingly diverse student body in higher education today. Yet, it is not uncommon for an online program to lose over 50% of its enrolled students prior to graduation. This study used a classic grounded theory qualitative methodology to investigate the persistent problem of…
ERIC Educational Resources Information Center
Gotsch-Thomson, Susan
1990-01-01
Describes how gender is integrated into a classical social theory course by including a female theorist in the reading assignments and using "The Handmaid's Tale" by Margaret Atwood as the basis for class discussion. Reviews the course objectives and readings; describes the process of the class discussions; and provides student…
The Development of Bayesian Theory and Its Applications in Business and Bioinformatics
NASA Astrophysics Data System (ADS)
Zhang, Yifei
2018-03-01
Bayesian Theory originated from an Essay of a British mathematician named Thomas Bayes in 1763, and after its development in 20th century, Bayesian Statistics has been taking a significant part in statistical study of all fields. Due to the recent breakthrough of high-dimensional integral, Bayesian Statistics has been improved and perfected, and now it can be used to solve problems that Classical Statistics failed to solve. This paper summarizes Bayesian Statistics’ history, concepts and applications, which are illustrated in five parts: the history of Bayesian Statistics, the weakness of Classical Statistics, Bayesian Theory and its development and applications. The first two parts make a comparison between Bayesian Statistics and Classical Statistics in a macroscopic aspect. And the last three parts focus on Bayesian Theory in specific -- from introducing some particular Bayesian Statistics’ concepts to listing their development and finally their applications.
NASA Astrophysics Data System (ADS)
Mojahedi, Mahdi; Shekoohinejad, Hamidreza
2018-02-01
In this paper, temperature distribution in the continuous and pulsed end-pumped Nd:YAG rod crystal is determined using nonclassical and classical heat conduction theories. In order to find the temperature distribution in crystal, heat transfer differential equations of crystal with consideration of boundary conditions are derived based on non-Fourier's model and temperature distribution of the crystal is achieved by an analytical method. Then, by transferring non-Fourier differential equations to matrix equations, using finite element method, temperature and stress of every point of crystal are calculated in the time domain. According to the results, a comparison between classical and nonclassical theories is represented to investigate rupture power values. In continuous end pumping with equal input powers, non-Fourier theory predicts greater temperature and stress compared to Fourier theory. It also shows that with an increase in relaxation time, crystal rupture power decreases. Despite of these results, in single rectangular pulsed end-pumping condition, with an equal input power, Fourier theory indicates higher temperature and stress rather than non-Fourier theory. It is also observed that, when the relaxation time increases, maximum amounts of temperature and stress decrease.
Information processing, computation, and cognition
Scarantino, Andrea
2010-01-01
Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both – although others disagree vehemently. Yet different cognitive scientists use ‘computation’ and ‘information processing’ to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates’ empirical aspects. PMID:22210958
Gambini, R; Pullin, J
2000-12-18
We consider general relativity with a cosmological constant as a perturbative expansion around a completely solvable diffeomorphism invariant field theory. This theory is the lambda --> infinity limit of general relativity. This allows an explicit perturbative computational setup in which the quantum states of the theory and the classical observables can be explicitly computed. An unexpected relationship arises at a quantum level between the discrete spectrum of the volume operator and the allowed values of the cosmological constant.
Using generalizability theory to develop clinical assessment protocols.
Preuss, Richard A
2013-04-01
Clinical assessment protocols must produce data that are reliable, with a clinically attainable minimal detectable change (MDC). In a reliability study, generalizability theory has 2 advantages over classical test theory. These advantages provide information that allows assessment protocols to be adjusted to match individual patient profiles. First, generalizability theory allows the user to simultaneously consider multiple sources of measurement error variance (facets). Second, it allows the user to generalize the findings of the main study across the different study facets and to recalculate the reliability and MDC based on different combinations of facet conditions. In doing so, clinical assessment protocols can be chosen based on minimizing the number of measures that must be taken to achieve a realistic MDC, using repeated measures to minimize the MDC, or simply based on the combination that best allows the clinician to monitor an individual patient's progress over a specified period of time.
Evolving political science. Biological adaptation, rational action, and symbolism.
Tingley, Dustin
2006-01-01
Political science, as a discipline, has been reluctant to adopt theories and methodologies developed in fields studying human behavior from an evolutionary standpoint. I ask whether evolutionary concepts are reconcilable with standard political-science theories and whether those concepts help solve puzzles to which these theories classically are applied. I find that evolutionary concepts readily and simultaneously accommodate theories of rational choice, symbolism, interpretation, and acculturation. Moreover, phenomena perennially hard to explain in standard political science become clearer when human interactions are understood in light of natural selection and evolutionary psychology. These phenomena include the political and economic effects of emotion, status, personal attractiveness, and variations in information-processing and decision-making under uncertainty; exemplary is the use of "focal points" in multiple-equilibrium games. I conclude with an overview of recent research by, and ongoing debates among, scholars analyzing politics in evolutionarily sophisticated terms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lusanna, Luca
2004-08-19
The four (electro-magnetic, weak, strong and gravitational) interactions are described by singular Lagrangians and by Dirac-Bergmann theory of Hamiltonian constraints. As a consequence a subset of the original configuration variables are gauge variables, not determined by the equations of motion. Only at the Hamiltonian level it is possible to separate the gauge variables from the deterministic physical degrees of freedom, the Dirac observables, and to formulate a well posed Cauchy problem for them both in special and general relativity. Then the requirement of causality dictates the choice of retarded solutions at the classical level. However both the problems of themore » classical theory of the electron, leading to the choice of (1/2) (retarded + advanced) solutions, and the regularization of quantum field theory, leading to the Feynman propagator, introduce anticipatory aspects. The determination of the relativistic Darwin potential as a semi-classical approximation to the Lienard-Wiechert solution for particles with Grassmann-valued electric charges, regularizing the Coulomb self-energies, shows that these anticipatory effects live beyond the semi-classical approximation (tree level) under the form of radiative corrections, at least for the electro-magnetic interaction.Talk and 'best contribution' at The Sixth International Conference on Computing Anticipatory Systems CASYS'03, Liege August 11-16, 2003.« less
The dynamical mass of a classical Cepheid variable star in an eclipsing binary system.
Pietrzyński, G; Thompson, I B; Gieren, W; Graczyk, D; Bono, G; Udalski, A; Soszyński, I; Minniti, D; Pilecki, B
2010-11-25
Stellar pulsation theory provides a means of determining the masses of pulsating classical Cepheid supergiants-it is the pulsation that causes their luminosity to vary. Such pulsational masses are found to be smaller than the masses derived from stellar evolution theory: this is the Cepheid mass discrepancy problem, for which a solution is missing. An independent, accurate dynamical mass determination for a classical Cepheid variable star (as opposed to type-II Cepheids, low-mass stars with a very different evolutionary history) in a binary system is needed in order to determine which is correct. The accuracy of previous efforts to establish a dynamical Cepheid mass from Galactic single-lined non-eclipsing binaries was typically about 15-30% (refs 6, 7), which is not good enough to resolve the mass discrepancy problem. In spite of many observational efforts, no firm detection of a classical Cepheid in an eclipsing double-lined binary has hitherto been reported. Here we report the discovery of a classical Cepheid in a well detached, double-lined eclipsing binary in the Large Magellanic Cloud. We determine the mass to a precision of 1% and show that it agrees with its pulsation mass, providing strong evidence that pulsation theory correctly and precisely predicts the masses of classical Cepheids.
Phase-Sensitive Coherence and the Classical-Quantum Boundary in Ghost Imaging
NASA Technical Reports Server (NTRS)
Erkmen, Baris I.; Hardy, Nicholas D.; Venkatraman, Dheera; Wong, Franco N. C.; Shapiro, Jeffrey H.
2011-01-01
The theory of partial coherence has a long and storied history in classical statistical optics. the vast majority of this work addresses fields that are statistically stationary in time, hence their complex envelopes only have phase-insensitive correlations. The quantum optics of squeezed-state generation, however, depends on nonlinear interactions producing baseband field operators with phase-insensitive and phase-sensitive correlations. Utilizing quantum light to enhance imaging has been a topic of considerable current interest, much of it involving biphotons, i.e., streams of entangled-photon pairs. Biphotons have been employed for quantum versions of optical coherence tomography, ghost imaging, holography, and lithography. However, their seemingly quantum features have been mimicked with classical-sate light, questioning wherein lies the classical-quantum boundary. We have shown, for the case of Gaussian-state light, that this boundary is intimately connected to the theory of phase-sensitive partial coherence. Here we present that theory, contrasting it with the familiar case of phase-insensitive partial coherence, and use it to elucidate the classical-quantum boundary of ghost imaging. We show, both theoretically and experimentally, that classical phase-sensitive light produces ghost imaging most closely mimicking those obtained in biphotons, and we derived the spatial resolution, image contrast, and signal-to-noise ratio of a standoff-sensing ghost imager, taking into account target-induced speckle.
Quantum Communication Using Coherent Rejection Sampling
NASA Astrophysics Data System (ADS)
Anshu, Anurag; Devabathini, Vamsi Krishna; Jain, Rahul
2017-09-01
Compression of a message up to the information it carries is key to many tasks involved in classical and quantum information theory. Schumacher [B. Schumacher, Phys. Rev. A 51, 2738 (1995), 10.1103/PhysRevA.51.2738] provided one of the first quantum compression schemes and several more general schemes have been developed ever since [M. Horodecki, J. Oppenheim, and A. Winter, Commun. Math. Phys. 269, 107 (2007); , 10.1007/s00220-006-0118-xI. Devetak and J. Yard, Phys. Rev. Lett. 100, 230501 (2008); , 10.1103/PhysRevLett.100.230501A. Abeyesinghe, I. Devetak, P. Hayden, and A. Winter, Proc. R. Soc. A 465, 2537 (2009), 10.1098/rspa.2009.0202]. However, the one-shot characterization of these quantum tasks is still under development, and often lacks a direct connection with analogous classical tasks. Here we show a new technique for the compression of quantum messages with the aid of entanglement. We devise a new tool that we call the convex split lemma, which is a coherent quantum analogue of the widely used rejection sampling procedure in classical communication protocols. As a consequence, we exhibit new explicit protocols with tight communication cost for quantum state merging, quantum state splitting, and quantum state redistribution (up to a certain optimization in the latter case). We also present a port-based teleportation scheme which uses a fewer number of ports in the presence of information about input.
Quantum Bayesian perspective for intelligence reservoir characterization, monitoring and management.
Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia; de Jesús Correa, María
2017-11-13
The paper starts with a brief review of the literature about uncertainty in geological, geophysical and petrophysical data. In particular, we present the viewpoints of experts in geophysics on the application of Bayesian inference and subjective probability. Then we present arguments that the use of classical probability theory (CP) does not match completely the structure of geophysical data. We emphasize that such data are characterized by contextuality and non-Kolmogorovness (the impossibility to use the CP model), incompleteness as well as incompatibility of some geophysical measurements. These characteristics of geophysical data are similar to the characteristics of quantum physical data. Notwithstanding all this, contextuality can be seen as a major deviation of quantum theory from classical physics. In particular, the contextual probability viewpoint is the essence of the Växjö interpretation of quantum mechanics. We propose to use quantum probability (QP) for decision-making during the characterization, modelling, exploring and management of the intelligent hydrocarbon reservoir Quantum Bayesianism (QBism), one of the recently developed information interpretations of quantum theory, can be used as the interpretational basis for such QP decision-making in geology, geophysics and petroleum projects design and management.This article is part of the themed issue 'Second quantum revolution: foundational questions'. © 2017 The Author(s).
Bertrand's theorem and virial theorem in fractional classical mechanics
NASA Astrophysics Data System (ADS)
Yu, Rui-Yan; Wang, Towe
2017-09-01
Fractional classical mechanics is the classical counterpart of fractional quantum mechanics. The central force problem in this theory is investigated. Bertrand's theorem is generalized, and virial theorem is revisited, both in three spatial dimensions. In order to produce stable, closed, non-circular orbits, the inverse-square law and the Hooke's law should be modified in fractional classical mechanics.
The Tensile Strength of Liquid Nitrogen
NASA Astrophysics Data System (ADS)
Huang, Jian
1992-01-01
The tensile strength of liquids has been a puzzling subject. On the one hand, the classical nucleation theory has met great success in predicting the nucleation rates of superheated liquids. On the other hand, most of reported experimental values of the tensile strength for different liquids are far below the prediction from the classical nucleation theory. In this study, homogeneous nucleation in liquid nitrogen and its tensile strength have been investigated. Different approaches for determining the pressure amplitude were studied carefully. It is shown that Raman-Nath theory, as modified by the introduction of an effective interaction length, can be used to determine the pressure amplitude in the focal plane of a focusing ultrasonic transducer. The results obtained from different diffraction orders are consistent and in good agreement with other approaches including Debye's theory and solving the KZK equation. The measurement of the tensile strength was carried out in a high pressure stainless steel dewar. A High intensity ultrasonic wave was focused into a small volume of liquid nitrogen in a short time period. A probe laser beam passes through the focal region of a concave spherical transducer with small aperture angle and the transmitted light is detected with a photodiode. The pressure amplitude at the focus is calculated based on the acoustic power radiated into the liquid. In the experiment, the electrical signal on the transducer is gated at its resonance frequency with gate widths of 20 mus to 0.2 ms and temperature range from 77 K to near 100 K. The calculated pressure amplitude is in agreement with the prediction of classical nucleation theory for the nucleation rates from 10^6 to 10^ {11} (bubbles/cm^3 sec). This work provides the experimental evidence that the validity of the classical nucleation theory can be extended to the region of the negative pressure up to -90 atm. This is only the second cryogenic liquid to reach the tensile strength predicted from the classical nucleation theory.
Nonequilibrium dynamics of the O( N ) model on dS3 and AdS crunches
NASA Astrophysics Data System (ADS)
Kumar, S. Prem; Vaganov, Vladislav
2018-03-01
We study the nonperturbative quantum evolution of the interacting O( N ) vector model at large- N , formulated on a spatial two-sphere, with time dependent couplings which diverge at finite time. This model - the so-called "E-frame" theory, is related via a conformal transformation to the interacting O( N ) model in three dimensional global de Sitter spacetime with time independent couplings. We show that with a purely quartic, relevant deformation the quantum evolution of the E-frame model is regular even when the classical theory is rendered singular at the end of time by the diverging coupling. Time evolution drives the E-frame theory to the large- N Wilson-Fisher fixed point when the classical coupling diverges. We study the quantum evolution numerically for a variety of initial conditions and demonstrate the finiteness of the energy at the classical "end of time". With an additional (time dependent) mass deformation, quantum backreaction lowers the mass, with a putative smooth time evolution only possible in the limit of infinite quartic coupling. We discuss the relevance of these results for the resolution of crunch singularities in AdS geometries dual to E-frame theories with a classical gravity dual.
Semenov, Alexander; Babikov, Dmitri
2015-12-17
The mixed quantum classical theory, MQCT, for inelastic scattering of two molecules is developed, in which the internal (rotational, vibrational) motion of both collision partners is treated with quantum mechanics, and the molecule-molecule scattering (translational motion) is described by classical trajectories. The resultant MQCT formalism includes a system of coupled differential equations for quantum probability amplitudes, and the classical equations of motion in the mean-field potential. Numerical tests of this theory are carried out for several most important rotational state-to-state transitions in the N2 + H2 system, in a broad range of collision energies. Besides scattering resonances (at low collision energies) excellent agreement with full-quantum results is obtained, including the excitation thresholds, the maxima of cross sections, and even some smaller features, such as slight oscillations of energy dependencies. Most importantly, at higher energies the results of MQCT are nearly identical to the full quantum results, which makes this approach a good alternative to the full-quantum calculations that become computationally expensive at higher collision energies and for heavier collision partners. Extensions of this theory to include vibrational transitions or general asymmetric-top rotor (polyatomic) molecules are relatively straightforward.
Bojowald, Martin
2008-01-01
Quantum gravity is expected to be necessary in order to understand situations in which classical general relativity breaks down. In particular in cosmology one has to deal with initial singularities, i.e., the fact that the backward evolution of a classical spacetime inevitably comes to an end after a finite amount of proper time. This presents a breakdown of the classical picture and requires an extended theory for a meaningful description. Since small length scales and high curvatures are involved, quantum effects must play a role. Not only the singularity itself but also the surrounding spacetime is then modified. One particular theory is loop quantum cosmology, an application of loop quantum gravity to homogeneous systems, which removes classical singularities. Its implications can be studied at different levels. The main effects are introduced into effective classical equations, which allow one to avoid the interpretational problems of quantum theory. They give rise to new kinds of early-universe phenomenology with applications to inflation and cyclic models. To resolve classical singularities and to understand the structure of geometry around them, the quantum description is necessary. Classical evolution is then replaced by a difference equation for a wave function, which allows an extension of quantum spacetime beyond classical singularities. One main question is how these homogeneous scenarios are related to full loop quantum gravity, which can be dealt with at the level of distributional symmetric states. Finally, the new structure of spacetime arising in loop quantum gravity and its application to cosmology sheds light on more general issues, such as the nature of time. Supplementary material is available for this article at 10.12942/lrr-2008-4.
McCarthy, Bridie; Andrews, Tom; Hegarty, Josephine
2015-04-01
To explore family members' experiences when their loved one is undergoing chemotherapy treatment as an outpatient for newly diagnosed colorectal cancer and to develop an explanatory theory of how they process their main concern. Most individuals with cancer are now treated as outpatients and cared for by family members. International research highlights the many side effects of chemotherapy, which in the absence of specific information and/or experience can be difficult for family members to deal with. Unmet needs can have an impact on the health of both patients and family members. Classic grounded theory methodology was used for this study. Using classic grounded theory methodology, family members (n = 35) of patients undergoing chemotherapy treatment for cancer were interviewed (June 2010-July 2011). Data were analysed using the concurrent processes of constant comparative analysis, data collection, theoretical sampling and memo writing. The main concern that emerged for participants was fear of emotional collapse. This fear was dealt with through a process conceptualized as 'Emotional Resistance Building'. This is a basic social process with three phases: 'Figuring out', 'Getting on with it' and 'Uncertainty adjustment'. The phases are not linear, but interrelated as participants can be in any one or more of the phases at any one time. This theory has the potential to be used by healthcare professionals working in oncology to support family members of patients undergoing chemotherapy. New ways of supporting family members through this most difficult and challenging period are articulated within this theory. © 2014 John Wiley & Sons Ltd.
Ozierański, Piotr; King, Lawrence
2016-06-01
This article explores a key question in political sociology: Can post-communist policy-making be described with classical theories of the Western state or do we need a theory of the specificity of the post-communist state? In so doing, we consider Janine Wedel's clique theory, concerned with informal social actors and processes in post-communist transition. We conducted a case study of drug reimbursement policy in Poland, using 109 stakeholder interviews, official documents and media coverage. Drawing on 'sensitizing concepts' from Wedel's theory, especially the notion of 'deniability', we developed an explanation of why Poland's reimbursement policy combined suboptimal outcomes, procedural irregularities with limited accountability of key stakeholders. We argue that deniability was created through four main mechanisms: (1) blurred boundaries between different types of state authority allowing for the dispersion of blame for controversial policy decisions; (2) bridging different sectors by 'institutional nomads', who often escaped existing conflicts of interest regulations; (3) institutional nomads' 'flexible' methods of influence premised on managing roles and representations; and (4) coordination of resources and influence by elite cliques monopolizing exclusive policy expertise. Overall, the greatest power over drug reimbursement was often associated with lowest accountability. We suggest, therefore, that the clique theory can be generalized from its home domain of explanation in foreign aid and privatizations to more technologically advanced policies in Poland and other post-communist countries. This conclusion is not identical, however, with arguing the uniqueness of the post-communist state. Rather, we show potential for using Wedel's account to analyse policy-making in Western democracies and indicate scope for its possible integration with the classical theories of the state. © London School of Economics and Political Science 2016.
Petrillo, Jennifer; Cano, Stefan J; McLeod, Lori D; Coon, Cheryl D
2015-01-01
To provide comparisons and a worked example of item- and scale-level evaluations based on three psychometric methods used in patient-reported outcome development-classical test theory (CTT), item response theory (IRT), and Rasch measurement theory (RMT)-in an analysis of the National Eye Institute Visual Functioning Questionnaire (VFQ-25). Baseline VFQ-25 data from 240 participants with diabetic macular edema from a randomized, double-masked, multicenter clinical trial were used to evaluate the VFQ at the total score level. CTT, RMT, and IRT evaluations were conducted, and results were assessed in a head-to-head comparison. Results were similar across the three methods, with IRT and RMT providing more detailed diagnostic information on how to improve the scale. CTT led to the identification of two problematic items that threaten the validity of the overall scale score, sets of redundant items, and skewed response categories. IRT and RMT additionally identified poor fit for one item, many locally dependent items, poor targeting, and disordering of over half the response categories. Selection of a psychometric approach depends on many factors. Researchers should justify their evaluation method and consider the intended audience. If the instrument is being developed for descriptive purposes and on a restricted budget, a cursory examination of the CTT-based psychometric properties may be all that is possible. In a high-stakes situation, such as the development of a patient-reported outcome instrument for consideration in pharmaceutical labeling, however, a thorough psychometric evaluation including IRT or RMT should be considered, with final item-level decisions made on the basis of both quantitative and qualitative results. Copyright © 2015. Published by Elsevier Inc.
Aging Theories for Establishing Safe Life Spans of Airborne Critical Structural Components
NASA Technical Reports Server (NTRS)
Ko, William L.
2003-01-01
New aging theories have been developed to establish the safe life span of airborne critical structural components such as B-52B aircraft pylon hooks for carrying air-launch drop-test vehicles. The new aging theories use the equivalent-constant-amplitude loading spectrum to represent the actual random loading spectrum with the same damaging effect. The crack growth due to random loading cycling of the first flight is calculated using the half-cycle theory, and then extrapolated to all the crack growths of the subsequent flights. The predictions of the new aging theories (finite difference aging theory and closed-form aging theory) are compared with the classical flight-test life theory and the previously developed Ko first- and Ko second-order aging theories. The new aging theories predict the number of safe flights as considerably lower than that predicted by the classical aging theory, and slightly lower than those predicted by the Ko first- and Ko second-order aging theories due to the inclusion of all the higher order terms.
On the co-creation of classical and modern physics.
Staley, Richard
2005-12-01
While the concept of "classical physics" has long framed our understanding of the environment from which modern physics emerged, it has consistently been read back into a period in which the physicists concerned initially considered their work in quite other terms. This essay explores the shifting currency of the rich cultural image of the classical/ modern divide by tracing empirically different uses of "classical" within the physics community from the 1890s to 1911. A study of fin-de-siècle addresses shows that the earliest general uses of the concept proved controversial. Our present understanding of the term was in large part shaped by its incorporation (in different ways) within the emerging theories of relativity and quantum theory--where the content of "classical" physics was defined by proponents of the new. Studying the diverse ways in which Boltzmann, Larmor, Poincaré, Einstein, Minkowski, and Planck invoked the term "classical" will help clarify the critical relations between physicists' research programs and their use of worldview arguments in fashioning modern physics.
Gauge interaction as periodicity modulation
NASA Astrophysics Data System (ADS)
Dolce, Donatello
2012-06-01
The paper is devoted to a geometrical interpretation of gauge invariance in terms of the formalism of field theory in compact space-time dimensions (Dolce, 2011) [8]. In this formalism, the kinematic information of an interacting elementary particle is encoded on the relativistic geometrodynamics of the boundary of the theory through local transformations of the underlying space-time coordinates. Therefore gauge interactions are described as invariance of the theory under local deformations of the boundary. The resulting local variations of the field solution are interpreted as internal transformations. The internal symmetries of the gauge theory turn out to be related to corresponding space-time local symmetries. In the approximation of local infinitesimal isometric transformations, Maxwell's kinematics and gauge invariance are inferred directly from the variational principle. Furthermore we explicitly impose periodic conditions at the boundary of the theory as semi-classical quantization condition in order to investigate the quantum behavior of gauge interaction. In the abelian case the result is a remarkable formal correspondence with scalar QED.
Contact stresses in gear teeth: A new method of analysis
NASA Technical Reports Server (NTRS)
Somprakit, Paisan; Huston, Ronald L.; Oswald, Fred B.
1991-01-01
A new, innovative procedure called point load superposition for determining the contact stresses in mating gear teeth. It is believed that this procedure will greatly extend both the range of applicability and the accuracy of gear contact stress analysis. Point load superposition is based upon fundamental solutions from the theory of elasticity. It is an iterative numerical procedure which has distinct advantages over the classical Hertz method, the finite element method, and over existing applications with the boundary element method. Specifically, friction and sliding effects, which are either excluded from or difficult to study with the classical methods, are routinely handled with the new procedure. Presented here are the basic theory and the algorithms. Several examples are given. Results are consistent with those of the classical theories. Applications to spur gears are discussed.
Brassey, Charlotte A.; Margetts, Lee; Kitchener, Andrew C.; Withers, Philip J.; Manning, Phillip L.; Sellers, William I.
2013-01-01
Classic beam theory is frequently used in biomechanics to model the stress behaviour of vertebrate long bones, particularly when creating intraspecific scaling models. Although methodologically straightforward, classic beam theory requires complex irregular bones to be approximated as slender beams, and the errors associated with simplifying complex organic structures to such an extent are unknown. Alternative approaches, such as finite element analysis (FEA), while much more time-consuming to perform, require no such assumptions. This study compares the results obtained using classic beam theory with those from FEA to quantify the beam theory errors and to provide recommendations about when a full FEA is essential for reasonable biomechanical predictions. High-resolution computed tomographic scans of eight vertebrate long bones were used to calculate diaphyseal stress owing to various loading regimes. Under compression, FEA values of minimum principal stress (σmin) were on average 142 per cent (±28% s.e.) larger than those predicted by beam theory, with deviation between the two models correlated to shaft curvature (two-tailed p = 0.03, r2 = 0.56). Under bending, FEA values of maximum principal stress (σmax) and beam theory values differed on average by 12 per cent (±4% s.e.), with deviation between the models significantly correlated to cross-sectional asymmetry at midshaft (two-tailed p = 0.02, r2 = 0.62). In torsion, assuming maximum stress values occurred at the location of minimum cortical thickness brought beam theory and FEA values closest in line, and in this case FEA values of τtorsion were on average 14 per cent (±5% s.e.) higher than beam theory. Therefore, FEA is the preferred modelling solution when estimates of absolute diaphyseal stress are required, although values calculated by beam theory for bending may be acceptable in some situations. PMID:23173199
Generalized Quantum Theory of Bianchi IX Cosmologies
NASA Astrophysics Data System (ADS)
Craig, David; Hartle, James
2003-04-01
We apply sum-over-histories generalized quantum theory to the closed homogeneous minisuperspace Bianchi IX cosmological model. We sketch how the probabilities in decoherent sets of alternative, coarse-grained histories of this model universe are calculated. We consider in particular, the probabilities for classical evolution in a suitable coarse-graining. For a restricted class of initial conditions and coarse grainings we exhibit the approximate decoherence of alternative histories in which the universe behaves classically and those in which it does not, illustrating the prediction that these universes will evolve in an approximately classical manner with a probability near unity.
Budiyono, Agung; Rohrlich, Daniel
2017-11-03
Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.
"Fathers" and "sons" of theories in cell physiology: the membrane theory.
Matveev, V V; Wheatley, D N
2005-12-16
The last 50 years in the history of life sciences are remarkable for a new important feature that looks as a great threat for their future. A profound specialization dominating in quickly developing fields of science causes a crisis of the scientific method. The essence of the method is a unity of two elements, the experimental data and the theory that explains them. To us, "fathers" of science, classically, were the creators of new ideas and theories. They were the true experts of their own theories. It is only they who have the right to say: "I am the theory". In other words, they were carriers of theories, of the theoretical knowledge. The fathers provided the necessary logical integrity to their theories, since theories in biology have still to be based on strict mathematical proofs. It is not true for sons. As a result of massive specialization, modern experts operate in very confined close spaces. They formulate particular rules far from the level of theory. The main theories of science are known to them only at the textbook level. Nowadays, nobody can say: "I am the theory". With whom, then is it possible to discuss today on a broader theoretical level? How can a classical theory--for example, the membrane one--be changed or even disproved under these conditions? How can the "sons" with their narrow education catch sight of membrane theory defects? As a result, "global" theories have few critics and control. Due to specialization, we have lost the ability to work at the experimental level of biology within the correct or appropriate theoretical context. The scientific method in its classic form is now being rapidly eroded. A good case can be made for "Membrane Theory", to which we will largely refer throughout this article.
NASA Astrophysics Data System (ADS)
Dür, Wolfgang; Lamprecht, Raphael; Heusler, Stefan
2017-07-01
A long-range quantum communication network is among the most promising applications of emerging quantum technologies. We discuss the potential of such a quantum internet for the secure transmission of classical and quantum information, as well as theoretical and experimental approaches and recent advances to realize them. We illustrate the involved concepts such as error correction, teleportation or quantum repeaters and consider an approach to this topic based on catchy visualizations as a context-based, modern treatment of quantum theory at high school.
NASA Astrophysics Data System (ADS)
Girolami, Davide; Schmidt, Rebecca; Adesso, Gerardo
2015-10-01
Classical cybernetics is a successful meta-theory to model the regulation of complex systems from an abstract information-theoretic viewpoint, regardless of the properties of the system under scrutiny. Fundamental limits to the controllability of an open system can be formalized in terms of the law of requisite variety, which is derived from the second law of thermodynamics. These concepts are briefly reviewed, and the chances, challenges and potential gains arising from the generalisation of such a framework to the quantum domain are discussed.
A unifying model for adsorption and nucleation of vapors on solid surfaces.
Laaksonen, Ari
2015-04-23
Vapor interaction with solid surfaces is traditionally described with adsorption isotherms in the undersaturated regime and with heterogeneous nucleation theory in the supersaturated regime. A class of adsorption isotherms is based on the idea of vapor molecule clustering around so-called active sites. However, as the isotherms do not account for the surface curvature effects of the clusters, they predict an infinitely thick adsorption layer at saturation and do not recognize the existence of the supersaturated regime. The classical heterogeneous nucleation theory also builds on the idea of cluster formation, but describes the interactions between the surface and the cluster with a single parameter, the contact angle, which provides limited information compared with adsorption isotherms. Here, a new model of vapor adsorption on nonporous solid surfaces is derived. The basic assumption is that adsorption proceeds via formation of molecular clusters, modeled as liquid caps. The equilibrium of the individual clusters with the vapor phase is described with the Frenkel-Halsey-Hill (FHH) adsorption theory modified with the Kelvin equation that corrects for the curvature effect on vapor pressure. The new model extends the FHH adsorption isotherm to be applicable both at submonolayer surface coverages and at supersaturated conditions. It shows good agreement with experimental adsorption data from 12 different adsorbent-adsorbate systems. The model predictions are also compared against heterogeneous nucleation data, and they show much better agreement than predictions of the classical heterogeneous nucleation theory.
Survey on nonlocal games and operator space theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palazuelos, Carlos, E-mail: cpalazue@mat.ucm.es; Vidick, Thomas, E-mail: vidick@cms.caltech.edu
This review article is concerned with a recently uncovered connection between operator spaces, a noncommutative extension of Banach spaces, and quantum nonlocality, a striking phenomenon which underlies many of the applications of quantum mechanics to information theory, cryptography, and algorithms. Using the framework of nonlocal games, we relate measures of the nonlocality of quantum mechanics to certain norms in the Banach and operator space categories. We survey recent results that exploit this connection to derive large violations of Bell inequalities, study the complexity of the classical and quantum values of games and their relation to Grothendieck inequalities, and quantify themore » nonlocality of different classes of entangled states.« less
Berthelsen, Connie Bøttcher; Lindhardt, Tove; Frederiksen, Kirsten
2017-06-01
This paper presents a discussion of the differences in using participant observation as a data collection method by comparing the classic grounded theory methodology of Barney Glaser with the constructivist grounded theory methodology by Kathy Charmaz. Participant observations allow nursing researchers to experience activities and interactions directly in situ. However, using participant observations as a data collection method can be done in many ways, depending on the chosen grounded theory methodology, and may produce different results. This discussion shows that how the differences between using participant observations in classic and constructivist grounded theory can be considerable and that grounded theory researchers should adhere to the method descriptions of performing participant observations according to the selected grounded theory methodology to enhance the quality of research. © 2016 Nordic College of Caring Science.
The Institution of Sociological Theory in Canada.
Guzman, Cinthya; Silver, Daniel
2018-02-01
Using theory syllabi and departmental data collected for three academic years, this paper investigates the institutional practice of theory in sociology departments across Canada. In particular, it examines the position of theory within the sociological curriculum, and how this varies among universities. Taken together, our analyses indicate that theory remains deeply institutionalized at the core of sociological education and Canadian sociologists' self-understanding; that theorists as a whole show some coherence in how they define themselves, but differ in various ways, especially along lines of region, intellectual background, and gender; that despite these differences, the classical versus contemporary heuristic largely cuts across these divides, as does the strongly ingrained position of a small group of European authors as classics of the discipline as a whole. Nevertheless, who is a classic remains an unsettled question, alternatives to the "classical versus contemporary" heuristic do exist, and theorists' syllabi reveal diverse "others" as potential candidates. Our findings show that the field of sociology is neither marked by universal agreement nor by absolute division when it comes to its theoretical underpinnings. To the extent that they reveal a unified field, the findings suggest that unity lies more in a distinctive form than in a distinctive content, which defines the space and structure of the field of sociology. © 2018 Canadian Sociological Association/La Société canadienne de sociologie.
On the effective field theory of intersecting D3-branes
NASA Astrophysics Data System (ADS)
Abbaspur, Reza
2018-05-01
We study the effective field theory of two intersecting D3-branes with one common dimension along the lines recently proposed in ref. [1]. We introduce a systematic way of deriving the classical effective action to arbitrary orders in perturbation theory. Using a proper renormalization prescription to handle logarithmic divergencies arising at all orders in the perturbation series, we recover the first order renormalization group equation of ref. [1] plus an infinite set of higher order equations. We show the consistency of the higher order equations with the first order one and hence interpret the first order result as an exact RG flow equation in the classical theory.
NASA Technical Reports Server (NTRS)
Zeng, X. C.; Stroud, D.
1989-01-01
The previously developed Ginzburg-Landau theory for calculating the crystal-melt interfacial tension of bcc elements to treat the classical one-component plasma (OCP), the charged fermion system, and the Bose crystal. For the OCP, a direct application of the theory of Shih et al. (1987) yields for the surface tension 0.0012(Z-squared e-squared/a-cubed), where Ze is the ionic charge and a is the radius of the ionic sphere. Bose crystal-melt interface is treated by a quantum extension of the classical density-functional theory, using the Feynman formalism to estimate the relevant correlation functions. The theory is applied to the metastable He-4 solid-superfluid interface at T = 0, with a resulting surface tension of 0.085 erg/sq cm, in reasonable agreement with the value extrapolated from the measured surface tension of the bcc solid in the range 1.46-1.76 K. These results suggest that the density-functional approach is a satisfactory mean-field theory for estimating the equilibrium properties of liquid-solid interfaces, given knowledge of the uniform phases.
NASA Astrophysics Data System (ADS)
Brynjolfsson, Ari
2002-04-01
Einstein's general theory of relativity assumes that photons don't change frequency as they move from Sun to Earth. This assumption is correct in classical physics. All experiments proving the general relativity are in the domain of classical physics. This include the tests by Pound et al. of the gravitational redshift of 14.4 keV photons; the rocket experiments by Vessot et al.; the Galileo solar redshift experiments by Krisher et al.; the gravitational deflection of light experiments by Riveros and Vucetich; and delay of echoes of radar signals passing close to Sun as observed by Shapiro et al. Bohr's correspondence principle assures that quantum mechanical theory of general relativity agrees with Einstein's classical theory when frequency and gravitational field gradient approach zero, or when photons cannot interact with the gravitational field. When we treat photons as quantum mechanical particles; we find that gravitational force on photons is reversed (antigravity). This modified theory contradicts the equivalence principle, but is consistent with all experiments. Solar lines and distant stars are redshifted in accordance with author's plasma redshift theory. These changes result in a beautiful consistent cosmology.
The Split-Brain Phenomenon Revisited: A Single Conscious Agent with Split Perception.
Pinto, Yair; de Haan, Edward H F; Lamme, Victor A F
2017-11-01
The split-brain phenomenon is caused by the surgical severing of the corpus callosum, the main route of communication between the cerebral hemispheres. The classical view of this syndrome asserts that conscious unity is abolished. The left hemisphere consciously experiences and functions independently of the right hemisphere. This view is a cornerstone of current consciousness research. In this review, we first discuss the evidence for the classical view. We then propose an alternative, the 'conscious unity, split perception' model. This model asserts that a split brain produces one conscious agent who experiences two parallel, unintegrated streams of information. In addition to changing our view of the split-brain phenomenon, this new model also poses a serious challenge for current dominant theories of consciousness. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Wilson, Mark; Allen, Diane D.; Li, Jun Corser
2006-01-01
This paper compares the approach and resultant outcomes of item response models (IRMs) and classical test theory (CTT). First, it reviews basic ideas of CTT, and compares them to the ideas about using IRMs introduced in an earlier paper. It then applies a comparison scheme based on the AERA/APA/NCME "Standards for Educational and…
ERIC Educational Resources Information Center
Culpepper, Steven Andrew
2013-01-01
A classic topic in the fields of psychometrics and measurement has been the impact of the number of scale categories on test score reliability. This study builds on previous research by further articulating the relationship between item response theory (IRT) and classical test theory (CTT). Equations are presented for comparing the reliability and…
ERIC Educational Resources Information Center
Mason, Brandon; Smithey, Martha
2012-01-01
This study examines Merton's Classical Strain Theory (1938) as a causative factor in intimate partner violence among college students. We theorize that college students experience general life strain and cumulative strain as they pursue the goal of a college degree. We test this strain on the likelihood of using intimate partner violence. Strain…
ERIC Educational Resources Information Center
Schlingman, Wayne M.; Prather, Edward E.; Wallace, Colin S.; Brissenden, Gina; Rudolph, Alexander L.
2012-01-01
This paper is the first in a series of investigations into the data from the recent national study using the Light and Spectroscopy Concept Inventory (LSCI). In this paper, we use classical test theory to form a framework of results that will be used to evaluate individual item difficulties, item discriminations, and the overall reliability of the…
Classical closure theory and Lam's interpretation of epsilon-RNG
NASA Technical Reports Server (NTRS)
Zhou, YE
1995-01-01
Lam's phenomenological epsilon-renormalization group (RNG) model is quite different from the other members of that group. It does not make use of the correspondence principle and the epsilon-expansion procedure. We demonstrate that Lam's epsilon-RNG model is essentially the physical space version of the classical closure theory in spectral space and consider the corresponding treatment of the eddy viscosity and energy backscatter.
New variables for classical and quantum gravity
NASA Technical Reports Server (NTRS)
Ashtekar, Abhay
1986-01-01
A Hamiltonian formulation of general relativity based on certain spinorial variables is introduced. These variables simplify the constraints of general relativity considerably and enable one to imbed the constraint surface in the phase space of Einstein's theory into that of Yang-Mills theory. The imbedding suggests new ways of attacking a number of problems in both classical and quantum gravity. Some illustrative applications are discussed.
ERIC Educational Resources Information Center
Sussman, Joshua; Beaujean, A. Alexander; Worrell, Frank C.; Watson, Stevie
2013-01-01
Item response models (IRMs) were used to analyze Cross Racial Identity Scale (CRIS) scores. Rasch analysis scores were compared with classical test theory (CTT) scores. The partial credit model demonstrated a high goodness of fit and correlations between Rasch and CTT scores ranged from 0.91 to 0.99. CRIS scores are supported by both methods.…
Conveying the Complex: Updating U.S. Joint Systems Analysis Doctrine with Complexity Theory
2013-12-10
screech during a public address, or sustain and amplify it during a guitar solo. Since the systems are nonlinear, understanding cause and effect... Classics , 2007), 12. 34 those frames.58 A technique to cope with the potentially confusing...Reynolds, Paul Davidson. A Primer in Theory Construction. Boston: Allyn and Bacon Classics , 2007. Riolo, Rick L. “The Effects and Evolution of Tag
Asteroid orbital error analysis: Theory and application
NASA Technical Reports Server (NTRS)
Muinonen, K.; Bowell, Edward
1992-01-01
We present a rigorous Bayesian theory for asteroid orbital error estimation in which the probability density of the orbital elements is derived from the noise statistics of the observations. For Gaussian noise in a linearized approximation the probability density is also Gaussian, and the errors of the orbital elements at a given epoch are fully described by the covariance matrix. The law of error propagation can then be applied to calculate past and future positional uncertainty ellipsoids (Cappellari et al. 1976, Yeomans et al. 1987, Whipple et al. 1991). To our knowledge, this is the first time a Bayesian approach has been formulated for orbital element estimation. In contrast to the classical Fisherian school of statistics, the Bayesian school allows a priori information to be formally present in the final estimation. However, Bayesian estimation does give the same results as Fisherian estimation when no priori information is assumed (Lehtinen 1988, and reference therein).
Real time forecasting of near-future evolution.
Gerrish, Philip J; Sniegowski, Paul D
2012-09-07
A metaphor for adaptation that informs much evolutionary thinking today is that of mountain climbing, where horizontal displacement represents change in genotype, and vertical displacement represents change in fitness. If it were known a priori what the 'fitness landscape' looked like, that is, how the myriad possible genotypes mapped onto fitness, then the possible paths up the fitness mountain could each be assigned a probability, thus providing a dynamical theory with long-term predictive power. Such detailed genotype-fitness data, however, are rarely available and are subject to change with each change in the organism or in the environment. Here, we take a very different approach that depends only on fitness or phenotype-fitness data obtained in real time and requires no a priori information about the fitness landscape. Our general statistical model of adaptive evolution builds on classical theory and gives reasonable predictions of fitness and phenotype evolution many generations into the future.
Quantum kinetic theory of the filamentation instability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bret, A.; Haas, F.
2011-07-15
The quantum electromagnetic dielectric tensor for a multi-species plasma is re-derived from the gauge-invariant Wigner-Maxwell system and presented under a form very similar to the classical one. The resulting expression is then applied to a quantum kinetic theory of the electromagnetic filamentation instability. Comparison is made with the quantum fluid theory including a Bohm pressure term and with the cold classical plasma result. A number of analytical expressions are derived for the cutoff wave vector, the largest growth rate, and the most unstable wave vector.
A classical density-functional theory for describing water interfaces.
Hughes, Jessica; Krebs, Eric J; Roundy, David
2013-01-14
We develop a classical density functional for water which combines the White Bear fundamental-measure theory (FMT) functional for the hard sphere fluid with attractive interactions based on the statistical associating fluid theory variable range (SAFT-VR). This functional reproduces the properties of water at both long and short length scales over a wide range of temperatures and is computationally efficient, comparable to the cost of FMT itself. We demonstrate our functional by applying it to systems composed of two hard rods, four hard rods arranged in a square, and hard spheres in water.
Geometry, topology, and string theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Varadarajan, Uday
A variety of scenarios are considered which shed light upon the uses and limitations of classical geometric and topological notions in string theory. The primary focus is on situations in which D-brane or string probes of a given classical space-time see the geometry quite differently than one might naively expect. In particular, situations in which extra dimensions, non-commutative geometries as well as other non-local structures emerge are explored in detail. Further, a preliminary exploration of such issues in Lorentzian space-times with non-trivial causal structures within string theory is initiated.
Extracting Information about the Initial State from the Black Hole Radiation.
Lochan, Kinjalk; Padmanabhan, T
2016-02-05
The crux of the black hole information paradox is related to the fact that the complete information about the initial state of a quantum field in a collapsing spacetime is not available to future asymptotic observers, belying the expectations from a unitary quantum theory. We study the imprints of the initial quantum state contained in a specific class of distortions of the black hole radiation and identify the classes of in states that can be partially or fully reconstructed from the information contained within. Even for the general in state, we can uncover some specific information. These results suggest that a classical collapse scenario ignores this richness of information in the resulting spectrum and a consistent quantum treatment of the entire collapse process might allow us to retrieve much more information from the spectrum of the final radiation.
Essays on inference in economics, competition, and the rate of profit
NASA Astrophysics Data System (ADS)
Scharfenaker, Ellis S.
This dissertation is comprised of three papers that demonstrate the role of Bayesian methods of inference and Shannon's information theory in classical political economy. The first chapter explores the empirical distribution of profit rate data from North American firms from 1962-2012. This chapter address the fact that existing methods for sample selection from noisy profit rate data in the industrial organization field of economics tends to be conditional on a covariate's value that risks discarding information. Conditioning sample selection instead on the profit rate data's structure by means of a two component (signal and noise) Bayesian mixture model we find the the profit rate sample to be time stationary Laplace distributed, corroborating earlier estimates of cross section distributions. The second chapter compares alternative probabilistic approaches to discrete (quantal) choice analysis and examines the various ways in which they overlap. In particular, the work on individual choice behavior by Duncan Luce and the extension of this work to quantal response problems by game theoreticians is shown to be related both to the rational inattention work of Christopher Sims through Shannon's information theory as well as to the maximum entropy principle of inference proposed physicist Edwin T. Jaynes. In the third chapter I propose a model of ``classically" competitive firms facing informational entropy constraints in their decisions to potentially enter or exit markets based on profit rate differentials. The result is a three parameter logit quantal response distribution for firm entry and exit decisions. Bayesian methods are used for inference into the the distribution of entry and exit decisions conditional on profit rate deviations and firm level data from Compustat is used to test these predictions.
Kaurin, Aleksandra; Egloff, Boris; Stringaris, Argyris; Wessa, Michèle
2016-08-01
Multi-informant approaches are thought to be key to clinical assessment. Classical theories of psychological measurements assume that only convergence among different informants' reports allows for an estimate of the true nature and causes of clinical presentations. However, the integration of multiple accounts is fraught with problems because findings in child and adolescent psychiatry do not conform to the fundamental expectation of convergence. Indeed, reports provided by different sources (self, parents, teachers, peers) share little variance. Moreover, in some cases informant divergence may be meaningful and not error variance. In this review, we give an overview of conceptual and theoretical foundations of valid multi-informant assessment and discuss why our common concepts of validity need revaluation.
Tensor products of process matrices with indefinite causal structure
NASA Astrophysics Data System (ADS)
Jia, Ding; Sakharwade, Nitica
2018-03-01
Theories with indefinite causal structure have been studied from both the fundamental perspective of quantum gravity and the practical perspective of information processing. In this paper we point out a restriction in forming tensor products of objects with indefinite causal structure in certain models: there exist both classical and quantum objects the tensor products of which violate the normalization condition of probabilities, if all local operations are allowed. We obtain a necessary and sufficient condition for when such unrestricted tensor products of multipartite objects are (in)valid. This poses a challenge to extending communication theory to indefinite causal structures, as the tensor product is the fundamental ingredient in the asymptotic setting of communication theory. We discuss a few options to evade this issue. In particular, we show that the sequential asymptotic setting does not suffer the violation of normalization.
A psychometric evaluation of the digital logic concept inventory
NASA Astrophysics Data System (ADS)
Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.
2014-10-01
Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric evaluation). Classical Test Theory and Item Response Theory provide two psychometric frameworks for evaluating the quality of assessment tools. We discuss how these theories can be applied to assessment tools generally and then apply them to the Digital Logic Concept Inventory (DLCI). We demonstrate that the DLCI is sufficiently reliable for research purposes when used in its entirety and as a post-course assessment of students' conceptual understanding of digital logic. The DLCI can also discriminate between students across a wide range of ability levels, providing the most information about weaker students' ability levels.
Semiclassical theory of electronically nonadiabatic transitions in molecular collision processes
NASA Technical Reports Server (NTRS)
Lam, K. S.; George, T. F.
1979-01-01
An introductory account of the semiclassical theory of the S-matrix for molecular collision processes is presented, with special emphasis on electronically nonadiabatic transitions. This theory is based on the incorporation of classical mechanics with quantum superposition, and in practice makes use of the analytic continuation of classical mechanics into the complex space of time domain. The relevant concepts of molecular scattering theory and related dynamical models are described and the formalism is developed and illustrated with simple examples - collinear collision of the A+BC type. The theory is then extended to include the effects of laser-induced nonadiabatic transitions. Two bound continuum processes collisional ionization and collision-induced emission also amenable to the same general semiclassical treatment are discussed.
Classical theory of atomic collisions - The first hundred years
NASA Astrophysics Data System (ADS)
Grujić, Petar V.
2012-05-01
Classical calculations of the atomic processes started in 1911 with famous Rutherford's evaluation of the differential cross section for α particles scattered on foil atoms [1]. The success of these calculations was soon overshadowed by the rise of Quantum Mechanics in 1925 and its triumphal success in describing processes at the atomic and subatomic levels. It was generally recognized that the classical approach should be inadequate and it was neglected until 1953, when the famous paper by Gregory Wannier appeared, in which the threshold law for the single ionization cross section behaviour by electron impact was derived. All later calculations and experimental studies confirmed the law derived by purely classical theory. The next step was taken by Ian Percival and collaborators in 60s, who developed a general classical three-body computer code, which was used by many researchers in evaluating various atomic processes like ionization, excitation, detachment, dissociation, etc. Another approach was pursued by Michal Gryzinski from Warsaw, who started a far reaching programme for treating atomic particles and processes as purely classical objects [2]. Though often criticized for overestimating the domain of the classical theory, results of his group were able to match many experimental data. Belgrade group was pursuing the classical approach using both analytical and numerical calculations, studying a number of atomic collisions, in particular near-threshold processes. Riga group, lead by Modris Gailitis [3], contributed considerably to the field, as it was done by Valentin Ostrovsky and coworkers from Sanct Petersbourg, who developed powerful analytical methods within purely classical mechanics [4]. We shall make an overview of these approaches and show some of the remarkable results, which were subsequently confirmed by semiclassical and quantum mechanical calculations, as well as by the experimental evidence. Finally we discuss the theoretical and epistemological background of the classical calculations and explain why these turned out so successful, despite the essentially quantum nature of the atomic and subatomic systems.
NASA Technical Reports Server (NTRS)
Ioannou, Petros J.; Lindzen, Richard S.
1993-01-01
Classical tidal theory is applied to the atmospheres of the outer planets. The tidal geopotential due to satellites of the outer planets is discussed, and the solution of Laplace's tidal equation for Hough modes appropriate to tides on the outer planets is examined. The vertical structure of tidal modes is described, noting that only relatively high-order meridional mode numbers can propagate vertically with growing amplitude. Expected magnitudes for tides in the visible atmosphere of Jupiter are discussed. The classical theory is extended to planetary interiors taking the effects of spherically and self-gravity into account. The thermodynamic structure of Jupiter is described and the WKB theory of the vertical structure equation is presented. The regions for which inertial, gravity, and acoustic oscillations are possible are delineated. The case of a planet with a neutral interior is treated, discussing the various atmospheric boundary conditions and showing that the tidal response is small.
Physics of automated driving in framework of three-phase traffic theory.
Kerner, Boris S
2018-04-01
We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.
Physics of automated driving in framework of three-phase traffic theory
NASA Astrophysics Data System (ADS)
Kerner, Boris S.
2018-04-01
We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.
Urns and Chameleons: two metaphors for two different types of measurements
NASA Astrophysics Data System (ADS)
Accardi, Luigi
2013-09-01
The awareness of the physical possibility of models of space, alternative with respect to the Euclidean one, begun to emerge towards the end of the 19-th century. At the end of the 20-th century a similar awareness emerged concerning the physical possibility of models of the laws of chance alternative with respect to the classical probabilistic models (Kolmogorov model). In geometry the mathematical construction of several non-Euclidean models of space preceded of about one century their applications in physics, which came with the theory of relativity. In physics the opposite situation took place. In fact, while the first example of non Kolmogorov probabilistic models emerged in quantum physics approximately one century ago, at the beginning of 1900, the awareness of the fact that this new mathematical formalism reflected a new mathematical model of the laws of chance had to wait until the early 1980's. In this long time interval the classical and the new probabilistic models were both used in the description and the interpretation of quantum phenomena and negatively interfered with each other because of the absence (for many decades) of a mathematical theory that clearly delimited the respective domains of application. The result of this interference was the emergence of the so-called the "paradoxes of quantum theory". For several decades there have been many different attempts to solve these paradoxes giving rise to what K. Popper baptized "the great quantum muddle": a debate which has been at the core of the philosophy of science for more than 50 years. However these attempts have led to contradictions between the two fundamental theories of the contemporary physical: the quantum theory and the theory of the relativity. Quantum probability identifies the reason of the emergence of non Kolmogorov models, and therefore of the so-called the paradoxes of quantum theory, in the difference between the notion of passive measurements like "reading pre-existent properties" (urn metaphor) and measurements consisting in reading "a response to an interaction" (chameleon metaphor). The non-trivial point is that one can prove that, while the urn scheme cannot lead to empirical data outside of classic probability, response based measurements can give rise to non classical statistics. The talk will include entirely classical examples of non classical statistics and potential applications to economic, sociological or biomedical phenomena.
De Tiège, Alexis; Van de Peer, Yves; Braeckman, Johan; Tanghe, Koen B
2017-11-22
Although classical evolutionary theory, i.e., population genetics and the Modern Synthesis, was already implicitly 'gene-centred', the organism was, in practice, still generally regarded as the individual unit of which a population is composed. The gene-centred approach to evolution only reached a logical conclusion with the advent of the gene-selectionist or gene's eye view in the 1960s and 1970s. Whereas classical evolutionary theory can only work with (genotypically represented) fitness differences between individual organisms, gene-selectionism is capable of working with fitness differences among genes within the same organism and genome. Here, we explore the explanatory potential of 'intra-organismic' and 'intra-genomic' gene-selectionism, i.e., of a behavioural-ecological 'gene's eye view' on genetic, genomic and organismal evolution. First, we give a general outline of the framework and how it complements the-to some extent-still 'organism-centred' approach of classical evolutionary theory. Secondly, we give a more in-depth assessment of its explanatory potential for biological evolution, i.e., for Darwin's 'common descent with modification' or, more specifically, for 'historical continuity or homology with modular evolutionary change' as it has been studied by evolutionary developmental biology (evo-devo) during the last few decades. In contrast with classical evolutionary theory, evo-devo focuses on 'within-organism' developmental processes. Given the capacity of gene-selectionism to adopt an intra-organismal gene's eye view, we outline the relevance of the latter model for evo-devo. Overall, we aim for the conceptual integration between the gene's eye view on the one hand, and more organism-centred evolutionary models (both classical evolutionary theory and evo-devo) on the other.
Clerc, Daryl G
2016-07-21
An ab initio approach was used to study the molecular-level interactions that connect gene-mutation to changes in an organism׳s phenotype. The study provides new insights into the evolutionary process and presents a simplification whereby changes in phenotypic properties may be studied in terms of the binding affinities of the chemical interactions affected by mutation, rather than by correlation to the genes. The study also reports the role that nonlinear effects play in the progression of organs, and how those effects relate to the classical theory of evolution. Results indicate that the classical theory of evolution occurs as a special case within the ab initio model - a case having two attributes. The first attribute: proteins and promoter regions are not shared among organs. The second attribute: continuous limiting behavior exists in the physical properties of organs as well as in the binding affinity of the associated chemical interactions, with respect to displacements in the chemical properties of proteins and promoter regions induced by mutation. Outside of the special case, second-order coupling contributions are significant and nonlinear effects play an important role, a result corroborated by analyses of published activity levels in binding and transactivation assays. Further, gradations in the state of perfection of an organ may be small or large depending on the type of mutation, and not necessarily closely-separated as maintained by the classical theory. Results also indicate that organs progress with varying degrees of interdependence, the likelihood of successful mutation decreases with increasing complexity of the affected chemical system, and differences between the ab initio model and the classical theory increase with increasing complexity of the organism. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.
The Basics: What's Essential about Theory for Community Development Practice?
ERIC Educational Resources Information Center
Hustedde, Ronald J.; Ganowicz, Jacek
2002-01-01
Relates three classical theories (structural functionalism, conflict theory, symbolic interactionism) to fundamental concerns of community development (structure, power, and shared meaning). Links these theories to Giddens' structuration theory, which connects macro and micro structures and community influence on change through cultural norms.…
On classical and quantum dynamics of tachyon-like fields and their cosmological implications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dimitrijević, Dragoljub D., E-mail: ddrag@pmf.ni.ac.rs; Djordjević, Goran S., E-mail: ddrag@pmf.ni.ac.rs; Milošević, Milan, E-mail: ddrag@pmf.ni.ac.rs
2014-11-24
We consider a class of tachyon-like potentials, motivated by string theory, D-brane dynamics and inflation theory in the context of classical and quantum mechanics. A formalism for describing dynamics of tachyon fields in spatially homogenous and one-dimensional - classical and quantum mechanical limit is proposed. A few models with concrete potentials are considered. Additionally, possibilities for p-adic and adelic generalization of these models are discussed. Classical actions and corresponding quantum propagators, in the Feynman path integral approach, are calculated in a form invariant on a change of the background number fields, i.e. on both archimedean and nonarchimedean spaces. Looking formore » a quantum origin of inflation, relevance of p-adic and adelic generalizations are briefly discussed.« less
The Classical Theory of Light Colors: a Paradigm for Description of Particle Interactions
NASA Astrophysics Data System (ADS)
Mazilu, Nicolae; Agop, Maricel; Gatu, Irina; Iacob, Dan Dezideriu; Butuc, Irina; Ghizdovat, Vlad
2016-06-01
The color is an interaction property: of the interaction of light with matter. Classically speaking it is therefore akin to the forces. But while forces engendered the mechanical view of the world, the colors generated the optical view. One of the modern concepts of interaction between the fundamental particles of matter - the quantum chromodynamics - aims to fill the gap between mechanics and optics, in a specific description of strong interactions. We show here that this modern description of the particle interactions has ties with both the classical and quantum theories of light, regardless of the connection between forces and colors. In a word, the light is a universal model in the description of matter. The description involves classical Yang-Mills fields related to color.
NASA Astrophysics Data System (ADS)
Ivanov, Sergey V.; Buzykin, Oleg G.
2016-12-01
A classical approach is applied to calculate pressure broadening coefficients of CO2 vibration-rotational spectral lines perturbed by Ar. Three types of spectra are examined: electric dipole (infrared) absorption; isotropic and anisotropic Raman Q branches. Simple and explicit formulae of the classical impact theory are used along with exact 3D Hamilton equations for CO2-Ar molecular motion. The calculations utilize vibrationally independent most accurate ab initio potential energy surface (PES) of Hutson et al. expanded in Legendre polynomial series up to lmax = 24. New improved algorithm of classical rotational frequency selection is applied. The dependences of CO2 half-widths on rotational quantum number J up to J=100 are computed for the temperatures between 77 and 765 K and compared with available experimental data as well as with the results of fully quantum dynamical calculations performed on the same PES. To make the picture complete, the predictions of two independent variants of the semi-classical Robert-Bonamy formalism for dipole absorption lines are included. This method. however, has demonstrated poor accuracy almost for all temperatures. On the contrary, classical broadening coefficients are in excellent agreement both with measurements and with quantum results at all temperatures. The classical impact theory in its present variant is capable to produce quickly and accurately the pressure broadening coefficients of spectral lines of linear molecules for any J value (including high Js) using full-dimensional ab initio - based PES in the cases where other computational methods are either extremely time consuming (like the quantum close coupling method) or give erroneous results (like semi-classical methods).
The polymer physics of single DNA confined in nanochannels.
Dai, Liang; Renner, C Benjamin; Doyle, Patrick S
2016-06-01
In recent years, applications and experimental studies of DNA in nanochannels have stimulated the investigation of the polymer physics of DNA in confinement. Recent advances in the physics of confined polymers, using DNA as a model polymer, have moved beyond the classic Odijk theory for the strong confinement, and the classic blob theory for the weak confinement. In this review, we present the current understanding of the behaviors of confined polymers while briefly reviewing classic theories. Three aspects of confined DNA are presented: static, dynamic, and topological properties. The relevant simulation methods are also summarized. In addition, comparisons of confined DNA with DNA under tension and DNA in semidilute solution are made to emphasize universal behaviors. Finally, an outlook of the possible future research for confined DNA is given. Copyright © 2015 Elsevier B.V. All rights reserved.
Classical and non-classical effective medium theories: New perspectives
NASA Astrophysics Data System (ADS)
Tsukerman, Igor
2017-05-01
Future research in electrodynamics of periodic electromagnetic composites (metamaterials) can be expected to produce sophisticated homogenization theories valid for any composition and size of the lattice cell. The paper outlines a promising path in that direction, leading to non-asymptotic and nonlocal homogenization models, and highlights aspects of homogenization that are often overlooked: the finite size of the sample and the role of interface boundaries. Classical theories (e.g. Clausius-Mossotti, Maxwell Garnett), while originally derived from a very different set of ideas, fit well into the proposed framework. Nonlocal effects can be included in the model, making an order-of-magnitude accuracy improvements possible. One future challenge is to determine what effective parameters can or cannot be obtained for a given set of constituents of a metamaterial lattice cell, thereby delineating the possible from the impossible in metamaterial design.
Robust Measurement via A Fused Latent and Graphical Item Response Theory Model.
Chen, Yunxiao; Li, Xiaoou; Liu, Jingchen; Ying, Zhiliang
2018-03-12
Item response theory (IRT) plays an important role in psychological and educational measurement. Unlike the classical testing theory, IRT models aggregate the item level information, yielding more accurate measurements. Most IRT models assume local independence, an assumption not likely to be satisfied in practice, especially when the number of items is large. Results in the literature and simulation studies in this paper reveal that misspecifying the local independence assumption may result in inaccurate measurements and differential item functioning. To provide more robust measurements, we propose an integrated approach by adding a graphical component to a multidimensional IRT model that can offset the effect of unknown local dependence. The new model contains a confirmatory latent variable component, which measures the targeted latent traits, and a graphical component, which captures the local dependence. An efficient proximal algorithm is proposed for the parameter estimation and structure learning of the local dependence. This approach can substantially improve the measurement, given no prior information on the local dependence structure. The model can be applied to measure both a unidimensional latent trait and multidimensional latent traits.
ERIC Educational Resources Information Center
Kim, Sooyeon; Livingston, Samuel A.
2017-01-01
The purpose of this simulation study was to assess the accuracy of a classical test theory (CTT)-based procedure for estimating the alternate-forms reliability of scores on a multistage test (MST) having 3 stages. We generated item difficulty and discrimination parameters for 10 parallel, nonoverlapping forms of the complete 3-stage test and…
Wang, Wei; Takeda, Mitsuo
2006-09-01
A new concept of vector and tensor densities is introduced into the general coherence theory of vector electromagnetic fields that is based on energy and energy-flow coherence tensors. Related coherence conservation laws are presented in the form of continuity equations that provide new insights into the propagation of second-order correlation tensors associated with stationary random classical electromagnetic fields.
Application of ply level analysis to flexural wave propagation
NASA Astrophysics Data System (ADS)
Valisetty, R. R.; Rehfield, L. W.
1988-10-01
A brief survey is presented of the shear deformation theories of laminated plates. It indicates that there are certain non-classical influences that affect bending-related behavior in the same way as do the transverse shear stresses. They include bending- and stretching-related section warping and the concomitant non-classical surface parallel stress contributions and the transverse normal stress. A bending theory gives significantly improved performance if these non-classical affects are incorporated. The heterogeneous shear deformations that are characteristic of laminates with highly dissimilar materials, however, require that attention be paid to the modeling of local rotations. In this paper, it is shown that a ply level analysis can be used to model such disparate shear deformations. Here, equilibrium of each layer is analyzed separately. Earlier applications of this analysis include free-edge laminate stresses. It is now extended to the study of flexural wave propagation in laminates. A recently developed homogeneous plate theory is used as a ply level model. Due consideration is given to the non-classical influences and no shear correction factors are introduced extraneously in this theory. The results for the lowest flexural mode of travelling planar harmonic waves indicate that this approach is competitive and yields better results for certain laminates.
Geometric Theory of Reduction of Nonlinear Control Systems
NASA Astrophysics Data System (ADS)
Elkin, V. I.
2018-02-01
The foundations of a differential geometric theory of nonlinear control systems are described on the basis of categorical concepts (isomorphism, factorization, restrictions) by analogy with classical mathematical theories (of linear spaces, groups, etc.).
Quantum Communication Using Coherent Rejection Sampling.
Anshu, Anurag; Devabathini, Vamsi Krishna; Jain, Rahul
2017-09-22
Compression of a message up to the information it carries is key to many tasks involved in classical and quantum information theory. Schumacher [B. Schumacher, Phys. Rev. A 51, 2738 (1995)PLRAAN1050-294710.1103/PhysRevA.51.2738] provided one of the first quantum compression schemes and several more general schemes have been developed ever since [M. Horodecki, J. Oppenheim, and A. Winter, Commun. Math. Phys. 269, 107 (2007); CMPHAY0010-361610.1007/s00220-006-0118-xI. Devetak and J. Yard, Phys. Rev. Lett. 100, 230501 (2008); PRLTAO0031-900710.1103/PhysRevLett.100.230501A. Abeyesinghe, I. Devetak, P. Hayden, and A. Winter, Proc. R. Soc. A 465, 2537 (2009)PRLAAZ1364-502110.1098/rspa.2009.0202]. However, the one-shot characterization of these quantum tasks is still under development, and often lacks a direct connection with analogous classical tasks. Here we show a new technique for the compression of quantum messages with the aid of entanglement. We devise a new tool that we call the convex split lemma, which is a coherent quantum analogue of the widely used rejection sampling procedure in classical communication protocols. As a consequence, we exhibit new explicit protocols with tight communication cost for quantum state merging, quantum state splitting, and quantum state redistribution (up to a certain optimization in the latter case). We also present a port-based teleportation scheme which uses a fewer number of ports in the presence of information about input.
NASA Technical Reports Server (NTRS)
Kelly, Bernard J.
2010-01-01
Einstein's General Theory of Relativity is our best classical description of gravity, and informs modern astronomy and astrophysics at all scales: stellar, galactic, and cosmological. Among its surprising predictions is the existence of gravitational waves -- ripples in space-time that carry energy and momentum away from strongly interacting gravitating sources. In my talk, I will give an overview of the properties of this radiation, recent breakthroughs in computational physics allowing us to calculate the waveforms from galactic mergers, and the prospect of direct observation with interferometric detectors such as LIGO and LISA.
NASA Astrophysics Data System (ADS)
Chen, De-You; Jiang, Qing-Quan; Yang, Shu-Zheng
2007-12-01
Applying Parikh’s semi-classical quantum tunneling method, the tunneling radiation characteristic of the charged particle from the event horizon of the Reissner Nordström anti de Sitter black hole is researched. The result shows the derived spectrum is not purely thermal one, but is consistent with the underlying unitary theory, which gives a might explanation to the information loss paradox and is the correct amendment to the Hawking radiation.
Competitive-Cooperative Automated Reasoning from Distributed and Multiple Source of Data
NASA Astrophysics Data System (ADS)
Fard, Amin Milani
Knowledge extraction from distributed database systems, have been investigated during past decade in order to analyze billions of information records. In this work a competitive deduction approach in a heterogeneous data grid environment is proposed using classic data mining and statistical methods. By applying a game theory concept in a multi-agent model, we tried to design a policy for hierarchical knowledge discovery and inference fusion. To show the system run, a sample multi-expert system has also been developed.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey H.
2006-01-01
The need for sufficient quantities of oxygen, water, and fuel resources to support a crew on the surface of Mars presents a critical logistical issue of whether to transport such resources from Earth or manufacture them on Mars. An approach based on the classical Wildcat Drilling Problem of Bayesian decision theory was applied to the problem of finding water in order to compute the expected value of precursor mission sample information. An implicit (required) probability of finding water on Mars was derived from the value of sample information using the expected mass savings of alternative precursor missions.
Representational Realism, Closed Theories and the Quantum to Classical Limit
NASA Astrophysics Data System (ADS)
de Ronde, Christian
In this chapter, we discuss the representational realist stance as a pluralistontic approach to inter-theoretic relationships. Our stance stresses the fact that physical theories require the necessary consideration of a conceptual level of discourse which determines and configures the specific field of phenomena discussed by each particular theory. We will criticize the orthodox line of research which has grounded the analysis about QM in two (Bohrian) metaphysical presuppositions - accepted in the present as dogmas that all interpretations must follow. We will also examine how the orthodox project of "bridging the gap" between the quantum and the classical domains has constrained the possibilities of research, producing only a limited set of interpretational problems which only focus in the justification of "classical reality" and exclude the possibility of analyzing the possibilities of non-classical conceptual representations of QM. The representational realist stance introduces two new problems, namely, the superposition problem and the contextuality problem, which consider explicitly the conceptual representation of orthodox QM beyond the mere reference to mathematical structures and measurement outcomes. In the final part of the chapter, we revisit, from representational realist perspective, the quantum to classical limit and the orthodox claim that this inter-theoretic relation can be explained through the principle of decoherence.
[Discussion on six errors of formulas corresponding to syndromes in using the classic formulas].
Bao, Yan-ju; Hua, Bao-jin
2012-12-01
The theory of formulas corresponding to syndromes is one of the characteristics of Treatise on Cold Damage and Miscellaneous Diseases (Shanghan Zabing Lun) and one of the main principles in applying classic prescriptions. It is important to take effect by following the principle of formulas corresponding to syndromes. However, some medical practitioners always feel that the actual clinical effect is far less than expected. Six errors in the use of classic prescriptions as well as the theory of formulas corresponding to syndromes are the most important causes to be considered, i.e. paying attention only to the local syndromes while neglecting the whole, paying attention only to formulas corresponding to syndromes while neglecting the pathogenesis, paying attention only to syndromes while neglecting the pulse diagnosis, paying attention only to unilateral prescription but neglecting the combined prescriptions, paying attention only to classic prescriptions while neglecting the modern formulas, and paying attention only to the formulas but neglecting the drug dosage. Therefore, not only the patients' clinical syndromes, but also the combination of main syndrome and pathogenesis simultaneously is necessary in the clinical applications of classic prescriptions and the theory of prescription corresponding to syndrome. In addition, comprehensive syndrome differentiation, modern formulas, current prescriptions, combined prescriptions, and drug dosage all contribute to avoid clinical errors and improve clinical effects.
Influences on and Limitations of Classical Test Theory Reliability Estimates.
ERIC Educational Resources Information Center
Arnold, Margery E.
It is incorrect to say "the test is reliable" because reliability is a function not only of the test itself, but of many factors. The present paper explains how different factors affect classical reliability estimates such as test-retest, interrater, internal consistency, and equivalent forms coefficients. Furthermore, the limits of classical test…
A Comparison of Kinetic Energy and Momentum in Special Relativity and Classical Mechanics
ERIC Educational Resources Information Center
Riggs, Peter J.
2016-01-01
Kinetic energy and momentum are indispensable dynamical quantities in both the special theory of relativity and in classical mechanics. Although momentum and kinetic energy are central to understanding dynamics, the differences between their relativistic and classical notions have not always received adequate treatment in undergraduate teaching.…
Designing quantum information processing via structural physical approximation.
Bae, Joonwoo
2017-10-01
In quantum information processing it may be possible to have efficient computation and secure communication beyond the limitations of classical systems. In a fundamental point of view, however, evolution of quantum systems by the laws of quantum mechanics is more restrictive than classical systems, identified to a specific form of dynamics, that is, unitary transformations and, consequently, positive and completely positive maps to subsystems. This also characterizes classes of disallowed transformations on quantum systems, among which positive but not completely maps are of particular interest as they characterize entangled states, a general resource in quantum information processing. Structural physical approximation offers a systematic way of approximating those non-physical maps, positive but not completely positive maps, with quantum channels. Since it has been proposed as a method of detecting entangled states, it has stimulated fundamental problems on classifications of positive maps and the structure of Hermitian operators and quantum states, as well as on quantum measurement such as quantum design in quantum information theory. It has developed efficient and feasible methods of directly detecting entangled states in practice, for which proof-of-principle experimental demonstrations have also been performed with photonic qubit states. Here, we present a comprehensive review on quantum information processing with structural physical approximations and the related progress. The review mainly focuses on properties of structural physical approximations and their applications toward practical information applications.
Designing quantum information processing via structural physical approximation
NASA Astrophysics Data System (ADS)
Bae, Joonwoo
2017-10-01
In quantum information processing it may be possible to have efficient computation and secure communication beyond the limitations of classical systems. In a fundamental point of view, however, evolution of quantum systems by the laws of quantum mechanics is more restrictive than classical systems, identified to a specific form of dynamics, that is, unitary transformations and, consequently, positive and completely positive maps to subsystems. This also characterizes classes of disallowed transformations on quantum systems, among which positive but not completely maps are of particular interest as they characterize entangled states, a general resource in quantum information processing. Structural physical approximation offers a systematic way of approximating those non-physical maps, positive but not completely positive maps, with quantum channels. Since it has been proposed as a method of detecting entangled states, it has stimulated fundamental problems on classifications of positive maps and the structure of Hermitian operators and quantum states, as well as on quantum measurement such as quantum design in quantum information theory. It has developed efficient and feasible methods of directly detecting entangled states in practice, for which proof-of-principle experimental demonstrations have also been performed with photonic qubit states. Here, we present a comprehensive review on quantum information processing with structural physical approximations and the related progress. The review mainly focuses on properties of structural physical approximations and their applications toward practical information applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nizami, Lance
2010-03-01
Norwich's Entropy Theory of Perception (1975-present) is a general theory of perception, based on Shannon's Information Theory. Among many bold claims, the Entropy Theory presents a truly astounding result: that Stevens' Law with an Index of 1, an empirical power relation of direct proportionality between perceived taste intensity and stimulus concentration, arises from theory alone. Norwich's theorizing starts with several extraordinary hypotheses. First, 'multiple, parallel receptor-neuron units' without collaterals 'carry essentially the same message to the brain', i.e. the rate-level curves are identical. Second, sensation is proportional to firing rate. Third, firing rate is proportional to the taste receptor's 'resolvablemore » uncertainty'. Fourth, the 'resolvable uncertainty' is obtained from Shannon's Information Theory. Finally, 'resolvable uncertainty' also depends upon the microscopic thermodynamic density fluctuation of the tasted solute. Norwich proves that density fluctuation is density variance, which is proportional to solute concentration, all based on the theory of fluctuations in fluid composition from Tolman's classic physics text, 'The Principles of Statistical Mechanics'. Altogether, according to Norwich, perceived taste intensity is theoretically proportional to solute concentration. Such a universal rule for taste, one that is independent of solute identity, personal physiological differences, and psychophysical task, is truly remarkable and is well-deserving of scrutiny. Norwich's crucial step was the derivation of density variance. That step was meticulously reconstructed here. It transpires that the appropriate fluctuation is Tolman's mean-square fractional density fluctuation, not density variance as used by Norwich. Tolman's algebra yields a 'Stevens Index' of -1 rather than 1. As 'Stevens Index' empirically always exceeds zero, the Index of -1 suggests that it is risky to infer psychophysical laws of sensory response from information theory and stimulus physics while ignoring empirical biological transformations, such as sensory transduction. Indeed, it raises doubts as to whether the Entropy Theory actually describes psychophysical laws at all.« less
A Comparative Analysis of Three Unique Theories of Organizational Learning
ERIC Educational Resources Information Center
Leavitt, Carol C.
2011-01-01
The purpose of this paper is to present three classical theories on organizational learning and conduct a comparative analysis that highlights their strengths, similarities, and differences. Two of the theories -- experiential learning theory and adaptive -- generative learning theory -- represent the thinking of the cognitive perspective, while…
Retinal Connectomics: Towards Complete, Accurate Networks
Marc, Robert E.; Jones, Bryan W.; Watt, Carl B.; Anderson, James R.; Sigulinsky, Crystal; Lauritzen, Scott
2013-01-01
Connectomics is a strategy for mapping complex neural networks based on high-speed automated electron optical imaging, computational assembly of neural data volumes, web-based navigational tools to explore 1012–1015 byte (terabyte to petabyte) image volumes, and annotation and markup tools to convert images into rich networks with cellular metadata. These collections of network data and associated metadata, analyzed using tools from graph theory and classification theory, can be merged with classical systems theory, giving a more completely parameterized view of how biologic information processing systems are implemented in retina and brain. Networks have two separable features: topology and connection attributes. The first findings from connectomics strongly validate the idea that the topologies complete retinal networks are far more complex than the simple schematics that emerged from classical anatomy. In particular, connectomics has permitted an aggressive refactoring of the retinal inner plexiform layer, demonstrating that network function cannot be simply inferred from stratification; exposing the complex geometric rules for inserting different cells into a shared network; revealing unexpected bidirectional signaling pathways between mammalian rod and cone systems; documenting selective feedforward systems, novel candidate signaling architectures, new coupling motifs, and the highly complex architecture of the mammalian AII amacrine cell. This is but the beginning, as the underlying principles of connectomics are readily transferrable to non-neural cell complexes and provide new contexts for assessing intercellular communication. PMID:24016532
ERRATUM: Papers published in incorrect sections
NASA Astrophysics Data System (ADS)
2004-04-01
A number of J. Phys. A: Math. Gen. articles have mistakenly been placed in the wrong subject section in recent issues of the journal. We would like to apologize to the authors of these articles for publishing their papers in the Fluid and Plasma Theory section. The correct section for each article is given below. Statistical Physics Issue 4: Microcanonical entropy for small magnetizations Behringer H 2004 J. Phys. A: Math. Gen. 37 1443 Mathematical Physics Issue 9: On the solution of fractional evolution equations Kilbas A A, Pierantozzi T, Trujillo J J and Vázquez L 2004 J. Phys. A: Math. Gen. 37 3271 Quantum Mechanics and Quantum Information Theory Issue 6: New exactly solvable isospectral partners for PT-symmetric potentials Sinha A and Roy P 2004 J. Phys. A: Math. Gen. 37 2509 Issue 9: Symplectically entangled states and their applications to coding Vourdas A 2004 J. Phys. A: Math. Gen. 37 3305 Classical and Quantum Field Theory Issue 6: Pairing of parafermions of order 2: seniority model Nelson C A 2004 J. Phys. A: Math. Gen. 37 2497 Issue 7: Jordan-Schwinger map, 3D harmonic oscillator constants of motion, and classical and quantum parameters characterizing electromagnetic wave polarization Mota R D, Xicoténcatl M A and Granados V D 2004 J. Phys. A: Math. Gen. 37 2835 Issue 9: Could only fermions be elementary? Lev F M 2004 J. Phys. A: Math. Gen. 37 3285
NASA Astrophysics Data System (ADS)
Hwang, Jai-Chan; Noh, Hyerim
2005-03-01
We present cosmological perturbation theory based on generalized gravity theories including string theory correction terms and a tachyonic complication. The classical evolution as well as the quantum generation processes in these varieties of gravity theories are presented in unified forms. These apply both to the scalar- and tensor-type perturbations. Analyses are made based on the curvature variable in two different gauge conditions often used in the literature in Einstein’s gravity; these are the curvature variables in the comoving (or uniform-field) gauge and the zero-shear gauge. Applications to generalized slow-roll inflation and its consequent power spectra are derived in unified forms which include a wide range of inflationary scenarios based on Einstein’s gravity and others.
Infinite derivative gravity: non-singular cosmology & blackhole solutions
NASA Astrophysics Data System (ADS)
Mazumdar, A.
Both Einstein’s theory of General Relativity and Newton’s theory of gravity possess a short distance and small time scale catastrophe. The blackhole singularity and cosmological Big Bang singularity problems highlight that current theories of gravity are incomplete description at early times and small distances. I will discuss how one can potentially resolve these fundamental problems at a classical level and quantum level. In particular, I will discuss infinite derivative theories of gravity, where gravitational interactions become weaker in the ultraviolet, and therefore resolving some of the classical singularities, such as Big Bang and Schwarzschild singularity for compact non-singular objects with mass up to 1025 grams. In this lecture, I will discuss quantum aspects of infinite derivative gravity and discuss few aspects which can make the theory asymptotically free in the UV.
Psychodrama: group psychotherapy through role playing.
Kipper, D A
1992-10-01
The theory and the therapeutic procedure of classical psychodrama are described along with brief illustrations. Classical psychodrama and sociodrama stemmed from role theory, enactments, "tele," the reciprocity of choices, and the theory of spontaneity-robopathy and creativity. The discussion focuses on key concepts such as the therapeutic team, the structure of the session, transference and reality, countertransference, the here-and-now and the encounter, the group-as-a-whole, resistance and difficult clients, and affect and cognition. Also described are the neoclassical approaches of psychodrama, action methods, and clinical role playing, and the significance of the concept of behavioral simulation in group psychotherapy.
Spinning particles, axion radiation, and the classical double copy
NASA Astrophysics Data System (ADS)
Goldberger, Walter D.; Li, Jingping; Prabhu, Siddharth G.
2018-05-01
We extend the perturbative double copy between radiating classical sources in gauge theory and gravity to the case of spinning particles. We construct, to linear order in spins, perturbative radiating solutions to the classical Yang-Mills equations sourced by a set of interacting color charges with chromomagnetic dipole spin couplings. Using a color-to-kinematics replacement rule proposed earlier by one of the authors, these solutions map onto radiation in a theory of interacting particles coupled to massless fields that include the graviton, a scalar (dilaton) ϕ and the Kalb-Ramond axion field Bμ ν. Consistency of the double copy imposes constraints on the parameters of the theory on both the gauge and gravity sides of the correspondence. In particular, the color charges carry a chromomagnetic interaction which, in d =4 , corresponds to a gyromagnetic ratio equal to Dirac's value g =2 . The color-to-kinematics map implies that on the gravity side, the bulk theory of the fields (ϕ ,gμ ν,Bμ ν) has interactions which match those of d -dimensional "string gravity," as is the case both in the BCJ double copy of pure gauge theory scattering amplitudes and the KLT relations between the tree-level S -matrix elements of open and closed string theory.
Lamb wave extraction of dispersion curves in micro/nano-plates using couple stress theories
NASA Astrophysics Data System (ADS)
Ghodrati, Behnam; Yaghootian, Amin; Ghanbar Zadeh, Afshin; Mohammad-Sedighi, Hamid
2018-01-01
In this paper, Lamb wave propagation in a homogeneous and isotropic non-classical micro/nano-plates is investigated. To consider the effect of material microstructure on the wave propagation, three size-dependent models namely indeterminate-, modified- and consistent couple stress theories are used to extract the dispersion equations. In the mentioned theories, a parameter called 'characteristic length' is used to consider the size of material microstructure in the governing equations. To generalize the parametric studies and examine the effect of thickness, propagation wavelength, and characteristic length on the behavior of miniature plate structures, the governing equations are nondimensionalized by defining appropriate dimensionless parameters. Then the dispersion curves for phase and group velocities are plotted in terms of a wide frequency-thickness range to study the lamb waves propagation considering microstructure effects in very high frequencies. According to the illustrated results, it was observed that the couple stress theories in the Cosserat type material predict more rigidity than the classical theory; so that in a plate with constant thickness, by increasing the thickness to characteristic length ratio, the results approach to the classical theory, and by reducing this ratio, wave propagation speed in the plate is significantly increased. In addition, it is demonstrated that for high-frequency Lamb waves, it converges to dispersive Rayleigh wave velocity.
D'Ariano, Giacomo Mauro
2018-07-13
Causality has never gained the status of a 'law' or 'principle' in physics. Some recent literature has even popularized the false idea that causality is a notion that should be banned from theory. Such misconception relies on an alleged universality of the reversibility of the laws of physics, based either on the determinism of classical theory, or on the multiverse interpretation of quantum theory, in both cases motivated by mere interpretational requirements for realism of the theory. Here, I will show that a properly defined unambiguous notion of causality is a theorem of quantum theory, which is also a falsifiable proposition of the theory. Such a notion of causality appeared in the literature within the framework of operational probabilistic theories. It is a genuinely theoretical notion, corresponding to establishing a definite partial order among events, in the same way as we do by using the future causal cone on Minkowski space. The notion of causality is logically completely independent of the misidentified concept of 'determinism', and, being a consequence of quantum theory, is ubiquitous in physics. In addition, as classical theory can be regarded as a restriction of quantum theory, causality holds also in the classical case, although the determinism of the theory trivializes it. I then conclude by arguing that causality naturally establishes an arrow of time. This implies that the scenario of the 'block Universe' and the connected 'past hypothesis' are incompatible with causality, and thus with quantum theory: they are both doomed to remain mere interpretations and, as such, are not falsifiable, similar to the hypothesis of 'super-determinism'.This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).
Quantum Counterfactual Information Transmission Without a Weak Trace
NASA Astrophysics Data System (ADS)
Arvidsson Shukur, David; Barnes, Crispin
The classical theories of communication rely on the assumption that there has to be a flow of particles from Bob to Alice in order for him to send a message to her. We have developed a quantum protocol that allows Alice to perceive Bob's message ``counterfactually''. That is, without Alice receiving any particles that have interacted with Bob. By utilising a setup built on results from interaction-free measurements and the quantum Zeno effect, we outline a communication protocol in which the information travels in the opposite direction of the emitted particles. In comparison to previous attempts on such protocols, this one is such that a weak measurement at the message source would not leave a weak trace that could be detected by Alice's receiver. Whilst some interaction-free schemes require a large number of carefully aligned beam-splitters, our protocol is realisable with two or more beam-splitters. Furthermore, we outline how Alice's obtained classical Fisher information between a weak variable at Bob's laboratory is negligible in our scheme. We demonstrate this protocol by numerically solving the time-dependent Schrödinger Equation (TDSE) for a Hamiltonian that implements this quantum counterfactual phenomenon.
Soliton Gases and Generalized Hydrodynamics
NASA Astrophysics Data System (ADS)
Doyon, Benjamin; Yoshimura, Takato; Caux, Jean-Sébastien
2018-01-01
We show that the equations of generalized hydrodynamics (GHD), a hydrodynamic theory for integrable quantum systems at the Euler scale, emerge in full generality in a family of classical gases, which generalize the gas of hard rods. In this family, the particles, upon colliding, jump forward or backward by a distance that depends on their velocities, reminiscent of classical soliton scattering. This provides a "molecular dynamics" for GHD: a numerical solver which is efficient, flexible, and which applies to the presence of external force fields. GHD also describes the hydrodynamics of classical soliton gases. We identify the GHD of any quantum model with that of the gas of its solitonlike wave packets, thus providing a remarkable quantum-classical equivalence. The theory is directly applicable, for instance, to integrable quantum chains and to the Lieb-Liniger model realized in cold-atom experiments.
Topological and Orthomodular Modeling of Context in Behavioral Science
NASA Astrophysics Data System (ADS)
Narens, Louis
2017-02-01
Two non-boolean methods are discussed for modeling context in behavioral data and theory. The first is based on intuitionistic logic, which is similar to classical logic except that not every event has a complement. Its probability theory is also similar to classical probability theory except that the definition of probability function needs to be generalized to unions of events instead of applying only to unions of disjoint events. The generalization is needed, because intuitionistic event spaces may not contain enough disjoint events for the classical definition to be effective. The second method develops a version of quantum logic for its underlying probability theory. It differs from Hilbert space logic used in quantum mechanics as a foundation for quantum probability theory in variety of ways. John von Neumann and others have commented about the lack of a relative frequency approach and a rational foundation for this probability theory. This article argues that its version of quantum probability theory does not have such issues. The method based on intuitionistic logic is useful for modeling cognitive interpretations that vary with context, for example, the mood of the decision maker, the context produced by the influence of other items in a choice experiment, etc. The method based on this article's quantum logic is useful for modeling probabilities across contexts, for example, how probabilities of events from different experiments are related.
A classical density functional theory of ionic liquids.
Forsman, Jan; Woodward, Clifford E; Trulsson, Martin
2011-04-28
We present a simple, classical density functional approach to the study of simple models of room temperature ionic liquids. Dispersion attractions as well as ion correlation effects and excluded volume packing are taken into account. The oligomeric structure, common to many ionic liquid molecules, is handled by a polymer density functional treatment. The theory is evaluated by comparisons with simulations, with an emphasis on the differential capacitance, an experimentally measurable quantity of significant practical interest.
Generalized quantum theory of recollapsing homogeneous cosmologies
NASA Astrophysics Data System (ADS)
Craig, David; Hartle, James B.
2004-06-01
A sum-over-histories generalized quantum theory is developed for homogeneous minisuperspace type A Bianchi cosmological models, focusing on the particular example of the classically recollapsing Bianchi type-IX universe. The decoherence functional for such universes is exhibited. We show how the probabilities of decoherent sets of alternative, coarse-grained histories of these model universes can be calculated. We consider in particular the probabilities for classical evolution defined by a suitable coarse graining. For a restricted class of initial conditions and coarse grainings we exhibit the approximate decoherence of alternative histories in which the universe behaves classically and those in which it does not. For these situations we show that the probability is near unity for the universe to recontract classically if it expands classically. We also determine the relative probabilities of quasiclassical trajectories for initial states of WKB form, recovering for such states a precise form of the familiar heuristic “JṡdΣ” rule of quantum cosmology, as well as a generalization of this rule to generic initial states.
DNA Methylation and Sex Allocation in the Parasitoid Wasp Nasonia vitripennis.
Cook, Nicola; Pannebakker, Bart A; Tauber, Eran; Shuker, David M
2015-10-01
The role of epigenetics in the control and evolution of behavior is being increasingly recognized. Here we test whether DNA methylation influences patterns of adaptive sex allocation in the parasitoid wasp Nasonia vitripennis. Female N. vitripennis allocate offspring sex broadly in line with local mate competition (LMC) theory. However, recent theory has highlighted how genomic conflict may influence sex allocation under LMC, conflict that requires parent-of-origin information to be retained by alleles through some form of epigenetic signal. We manipulated whole-genome DNA methylation in N. vitripennis females using the hypomethylating agent 5-aza-2'-deoxycytidine. Across two replicated experiments, we show that disruption of DNA methylation does not ablate the facultative sex allocation response of females, as sex ratios still vary with cofoundress number as in the classical theory. However, sex ratios are generally shifted upward when DNA methylation is disrupted. Our data are consistent with predictions from genomic conflict over sex allocation theory and suggest that sex ratios may be closer to the optimum for maternally inherited alleles.
The problems in quantum foundations in the light of gauge theories
NASA Astrophysics Data System (ADS)
Ne'Eman, Yuval
1986-04-01
We review the issues of nonseparability and seemingly acausal propagation of information in EPR, as displayed by experiments and the failure of Bell's inequalities. We show that global effects are in the very nature of the geometric structure of modern physical theories, occurring even at the classical level. The Aharonov-Bohm effect, magnetic monopoles, instantons, etc. result from the topology and homotopy features of the fiber bundle manifolds of gauge theories. The conservation of probabilities, a supposedly highly quantum effect, is also achieved through global geometry equations. The EPR observables all fit in such geometries, and space-time is a truncated representation and is not the correct arena for their understanding. Relativistic quantum field theory represents the global action of the measurement operators as the zero-momentum (and therefore spatially infinitely spread) limit of their wave functions (form factors). We also analyze the collapse of the state vector as a case of spontaneous symmetry breakdown in the apparatus-observed state interaction.
Exploring the joint measurability using an information-theoretic approach
NASA Astrophysics Data System (ADS)
Hsu, Li-Yi
2016-12-01
We explore the legal purity parameters for the joint measurements. Instead of direct unsharpening the measurements, we perform the quantum cloning before the sharp measurements. The necessary fuzziness in the unsharp measurements is equivalently introduced in the imperfect cloning process. Based on the information causality and the consequent noisy nonlocal computation, one can derive the information-theoretic quadratic inequalities that must be satisfied by any physical theory. On the other hand, to guarantee the classicality, the linear Bell-type inequalities deduced by these quadratic ones must be obeyed. As for the joint measurability, the purity parameters must be chosen to obey both types of inequalities. Finally, the quadratic inequalities for purity parameters in the joint measurability region are derived.
Gonoskov, I A; Tsatrafyllis, N; Kominis, I K; Tzallas, P
2016-09-07
We analytically describe the strong-field light-electron interaction using a quantized coherent laser state with arbitrary photon number. We obtain a light-electron wave function which is a closed-form solution of the time-dependent Schrödinger equation (TDSE). This wave function provides information about the quantum optical features of the interaction not accessible by semi-classical theories. With this approach we can reveal the quantum optical properties of high harmonic generation (HHG) process in gases by measuring the photon statistics of the transmitted infrared (IR) laser radiation. This work can lead to novel experiments in high-resolution spectroscopy in extreme-ultraviolet (XUV) and attosecond science without the need to measure the XUV light, while it can pave the way for the development of intense non-classical light sources.
NASA Astrophysics Data System (ADS)
Tanona, Scott Daniel
I develop a new analysis of Niels Bohr's Copenhagen interpretation of quantum mechanics by examining the development of his views from his earlier use of the correspondence principle in the so-called 'old quantum theory' to his articulation of the idea of complementarity in the context of the novel mathematical formalism of quantum mechanics. I argue that Bohr was motivated not by controversial and perhaps dispensable epistemological ideas---positivism or neo-Kantianism, for example---but by his own unique perspective on the difficulties of creating a new working physics of the internal structure of the atom. Bohr's use of the correspondence principle in the old quantum theory was associated with an empirical methodology that used this principle as an epistemological bridge to connect empirical phenomena with quantum models. The application of the correspondence principle required that one determine the validity of the idealizations and approximations necessary for the judicious use of classical physics within quantum theory. Bohr's interpretation of the new quantum mechanics then focused on the largely unexamined ways in which the developing abstract mathematical formalism is given empirical content by precisely this process of approximation. Significant consistency between his later interpretive framework and his forms of argument with the correspondence principle indicate that complementarity is best understood as a relationship among the various approximations and idealizations that must be made when one connects otherwise meaningless quantum mechanical symbols to empirical situations or 'experimental arrangements' described using concepts from classical physics. We discover that this relationship is unavoidable not through any sort of a priori analysis of the priority of classical concepts, but because quantum mechanics incorporates the correspondence approach in the way in which it represents quantum properties with matrices of transition probabilities, the empirical meaning of which depend on the situation but in general are tied to the correspondence connection to the spectra. For Bohr, it is then the commutation relations, which arise from the formalism, which inform us of the complementary nature of this approximate representation of quantum properties via the classical equations through which we connect them to experiments.
Adaptive neural coding: from biological to behavioral decision-making
Louie, Kenway; Glimcher, Paul W.; Webb, Ryan
2015-01-01
Empirical decision-making in diverse species deviates from the predictions of normative choice theory, but why such suboptimal behavior occurs is unknown. Here, we propose that deviations from optimality arise from biological decision mechanisms that have evolved to maximize choice performance within intrinsic biophysical constraints. Sensory processing utilizes specific computations such as divisive normalization to maximize information coding in constrained neural circuits, and recent evidence suggests that analogous computations operate in decision-related brain areas. These adaptive computations implement a relative value code that may explain the characteristic context-dependent nature of behavioral violations of classical normative theory. Examining decision-making at the computational level thus provides a crucial link between the architecture of biological decision circuits and the form of empirical choice behavior. PMID:26722666
Bounds on quantum communication via Newtonian gravity
NASA Astrophysics Data System (ADS)
Kafri, D.; Milburn, G. J.; Taylor, J. M.
2015-01-01
Newtonian gravity yields specific observable consequences, the most striking of which is the emergence of a 1/{{r}2} force. In so far as communication can arise via such interactions between distant particles, we can ask what would be expected for a theory of gravity that only allows classical communication. Many heuristic suggestions for gravity-induced decoherence have this restriction implicitly or explicitly in their construction. Here we show that communication via a 1/{{r}2} force has a minimum noise induced in the system when the communication cannot convey quantum information, in a continuous time analogue to Bell's inequalities. Our derived noise bounds provide tight constraints from current experimental results on any theory of gravity that does not allow quantum communication.
Quantum-Like Model for Decision Making Process in Two Players Game. A Non-Kolmogorovian Model
NASA Astrophysics Data System (ADS)
Asano, Masanari; Ohya, Masanori; Khrennikov, Andrei
2011-03-01
In experiments of games, players frequently make choices which are regarded as irrational in game theory. In papers of Khrennikov (Information Dynamics in Cognitive, Psychological and Anomalous Phenomena. Fundamental Theories of Physics, Kluwer Academic, Norwell, 2004; Fuzzy Sets Syst. 155:4-17, 2005; Biosystems 84:225-241, 2006; Found. Phys. 35(10):1655-1693, 2005; in QP-PQ Quantum Probability and White Noise Analysis, vol. XXIV, pp. 105-117, 2009), it was pointed out that statistics collected in such the experiments have "quantum-like" properties, which can not be explained in classical probability theory. In this paper, we design a simple quantum-like model describing a decision-making process in a two-players game and try to explain a mechanism of the irrational behavior of players. Finally we discuss a mathematical frame of non-Kolmogorovian system in terms of liftings (Accardi and Ohya, in Appl. Math. Optim. 39:33-59, 1999).
Profile-likelihood Confidence Intervals in Item Response Theory Models.
Chalmers, R Philip; Pek, Jolynn; Liu, Yang
2017-01-01
Confidence intervals (CIs) are fundamental inferential devices which quantify the sampling variability of parameter estimates. In item response theory, CIs have been primarily obtained from large-sample Wald-type approaches based on standard error estimates, derived from the observed or expected information matrix, after parameters have been estimated via maximum likelihood. An alternative approach to constructing CIs is to quantify sampling variability directly from the likelihood function with a technique known as profile-likelihood confidence intervals (PL CIs). In this article, we introduce PL CIs for item response theory models, compare PL CIs to classical large-sample Wald-type CIs, and demonstrate important distinctions among these CIs. CIs are then constructed for parameters directly estimated in the specified model and for transformed parameters which are often obtained post-estimation. Monte Carlo simulation results suggest that PL CIs perform consistently better than Wald-type CIs for both non-transformed and transformed parameters.
Trumpp, Natalie M; Traub, Felix; Pulvermüller, Friedemann; Kiefer, Markus
2014-02-01
Classical theories of semantic memory assume that concepts are represented in a unitary amodal memory system. In challenging this classical view, pure or hybrid modality-specific theories propose that conceptual representations are grounded in the sensory-motor brain areas, which typically process sensory and action-related information. Although neuroimaging studies provided evidence for a functional-anatomical link between conceptual processing of sensory or action-related features and the sensory-motor brain systems, it has been argued that aspects of such sensory-motor activation may not directly reflect conceptual processing but rather strategic imagery or postconceptual elaboration. In the present ERP study, we investigated masked effects of acoustic and action-related conceptual features to probe unconscious automatic conceptual processing in isolation. Subliminal feature-specific ERP effects at frontocentral electrodes were observed, which differed with regard to polarity, topography, and underlying brain electrical sources in congruency with earlier findings under conscious viewing conditions. These findings suggest that conceptual acoustic and action representations can also be unconsciously accessed, thereby excluding any postconceptual strategic processes. This study therefore further substantiates a grounding of conceptual and semantic processing in action and perception.
NASA Astrophysics Data System (ADS)
Huyskens, P.; Kapuku, F.; Colemonts-Vandevyvere, C.
1990-09-01
In liquids the partners of H bonds constantly change. As a consequence the entities observed by IR spectroscopy are not the same as those considered for thermodynamic properties. For the latter, the H-bonds are shared by all the molecules. The thermodynamic "monomeric fraction", γ, the time fraction during which an alcohol molecule is vaporizable, is the square root of the spectroscopic monomeric fraction, and is the fraction of molecules which, during a time interval of 10 -14 s, have their hydroxylic proton and their lone pairs free. The classical thermodynamic treatments of Mecke and Prigogine consider the spectroscopic entities as real thermodynamic entities. Opposed to this, the mobile order theory considers all the formal molecules as equal but with a reduction of the entropy due to the fact that during a fraction 1-γ of the time, the OH proton follows a neighbouring oxygen atom on its journey through the liquid. Mobile order theory and classic multicomponent treatment lead, in binary mixtures of the associated substance A with the inert substance S, to expressions of the chemical potentials μ A and μ S that are fundamentally different. However, the differences become very important only when the molar volumes overlineVS and overlineVA differ by a factor larger than 2. As a consequence the equations of the classic theory can still fit the experimental vapour pressure data of mixtures of liquid alcohols and liquid alkanes. However, the solubilities of solid alkanes in water for which overlineVS > 3 overlineVA are only correctly predicted by the mobile order theory.
From Foucault to Freire through Facebook: Toward an Integrated Theory of mHealth
ERIC Educational Resources Information Center
Bull, Sheana; Ezeanochie, Nnamdi
2016-01-01
Objective: To document the integration of social science theory in literature on mHealth (mobile health) and consider opportunities for integration of classic theory, health communication theory, and social networking to generate a relevant theory for mHealth program design. Method: A secondary review of research syntheses and meta-analyses…
Štys, Dalibor; Urban, Jan; Vaněk, Jan; Císař, Petr
2011-06-01
We report objective analysis of information in the microscopic image of the cell monolayer. The process of transfer of information about the cell by the microscope is analyzed in terms of the classical Shannon information transfer scheme. The information source is the biological object, the information transfer channel is the whole microscope including the camera chip. The destination is the model of biological system. The information contribution is analyzed as information carried by a point to overall information in the image. Subsequently we obtain information reflection of the biological object. This is transformed in the biological model which, in information terminology, is the destination. This, we propose, should be constructed as state transitions in individual cells modulated by information bonds between the cells. We show examples of detected cell states in multidimensional state space. This space is reflected as colour channel intensity phenomenological state space. We have also observed information bonds and show examples of them.
Stys, Dalibor; Urban, Jan; Vanek, Jan; Císar, Petr
2010-07-01
We report objective analysis of information in the microscopic image of the cell monolayer. The process of transfer of information about the cell by the microscope is analyzed in terms of the classical Shannon information transfer scheme. The information source is the biological object, the information transfer channel is the whole microscope including the camera chip. The destination is the model of biological system. The information contribution is analyzed as information carried by a point to overall information in the image. Subsequently we obtain information reflection of the biological object. This is transformed in the biological model which, in information terminology, is the destination. This, we propose, should be constructed as state transitions in individual cells modulated by information bonds between the cells. We show examples of detected cell states in multidimensional state space reflected in space an colour channel intensity phenomenological state space. We have also observed information bonds and show examples of them. Copyright 2010 Elsevier Ltd. All rights reserved.
Generalizability Theory and Classical Test Theory
ERIC Educational Resources Information Center
Brennan, Robert L.
2011-01-01
Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…
Theories of the Alcoholic Personality.
ERIC Educational Resources Information Center
Cox, W. Miles
Several theories of the alcoholic personality have been devised to determine the relationship between the clusters of personality characteristics of alcoholics and their abuse of alcohol. The oldest and probably best known theory is the dependency theory, formulated in the tradition of classical psychoanalysis, which associates the alcoholic's…
The Giffen Effect: A Note on Economic Purposes.
ERIC Educational Resources Information Center
Williams, William D.
1990-01-01
Describes the Giffen effect: demand for a commodity increases as price increases. Explains how applying control theory eliminates the paradox that the Giffen effect presents to classic economics supply and demand theory. Notes the differences in how conventional demand theory and control theory treat consumer behavior. (CH)
Personality Theories for the 21st Century
ERIC Educational Resources Information Center
McCrae, Robert R.
2011-01-01
Classic personality theories, although intriguing, are outdated. The five-factor model of personality traits reinvigorated personality research, and the resulting findings spurred a new generation of personality theories. These theories assign a central place to traits and acknowledge the crucial role of evolved biology in shaping human…
Continuous Time in Consistent Histories
NASA Astrophysics Data System (ADS)
Savvidou, Konstantina
1999-12-01
We discuss the case of histories labelled by a continuous time parameter in the History Projection Operator consistent-histories quantum theory. We describe how the appropriate representation of the history algebra may be chosen by requiring the existence of projection operators that represent propositions about time averages of the energy. We define the action operator for the consistent histories formalism, as the quantum analogue of the classical action functional, for the simple harmonic oscillator case. We show that the action operator is the generator of two types of time transformations that may be related to the two laws of time-evolution of the standard quantum theory: the `state-vector reduction' and the unitary time-evolution. We construct the corresponding classical histories and demonstrate the relevance with the quantum histories; we demonstrate how the requirement of the temporal logic structure of the theory is sufficient for the definition of classical histories. Furthermore, we show the relation of the action operator to the decoherence functional which describes the dynamics of the system. Finally, the discussion is extended to give a preliminary account of quantum field theory in this approach to the consistent histories formalism.
Effects of Extrinsic Mortality on the Evolution of Aging: A Stochastic Modeling Approach
Shokhirev, Maxim Nikolaievich; Johnson, Adiv Adam
2014-01-01
The evolutionary theories of aging are useful for gaining insights into the complex mechanisms underlying senescence. Classical theories argue that high levels of extrinsic mortality should select for the evolution of shorter lifespans and earlier peak fertility. Non-classical theories, in contrast, posit that an increase in extrinsic mortality could select for the evolution of longer lifespans. Although numerous studies support the classical paradigm, recent data challenge classical predictions, finding that high extrinsic mortality can select for the evolution of longer lifespans. To further elucidate the role of extrinsic mortality in the evolution of aging, we implemented a stochastic, agent-based, computational model. We used a simulated annealing optimization approach to predict which model parameters predispose populations to evolve longer or shorter lifespans in response to increased levels of predation. We report that longer lifespans evolved in the presence of rising predation if the cost of mating is relatively high and if energy is available in excess. Conversely, we found that dramatically shorter lifespans evolved when mating costs were relatively low and food was relatively scarce. We also analyzed the effects of increased predation on various parameters related to density dependence and energy allocation. Longer and shorter lifespans were accompanied by increased and decreased investments of energy into somatic maintenance, respectively. Similarly, earlier and later maturation ages were accompanied by increased and decreased energetic investments into early fecundity, respectively. Higher predation significantly decreased the total population size, enlarged the shared resource pool, and redistributed energy reserves for mature individuals. These results both corroborate and refine classical predictions, demonstrating a population-level trade-off between longevity and fecundity and identifying conditions that produce both classical and non-classical lifespan effects. PMID:24466165
Assessing the quantum physics impacts on future x-ray free-electron lasers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmitt, Mark J.; Anisimov, Petr Mikhaylovich
A new quantum mechanical theory of x-ray free electron lasers (XFELs) has been successfully developed that has placed LANL at the forefront of the understanding of quantum effects in XFELs. Our quantum theory describes the interaction of relativistic electrons with x-ray radiation in the periodic magnetic field of an undulator using the same mathematical formalism as classical XFEL theory. This places classical and quantum treatments on the same footing and allows for a continuous transition from one regime to the other eliminating the disparate analytical approaches previously used. Moreover, Dr. Anisimov, the architect of this new theory, is now consideredmore » a resource in the international FEL community for assessing quantum effects in XFELs.« less
Quantum-like model of unconscious–conscious dynamics
Khrennikov, Andrei
2015-01-01
We present a quantum-like model of sensation–perception dynamics (originated in Helmholtz theory of unconscious inference) based on the theory of quantum apparatuses and instruments. We illustrate our approach with the model of bistable perception of a particular ambiguous figure, the Schröder stair. This is a concrete model for unconscious and conscious processing of information and their interaction. The starting point of our quantum-like journey was the observation that perception dynamics is essentially contextual which implies impossibility of (straightforward) embedding of experimental statistical data in the classical (Kolmogorov, 1933) framework of probability theory. This motivates application of nonclassical probabilistic schemes. And the quantum formalism provides a variety of the well-approved and mathematically elegant probabilistic schemes to handle results of measurements. The theory of quantum apparatuses and instruments is the most general quantum scheme describing measurements and it is natural to explore it to model the sensation–perception dynamics. In particular, this theory provides the scheme of indirect quantum measurements which we apply to model unconscious inference leading to transition from sensations to perceptions. PMID:26283979
Non-Equilibrium Turbulence and Two-Equation Modeling
NASA Technical Reports Server (NTRS)
Rubinstein, Robert
2011-01-01
Two-equation turbulence models are analyzed from the perspective of spectral closure theories. Kolmogorov theory provides useful information for models, but it is limited to equilibrium conditions in which the energy spectrum has relaxed to a steady state consistent with the forcing at large scales; it does not describe transient evolution between such states. Transient evolution is necessarily through nonequilibrium states, which can only be found from a theory of turbulence evolution, such as one provided by a spectral closure. When the departure from equilibrium is small, perturbation theory can be used to approximate the evolution by a two-equation model. The perturbation theory also gives explicit conditions under which this model can be valid, and when it will fail. Implications of the non-equilibrium corrections for the classic Tennekes-Lumley balance in the dissipation rate equation are drawn: it is possible to establish both the cancellation of the leading order Re1/2 divergent contributions to vortex stretching and enstrophy destruction, and the existence of a nonzero difference which is finite in the limit of infinite Reynolds number.
NASA Astrophysics Data System (ADS)
McCarthy, Kimberly Ann
1990-01-01
Divisions in definitions of creativity have centered primarily on the working definition of discontinuity and the inclusion of intrinsic features such as unconscious processing and intrinsic motivation and reinforcement. These differences generally result from Cohen's two world views underlying theories of creativity: Organismic, oriented toward holism; or mechanistic, oriented toward cause-effect reductionism. The quantum world view is proposed which theoretically and empirically unifies organismic and mechanistic elements of creativity. Based on Goswami's Idealistic Interpretation of quantum physics, the quantum view postulates the mind -brain as consisting of both classical and quantum structures and functions. The quantum domain accesses the transcendent order through coherent superpositions (a state of potentialities), while the classical domain performs the function of measuring apparatus through amplifying and recording the result of the collapse of the pure mental state. A theoretical experiment, based on the 1980 Marcel study of conscious and unconscious word-sense disambiguation, is conducted which compares the predictions of the quantum model with those of the 1975 Posner and Snyder Facilitation and Inhibition model. Each model agrees that while conscious access to information is limited, unconscious access is unlimited. However, each model differently defines the connection between these states: The Posner model postulates a central processing mechanism while the quantum model postulates a self-referential consciousness. Consequently, the two models predict differently. The strength of the quantum model lies in its ability to distinguish between classical and quantum definitions of discontinuity, as well as clarifying the function of consciousness, without added assumptions or ad-hoc analysis: Consciousness is an essential, valid feature of quantum mechanisms independent of the field of cognitive psychology. According to the quantum model, through a cycle of conscious and unconscious processing, various contexts are accessed, specifically, coherent superposition states and the removal of the subject-object dichotomy in unconscious processing. Coupled with a high tolerance for ambiguity, the individual has access not only to an increased quantity of information, but is exposed to this information in the absence of a self-referential or biased context, the result of which is an increase in creative behavior.
An Innovative Thinking-Based Intelligent Information Fusion Algorithm
Hu, Liang; Liu, Gang; Zhou, Jin
2013-01-01
This study proposes an intelligent algorithm that can realize information fusion in reference to the relative research achievements in brain cognitive theory and innovative computation. This algorithm treats knowledge as core and information fusion as a knowledge-based innovative thinking process. Furthermore, the five key parts of this algorithm including information sense and perception, memory storage, divergent thinking, convergent thinking, and evaluation system are simulated and modeled. This algorithm fully develops innovative thinking skills of knowledge in information fusion and is a try to converse the abstract conception of brain cognitive science to specific and operable research routes and strategies. Furthermore, the influences of each parameter of this algorithm on algorithm performance are analyzed and compared with those of classical intelligent algorithms trough test. Test results suggest that the algorithm proposed in this study can obtain the optimum problem solution by less target evaluation times, improve optimization effectiveness, and achieve the effective fusion of information. PMID:23956699
An innovative thinking-based intelligent information fusion algorithm.
Lu, Huimin; Hu, Liang; Liu, Gang; Zhou, Jin
2013-01-01
This study proposes an intelligent algorithm that can realize information fusion in reference to the relative research achievements in brain cognitive theory and innovative computation. This algorithm treats knowledge as core and information fusion as a knowledge-based innovative thinking process. Furthermore, the five key parts of this algorithm including information sense and perception, memory storage, divergent thinking, convergent thinking, and evaluation system are simulated and modeled. This algorithm fully develops innovative thinking skills of knowledge in information fusion and is a try to converse the abstract conception of brain cognitive science to specific and operable research routes and strategies. Furthermore, the influences of each parameter of this algorithm on algorithm performance are analyzed and compared with those of classical intelligent algorithms trough test. Test results suggest that the algorithm proposed in this study can obtain the optimum problem solution by less target evaluation times, improve optimization effectiveness, and achieve the effective fusion of information.
Brier, Søren
2017-12-01
Charles S. Peirce developed a process philosophy featuring a non-theistic agapistic evolution from nothingness. It is an Eastern inspired alternative to the Western mechanical ontology of classical science also inspired by the American transcendentalists. Advaitism and Buddhism are the two most important Eastern philosophical traditions that encompass scientific knowledge and the idea of spontaneous evolutionary development. This article attempts to show how Peirce's non-mechanistic triadic semiotic process theory is suited better to embrace the quantum field view than mechanistic and information-based views are with regard to a theory of the emergence of consciousness. Peirce views the universe as a reasoning process developing from pure potentiality to the fully ordered rational Summon Bonum. The paper compares this with John Archibald Wheeler's "It from bit" cosmogony based on quantum information science, which leads to the info-computational view of nature, mind and culture. However, this theory lacks a phenomenological foundation. David Chalmers' double aspect interpretation of information attempts to overcome the limitations of the info-computational view. Chalmers supplements Batesonian and Wheelerian info-computationalism - both of which lack a phenomenological aspect - with a dimension that corresponds to the phenomenological aspect of reality. However, he does not manage to produce an integrated theory of the development of meaning and rationality. Alex Hankey's further work goes some way towards establishing a theory that can satisfy Husserl's criteria for consciousness - such as a sense of being and time - but Hankey's dependence on Chalmers' theory is still not able to account for what the connection between core consciousness and the physical world is. Copyright © 2017 Elsevier Ltd. All rights reserved.
Pahlavan, Farzaneh
2008-01-01
In recent decades, researchers in various areas of psychology have challenged the claims of a single mode of information processing, and developed dual-process models of social behaviors. Although these theories differ on a number of dimensions, they all share the basic assumption that two different modes of information processing operate in making decision and copying behavior. In essence, the common distinction in these perspectives is between controlled vs. automatic, conscious vs. unconscious, and affective vs. cognitive modes of processing. The purpose of Berkowitz's article is to go beyond the notion of automatic processes in order to use classic notions of conditioning and displacement to explain aggressive behavior. I assert that an explanatory framework for psychology of aggression must be anchored not only in the new but also classic theoretical paradigms. However, progress in psychology does not rest solely on the accumulation of theoretical insights. It demands a large body of empirical facts, with attention to incongruities, discordances, and conceptual clarifications. Copyright 2008 Wiley-Liss, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Somnath, Suhas; Collins, Liam; Matheson, Michael A.
We develop and implement a multifrequency spectroscopy and spectroscopic imaging mode, referred to as general dynamic mode (GDM), that captures the complete spatially- and stimulus dependent information on nonlinear cantilever dynamics in scanning probe microscopy (SPM). GDM acquires the cantilever response including harmonics and mode mixing products across the entire broadband cantilever spectrum as a function of excitation frequency. GDM spectra substitute the classical measurements in SPM, e.g. amplitude and phase in lock-in detection. Here, GDM is used to investigate the response of a purely capacitively driven cantilever. We use information theory techniques to mine the data and verify themore » findings with governing equations and classical lock-in based approaches. We explore the dependence of the cantilever dynamics on the tip–sample distance, AC and DC driving bias. This approach can be applied to investigate the dynamic behavior of other systems within and beyond dynamic SPM. In conclusion, GDM is expected to be useful for separating the contribution of different physical phenomena in the cantilever response and understanding the role of cantilever dynamics in dynamic AFM techniques.« less
A diffusion of innovations model of physician order entry.
Ash, J S; Lyman, J; Carpenter, J; Fournier, L
2001-01-01
To interpret the results of a cross-site study of physician order entry (POE) in hospitals using a diffusion of innovations theory framework. Qualitative study using observation, focus groups, and interviews. Data were analyzed by an interdisciplinary team of researchers using a grounded approach to identify themes. Themes were then interpreted using classical Diffusion of Innovations (DOI) theory as described by Rogers [1]. Four high level themes were identified: organizational issues; clinical and professional issues; technology implementation issues; and issues related to the organization of information and knowledge. Further analysis using the DOI framework indicated that POE is an especially complex information technology innovation when one considers communication, time, and social system issues in addition to attributes of the innovation itself. Implementation strategies for POE should be designed to account for its complex nature. The ideal would be a system that is both customizable and integrated with other parts of the information system, is implemented with maximum involvement of users and high levels of support, and is surrounded by an atmosphere of trust and collaboration.
NP-hardness of decoding quantum error-correction codes
NASA Astrophysics Data System (ADS)
Hsieh, Min-Hsiu; Le Gall, François
2011-05-01
Although the theory of quantum error correction is intimately related to classical coding theory and, in particular, one can construct quantum error-correction codes (QECCs) from classical codes with the dual-containing property, this does not necessarily imply that the computational complexity of decoding QECCs is the same as their classical counterparts. Instead, decoding QECCs can be very much different from decoding classical codes due to the degeneracy property. Intuitively, one expects degeneracy would simplify the decoding since two different errors might not and need not be distinguished in order to correct them. However, we show that general quantum decoding problem is NP-hard regardless of the quantum codes being degenerate or nondegenerate. This finding implies that no considerably fast decoding algorithm exists for the general quantum decoding problems and suggests the existence of a quantum cryptosystem based on the hardness of decoding QECCs.
On quantum effects in a theory of biological evolution.
Martin-Delgado, M A
2012-01-01
We construct a descriptive toy model that considers quantum effects on biological evolution starting from Chaitin's classical framework. There are smart evolution scenarios in which a quantum world is as favorable as classical worlds for evolution to take place. However, in more natural scenarios, the rate of evolution depends on the degree of entanglement present in quantum organisms with respect to classical organisms. If the entanglement is maximal, classical evolution turns out to be more favorable.
On Quantum Effects in a Theory of Biological Evolution
Martin-Delgado, M. A.
2012-01-01
We construct a descriptive toy model that considers quantum effects on biological evolution starting from Chaitin's classical framework. There are smart evolution scenarios in which a quantum world is as favorable as classical worlds for evolution to take place. However, in more natural scenarios, the rate of evolution depends on the degree of entanglement present in quantum organisms with respect to classical organisms. If the entanglement is maximal, classical evolution turns out to be more favorable. PMID:22413059
Quantum channels and memory effects
NASA Astrophysics Data System (ADS)
Caruso, Filippo; Giovannetti, Vittorio; Lupo, Cosmo; Mancini, Stefano
2014-10-01
Any physical process can be represented as a quantum channel mapping an initial state to a final state. Hence it can be characterized from the point of view of communication theory, i.e., in terms of its ability to transfer information. Quantum information provides a theoretical framework and the proper mathematical tools to accomplish this. In this context the notion of codes and communication capacities have been introduced by generalizing them from the classical Shannon theory of information transmission and error correction. The underlying assumption of this approach is to consider the channel not as acting on a single system, but on sequences of systems, which, when properly initialized allow one to overcome the noisy effects induced by the physical process under consideration. While most of the work produced so far has been focused on the case in which a given channel transformation acts identically and independently on the various elements of the sequence (memoryless configuration in jargon), correlated error models appear to be a more realistic way to approach the problem. A slightly different, yet conceptually related, notion of correlated errors applies to a single quantum system which evolves continuously in time under the influence of an external disturbance which acts on it in a non-Markovian fashion. This leads to the study of memory effects in quantum channels: a fertile ground where interesting novel phenomena emerge at the intersection of quantum information theory and other branches of physics. A survey is taken of the field of quantum channels theory while also embracing these specific and complex settings.
The Value of Item Response Theory in Clinical Assessment: A Review
ERIC Educational Resources Information Center
Thomas, Michael L.
2011-01-01
Item response theory (IRT) and related latent variable models represent modern psychometric theory, the successor to classical test theory in psychological assessment. Although IRT has become prevalent in the measurement of ability and achievement, its contributions to clinical domains have been less extensive. Applications of IRT to clinical…
NASA Astrophysics Data System (ADS)
Rincón, Ángel; Panotopoulos, Grigoris
2018-01-01
We study for the first time the stability against scalar perturbations, and we compute the spectrum of quasinormal modes of three-dimensional charged black holes in Einstein-power-Maxwell nonlinear electrodynamics assuming running couplings. Adopting the sixth order Wentzel-Kramers-Brillouin (WKB) approximation we investigate how the running of the couplings change the spectrum of the classical theory. Our results show that all modes corresponding to nonvanishing angular momentum are unstable both in the classical theory and with the running of the couplings, while the fundamental mode can be stable or unstable depending on the running parameter and the electric charge.
Uncertainty and denial: a resource-rational model of the value of information.
Pierson, Emma; Goodman, Noah
2014-01-01
Classical decision theory predicts that people should be indifferent to information that is not useful for making decisions, but this model often fails to describe human behavior. Here we investigate one such scenario, where people desire information about whether an event (the gain/loss of money) will occur even though there is no obvious decision to be made on the basis of this information. We find a curious dual trend: if information is costless, as the probability of the event increases people want the information more; if information is not costless, people's desire for the information peaks at an intermediate probability. People also want information more as the importance of the event increases, and less as the cost of the information increases. We propose a model that explains these results, based on the assumption that people have limited cognitive resources and obtain information about which events will occur so they can determine whether to expend effort planning for them.
Uncertainty and Denial: A Resource-Rational Model of the Value of Information
Pierson, Emma; Goodman, Noah
2014-01-01
Classical decision theory predicts that people should be indifferent to information that is not useful for making decisions, but this model often fails to describe human behavior. Here we investigate one such scenario, where people desire information about whether an event (the gain/loss of money) will occur even though there is no obvious decision to be made on the basis of this information. We find a curious dual trend: if information is costless, as the probability of the event increases people want the information more; if information is not costless, people's desire for the information peaks at an intermediate probability. People also want information more as the importance of the event increases, and less as the cost of the information increases. We propose a model that explains these results, based on the assumption that people have limited cognitive resources and obtain information about which events will occur so they can determine whether to expend effort planning for them. PMID:25426631
An information theory account of late frontoparietal ERP positivities in cognitive control.
Barceló, Francisco; Cooper, Patrick S
2018-03-01
ERP research on task switching has revealed distinct transient and sustained positive waveforms (latency circa 300-900 ms) while shifting task rules or stimulus-response (S-R) mappings. However, it remains unclear whether such switch-related positivities show similar scalp topography and index context-updating mechanisms akin to those posed for domain-general (i.e., classic P300) positivities in many task domains. To examine this question, ERPs were recorded from 31 young adults (18-30 years) while they were intermittently cued to switch or repeat their perceptual categorization of Gabor gratings varying in color and thickness (switch task), or else they performed two visually identical control tasks (go/no-go and oddball). Our task cueing paradigm examined two temporarily distinct stages of proactive rule updating and reactive rule execution. A simple information theory model helped us gauge cognitive demands under distinct temporal and task contexts in terms of low-level S-R pathways and higher-order rule updating operations. Task demands modulated domain-general (indexed by classic oddball P3) and switch positivities-indexed by both a cue-locked late positive complex and a sustained positivity ensuing task transitions. Topographic scalp analyses confirmed subtle yet significant split-second changes in the configuration of neural sources for both domain-general P3s and switch positivities as a function of both the temporal and task context. These findings partly meet predictions from information estimates, and are compatible with a family of P3-like potentials indexing functionally distinct neural operations within a common frontoparietal "multiple demand" system during the preparation and execution of simple task rules. © 2016 Society for Psychophysiological Research.
Constraints on primordial magnetic fields from inflation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Daniel; Kobayashi, Takeshi, E-mail: drgreen@cita.utoronto.ca, E-mail: takeshi.kobayashi@sissa.it
2016-03-01
We present generic bounds on magnetic fields produced from cosmic inflation. By investigating field bounds on the vector potential, we constrain both the quantum mechanical production of magnetic fields and their classical growth in a model independent way. For classical growth, we show that only if the reheating temperature is as low as T{sub reh} ∼< 10{sup 2} MeV can magnetic fields of 10{sup −15} G be produced on Mpc scales in the present universe. For purely quantum mechanical scenarios, even stronger constraints are derived. Our bounds on classical and quantum mechanical scenarios apply to generic theories of inflationary magnetogenesis with a two-derivative timemore » kinetic term for the vector potential. In both cases, the magnetic field strength is limited by the gravitational back-reaction of the electric fields that are produced simultaneously. As an example of quantum mechanical scenarios, we construct vector field theories whose time diffeomorphisms are spontaneously broken, and explore magnetic field generation in theories with a variable speed of light. Transitions of quantum vector field fluctuations into classical fluctuations are also analyzed in the examples.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gurvits, L.
2002-01-01
Classical matching theory can be defined in terms of matrices with nonnegative entries. The notion of Positive operator, central in Quantum Theory, is a natural generalization of matrices with non-negative entries. Based on this point of view, we introduce a definition of perfect Quantum (operator) matching. We show that the new notion inherits many 'classical' properties, but not all of them. This new notion goes somewhere beyound matroids. For separable bipartite quantum states this new notion coinsides with the full rank property of the intersection of two corresponding geometric matroids. In the classical situation, permanents are naturally associated with perfectsmore » matchings. We introduce an analog of permanents for positive operators, called Quantum Permanent and show how this generalization of the permanent is related to the Quantum Entanglement. Besides many other things, Quantum Permanents provide new rational inequalities necessary for the separability of bipartite quantum states. Using Quantum Permanents, we give deterministic poly-time algorithm to solve Hidden Matroids Intersection Problem and indicate some 'classical' complexity difficulties associated with the Quantum Entanglement. Finally, we prove that the weak membership problem for the convex set of separable bipartite density matrices is NP-HARD.« less
Ghost imaging of phase objects with classical incoherent light
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shirai, Tomohiro; Setaelae, Tero; Friberg, Ari T.
2011-10-15
We describe an optical setup for performing spatial Fourier filtering in ghost imaging with classical incoherent light. This is achieved by a modification of the conventional geometry for lensless ghost imaging. It is shown on the basis of classical coherence theory that with this technique one can realize what we call phase-contrast ghost imaging to visualize pure phase objects.
Quantum-mechanical machinery for rational decision-making in classical guessing game
NASA Astrophysics Data System (ADS)
Bang, Jeongho; Ryu, Junghee; Pawłowski, Marcin; Ham, Byoung S.; Lee, Jinhyoung
2016-02-01
In quantum game theory, one of the most intriguing and important questions is, “Is it possible to get quantum advantages without any modification of the classical game?” The answer to this question so far has largely been negative. So far, it has usually been thought that a change of the classical game setting appears to be unavoidable for getting the quantum advantages. However, we give an affirmative answer here, focusing on the decision-making process (we call ‘reasoning’) to generate the best strategy, which may occur internally, e.g., in the player’s brain. To show this, we consider a classical guessing game. We then define a one-player reasoning problem in the context of the decision-making theory, where the machinery processes are designed to simulate classical and quantum reasoning. In such settings, we present a scenario where a rational player is able to make better use of his/her weak preferences due to quantum reasoning, without any altering or resetting of the classically defined game. We also argue in further analysis that the quantum reasoning may make the player fail, and even make the situation worse, due to any inappropriate preferences.
Quantum-mechanical machinery for rational decision-making in classical guessing game
Bang, Jeongho; Ryu, Junghee; Pawłowski, Marcin; Ham, Byoung S.; Lee, Jinhyoung
2016-01-01
In quantum game theory, one of the most intriguing and important questions is, “Is it possible to get quantum advantages without any modification of the classical game?” The answer to this question so far has largely been negative. So far, it has usually been thought that a change of the classical game setting appears to be unavoidable for getting the quantum advantages. However, we give an affirmative answer here, focusing on the decision-making process (we call ‘reasoning’) to generate the best strategy, which may occur internally, e.g., in the player’s brain. To show this, we consider a classical guessing game. We then define a one-player reasoning problem in the context of the decision-making theory, where the machinery processes are designed to simulate classical and quantum reasoning. In such settings, we present a scenario where a rational player is able to make better use of his/her weak preferences due to quantum reasoning, without any altering or resetting of the classically defined game. We also argue in further analysis that the quantum reasoning may make the player fail, and even make the situation worse, due to any inappropriate preferences. PMID:26875685
Quantum-mechanical machinery for rational decision-making in classical guessing game.
Bang, Jeongho; Ryu, Junghee; Pawłowski, Marcin; Ham, Byoung S; Lee, Jinhyoung
2016-02-15
In quantum game theory, one of the most intriguing and important questions is, "Is it possible to get quantum advantages without any modification of the classical game?" The answer to this question so far has largely been negative. So far, it has usually been thought that a change of the classical game setting appears to be unavoidable for getting the quantum advantages. However, we give an affirmative answer here, focusing on the decision-making process (we call 'reasoning') to generate the best strategy, which may occur internally, e.g., in the player's brain. To show this, we consider a classical guessing game. We then define a one-player reasoning problem in the context of the decision-making theory, where the machinery processes are designed to simulate classical and quantum reasoning. In such settings, we present a scenario where a rational player is able to make better use of his/her weak preferences due to quantum reasoning, without any altering or resetting of the classically defined game. We also argue in further analysis that the quantum reasoning may make the player fail, and even make the situation worse, due to any inappropriate preferences.
Toda theories as contractions of affine Toda theories
NASA Astrophysics Data System (ADS)
Aghamohammadi, A.; Khorrami, M.; Shariati, A.
1996-02-01
Using a contraction procedure, we obtain Toda theories and their structures, from affine Toda theories and their corresponding structures. By structures, we mean the equation of motion, the classical Lax pair, the boundary term for half line theories, and the quantum transfer matrix. The Lax pair and the transfer matrix so obtained, depend nontrivially on the spectral parameter.
Comparing the Effectiveness of SPSS and EduG Using Different Designs for Generalizability Theory
ERIC Educational Resources Information Center
Teker, Gulsen Tasdelen; Guler, Nese; Uyanik, Gulden Kaya
2015-01-01
Generalizability theory (G theory) provides a broad conceptual framework for social sciences such as psychology and education, and a comprehensive construct for numerous measurement events by using analysis of variance, a strong statistical method. G theory, as an extension of both classical test theory and analysis of variance, is a model which…
An Approach to Biased Item Identification Using Latent Trait Measurement Theory.
ERIC Educational Resources Information Center
Rudner, Lawrence M.
Because it is a true score model employing item parameters which are independent of the examined sample, item characteristic curve theory (ICC) offers several advantages over classical measurement theory. In this paper an approach to biased item identification using ICC theory is described and applied. The ICC theory approach is attractive in that…
NASA Astrophysics Data System (ADS)
Stapp, Henry P.
2012-05-01
Robert Griffiths has recently addressed, within the framework of a `consistent quantum theory' that he has developed, the issue of whether, as is often claimed, quantum mechanics entails a need for faster-than-light transfers of information over long distances. He argues that the putative proofs of this property that involve hidden variables include in their premises some essentially classical-physics-type assumptions that are not entailed by the precepts of quantum mechanics. Thus whatever is proved is not a feature of quantum mechanics, but is a property of a theory that tries to combine quantum theory with quasi-classical features that go beyond what is entailed by quantum theory itself. One cannot logically prove properties of a system by establishing, instead, properties of a system modified by adding properties alien to the original system. Hence Griffiths' rejection of hidden-variable-based proofs is logically warranted. Griffiths mentions the existence of a certain alternative proof that does not involve hidden variables, and that uses only macroscopically described observable properties. He notes that he had examined in his book proofs of this general kind, and concluded that they provide no evidence for nonlocal influences. But he did not examine the particular proof that he cites. An examination of that particular proof by the method specified by his `consistent quantum theory' shows that the cited proof is valid within that restrictive version of quantum theory. An added section responds to Griffiths' reply, which cites general possibilities of ambiguities that might make what is to be proved ill-defined, and hence render the pertinent `consistent framework' ill defined. But the vagaries that he cites do not upset the proof in question, which, both by its physical formulation and by explicit identification, specify the framework to be used. Griffiths confirms the validity of the proof insofar as that pertinent framework is used. The section also shows, in response to Griffiths' challenge, why a putative proof of locality that he has described is flawed.
Restructuring consciousness -the psychedelic state in light of integrated information theory.
Gallimore, Andrew R
2015-01-01
The psychological state elicited by the classic psychedelics drugs, such as LSD and psilocybin, is one of the most fascinating and yet least understood states of consciousness. However, with the advent of modern functional neuroimaging techniques, the effect of these drugs on neural activity is now being revealed, although many of the varied phenomenological features of the psychedelic state remain challenging to explain. Integrated information theory (IIT) is one of the foremost contemporary theories of consciousness, providing a mathematical formalization of both the quantity and quality of conscious experience. This theory can be applied to all known states of consciousness, including the psychedelic state. Using the results of functional neuroimaging data on the psychedelic state, the effects of psychedelic drugs on both the level and structure of consciousness can be explained in terms of the conceptual framework of IIT. This new IIT-based model of the psychedelic state provides an explanation for many of its phenomenological features, including unconstrained cognition, alterations in the structure and meaning of concepts and a sense of expanded awareness. This model also suggests that whilst cognitive flexibility, creativity, and imagination are enhanced during the psychedelic state, this occurs at the expense of cause-effect information, as well as degrading the brain's ability to organize, categorize, and differentiate the constituents of conscious experience. Furthermore, the model generates specific predictions that can be tested using a combination of functional imaging techniques, as has been applied to the study of levels of consciousness during anesthesia and following brain injury.
Transfer function modeling of damping mechanisms in viscoelastic plates
NASA Technical Reports Server (NTRS)
Slater, J. C.; Inman, D. J.
1991-01-01
This work formulates a method for the modeling of material damping characteristics in plates. The Sophie German equation of classical plate theory is modified to incorporate hysteresis effects represented by complex stiffness using the transfer function approach proposed by Golla and Hughes, (1985). However, this procedure is not limited to this representation. The governing characteristic equation is decoupled through separation of variables, yielding a solution similar to that of undamped classical plate theory, allowing solution of the steady state as well as the transient response problem.
Potential Functions and the Characterization of Economics-Based Information
NASA Astrophysics Data System (ADS)
Haven, Emmanuel
2015-10-01
The formulation of quantum mechanics as a diffusion process by Nelson (Phys Rev 150:1079-1085, 1966) provides for an interesting approach on how we may transit from classical mechanics into quantum mechanics. Besides the presence of the real potential function, another type of potential function (often denoted as `quantum potential') forms an intrinsic part of this theory. In this paper we attempt to show how both types of potential functions can have a use in a resolutely macroscopic context like financial asset pricing. We are particularly interested in uncovering how the `quantum potential' can add to the economics-based relevant information which is already supplied by the real potential function.
NASA Astrophysics Data System (ADS)
Poisson, E.
2006-09-01
The motion of a charged particle interacting with its own electromagnetic field is an area of research that has a long history; this problem has never ceased to fascinate its investigators. On the one hand the theory ought to be straightforward to formulate: one has Maxwell's equations that tell the field how to behave (given the motion of the particle), and one has the Lorentz-force law that tells the particle how to move (given the field). On the other hand the theory is fundamentally ambiguous because of the field singularities that necessarily come with a point particle. While each separate sub-problem can easily be solved, to couple the field to the particle in a self-consistent treatment turns out to be tricky. I believe it is this dilemma (the theory is straightforward but tricky) that has been the main source of the endless fascination. For readers of Classical and Quantum Gravity, the fascination does not end there. For them it is also rooted in the fact that the electromagnetic self-force problem is deeply analogous to the gravitational self-force problem, which is of direct relevance to future gravitational wave observations. The motion of point particles in curved spacetime has been the topic of a recent Topical Review [1], and it was the focus of a recent Special Issue [2]. It is surprising to me that radiation reaction is a subject that continues to be poorly covered in the standard textbooks, including Jackson's bible [3]. Exceptions are Rohrlich's excellent text [4], which makes a very useful introduction to radiation reaction, and the Landau and Lifshitz classic [5], which contains what is probably the most perfect summary of the foundational ideas (presented in characteristic terseness). It is therefore with some trepidation that I received Herbert Spohn's book, which covers both the classical and quantum theories of a charged particle coupled to its own field (the presentation is limited to flat spacetime). Is this the text that graduate students and researchers should turn to in order to get a complete and accessible education in radiation reaction? My answer is that while the book does indeed contain a lot of useful material, it is not a very accessible source of information, and it is certainly not a student-friendly textbook. Instead, the book presents a technical account of the author's personal take on the theory, and represents a culminating summary of the author's research contributions over more than a decade. The book is written in a fairly mathematical style (the author is Professor of Mathematical Physics at the Technische Universitat in Munich), and it very much emphasises mathematical rigour. This makes the book less accessible than I would wish it to be, but this is perhaps less a criticism than a statement about my taste, expectation, and attitude. The presentation of the classical theory begins with a point particle, but Spohn immediately smears the charge distribution to eliminate the vexing singularities of the retarded field. He considers both the nonrelativistic Abraham model (in which the extended particle is spherically symmetric in the laboratory frame) and the relativistic Lorentz model (in which the particle is spherical in its rest frame). In Spohn's work, the smearing of the charge distribution is entirely a mathematical procedure, and I would have wished for a more physical discussion. A physically extended body, held together against electrostatic repulsion by cohesive forces (sometimes called Poincaré stresses) would make a sound starting point for a classical theory of charged particles, and would have nicely (and physically) motivated the smearing operation adopted in the book. Spohn goes on to derive energy momentum relations for the extended objects, and to obtain their equations of motion. A compelling aspect of his presentation is that he formally introduces the 'adiabatic limit', the idea that the external fields acting on the charged body should have length and time scales that are long compared with the particle's internal scales (respectively the electrostatic classical radius and its associated time scale). As a consequence, the equations of motion do not involve a differentiated acceleration vector (as is the case for the Abraham Lorentz Dirac equations) but are proper second-order differential equations for the position vector. In effect, the correct equations of motion are obtained from the Abraham Lorentz Dirac equations by a reduction-of-order procedure that was first proposed (as far as I know) by Landau and Lifshitz [5]. In Spohn's work this procedure is not {\\it ad hoc}, but a natural consequence of the adiabatic approximation. An aspect of the classical portion of the book that got me particularly excited is Spohn's proposal for an experimental test of the predictions of the Landau Lifshitz equations. His proposed experiment involves a Penning trap, a device that uses a uniform magnetic field and a quadrupole electric field to trap an electron for very long times. Without radiation reaction, the motion of an electron in the trap is an epicycle that consists of a rapid (and small) cyclotron orbit superposed onto a slow (and large) magnetron orbit. Spohn shows that according to the Landau Lifshitz equations, the radiation reaction produces a damping of the cyclotron motion. For reasonable laboratory situations this damping occurs over a time scale of the order of 0.1 second. This experiment might well be within technological reach. The presentation of the quantum theory is based on the nonrelativistic Abraham model, which upon quantization leads to the well-known Pauli-Fierz Hamiltonian of nonrelativistic quantum electrodynamics. This theory, an approximation to the fully relativistic version of QED, has a wide domain of validity that includes many aspects of quantum optics and laser-matter interactions. As I am not an expert in this field, my ability to review this portion of Spohn's book is limited, and I will indeed restrict myself to a few remarks. I first admit that I found Spohn's presentation to be tough going. Unlike the pair of delightful books by Cohen-Tannoudji, Dupont-Roc, and Grynberg [6, 7], this is not a gentle introduction to the quantum theory of a charged particle coupled to its own electromagnetic field. Instead, Spohn proceeds rather quickly through the formulation of the theory (defining the Hamiltonian and the Hilbert space) and then presents some applications (for example, he constructs the ground states of the theory, he examines radiation processes, and he explores finite-temperature aspects). There is a lot of material in the eight chapters devoted to the quantum theory, but my insufficient preparation and the advanced nature of Spohn's presentation were significant obstacles; I was not able to draw much appreciation for this material. One of the most useful resources in Spohn's book are the historical notes and literature reviews that are inserted at the end of each chapter. I discovered a wealth of interesting articles by reading these, and I am grateful that the author made the effort to collect this information for the benefit of his readers. References [1] Poisson E 2004 Radiation reaction of point particles in curved spacetime Class. Quantum Grav 21 R153 R232 [2] Lousto C O 2005 Special issue: Gravitational Radiation from Binary Black Holes: Advances in the Perturbative Approach, Class. Quantum Grav22 S543 S868 [3] Jackson J D 1999 Classical Electrodynamics Third Edition (New York: Wiley) [4] Rohrlich F 1990 Classical Charged Particles (Redwood City, CA: Addison Wesley) [5] Landau L D and Lifshitz E M 2000 The Classical Theory of Fields Fourth Edition (Oxford: Butterworth Heinemann) [6] Cohen-Tannoudji C Dupont-Roc J and Grynberg G 1997 Photons and Atoms - Introduction to Quantum Electrodynamics (New York: Wiley-Interscience) [7] Cohen-Tannoudji C, Dupont-Roc J and G Grynberg G 1998 Atom Photon Interactions: Basic Processes and Applications (New York: Wiley-Interscience)
NASA Astrophysics Data System (ADS)
Gafurov, O.; Gafurov, D.; Syryamkin, V.
2018-05-01
The paper analyses a field of computer science formed at the intersection of such areas of natural science as artificial intelligence, mathematical statistics, and database theory, which is referred to as "Data Mining" (discovery of knowledge in data). The theory of neural networks is applied along with classical methods of mathematical analysis and numerical simulation. The paper describes the technique protected by the patent of the Russian Federation for the invention “A Method for Determining Location of Production Wells during the Development of Hydrocarbon Fields” [1–3] and implemented using the geoinformation system NeuroInformGeo. There are no analogues in domestic and international practice. The paper gives an example of comparing the forecast of the oil reservoir quality made by the geophysicist interpreter using standard methods and the forecast of the oil reservoir quality made using this technology. The technical result achieved shows the increase of efficiency, effectiveness, and ecological compatibility of development of mineral deposits and discovery of a new oil deposit.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beigi, Salman
Sandwiched (quantum) α-Rényi divergence has been recently defined in the independent works of Wilde et al. [“Strong converse for the classical capacity of entanglement-breaking channels,” preprint http://arxiv.org/abs/arXiv:1306.1586 (2013)] and Müller-Lennert et al. [“On quantum Rényi entropies: a new definition, some properties and several conjectures,” preprint http://arxiv.org/abs/arXiv:1306.3142v1 (2013)]. This new quantum divergence has already found applications in quantum information theory. Here we further investigate properties of this new quantum divergence. In particular, we show that sandwiched α-Rényi divergence satisfies the data processing inequality for all values of α > 1. Moreover we prove that α-Holevo information, a variant of Holevo informationmore » defined in terms of sandwiched α-Rényi divergence, is super-additive. Our results are based on Hölder's inequality, the Riesz-Thorin theorem and ideas from the theory of complex interpolation. We also employ Sion's minimax theorem.« less
A fuzzy-theory-based behavioral model for studying pedestrian evacuation from a single-exit room
NASA Astrophysics Data System (ADS)
Fu, Libi; Song, Weiguo; Lo, Siuming
2016-08-01
Many mass events in recent years have highlighted the importance of research on pedestrian evacuation dynamics. A number of models have been developed to analyze crowd behavior under evacuation situations. However, few focus on pedestrians' decision-making with respect to uncertainty, vagueness and imprecision. In this paper, a discrete evacuation model defined on the cellular space is proposed according to the fuzzy theory which is able to describe imprecise and subjective information. Pedestrians' percept information and various characteristics are regarded as fuzzy input. Then fuzzy inference systems with rule bases, which resemble human reasoning, are established to obtain fuzzy output that decides pedestrians' movement direction. This model is tested in two scenarios, namely in a single-exit room with and without obstacles. Simulation results reproduce some classic dynamics phenomena discovered in real building evacuation situations, and are consistent with those in other models and experiments. It is hoped that this study will enrich movement rules and approaches in traditional cellular automaton models for evacuation dynamics.
Experimental violation of Bell inequalities for multi-dimensional systems
Lo, Hsin-Pin; Li, Che-Ming; Yabushita, Atsushi; Chen, Yueh-Nan; Luo, Chih-Wei; Kobayashi, Takayoshi
2016-01-01
Quantum correlations between spatially separated parts of a d-dimensional bipartite system (d ≥ 2) have no classical analog. Such correlations, also called entanglements, are not only conceptually important, but also have a profound impact on information science. In theory the violation of Bell inequalities based on local realistic theories for d-dimensional systems provides evidence of quantum nonlocality. Experimental verification is required to confirm whether a quantum system of extremely large dimension can possess this feature, however it has never been performed for large dimension. Here, we report that Bell inequalities are experimentally violated for bipartite quantum systems of dimensionality d = 16 with the usual ensembles of polarization-entangled photon pairs. We also estimate that our entanglement source violates Bell inequalities for extremely high dimensionality of d > 4000. The designed scenario offers a possible new method to investigate the entanglement of multipartite systems of large dimensionality and their application in quantum information processing. PMID:26917246
Naked singularity, firewall, and Hawking radiation.
Zhang, Hongsheng
2017-06-21
Spacetime singularity has always been of interest since the proof of the Penrose-Hawking singularity theorem. Naked singularity naturally emerges from reasonable initial conditions in the collapsing process. A recent interesting approach in black hole information problem implies that we need a firewall to break the surplus entanglements among the Hawking photons. Classically, the firewall becomes a naked singularity. We find some vacuum analytical solutions in R n -gravity of the firewall-type and use these solutions as concrete models to study the naked singularities. By using standard quantum theory, we investigate the Hawking radiation emitted from the black holes with naked singularities. Here we show that the singularity itself does not destroy information. A unitary quantum theory works well around a firewall-type singularity. We discuss the validity of our result in general relativity. Further our result demonstrates that the temperature of the Hawking radiation still can be expressed in the form of the surface gravity divided by 2π. This indicates that a naked singularity may not compromise the Hakwing evaporation process.
Sandwiched Rényi divergence satisfies data processing inequality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beigi, Salman
2013-12-15
Sandwiched (quantum) α-Rényi divergence has been recently defined in the independent works of Wilde et al. [“Strong converse for the classical capacity of entanglement-breaking channels,” preprint http://arxiv.org/abs/arXiv:1306.1586 (2013)] and Müller-Lennert et al. [“On quantum Rényi entropies: a new definition, some properties and several conjectures,” preprint http://arxiv.org/abs/arXiv:1306.3142v1 (2013)]. This new quantum divergence has already found applications in quantum information theory. Here we further investigate properties of this new quantum divergence. In particular, we show that sandwiched α-Rényi divergence satisfies the data processing inequality for all values of α > 1. Moreover we prove that α-Holevo information, a variant of Holevo informationmore » defined in terms of sandwiched α-Rényi divergence, is super-additive. Our results are based on Hölder's inequality, the Riesz-Thorin theorem and ideas from the theory of complex interpolation. We also employ Sion's minimax theorem.« less
NASA Astrophysics Data System (ADS)
Wang, Hai; Kumar, Asutosh; Cho, Minhyung; Wu, Junde
2018-04-01
Physical quantities are assumed to take real values, which stems from the fact that an usual measuring instrument that measures a physical observable always yields a real number. Here we consider the question of what would happen if physical observables are allowed to assume complex values. In this paper, we show that by allowing observables in the Bell inequality to take complex values, a classical physical theory can actually get the same upper bound of the Bell expression as quantum theory. Also, by extending the real field to the quaternionic field, we can puzzle out the GHZ problem using local hidden variable model. Furthermore, we try to build a new type of hidden-variable theory of a single qubit based on the result.
From integrability to conformal symmetry: Bosonic superconformal Toda theories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bo-Yu Hou; Liu Chao
In this paper the authors study the conformal integrable models obtained from conformal reductions of WZNW theory associated with second order constraints. These models are called bosonic superconformal Toda models due to their conformal spectra and their resemblance to the usual Toda theories. From the reduction procedure they get the equations of motion and the linearized Lax equations in a generic Z gradation of the underlying Lie algebra. Then, in the special case of principal gradation, they derive the classical r matrix, fundamental Poisson relation, exchange algebra of chiral operators and find out the classical vertex operators. The result showsmore » that their model is very similar to the ordinary Toda theories in that one can obtain various conformal properties of the model from its integrability.« less
Sound and Vision: Using Progressive Rock To Teach Social Theory.
ERIC Educational Resources Information Center
Ahlkvist, Jarl A.
2001-01-01
Describes a teaching technique that utilizes progressive rock music to educate students about sociological theories in introductory sociology courses. Discusses the use of music when teaching about classical social theory and offers an evaluation of this teaching strategy. Includes references. (CMK)
Amplification, Redundancy, and Quantum Chernoff Information
NASA Astrophysics Data System (ADS)
Zwolak, Michael; Riedel, C. Jess; Zurek, Wojciech H.
2014-04-01
Amplification was regarded, since the early days of quantum theory, as a mysterious ingredient that endows quantum microstates with macroscopic consequences, key to the "collapse of the wave packet," and a way to avoid embarrassing problems exemplified by Schrödinger's cat. Such a bridge between the quantum microworld and the classical world of our experience was postulated ad hoc in the Copenhagen interpretation. Quantum Darwinism views amplification as replication, in many copies, of the information about quantum states. We show that such amplification is a natural consequence of a broad class of models of decoherence, including the photon environment we use to obtain most of our information. This leads to objective reality via the presence of robust and widely accessible records of selected quantum states. The resulting redundancy (the number of copies deposited in the environment) follows from the quantum Chernoff information that quantifies the information transmitted by a typical elementary subsystem of the environment.
Unbounded number of channel uses may be required to detect quantum capacity.
Cubitt, Toby; Elkouss, David; Matthews, William; Ozols, Maris; Pérez-García, David; Strelchuk, Sergii
2015-03-31
Transmitting data reliably over noisy communication channels is one of the most important applications of information theory, and is well understood for channels modelled by classical physics. However, when quantum effects are involved, we do not know how to compute channel capacities. This is because the formula for the quantum capacity involves maximizing the coherent information over an unbounded number of channel uses. In fact, entanglement across channel uses can even increase the coherent information from zero to non-zero. Here we study the number of channel uses necessary to detect positive coherent information. In all previous known examples, two channel uses already sufficed. It might be that only a finite number of channel uses is always sufficient. We show that this is not the case: for any number of uses, there are channels for which the coherent information is zero, but which nonetheless have capacity.
Quantum memory for Rindler supertranslations
NASA Astrophysics Data System (ADS)
Kolekar, Sanved; Louko, Jorma
2018-04-01
The Rindler horizon in Minkowski spacetime can be implanted with supertranslation hair by a matter shock wave without planar symmetry, and the hair is observable as a supertranslation memory on the Rindler family of uniformly linearly accelerated observers. We show that this classical memory is accompanied by a supertranslation quantum memory that modulates the entanglement between the opposing Rindler wedges in quantum field theory. A corresponding phenomenon across a black hole horizon may play a role in Hawking, Perry, and Strominger's proposal for supertranslations to provide a solution to the black hole information paradox.
NASA Astrophysics Data System (ADS)
Yan, Han
2012-08-01
Extending Parikh-Wilczek's semi-classical tunneling method, we discuss the Hawking radiation of the charged massive particles via tunneling from the cosmological horizon of ( n+2)-dimensional Topological Reissner-Nordström-de Sitter black hole.The result shows that, when energy conservation and electric charge conservation are taken into account, the derived spectrum deviates from the pure thermal one, but satisfies the unitary theory, which provides a probability for the solution of the information loss paradox.
Robust Control Analysis of Hydraulic Turbine Speed
NASA Astrophysics Data System (ADS)
Jekan, P.; Subramani, C.
2018-04-01
An effective control strategy for the hydro-turbine governor in time scenario is adjective for this paper. Considering the complex dynamic characteristic and the uncertainty of the hydro-turbine governor model and taking the static and dynamic performance of the governing system as the ultimate goal, the designed logic combined the classical PID control theory with artificial intelligence used to obtain the desired output. The used controller will be a variable control techniques, therefore, its parameters can be adaptively adjusted according to the information about the control error signal.
Gallistel, C.R.; Craig, Andrew R.; Shahan, Timothy A.
2015-01-01
Contingency, and more particularly temporal contingency, has often figured in thinking about the nature of learning. However, it has never been formally defined in such a way as to make it a measure that can be applied to most animal learning protocols. We use elementary information theory to define contingency in such a way as to make it a measurable property of almost any conditioning protocol. We discuss how making it a measurable construct enables the exploration of the role of different contingencies in the acquisition and performance of classically and operantly conditioned behavior. PMID:23994260
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erol, V.; Netas Telecommunication Inc., Istanbul
Entanglement has been studied extensively for understanding the mysteries of non-classical correlations between quantum systems. In the bipartite case, there are well known monotones for quantifying entanglement such as concurrence, relative entropy of entanglement (REE) and negativity, which cannot be increased via local operations. The study on these monotones has been a hot topic in quantum information [1-7] in order to understand the role of entanglement in this discipline. It can be observed that from any arbitrary quantum pure state a mixed state can obtained. A natural generalization of this observation would be to consider local operations classical communication (LOCC)more » transformations between general pure states of two parties. Although this question is a little more difficult, a complete solution has been developed using the mathematical framework of the majorization theory [8]. In this work, we analyze the relation between entanglement monotones concurrence and negativity with respect to majorization for general two-level quantum systems of two particles.« less
Efficient Quantum Pseudorandomness.
Brandão, Fernando G S L; Harrow, Aram W; Horodecki, Michał
2016-04-29
Randomness is both a useful way to model natural systems and a useful tool for engineered systems, e.g., in computation, communication, and control. Fully random transformations require exponential time for either classical or quantum systems, but in many cases pseudorandom operations can emulate certain properties of truly random ones. Indeed, in the classical realm there is by now a well-developed theory regarding such pseudorandom operations. However, the construction of such objects turns out to be much harder in the quantum case. Here, we show that random quantum unitary time evolutions ("circuits") are a powerful source of quantum pseudorandomness. This gives for the first time a polynomial-time construction of quantum unitary designs, which can replace fully random operations in most applications, and shows that generic quantum dynamics cannot be distinguished from truly random processes. We discuss applications of our result to quantum information science, cryptography, and understanding the self-equilibration of closed quantum dynamics.
Experimentally modeling stochastic processes with less memory by the use of a quantum processor
Palsson, Matthew S.; Gu, Mile; Ho, Joseph; Wiseman, Howard M.; Pryde, Geoff J.
2017-01-01
Computer simulation of observable phenomena is an indispensable tool for engineering new technology, understanding the natural world, and studying human society. However, the most interesting systems are often so complex that simulating their future behavior demands storing immense amounts of information regarding how they have behaved in the past. For increasingly complex systems, simulation becomes increasingly difficult and is ultimately constrained by resources such as computer memory. Recent theoretical work shows that quantum theory can reduce this memory requirement beyond ultimate classical limits, as measured by a process’ statistical complexity, C. We experimentally demonstrate this quantum advantage in simulating stochastic processes. Our quantum implementation observes a memory requirement of Cq = 0.05 ± 0.01, far below the ultimate classical limit of C = 1. Scaling up this technique would substantially reduce the memory required in simulations of more complex systems. PMID:28168218
NASA Astrophysics Data System (ADS)
Camilleri, Kristian; Schlosshauer, Maximilian
2015-02-01
Niels Bohr's doctrine of the primacy of "classical concepts" is arguably his most criticized and misunderstood view. We present a new, careful historical analysis that makes clear that Bohr's doctrine was primarily an epistemological thesis, derived from his understanding of the functional role of experiment. A hitherto largely overlooked disagreement between Bohr and Heisenberg about the movability of the "cut" between measuring apparatus and observed quantum system supports the view that, for Bohr, such a cut did not originate in dynamical (ontological) considerations, but rather in functional (epistemological) considerations. As such, both the motivation and the target of Bohr's doctrine of classical concepts are of a fundamentally different nature than what is understood as the dynamical problem of the quantum-to-classical transition. Our analysis suggests that, contrary to claims often found in the literature, Bohr's doctrine is not, and cannot be, at odds with proposed solutions to the dynamical problem of the quantum-classical transition that were pursued by several of Bohr's followers and culminated in the development of decoherence theory.
Force-field functor theory: classical force-fields which reproduce equilibrium quantum distributions
Babbush, Ryan; Parkhill, John; Aspuru-Guzik, Alán
2013-01-01
Feynman and Hibbs were the first to variationally determine an effective potential whose associated classical canonical ensemble approximates the exact quantum partition function. We examine the existence of a map between the local potential and an effective classical potential which matches the exact quantum equilibrium density and partition function. The usefulness of such a mapping rests in its ability to readily improve Born-Oppenheimer potentials for use with classical sampling. We show that such a map is unique and must exist. To explore the feasibility of using this result to improve classical molecular mechanics, we numerically produce a map from a library of randomly generated one-dimensional potential/effective potential pairs then evaluate its performance on independent test problems. We also apply the map to simulate liquid para-hydrogen, finding that the resulting radial pair distribution functions agree well with path integral Monte Carlo simulations. The surprising accessibility and transferability of the technique suggest a quantitative route to adapting Born-Oppenheimer potentials, with a motivation similar in spirit to the powerful ideas and approximations of density functional theory. PMID:24790954
Quantum theory for 1D X-ray free electron laser
Anisimov, Petr Mikhaylovich
2017-09-19
Classical 1D X-ray Free Electron Laser (X-ray FEL) theory has stood the test of time by guiding FEL design and development prior to any full-scale analysis. Future X-ray FELs and inverse-Compton sources, where photon recoil approaches an electron energy spread value, push the classical theory to its limits of applicability. After substantial efforts by the community to find what those limits are, there is no universally agreed upon quantum approach to design and development of future X-ray sources. We offer a new approach to formulate the quantum theory for 1D X-ray FELs that has an obvious connection to the classicalmore » theory, which allows for immediate transfer of knowledge between the two regimes. In conclusion, we exploit this connection in order to draw quantum mechanical conclusions about the quantum nature of electrons and generated radiation in terms of FEL variables.« less
NASA Astrophysics Data System (ADS)
Rezaei Kivi, Araz; Azizi, Saber; Norouzi, Peyman
2017-12-01
In this paper, the nonlinear size-dependent static and dynamic behavior of an electrostatically actuated nano-beam is investigated. A fully clamped nano-beam is considered for the modeling of the deformable electrode of the NEMS. The governing differential equation of the motion is derived using Hamiltonian principle based on couple stress theory; a non-classical theory for considering length scale effects. The nonlinear partial differential equation of the motion is discretized to a nonlinear Duffing type ODE's using Galerkin method. Static and dynamic pull-in instabilities obtained by both classical theory and MCST are compared. At the second stage of analysis, shooting technique is utilized to obtain the frequency response curve, and to capture the periodic solutions of the motion; the stability of the periodic solutions are gained by Floquet theory. The nonlinear dynamic behavior of the deformable electrode due to the AC harmonic accompanied with size dependency is investigated.
Finite conformal quantum gravity and spacetime singularities
NASA Astrophysics Data System (ADS)
Modesto, Leonardo; Rachwał, Lesław
2017-12-01
We show that a class of finite quantum non-local gravitational theories is conformally invariant at classical as well as at quantum level. This is actually a range of conformal anomaly-free theories in the spontaneously broken phase of the Weyl symmetry. At classical level we show how the Weyl conformal invariance is able to tame all the spacetime singularities that plague not only Einstein gravity, but also local and weakly non-local higher derivative theories. The latter statement is proved by a singularity theorem that applies to a large class of weakly non-local theories. Therefore, we are entitled to look for a solution of the spacetime singularity puzzle in a missed symmetry of nature, namely the Weyl conformal symmetry. Following the seminal paper by Narlikar and Kembhavi, we provide an explicit construction of singularity-free black hole exact solutions in a class of conformally invariant theories.
NASA Astrophysics Data System (ADS)
Argurio, Riccardo
1998-07-01
The thesis begins with an introduction to M-theory (at a graduate student's level), starting from perturbative string theory and proceeding to dualities, D-branes and finally Matrix theory. The following chapter treats, in a self-contained way, of general classical p-brane solutions. Black and extremal branes are reviewed, along with their semi-classical thermodynamics. We then focus on intersecting extremal branes, the intersection rules being derived both with and without the explicit use of supersymmetry. The last three chapters comprise more advanced aspects of brane physics, such as the dynamics of open branes, the little theories on the world-volume of branes and how the four dimensional Schwarzschild black hole can be mapped to an extremal configuration of branes, thus allowing for a statistical interpretation of its entropy. The original results were already reported in hep-th/9701042, hep-th/9704190, hep-th/9710027 and hep-th/9801053.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, R. G., E-mail: rgh@doe.carleton.ca
2014-01-21
A positive-feedback mean-field modification of the classical Brillouin magnetization theory provides an explanation of the apparent persistence of the spontaneous magnetization beyond the conventional Curie temperature—the little understood “tail” phenomenon that occurs in many ferromagnetic materials. The classical theory is unable to resolve this apparent anomaly. The modified theory incorporates the temperature-dependent quantum-scale hysteretic and mesoscopic domain-scale anhysteretic magnetization processes and includes the effects of demagnetizing and exchange fields. It is found that the thermal behavior of the reversible and irreversible segments of the hysteresis loops, as predicted by the theory, is a key to the presence or absence ofmore » the “tails.” The theory, which permits arbitrary values of the quantum spin number J, generally provides a quantitative agreement with the thermal variations of both the spontaneous magnetization and the shape of the hysteresis loop.« less
Item Response Modeling with Sum Scores
ERIC Educational Resources Information Center
Johnson, Timothy R.
2013-01-01
One of the distinctions between classical test theory and item response theory is that the former focuses on sum scores and their relationship to true scores, whereas the latter concerns item responses and their relationship to latent scores. Although item response theory is often viewed as the richer of the two theories, sum scores are still…
Test Theories, Educational Priorities and Reliability of Public Examinations in England
ERIC Educational Resources Information Center
Baird, Jo-Anne; Black, Paul
2013-01-01
Much has already been written on the controversies surrounding the use of different test theories in educational assessment. Other authors have noted the prevalence of classical test theory over item response theory in practice. This Special Issue draws together articles based upon work conducted on the Reliability Programme for England's…
Recent developments in bimetric theory
NASA Astrophysics Data System (ADS)
Schmidt-May, Angnis; von Strauss, Mikael
2016-05-01
This review is dedicated to recent progress in the field of classical, interacting, massive spin-2 theories, with a focus on ghost-free bimetric theory. We will outline its history and its development as a nontrivial extension and generalisation of nonlinear massive gravity. We present a detailed discussion of the consistency proofs of both theories, before we review Einstein solutions to the bimetric equations of motion in vacuum as well as the resulting mass spectrum. We introduce couplings to matter and then discuss the general relativity and massive gravity limits of bimetric theory, which correspond to decoupling the massive or the massless spin-2 field from the matter sector, respectively. More general classical solutions are reviewed and the present status of bimetric cosmology is summarised. An interesting corner in the bimetric parameter space which could potentially give rise to a nonlinear theory for partially massless spin-2 fields is also discussed. Relations to higher-curvature theories of gravity are explained and finally we give an overview of possible extensions of the theory and review its formulation in terms of vielbeins.
NASA Technical Reports Server (NTRS)
Jones, R. T. (Compiler)
1979-01-01
A collection of papers on modern theoretical aerodynamics is presented. Included are theories of incompressible potential flow and research on the aerodynamic forces on wing and wing sections of aircraft and on airship hulls.
Scalar gravitational waves in the effective theory of gravity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mottola, Emil
As a low energy effective field theory, classical General Relativity receives an infrared relevant modification from the conformal trace anomaly of the energy-momentum tensor of massless, or nearly massless, quantum fields. The local form of the effective action associated with the trace anomaly is expressed in terms of a dynamical scalar field that couples to the conformal factor of the spacetime metric, allowing it to propagate over macroscopic distances. Linearized around flat spacetime, this semi-classical EFT admits scalar gravitational wave solutions in addition to the transversely polarized tensor waves of the classical Einstein theory. The amplitude of the scalar wavemore » modes, as well as their energy and energy flux which are positive and contain a monopole moment, are computed. As a result, astrophysical sources for scalar gravitational waves are considered, with the excited gluonic condensates in the interiors of neutron stars in merger events with other compact objects likely to provide the strongest burst signals.« less
NASA Astrophysics Data System (ADS)
Zamani Kouhpanji, Mohammad Reza; Behzadirad, Mahmoud; Busani, Tito
2017-12-01
We used the stable strain gradient theory including acceleration gradients to investigate the classical and nonclassical mechanical properties of gallium nitride (GaN) nanowires (NWs). We predicted the static length scales, Young's modulus, and shear modulus of the GaN NWs from the experimental data. Combining these results with atomic simulations, we also found the dynamic length scale of the GaN NWs. Young's modulus, shear modulus, static, and dynamic length scales were found to be 318 GPa, 131 GPa, 8 nm, and 8.9 nm, respectively, usable for demonstrating the static and dynamic behaviors of GaN NWs having diameters from a few nm to bulk dimensions. Furthermore, the experimental data were analyzed with classical continuum theory (CCT) and compared with the available literature to illustrate the size-dependency of the mechanical properties of GaN NWs. This practice resolves the previous published discrepancies that happened due to the limitations of CCT used for determining the mechanical properties of GaN NWs and their size-dependency.
Banik, Suman Kumar; Bag, Bidhan Chandra; Ray, Deb Shankar
2002-05-01
Traditionally, quantum Brownian motion is described by Fokker-Planck or diffusion equations in terms of quasiprobability distribution functions, e.g., Wigner functions. These often become singular or negative in the full quantum regime. In this paper a simple approach to non-Markovian theory of quantum Brownian motion using true probability distribution functions is presented. Based on an initial coherent state representation of the bath oscillators and an equilibrium canonical distribution of the quantum mechanical mean values of their coordinates and momenta, we derive a generalized quantum Langevin equation in c numbers and show that the latter is amenable to a theoretical analysis in terms of the classical theory of non-Markovian dynamics. The corresponding Fokker-Planck, diffusion, and Smoluchowski equations are the exact quantum analogs of their classical counterparts. The present work is independent of path integral techniques. The theory as developed here is a natural extension of its classical version and is valid for arbitrary temperature and friction (the Smoluchowski equation being considered in the overdamped limit).
Density-functional theory simulation of large quantum dots
NASA Astrophysics Data System (ADS)
Jiang, Hong; Baranger, Harold U.; Yang, Weitao
2003-10-01
Kohn-Sham spin-density functional theory provides an efficient and accurate model to study electron-electron interaction effects in quantum dots, but its application to large systems is a challenge. Here an efficient method for the simulation of quantum dots using density-function theory is developed; it includes the particle-in-the-box representation of the Kohn-Sham orbitals, an efficient conjugate-gradient method to directly minimize the total energy, a Fourier convolution approach for the calculation of the Hartree potential, and a simplified multigrid technique to accelerate the convergence. We test the methodology in a two-dimensional model system and show that numerical studies of large quantum dots with several hundred electrons become computationally affordable. In the noninteracting limit, the classical dynamics of the system we study can be continuously varied from integrable to fully chaotic. The qualitative difference in the noninteracting classical dynamics has an effect on the quantum properties of the interacting system: integrable classical dynamics leads to higher-spin states and a broader distribution of spacing between Coulomb blockade peaks.
Scalar gravitational waves in the effective theory of gravity
Mottola, Emil
2017-07-10
As a low energy effective field theory, classical General Relativity receives an infrared relevant modification from the conformal trace anomaly of the energy-momentum tensor of massless, or nearly massless, quantum fields. The local form of the effective action associated with the trace anomaly is expressed in terms of a dynamical scalar field that couples to the conformal factor of the spacetime metric, allowing it to propagate over macroscopic distances. Linearized around flat spacetime, this semi-classical EFT admits scalar gravitational wave solutions in addition to the transversely polarized tensor waves of the classical Einstein theory. The amplitude of the scalar wavemore » modes, as well as their energy and energy flux which are positive and contain a monopole moment, are computed. As a result, astrophysical sources for scalar gravitational waves are considered, with the excited gluonic condensates in the interiors of neutron stars in merger events with other compact objects likely to provide the strongest burst signals.« less
An appraisal of the classic forest succession paradigm with the shade tolerance index
Jean Lienard; Ionut Florescu; Nikolay Strigul
2015-01-01
We revisit the classic theory of forest succession that relates shade tolerance and species replacement and assess its validity to understand patch-mosaic patterns of forested ecosystems of the USA. We introduce a macroscopic parameter called the âshade tolerance indexâ and compare it to the classic continuum index in southern Wisconsin forests. We exemplify shade...
Noncommutative gauge theory for Poisson manifolds
NASA Astrophysics Data System (ADS)
Jurčo, Branislav; Schupp, Peter; Wess, Julius
2000-09-01
A noncommutative gauge theory is associated to every Abelian gauge theory on a Poisson manifold. The semi-classical and full quantum version of the map from the ordinary gauge theory to the noncommutative gauge theory (Seiberg-Witten map) is given explicitly to all orders for any Poisson manifold in the Abelian case. In the quantum case the construction is based on Kontsevich's formality theorem.
The Foundations of Einstein's Theory of Gravitation
NASA Astrophysics Data System (ADS)
Freundlich, Erwin; Brose, Translated by Henry L.; Einstein, Preface by Albert; Turner, Introduction by H. H.
2011-06-01
Introduction; 1. The special theory of relativity as a stepping-stone to the general theory of relativity; 2. Two fundamental postulates in the mathematical formulation of physical laws; 3. Concerning the fulfilment of the two postulates; 4. The difficulties in the principles of classical mechanics; 5. Einstein's theory of gravitation; 6. The verification of the new theory by actual experience; Appendix; Index.
ERIC Educational Resources Information Center
Besson, Ugo
2013-01-01
This paper presents a history of research and theories on sliding friction between solids. This history is divided into four phases: from Leonardo da Vinci to Coulomb and the establishment of classical laws of friction; the theories of lubrication and the Tomlinson's theory of friction (1850-1930); the theories of wear, the Bowden and Tabor's…
NASA Technical Reports Server (NTRS)
Stein, Manuel; Sydow, P. Daniel; Librescu, Liviu
1990-01-01
Buckling and postbuckling results are presented for compression-loaded simply-supported aluminum plates and composite plates with a symmetric lay-up of thin + or - 45 deg plies composed of many layers. Buckling results for aluminum plates of finite length are given for various length-to-width ratios. Asymptotes to the curves based on buckling results give N(sub xcr) for plates of infinite length. Postbuckling results for plates with transverse shearing flexibility are compared to results from classical theory for various width-to-thickness ratios. Characteristic curves indicating the average longitudinal direct stress resultant as a function of the applied displacements are calculated based on four different theories: Classical von Karman theory using the Kirchoff assumptions, first-order shear deformation theory, higher-order shear deformation theory, and 3-D flexibility theory. Present results indicate that the 3-D flexibility theory gives the lowest buckling loads. The higher-order shear deformation theory has fewer unknowns than the 3-D flexibility theory but does not take into account through-the-thickness effects. The figures presented show that small differences occur in the average longitudinal direct stress resultants from the four theories that are functions of applied end-shortening displacement.
Cognitive-Behavioral Therapy. Second Edition. Theories of Psychotherapy Series
ERIC Educational Resources Information Center
Craske, Michelle G.
2017-01-01
In this revised edition of "Cognitive-Behavioral Therapy," Michelle G. Craske discusses the history, theory, and practice of this commonly practiced therapy. Cognitive-behavioral therapy (CBT) originated in the science and theory of classical and instrumental conditioning when cognitive principles were adopted following dissatisfaction…
A Transferrable Belief Model Representation for Physical Security of Nuclear Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Gerts
This work analyzed various probabilistic methods such as classic statistics, Bayesian inference, possibilistic theory, and Dempster-Shafer theory of belief functions for the potential insight offered into the physical security of nuclear materials as well as more broad application to nuclear non-proliferation automated decision making theory. A review of the fundamental heuristic and basic limitations of each of these methods suggested that the Dempster-Shafer theory of belief functions may offer significant capability. Further examination of the various interpretations of Dempster-Shafer theory, such as random set, generalized Bayesian, and upper/lower probability demonstrate some limitations. Compared to the other heuristics, the transferrable beliefmore » model (TBM), one of the leading interpretations of Dempster-Shafer theory, can improve the automated detection of the violation of physical security using sensors and human judgment. The improvement is shown to give a significant heuristic advantage over other probabilistic options by demonstrating significant successes for several classic gedanken experiments.« less
NASA Astrophysics Data System (ADS)
Frič, Roman; Papčo, Martin
2017-12-01
Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.
There is no fitness but fitness, and the lineage is its bearer
2016-01-01
Inclusive fitness has been the cornerstone of social evolution theory for more than a half-century and has matured as a mathematical theory in the past 20 years. Yet surprisingly for a theory so central to an entire field, some of its connections to evolutionary theory more broadly remain contentious or underappreciated. In this paper, we aim to emphasize the connection between inclusive fitness and modern evolutionary theory through the following fact: inclusive fitness is simply classical Darwinian fitness, averaged over social, environmental and demographic states that members of a gene lineage experience. Therefore, inclusive fitness is neither a generalization of classical fitness, nor does it belong exclusively to the individual. Rather, the lineage perspective emphasizes that evolutionary success is determined by the effect of selection on all biological and environmental contexts that a lineage may experience. We argue that this understanding of inclusive fitness based on gene lineages provides the most illuminating and accurate picture and avoids pitfalls in interpretation and empirical applications of inclusive fitness theory. PMID:26729925
First Test of Long-Range Collisional Drag via Plasma Wave Damping
NASA Astrophysics Data System (ADS)
Affolter, Matthew
2017-10-01
In magnetized plasmas, the rate of particle collisions is enhanced over classical predictions when the cyclotron radius rc is less than the Debye length λD. Classical theories describe local velocity scattering collisions with impact parameters ρ
NASA Astrophysics Data System (ADS)
Surana, K. S.; Joy, A. D.; Reddy, J. N.
2017-03-01
This paper presents a non-classical continuum theory in Lagrangian description for solids in which the conservation and the balance laws are derived by incorporating both the internal rotations arising from the Jacobian of deformation and the rotations of Cosserat theories at a material point. In particular, in this non-classical continuum theory, we have (i) the usual displacements ( ±b \\varvec{u}) and (ii) three internal rotations ({}_i ±b \\varvec{Θ}) about the axes of a triad whose axes are parallel to the x-frame arising from the Jacobian of deformation (which are completely defined by the skew-symmetric part of the Jacobian of deformation), and (iii) three additional rotations ({}_e ±b \\varvec{Θ}) about the axes of the same triad located at each material point as additional three degrees of freedom referred to as Cosserat rotations. This gives rise to ±b \\varvec{u} and {}_e ±b \\varvec{{Θ} as six degrees of freedom at a material point. The internal rotations ({}_i ±b \\varvec{Θ}), often neglected in classical continuum mechanics, exist in all deforming solid continua as these are due to Jacobian of deformation. When the internal rotations {}_i ±b \\varvec{Θ} are resisted by the deforming matter, conjugate moment tensor arises that together with {}_i ±b \\varvec{Θ} may result in energy storage and/or dissipation, which must be accounted for in the conservation and the balance laws. The Cosserat rotations {}_e ±b \\varvec{Θ} also result in conjugate moment tensor which, together with {}_e ±b \\varvec{Θ}, may also result in energy storage and/or dissipation. The main focus of the paper is a consistent derivation of conservation and balance laws that incorporate aforementioned physics and associated constitutive theories for thermoelastic solids. The mathematical model derived here has closure, and the constitutive theories derived using two alternate approaches are in agreement with each other as well as with the condition resulting from the entropy inequality. Material coefficients introduced in the constitutive theories are clearly defined and discussed.
Action and entanglement in gravity and field theory.
Neiman, Yasha
2013-12-27
In nongravitational quantum field theory, the entanglement entropy across a surface depends on the short-distance regularization. Quantum gravity should not require such regularization, and it has been conjectured that the entanglement entropy there is always given by the black hole entropy formula evaluated on the entangling surface. We show that these statements have precise classical counterparts at the level of the action. Specifically, we point out that the action can have a nonadditive imaginary part. In gravity, the latter is fixed by the black hole entropy formula, while in nongravitating theories it is arbitrary. From these classical facts, the entanglement entropy conjecture follows by heuristically applying the relation between actions and wave functions.
Inelastic black hole scattering from charged scalar amplitudes
NASA Astrophysics Data System (ADS)
Luna, Andrés; Nicholson, Isobel; O'Connell, Donal; White, Chris D.
2018-03-01
We explain how the lowest-order classical gravitational radiation produced during the inelastic scattering of two Schwarzschild black holes in General Relativity can be obtained from a tree scattering amplitude in gauge theory coupled to scalar fields. The gauge calculation is related to gravity through the double copy. We remove unwanted scalar forces which can occur in the double copy by introducing a massless scalar in the gauge theory, which is treated as a ghost in the link to gravity. We hope these methods are a step towards a direct application of the double copy at higher orders in classical perturbation theory, with the potential to greatly streamline gravity calculations for phenomenological applications.
1987-09-01
response. An estimate of the buffeting response for the two cases is presented in Figure 4, using the theory of Irwin (Reference 7). Data acquisition was...values were obtained using the log decrement method by exciting the bridge in one mode and observing the decay of the response. Classical theory would...added mass or structural damping level. The addition of inertia to the deck would tend to lower the response according to classical vibration theory
An entropy method for induced drag minimization
NASA Technical Reports Server (NTRS)
Greene, George C.
1989-01-01
A fundamentally new approach to the aircraft minimum induced drag problem is presented. The method, a 'viscous lifting line', is based on the minimum entropy production principle and does not require the planar wake assumption. An approximate, closed form solution is obtained for several wing configurations including a comparison of wing extension, winglets, and in-plane wing sweep, with and without a constraint on wing-root bending moment. Like the classical lifting-line theory, this theory predicts that induced drag is proportional to the square of the lift coefficient and inversely proportioinal to the wing aspect ratio. Unlike the classical theory, it predicts that induced drag is Reynolds number dependent and that the optimum spanwise circulation distribution is non-elliptic.
A novel approach to the theory of homogeneous and heterogeneous nucleation.
Ruckenstein, Eli; Berim, Gersh O; Narsimhan, Ganesan
2015-01-01
A new approach to the theory of nucleation, formulated relatively recently by Ruckenstein, Narsimhan, and Nowakowski (see Refs. [7-16]) and developed further by Ruckenstein and other colleagues, is presented. In contrast to the classical nucleation theory, which is based on calculating the free energy of formation of a cluster of the new phase as a function of its size on the basis of macroscopic thermodynamics, the proposed theory uses the kinetic theory of fluids to calculate the condensation (W(+)) and dissociation (W(-)) rates on and from the surface of the cluster, respectively. The dissociation rate of a monomer from a cluster is evaluated from the average time spent by a surface monomer in the potential well as obtained from the solution of the Fokker-Planck equation in the phase space of position and momentum for liquid-to-solid transition and the phase space of energy for vapor-to-liquid transition. The condensation rates are calculated using traditional expressions. The knowledge of those two rates allows one to calculate the size of the critical cluster from the equality W(+)=W(-) as well as the rate of nucleation. The developed microscopic approach allows one to avoid the controversial application of classical thermodynamics to the description of nuclei which contain a few molecules. The new theory was applied to a number of cases, such as the liquid-to-solid and vapor-to-liquid phase transitions, binary nucleation, heterogeneous nucleation, nucleation on soluble particles and protein folding. The theory predicts higher nucleation rates at high saturation ratios (small critical clusters) than the classical nucleation theory for both solid-to-liquid as well as vapor-to-liquid transitions. As expected, at low saturation ratios for which the size of the critical cluster is large, the results of the new theory are consistent with those of the classical one. The present approach was combined with the density functional theory to account for the density profile in the cluster. This approach was also applied to protein folding, viewed as the evolution of a cluster of native residues of spherical shape within a protein molecule, which could explain protein folding/unfolding and their dependence on temperature. Copyright © 2014 Elsevier B.V. All rights reserved.
Branes and the Kraft-Procesi transition: classical case
NASA Astrophysics Data System (ADS)
Cabrera, Santiago; Hanany, Amihay
2018-04-01
Moduli spaces of a large set of 3 d N=4 effective gauge theories are known to be closures of nilpotent orbits. This set of theories has recently acquired a special status, due to Namikawa's theorem. As a consequence of this theorem, closures of nilpotent orbits are the simplest non-trivial moduli spaces that can be found in three dimensional theories with eight supercharges. In the early 80's mathematicians Hanspeter Kraft and Claudio Procesi characterized an inclusion relation between nilpotent orbit closures of the same classical Lie algebra. We recently [1] showed a physical realization of their work in terms of the motion of D3-branes on the Type IIB superstring embedding of the effective gauge theories. This analysis is restricted to A-type Lie algebras. The present note expands our previous discussion to the remaining classical cases: orthogonal and symplectic algebras. In order to do so we introduce O3-planes in the superstring description. We also find a brane realization for the mathematical map between two partitions of the same integer number known as collapse. Another result is that basic Kraft-Procesi transitions turn out to be described by the moduli space of orthosymplectic quivers with varying boundary conditions.
Nesterenko, Pavel N; Rybalko, Marina A; Paull, Brett
2005-06-01
Significant deviations from classical van Deemter behaviour, indicative of turbulent flow liquid chromatography, has been recorded for mobile phases of varying viscosity on porous silica monolithic columns at elevated mobile phase flow rates.
Wibral, Michael; Priesemann, Viola; Kay, Jim W; Lizier, Joseph T; Phillips, William A
2017-03-01
In many neural systems anatomical motifs are present repeatedly, but despite their structural similarity they can serve very different tasks. A prime example for such a motif is the canonical microcircuit of six-layered neo-cortex, which is repeated across cortical areas, and is involved in a number of different tasks (e.g. sensory, cognitive, or motor tasks). This observation has spawned interest in finding a common underlying principle, a 'goal function', of information processing implemented in this structure. By definition such a goal function, if universal, cannot be cast in processing-domain specific language (e.g. 'edge filtering', 'working memory'). Thus, to formulate such a principle, we have to use a domain-independent framework. Information theory offers such a framework. However, while the classical framework of information theory focuses on the relation between one input and one output (Shannon's mutual information), we argue that neural information processing crucially depends on the combination of multiple inputs to create the output of a processor. To account for this, we use a very recent extension of Shannon Information theory, called partial information decomposition (PID). PID allows to quantify the information that several inputs provide individually (unique information), redundantly (shared information) or only jointly (synergistic information) about the output. First, we review the framework of PID. Then we apply it to reevaluate and analyze several earlier proposals of information theoretic neural goal functions (predictive coding, infomax and coherent infomax, efficient coding). We find that PID allows to compare these goal functions in a common framework, and also provides a versatile approach to design new goal functions from first principles. Building on this, we design and analyze a novel goal function, called 'coding with synergy', which builds on combining external input and prior knowledge in a synergistic manner. We suggest that this novel goal function may be highly useful in neural information processing. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Palmer, T. N.
2012-12-01
This essay discusses a proposal that draws together the three great revolutionary theories of 20th Century physics: quantum theory, relativity theory and chaos theory. Motivated by the Bohmian notion of implicate order, and what in chaos theory would be described as a strange attractor, the proposal attributes special ontological significance to certain non-computable, dynamically invariant state-space geometries for the universe as a whole. Studying the phenomenon of quantum interference, it is proposed to understand quantum wave-particle duality, and indeed classical electromagnetism, in terms of particles in space time and waves on this state space geometry. Studying the EPR experiment, the acausal constraints that this invariant geometry provides on spatially distant degrees of freedom, provides a way for the underlying dynamics to be consistent with the Bell theorem, yet be relativistically covariant ("nonlocality without nonlocality"). It is suggested that the physical basis for such non-computable geometries lies in properties of gravity with the information irreversibility implied by black hole no-hair theorems being crucial. In conclusion it is proposed that quantum theory may be emergent from an extended theory of gravity which is geometric not only in space time, but also in state space. Such a notion would undermine most current attempts to "quantise gravity".
Peyre, Hugo; Leplège, Alain; Coste, Joël
2011-03-01
Missing items are common in quality of life (QoL) questionnaires and present a challenge for research in this field. It remains unclear which of the various methods proposed to deal with missing data performs best in this context. We compared personal mean score, full information maximum likelihood, multiple imputation, and hot deck techniques using various realistic simulation scenarios of item missingness in QoL questionnaires constructed within the framework of classical test theory. Samples of 300 and 1,000 subjects were randomly drawn from the 2003 INSEE Decennial Health Survey (of 23,018 subjects representative of the French population and having completed the SF-36) and various patterns of missing data were generated according to three different item non-response rates (3, 6, and 9%) and three types of missing data (Little and Rubin's "missing completely at random," "missing at random," and "missing not at random"). The missing data methods were evaluated in terms of accuracy and precision for the analysis of one descriptive and one association parameter for three different scales of the SF-36. For all item non-response rates and types of missing data, multiple imputation and full information maximum likelihood appeared superior to the personal mean score and especially to hot deck in terms of accuracy and precision; however, the use of personal mean score was associated with insignificant bias (relative bias <2%) in all studied situations. Whereas multiple imputation and full information maximum likelihood are confirmed as reference methods, the personal mean score appears nonetheless appropriate for dealing with items missing from completed SF-36 questionnaires in most situations of routine use. These results can reasonably be extended to other questionnaires constructed according to classical test theory.
From Foucault to Freire Through Facebook: Toward an Integrated Theory of mHealth.
Bull, Sheana; Ezeanochie, Nnamdi
2016-08-01
To document the integration of social science theory in literature on mHealth (mobile health) and consider opportunities for integration of classic theory, health communication theory, and social networking to generate a relevant theory for mHealth program design. A secondary review of research syntheses and meta-analyses published between 2005 and 2014 related to mHealth, using the AMSTAR (A Measurement Tool to Assess Systematic Reviews) methodology for assessment of the quality of each review. High-quality articles from those reviews using a randomized controlled design and integrating social science theory in program design, implementation, or evaluation were reviewed. Results There were 1,749 articles among the 170 reviews with a high AMSTAR score (≥30). Only 13 were published from 2005 to 2014, used a randomized controlled design and made explicit mention of theory in any aspect of their mHealth program. All 13 included theoretical perspectives focused on psychological and/or psychosocial theories and constructs. Conclusions There is a very limited use of social science theory in mHealth despite demonstrated benefits in doing so. We propose an integrated theory of mHealth that incorporates classic theory, health communication theory, and social networking to guide development and evaluation of mHealth programs. © 2015 Society for Public Health Education.
Quantum information to the home
NASA Astrophysics Data System (ADS)
Choi, Iris; Young, Robert J.; Townsend, Paul D.
2011-06-01
Information encoded on individual quanta will play an important role in our future lives, much as classically encoded digital information does today. Combining quantum information carried by single photons with classical signals encoded on strong laser pulses in modern fibre-to-the-home (FTTH) networks is a significant challenge, the solution to which will facilitate the global distribution of quantum information to the home and with it a quantum internet [1]. In real-world networks, spontaneous Raman scattering in the optical fibre would induce crosstalk between the high-power classical channels and a single-photon quantum channel, such that the latter is unable to operate. Here, we show that the integration of quantum and classical information on an FTTH network is possible by performing quantum key distribution (QKD) on a network while simultaneously transferring realistic levels of classical data. Our novel scheme involves synchronously interleaving a channel of quantum data with the Raman scattered photons from a classical channel, exploiting the periodic minima in the instantaneous crosstalk and thereby enabling secure QKD to be performed.
Social Comparison: The End of a Theory and the Emergence of a Field
ERIC Educational Resources Information Center
Buunk, Abraham P.; Gibbons, Frederick X.
2007-01-01
The past and current states of research on social comparison are reviewed with regard to a series of major theoretical developments that have occurred in the past 5 decades. These are, in chronological order: (1) classic social comparison theory, (2) fear-affiliation theory, (3) downward comparison theory, (4) social comparison as social…
ERIC Educational Resources Information Center
Gaziano, Cecilie
This paper seeks to integrate some ideas from family systems theory and attachment theory within a theory of public opinion and social movement. Citing the classic "The Authoritarian Personality," the paper states that the first authorities children know, their parents or other caregivers, shape children's attitudes toward all…
Sheaff, R; Lloyd-Kendall, A
2000-07-01
To investigate how far English National Health Service (NHS) Personal Medical Services (PMS) contracts embody a principal-agent relationship between health authorities (HAs) and primary health care providers, especially, but not exclusively, general practices involved in the first wave (1998) of PMS pilot projects; and to consider the implications for relational and classical theories of contract. Content analysis of 71 first-wave PMS contracts. Most PMS contracts reflect current English NHS policy priorities, but few institute mechanisms to ensure that providers realise these objectives. Although PMS contracts have some classical characteristics, relational characteristics are more evident. Some characteristics match neither the classical nor the relational model. First-wave PMS contracts do not appear to embody a strong principal-agent relationship between HAs and primary health care providers. This finding offers little support for the relevance of classical theories of contract, but also implies that relational theories of contract need to be revised for quasi-market settings. Future PMS contracts will need to focus more on evidence-based processes of primary care, health outputs and patient satisfaction and less upon service inputs. PMS contracts will also need to be longer-term contracts in order to promote the 'institutional embedding' of independent general practice in the wider management systems of the NHS.
Pure sources and efficient detectors for optical quantum information processing
NASA Astrophysics Data System (ADS)
Zielnicki, Kevin
Over the last sixty years, classical information theory has revolutionized the understanding of the nature of information, and how it can be quantified and manipulated. Quantum information processing extends these lessons to quantum systems, where the properties of intrinsic uncertainty and entanglement fundamentally defy classical explanation. This growing field has many potential applications, including computing, cryptography, communication, and metrology. As inherently mobile quantum particles, photons are likely to play an important role in any mature large-scale quantum information processing system. However, the available methods for producing and detecting complex multi-photon states place practical limits on the feasibility of sophisticated optical quantum information processing experiments. In a typical quantum information protocol, a source first produces an interesting or useful quantum state (or set of states), perhaps involving superposition or entanglement. Then, some manipulations are performed on this state, perhaps involving quantum logic gates which further manipulate or entangle the intial state. Finally, the state must be detected, obtaining some desired measurement result, e.g., for secure communication or computationally efficient factoring. The work presented here concerns the first and last stages of this process as they relate to photons: sources and detectors. Our work on sources is based on the need for optimized non-classical states of light delivered at high rates, particularly of single photons in a pure quantum state. We seek to better understand the properties of spontaneous parameteric downconversion (SPDC) sources of photon pairs, and in doing so, produce such an optimized source. We report an SPDC source which produces pure heralded single photons with little or no spectral filtering, allowing a significant rate enhancement. Our work on detectors is based on the need to reliably measure single-photon states. We have focused on optimizing the detection efficiency of visible light photon counters (VLPCs), a single-photon detection technology that is also capable of resolving photon number states. We report a record-breaking quantum efficiency of 91 +/- 3% observed with our detection system. Both sources and detectors are independently interesting physical systems worthy of study, but together they promise to enable entire new classes and applications of information based on quantum mechanics.
Linear Quantum Systems: Non-Classical States and Robust Stability
2016-06-29
quantum linear systems subject to non-classical quantum fields. The major outcomes of this project are (i) derivation of quantum filtering equations for...derivation of quantum filtering equations for systems non-classical input states including single photon states, (ii) determination of how linear...history going back some 50 years, to the birth of modern control theory with Kalman’s foundational work on filtering and LQG optimal control
An Arbitrary First Order Theory Can Be Represented by a Program: A Theorem
NASA Technical Reports Server (NTRS)
Hosheleva, Olga
1997-01-01
How can we represent knowledge inside a computer? For formalized knowledge, classical logic seems to be the most adequate tool. Classical logic is behind all formalisms of classical mathematics, and behind many formalisms used in Artificial Intelligence. There is only one serious problem with classical logic: due to the famous Godel's theorem, classical logic is algorithmically undecidable; as a result, when the knowledge is represented in the form of logical statements, it is very difficult to check whether, based on this statement, a given query is true or not. To make knowledge representations more algorithmic, a special field of logic programming was invented. An important portion of logic programming is algorithmically decidable. To cover knowledge that cannot be represented in this portion, several extensions of the decidable fragments have been proposed. In the spirit of logic programming, these extensions are usually introduced in such a way that even if a general algorithm is not available, good heuristic methods exist. It is important to check whether the already proposed extensions are sufficient, or further extensions is necessary. In the present paper, we show that one particular extension, namely, logic programming with classical negation, introduced by M. Gelfond and V. Lifschitz, can represent (in some reasonable sense) an arbitrary first order logical theory.
Derivation of Einstein-Cartan theory from general relativity
NASA Astrophysics Data System (ADS)
Petti, Richard
2015-04-01
General relativity cannot describe exchange of classical intrinsic angular momentum and orbital angular momentum. Einstein-Cartan theory fixes this problem in the least invasive way. In the late 20th century, the consensus view was that Einstein-Cartan theory requires inclusion of torsion without adequate justification, it has no empirical support (though it doesn't conflict with any known evidence), it solves no important problem, and it complicates gravitational theory with no compensating benefit. In 1986 the author published a derivation of Einstein-Cartan theory from general relativity, with no additional assumptions or parameters. Starting without torsion, Poincaré symmetry, classical or quantum spin, or spinors, it derives torsion and its relation to spin from a continuum limit of general relativistic solutions. The present work makes the case that this computation, combined with supporting arguments, constitutes a derivation of Einstein-Cartan theory from general relativity, not just a plausibility argument. This paper adds more and simpler explanations, more computational details, correction of a factor of 2, discussion of limitations of the derivation, and discussion of some areas of gravitational research where Einstein-Cartan theory is relevant.
Pollard, Beth; Dixon, Diane; Dieppe, Paul; Johnston, Marie
2009-01-01
Background The International Classification of Functioning, Disability and Health (ICF) proposes three main health outcomes, Impairment (I), Activity Limitation (A) and Participation Restriction (P), but good measures of these constructs are needed The aim of this study was to use both Classical Test Theory (CTT) and Item Response Theory (IRT) methods to carry out an item analysis to improve measurement of these three components in patients having joint replacement surgery mainly for osteoarthritis (OA). Methods A geographical cohort of patients about to undergo lower limb joint replacement was invited to participate. Five hundred and twenty four patients completed ICF items that had been previously identified as measuring only a single ICF construct in patients with osteoarthritis. There were 13 I, 26 A and 20 P items. The SF-36 was used to explore the construct validity of the resultant I, A and P measures. The CTT and IRT analyses were run separately to identify items for inclusion or exclusion in the measurement of each construct. The results from both analyses were compared and contrasted. Results Overall, the item analysis resulted in the removal of 4 I items, 9 A items and 11 P items. CTT and IRT identified the same 14 items for removal, with CTT additionally excluding 3 items, and IRT a further 7 items. In a preliminary exploration of reliability and validity, the new measures appeared acceptable. Conclusion New measures were developed that reflect the ICF components of Impairment, Activity Limitation and Participation Restriction for patients with advanced arthritis. The resulting Aberdeen IAP measures (Ab-IAP) comprising I (Ab-I, 9 items), A (Ab-A, 17 items), and P (Ab-P, 9 items) met the criteria of conventional psychometric (CTT) analyses and the additional criteria (information and discrimination) of IRT. The use of both methods was more informative than the use of only one of these methods. Thus combining CTT and IRT appears to be a valuable tool in the development of measures. PMID:19422677
Is the firewall consistent? Gedanken experiments on black hole complementarity and firewall proposal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hwang, Dong-il; Lee, Bum-Hoon; Yeom, Dong-han, E-mail: dongil.j.hwang@gmail.com, E-mail: bhl@sogang.ac.kr, E-mail: innocent.yeom@gmail.com
2013-01-01
In this paper, we discuss the black hole complementarity and the firewall proposal at length. Black hole complementarity is inevitable if we assume the following five things: unitarity, entropy-area formula, existence of an information observer, semi-classical quantum field theory for an asymptotic observer, and the general relativity for an in-falling observer. However, large N rescaling and the AMPS argument show that black hole complementarity is inconsistent. To salvage the basic philosophy of the black hole complementarity, AMPS introduced a firewall around the horizon. According to large N rescaling, the firewall should be located close to the apparent horizon. We investigatemore » the consistency of the firewall with the two critical conditions: the firewall should be near the time-like apparent horizon and it should not affect the future infinity. Concerning this, we have introduced a gravitational collapse with a false vacuum lump which can generate a spacetime structure with disconnected apparent horizons. This reveals a situation that there is a firewall outside of the event horizon, while the apparent horizon is absent. Therefore, the firewall, if it exists, not only does modify the general relativity for an in-falling observer, but also modify the semi-classical quantum field theory for an asymptotic observer.« less
Is the firewall consistent? Gedanken experiments on black hole complementarity and firewall proposal
NASA Astrophysics Data System (ADS)
Hwang, Dong-il; Lee, Bum-Hoon; Yeom, Dong-han
2013-01-01
In this paper, we discuss the black hole complementarity and the firewall proposal at length. Black hole complementarity is inevitable if we assume the following five things: unitarity, entropy-area formula, existence of an information observer, semi-classical quantum field theory for an asymptotic observer, and the general relativity for an in-falling observer. However, large N rescaling and the AMPS argument show that black hole complementarity is inconsistent. To salvage the basic philosophy of the black hole complementarity, AMPS introduced a firewall around the horizon. According to large N rescaling, the firewall should be located close to the apparent horizon. We investigate the consistency of the firewall with the two critical conditions: the firewall should be near the time-like apparent horizon and it should not affect the future infinity. Concerning this, we have introduced a gravitational collapse with a false vacuum lump which can generate a spacetime structure with disconnected apparent horizons. This reveals a situation that there is a firewall outside of the event horizon, while the apparent horizon is absent. Therefore, the firewall, if it exists, not only does modify the general relativity for an in-falling observer, but also modify the semi-classical quantum field theory for an asymptotic observer.
Towards a General Model of Temporal Discounting
ERIC Educational Resources Information Center
van den Bos, Wouter; McClure, Samuel M.
2013-01-01
Psychological models of temporal discounting have now successfully displaced classical economic theory due to the simple fact that many common behavior patterns, such as impulsivity, were unexplainable with classic models. However, the now dominant hyperbolic model of discounting is itself becoming increasingly strained. Numerous factors have…
The evolution of consciousness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stapp, H.P.
1996-08-16
It is argued that the principles of classical physics are inimical to the development of an adequate science of consciousness. The problem is that insofar as the classical principles are valid consciousness can have no effect on the behavior, and hence on the survival prospects, of the organisms in which it inheres. Thus within the classical framework it is not possible to explain in natural terms the development of consciousness to the high-level form found in human beings. In quantum theory, on the other hand, consciousness can be dynamically efficacious: quantum theory does allow consciousness to influence behavior, and thencemore » to evolve in accordance with the principles of natural selection. However, this evolutionary requirement places important constraints upon the details of the formulation of the quantum dynamical principles.« less
Classical and quantum production of cornucopions at energies below 1018 GeV
NASA Astrophysics Data System (ADS)
Banks, T.; O'loughlin, M.
1993-01-01
We argue that the paradoxes associated with infinitely degenerate states, which plague relic particle scenarios for the end point of black hole evaporation, may be absent when the relics are horned particles. Most of our arguments are based on simple observations about the classical geometry of extremal dilaton black holes, but at a crucial point we are forced to speculate about classical solutions to string theory in which the infinite coupling singularity of the extremal dilaton solution is shielded by a condensate of massless modes propagating in its infinite horn. We use the nonsingular c=1 solution of (1+1)-dimensional string theory as a crude model for the properties of the condensate. We also present a brief discussion of more general relic scenarios based on large relics of low mass.
Classically and quantum stable emergent universe from conservation laws
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campo, Sergio del; Herrera, Ramón; Guendelman, Eduardo I.
It has been recently pointed out by Mithani-Vilenkin [1-4] that certain emergent universe scenarios which are classically stable are nevertheless unstable semiclassically to collapse. Here, we show that there is a class of emergent universes derived from scale invariant two measures theories with spontaneous symmetry breaking (s.s.b) of the scale invariance, which can have both classical stability and do not suffer the instability pointed out by Mithani-Vilenkin towards collapse. We find that this stability is due to the presence of a symmetry in the 'emergent phase', which together with the non linearities of the theory, does not allow that themore » FLRW scale factor to be smaller that a certain minimum value a {sub 0} in a certain protected region.« less
Visual working memory buffers information retrieved from visual long-term memory.
Fukuda, Keisuke; Woodman, Geoffrey F
2017-05-16
Human memory is thought to consist of long-term storage and short-term storage mechanisms, the latter known as working memory. Although it has long been assumed that information retrieved from long-term memory is represented in working memory, we lack neural evidence for this and need neural measures that allow us to watch this retrieval into working memory unfold with high temporal resolution. Here, we show that human electrophysiology can be used to track information as it is brought back into working memory during retrieval from long-term memory. Specifically, we found that the retrieval of information from long-term memory was limited to just a few simple objects' worth of information at once, and elicited a pattern of neurophysiological activity similar to that observed when people encode new information into working memory. Our findings suggest that working memory is where information is buffered when being retrieved from long-term memory and reconcile current theories of memory retrieval with classic notions about the memory mechanisms involved.
Visual working memory buffers information retrieved from visual long-term memory
Fukuda, Keisuke; Woodman, Geoffrey F.
2017-01-01
Human memory is thought to consist of long-term storage and short-term storage mechanisms, the latter known as working memory. Although it has long been assumed that information retrieved from long-term memory is represented in working memory, we lack neural evidence for this and need neural measures that allow us to watch this retrieval into working memory unfold with high temporal resolution. Here, we show that human electrophysiology can be used to track information as it is brought back into working memory during retrieval from long-term memory. Specifically, we found that the retrieval of information from long-term memory was limited to just a few simple objects’ worth of information at once, and elicited a pattern of neurophysiological activity similar to that observed when people encode new information into working memory. Our findings suggest that working memory is where information is buffered when being retrieved from long-term memory and reconcile current theories of memory retrieval with classic notions about the memory mechanisms involved. PMID:28461479
Semi-classical Reissner-Nordstrom model for the structure of charged leptons
NASA Technical Reports Server (NTRS)
Rosen, G.
1980-01-01
The lepton self-mass problem is examined within the framework of the quantum theory of electromagnetism and gravity. Consideration is given to the Reissner-Nordstrom solution to the Einstein-Maxwell classical field equations for an electrically charged mass point, and the WKB theory for a semiclassical system with total energy zero is used to obtain an expression for the Einstein-Maxwell action factor. The condition obtained is found to account for the observed mass values of the three charged leptons, and to be in agreement with the correspondence principle.
Redundancy of constraints in the classical and quantum theories of gravitation.
NASA Technical Reports Server (NTRS)
Moncrief, V.
1972-01-01
It is shown that in Dirac's version of the quantum theory of gravitation, the Hamiltonian constraints are greatly redundant. If the Hamiltonian constraint condition is satisfied at one point on the underlying, closed three-dimensional manifold, then it is automatically satisfied at every point, provided only that the momentum constraints are everywhere satisfied. This permits one to replace the usual infinity of Hamiltonian constraints by a single condition which may be taken in the form of an integral over the manifold. Analogous theorems are given for the classical Einstein Hamilton-Jacobi equations.
Quantum gambling based on Nash-equilibrium
NASA Astrophysics Data System (ADS)
Zhang, Pei; Zhou, Xiao-Qi; Wang, Yun-Long; Liu, Bi-Heng; Shadbolt, Pete; Zhang, Yong-Sheng; Gao, Hong; Li, Fu-Li; O'Brien, Jeremy L.
2017-06-01
The problem of establishing a fair bet between spatially separated gambler and casino can only be solved in the classical regime by relying on a trusted third party. By combining Nash-equilibrium theory with quantum game theory, we show that a secure, remote, two-party game can be played using a quantum gambling machine which has no classical counterpart. Specifically, by modifying the Nash-equilibrium point we can construct games with arbitrary amount of bias, including a game that is demonstrably fair to both parties. We also report a proof-of-principle experimental demonstration using linear optics.
The energy-momentum tensor(s) in classical gauge theories
Blaschke, Daniel N.; Gieres, François; Reboud, Méril; ...
2016-07-12
We give an introduction to, and review of, the energy-momentum tensors in classical gauge field theories in Minkowski space, and to some extent also in curved space-time. For the canonical energy-momentum tensor of non-Abelian gauge fields and of matter fields coupled to such fields, we present a new and simple improvement procedure based on gauge invariance for constructing a gauge invariant, symmetric energy-momentum tensor. In conclusion, the relationship with the Einstein-Hilbert tensor following from the coupling to a gravitational field is also discussed.
Stochastic game theory: for playing games, not just for doing theory.
Goeree, J K; Holt, C A
1999-09-14
Recent theoretical advances have dramatically increased the relevance of game theory for predicting human behavior in interactive situations. By relaxing the classical assumptions of perfect rationality and perfect foresight, we obtain much improved explanations of initial decisions, dynamic patterns of learning and adjustment, and equilibrium steady-state distributions.
On coupling NEC-violating matter to gravity
Chatterjee, Saugata; Parikh, Maulik; van der Schaar, Jan Pieter
2015-03-16
We show that effective theories of matter that classically violate the null energy condition cannot be minimally coupled to Einstein gravity without being inconsistent with both string theory and black hole thermodynamics. We argue however that they could still be either non-minimally coupled or coupled to higher-curvature theories of gravity.