Chance, determinism and the classical theory of probability.
Vasudevan, Anubav
2018-02-01
This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.
Topological and Orthomodular Modeling of Context in Behavioral Science
NASA Astrophysics Data System (ADS)
Narens, Louis
2017-02-01
Two non-boolean methods are discussed for modeling context in behavioral data and theory. The first is based on intuitionistic logic, which is similar to classical logic except that not every event has a complement. Its probability theory is also similar to classical probability theory except that the definition of probability function needs to be generalized to unions of events instead of applying only to unions of disjoint events. The generalization is needed, because intuitionistic event spaces may not contain enough disjoint events for the classical definition to be effective. The second method develops a version of quantum logic for its underlying probability theory. It differs from Hilbert space logic used in quantum mechanics as a foundation for quantum probability theory in variety of ways. John von Neumann and others have commented about the lack of a relative frequency approach and a rational foundation for this probability theory. This article argues that its version of quantum probability theory does not have such issues. The method based on intuitionistic logic is useful for modeling cognitive interpretations that vary with context, for example, the mood of the decision maker, the context produced by the influence of other items in a choice experiment, etc. The method based on this article's quantum logic is useful for modeling probabilities across contexts, for example, how probabilities of events from different experiments are related.
NASA Astrophysics Data System (ADS)
Frič, Roman; Papčo, Martin
2017-12-01
Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.
Quantum-correlation breaking channels, quantum conditional probability and Perron-Frobenius theory
NASA Astrophysics Data System (ADS)
Chruściński, Dariusz
2013-03-01
Using the quantum analog of conditional probability and classical Bayes theorem we discuss some aspects of particular entanglement breaking channels: quantum-classical and classical-classical channels. Applying the quantum analog of Perron-Frobenius theorem we generalize the recent result of Korbicz et al. (2012) [8] on full and spectrum broadcasting from quantum-classical channels to arbitrary quantum channels.
Generalized Quantum Theory of Bianchi IX Cosmologies
NASA Astrophysics Data System (ADS)
Craig, David; Hartle, James
2003-04-01
We apply sum-over-histories generalized quantum theory to the closed homogeneous minisuperspace Bianchi IX cosmological model. We sketch how the probabilities in decoherent sets of alternative, coarse-grained histories of this model universe are calculated. We consider in particular, the probabilities for classical evolution in a suitable coarse-graining. For a restricted class of initial conditions and coarse grainings we exhibit the approximate decoherence of alternative histories in which the universe behaves classically and those in which it does not, illustrating the prediction that these universes will evolve in an approximately classical manner with a probability near unity.
Generalized quantum theory of recollapsing homogeneous cosmologies
NASA Astrophysics Data System (ADS)
Craig, David; Hartle, James B.
2004-06-01
A sum-over-histories generalized quantum theory is developed for homogeneous minisuperspace type A Bianchi cosmological models, focusing on the particular example of the classically recollapsing Bianchi type-IX universe. The decoherence functional for such universes is exhibited. We show how the probabilities of decoherent sets of alternative, coarse-grained histories of these model universes can be calculated. We consider in particular the probabilities for classical evolution defined by a suitable coarse graining. For a restricted class of initial conditions and coarse grainings we exhibit the approximate decoherence of alternative histories in which the universe behaves classically and those in which it does not. For these situations we show that the probability is near unity for the universe to recontract classically if it expands classically. We also determine the relative probabilities of quasiclassical trajectories for initial states of WKB form, recovering for such states a precise form of the familiar heuristic “JṡdΣ” rule of quantum cosmology, as well as a generalization of this rule to generic initial states.
Classical Physics and the Bounds of Quantum Correlations.
Frustaglia, Diego; Baltanás, José P; Velázquez-Ahumada, María C; Fernández-Prieto, Armando; Lujambio, Aintzane; Losada, Vicente; Freire, Manuel J; Cabello, Adán
2016-06-24
A unifying principle explaining the numerical bounds of quantum correlations remains elusive, despite the efforts devoted to identifying it. Here, we show that these bounds are indeed not exclusive to quantum theory: for any abstract correlation scenario with compatible measurements, models based on classical waves produce probability distributions indistinguishable from those of quantum theory and, therefore, share the same bounds. We demonstrate this finding by implementing classical microwaves that propagate along meter-size transmission-line circuits and reproduce the probabilities of three emblematic quantum experiments. Our results show that the "quantum" bounds would also occur in a classical universe without quanta. The implications of this observation are discussed.
Probability Theory, Not the Very Guide of Life
ERIC Educational Resources Information Center
Juslin, Peter; Nilsson, Hakan; Winman, Anders
2009-01-01
Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive…
[Biometric bases: basic concepts of probability calculation].
Dinya, E
1998-04-26
The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.
Teaching Classic Probability Problems With Modern Digital Tools
ERIC Educational Resources Information Center
Abramovich, Sergei; Nikitin, Yakov Yu.
2017-01-01
This article is written to share teaching ideas about using commonly available computer applications--a spreadsheet, "The Geometer's Sketchpad", and "Wolfram Alpha"--to explore three classic and historically significant problems from the probability theory. These ideas stem from the authors' work with prospective economists,…
Non-Kolmogorovian Approach to the Context-Dependent Systems Breaking the Classical Probability Law
NASA Astrophysics Data System (ADS)
Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Yamato, Ichiro
2013-07-01
There exist several phenomena breaking the classical probability laws. The systems related to such phenomena are context-dependent, so that they are adaptive to other systems. In this paper, we present a new mathematical formalism to compute the joint probability distribution for two event-systems by using concepts of the adaptive dynamics and quantum information theory, e.g., quantum channels and liftings. In physics the basic example of the context-dependent phenomena is the famous double-slit experiment. Recently similar examples have been found in biological and psychological sciences. Our approach is an extension of traditional quantum probability theory, and it is general enough to describe aforementioned contextual phenomena outside of quantum physics.
Quantum stochastic walks on networks for decision-making.
Martínez-Martínez, Ismael; Sánchez-Burillo, Eduardo
2016-03-31
Recent experiments report violations of the classical law of total probability and incompatibility of certain mental representations when humans process and react to information. Evidence shows promise of a more general quantum theory providing a better explanation of the dynamics and structure of real decision-making processes than classical probability theory. Inspired by this, we show how the behavioral choice-probabilities can arise as the unique stationary distribution of quantum stochastic walkers on the classical network defined from Luce's response probabilities. This work is relevant because (i) we provide a very general framework integrating the positive characteristics of both quantum and classical approaches previously in confrontation, and (ii) we define a cognitive network which can be used to bring other connectivist approaches to decision-making into the quantum stochastic realm. We model the decision-maker as an open system in contact with her surrounding environment, and the time-length of the decision-making process reveals to be also a measure of the process' degree of interplay between the unitary and irreversible dynamics. Implementing quantum coherence on classical networks may be a door to better integrate human-like reasoning biases in stochastic models for decision-making.
Quantum stochastic walks on networks for decision-making
NASA Astrophysics Data System (ADS)
Martínez-Martínez, Ismael; Sánchez-Burillo, Eduardo
2016-03-01
Recent experiments report violations of the classical law of total probability and incompatibility of certain mental representations when humans process and react to information. Evidence shows promise of a more general quantum theory providing a better explanation of the dynamics and structure of real decision-making processes than classical probability theory. Inspired by this, we show how the behavioral choice-probabilities can arise as the unique stationary distribution of quantum stochastic walkers on the classical network defined from Luce’s response probabilities. This work is relevant because (i) we provide a very general framework integrating the positive characteristics of both quantum and classical approaches previously in confrontation, and (ii) we define a cognitive network which can be used to bring other connectivist approaches to decision-making into the quantum stochastic realm. We model the decision-maker as an open system in contact with her surrounding environment, and the time-length of the decision-making process reveals to be also a measure of the process’ degree of interplay between the unitary and irreversible dynamics. Implementing quantum coherence on classical networks may be a door to better integrate human-like reasoning biases in stochastic models for decision-making.
Quantum stochastic walks on networks for decision-making
Martínez-Martínez, Ismael; Sánchez-Burillo, Eduardo
2016-01-01
Recent experiments report violations of the classical law of total probability and incompatibility of certain mental representations when humans process and react to information. Evidence shows promise of a more general quantum theory providing a better explanation of the dynamics and structure of real decision-making processes than classical probability theory. Inspired by this, we show how the behavioral choice-probabilities can arise as the unique stationary distribution of quantum stochastic walkers on the classical network defined from Luce’s response probabilities. This work is relevant because (i) we provide a very general framework integrating the positive characteristics of both quantum and classical approaches previously in confrontation, and (ii) we define a cognitive network which can be used to bring other connectivist approaches to decision-making into the quantum stochastic realm. We model the decision-maker as an open system in contact with her surrounding environment, and the time-length of the decision-making process reveals to be also a measure of the process’ degree of interplay between the unitary and irreversible dynamics. Implementing quantum coherence on classical networks may be a door to better integrate human-like reasoning biases in stochastic models for decision-making. PMID:27030372
Random walk in generalized quantum theory
NASA Astrophysics Data System (ADS)
Martin, Xavier; O'Connor, Denjoe; Sorkin, Rafael D.
2005-01-01
One can view quantum mechanics as a generalization of classical probability theory that provides for pairwise interference among alternatives. Adopting this perspective, we “quantize” the classical random walk by finding, subject to a certain condition of “strong positivity”, the most general Markovian, translationally invariant “decoherence functional” with nearest neighbor transitions.
Economic decision-making compared with an equivalent motor task.
Wu, Shih-Wei; Delgado, Mauricio R; Maloney, Laurence T
2009-04-14
There is considerable evidence that human economic decision-making deviates from the predictions of expected utility theory (EUT) and that human performance conforms to EUT in many perceptual and motor decision tasks. It is possible that these results reflect a real difference in decision-making in the 2 domains but it is also possible that the observed discrepancy simply reflects typical differences in experimental design. We developed a motor task that is mathematically equivalent to choosing between lotteries and used it to compare how the same subject chose between classical economic lotteries and the same lotteries presented in equivalent motor form. In experiment 1, we found that subjects are more risk seeking in deciding between motor lotteries. In experiment 2, we used cumulative prospect theory to model choice and separately estimated the probability weighting functions and the value functions for each subject carrying out each task. We found no patterned differences in how subjects represented outcome value in the motor and the classical tasks. However, the probability weighting functions for motor and classical tasks were markedly and significantly different. Those for the classical task showed a typical tendency to overweight small probabilities and underweight large probabilities, and those for the motor task showed the opposite pattern of probability distortion. This outcome also accounts for the increased risk-seeking observed in the motor tasks of experiment 1. We conclude that the same subject distorts probability, but not value, differently in making identical decisions in motor and classical form.
Origin of probabilities and their application to the multiverse
NASA Astrophysics Data System (ADS)
Albrecht, Andreas; Phillips, Daniel
2014-12-01
We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We comment on the general implications of this view, and specifically question the application of purely classical probabilities to cosmology in cases where key questions are known to have no quantum answer. We argue that the ideas developed here may offer a way out of the notorious measure problems of eternal inflation.
Generalized probability theories: what determines the structure of quantum theory?
NASA Astrophysics Data System (ADS)
Janotta, Peter; Hinrichsen, Haye
2014-08-01
The framework of generalized probabilistic theories is a powerful tool for studying the foundations of quantum physics. It provides the basis for a variety of recent findings that significantly improve our understanding of the rich physical structure of quantum theory. This review paper tries to present the framework and recent results to a broader readership in an accessible manner. To achieve this, we follow a constructive approach. Starting from a few basic physically motivated assumptions we show how a given set of observations can be manifested in an operational theory. Furthermore, we characterize consistency conditions limiting the range of possible extensions. In this framework classical and quantum theory appear as special cases, and the aim is to understand what distinguishes quantum mechanics as the fundamental theory realized in nature. It turns out that non-classical features of single systems can equivalently result from higher-dimensional classical theories that have been restricted. Entanglement and non-locality, however, are shown to be genuine non-classical features.
Theories of the Alcoholic Personality.
ERIC Educational Resources Information Center
Cox, W. Miles
Several theories of the alcoholic personality have been devised to determine the relationship between the clusters of personality characteristics of alcoholics and their abuse of alcohol. The oldest and probably best known theory is the dependency theory, formulated in the tradition of classical psychoanalysis, which associates the alcoholic's…
Probability theory, not the very guide of life.
Juslin, Peter; Nilsson, Håkan; Winman, Anders
2009-10-01
Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive integration, in part, at least, because of well-known capacity constraints on controlled thought. In this article, the authors show with computer simulations that when based on approximate knowledge of probabilities, as is routinely the case in natural environments, linear additive integration can yield as accurate estimates, and as good average decision returns, as estimates based on probability theory. It is proposed that in natural environments people have little opportunity or incentive to induce the normative rules of probability theory and, given their cognitive constraints, linear additive integration may often offer superior bounded rationality.
Banik, Suman Kumar; Bag, Bidhan Chandra; Ray, Deb Shankar
2002-05-01
Traditionally, quantum Brownian motion is described by Fokker-Planck or diffusion equations in terms of quasiprobability distribution functions, e.g., Wigner functions. These often become singular or negative in the full quantum regime. In this paper a simple approach to non-Markovian theory of quantum Brownian motion using true probability distribution functions is presented. Based on an initial coherent state representation of the bath oscillators and an equilibrium canonical distribution of the quantum mechanical mean values of their coordinates and momenta, we derive a generalized quantum Langevin equation in c numbers and show that the latter is amenable to a theoretical analysis in terms of the classical theory of non-Markovian dynamics. The corresponding Fokker-Planck, diffusion, and Smoluchowski equations are the exact quantum analogs of their classical counterparts. The present work is independent of path integral techniques. The theory as developed here is a natural extension of its classical version and is valid for arbitrary temperature and friction (the Smoluchowski equation being considered in the overdamped limit).
NASA Astrophysics Data System (ADS)
Baumeler, ńmin; Feix, Adrien; Wolf, Stefan
2014-10-01
Quantum theory in a global spacetime gives rise to nonlocal correlations, which cannot be explained causally in a satisfactory way; this motivates the study of theories with reduced global assumptions. Oreshkov, Costa, and Brukner [Nat. Commun. 3, 1092 (2012), 10.1038/ncomms2076] proposed a framework in which quantum theory is valid locally but where, at the same time, no global spacetime, i.e., predefined causal order, is assumed beyond the absence of logical paradoxes. It was shown for the two-party case, however, that a global causal order always emerges in the classical limit. Quite naturally, it has been conjectured that the same also holds in the multiparty setting. We show that, counter to this belief, classical correlations locally compatible with classical probability theory exist that allow for deterministic signaling between three or more parties incompatible with any predefined causal order.
Franceschetti, Donald R; Gire, Elizabeth
2013-06-01
Quantum probability theory offers a viable alternative to classical probability, although there are some ambiguities inherent in transferring the quantum formalism to a less determined realm. A number of physicists are now looking at the applicability of quantum ideas to the assessment of physics learning, an area particularly suited to quantum probability ideas.
Single, Complete, Probability Spaces Consistent With EPR-Bohm-Bell Experimental Data
NASA Astrophysics Data System (ADS)
Avis, David; Fischer, Paul; Hilbert, Astrid; Khrennikov, Andrei
2009-03-01
We show that paradoxical consequences of violations of Bell's inequality are induced by the use of an unsuitable probabilistic description for the EPR-Bohm-Bell experiment. The conventional description (due to Bell) is based on a combination of statistical data collected for different settings of polarization beam splitters (PBSs). In fact, such data consists of some conditional probabilities which only partially define a probability space. Ignoring this conditioning leads to apparent contradictions in the classical probabilistic model (due to Kolmogorov). We show how to make a completely consistent probabilistic model by taking into account the probabilities of selecting the settings of the PBSs. Our model matches both the experimental data and is consistent with classical probability theory.
Quantum Bayesian perspective for intelligence reservoir characterization, monitoring and management
NASA Astrophysics Data System (ADS)
Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia; de Jesús Correa, María
2017-10-01
The paper starts with a brief review of the literature about uncertainty in geological, geophysical and petrophysical data. In particular, we present the viewpoints of experts in geophysics on the application of Bayesian inference and subjective probability. Then we present arguments that the use of classical probability theory (CP) does not match completely the structure of geophysical data. We emphasize that such data are characterized by contextuality and non-Kolmogorovness (the impossibility to use the CP model), incompleteness as well as incompatibility of some geophysical measurements. These characteristics of geophysical data are similar to the characteristics of quantum physical data. Notwithstanding all this, contextuality can be seen as a major deviation of quantum theory from classical physics. In particular, the contextual probability viewpoint is the essence of the Växjö interpretation of quantum mechanics. We propose to use quantum probability (QP) for decision-making during the characterization, modelling, exploring and management of the intelligent hydrocarbon reservoir. Quantum Bayesianism (QBism), one of the recently developed information interpretations of quantum theory, can be used as the interpretational basis for such QP decision-making in geology, geophysics and petroleum projects design and management. This article is part of the themed issue `Second quantum revolution: foundational questions'.
Towards a Theory of Semantic Communication (Extended Technical Report)
2011-03-01
counting models of a sentence, when interpretations have different probabilities, what matters is the total probability of models of the sentence, not...of classic logics still hold in the LP semantics, e.g., De Morgan’s laws. However, modus pollens does hold in the LP semantics 10 F. Relation to
Generalized Success-Breeds-Success Principle Leading to Time-Dependent Informetric Distributions.
ERIC Educational Resources Information Center
Egghe, Leo; Rousseau, Ronald
1995-01-01
Reformulates the success-breeds-success (SBS) principle in informetrics in order to generate a general theory of source-item relationships. Topics include a time-dependent probability, a new model for the expected probability that is compared with the SBS principle with exact combinatorial calculations, classical frequency distributions, and…
What Can Quantum Optics Say about Computational Complexity Theory?
NASA Astrophysics Data System (ADS)
Rahimi-Keshari, Saleh; Lund, Austin P.; Ralph, Timothy C.
2015-02-01
Considering the problem of sampling from the output photon-counting probability distribution of a linear-optical network for input Gaussian states, we obtain results that are of interest from both quantum theory and the computational complexity theory point of view. We derive a general formula for calculating the output probabilities, and by considering input thermal states, we show that the output probabilities are proportional to permanents of positive-semidefinite Hermitian matrices. It is believed that approximating permanents of complex matrices in general is a #P-hard problem. However, we show that these permanents can be approximated with an algorithm in the BPPNP complexity class, as there exists an efficient classical algorithm for sampling from the output probability distribution. We further consider input squeezed-vacuum states and discuss the complexity of sampling from the probability distribution at the output.
A quantum probability framework for human probabilistic inference.
Trueblood, Jennifer S; Yearsley, James M; Pothos, Emmanuel M
2017-09-01
There is considerable variety in human inference (e.g., a doctor inferring the presence of a disease, a juror inferring the guilt of a defendant, or someone inferring future weight loss based on diet and exercise). As such, people display a wide range of behaviors when making inference judgments. Sometimes, people's judgments appear Bayesian (i.e., normative), but in other cases, judgments deviate from the normative prescription of classical probability theory. How can we combine both Bayesian and non-Bayesian influences in a principled way? We propose a unified explanation of human inference using quantum probability theory. In our approach, we postulate a hierarchy of mental representations, from 'fully' quantum to 'fully' classical, which could be adopted in different situations. In our hierarchy of models, moving from the lowest level to the highest involves changing assumptions about compatibility (i.e., how joint events are represented). Using results from 3 experiments, we show that our modeling approach explains 5 key phenomena in human inference including order effects, reciprocity (i.e., the inverse fallacy), memorylessness, violations of the Markov condition, and antidiscounting. As far as we are aware, no existing theory or model can explain all 5 phenomena. We also explore transitions in our hierarchy, examining how representations change from more quantum to more classical. We show that classical representations provide a better account of data as individuals gain familiarity with a task. We also show that representations vary between individuals, in a way that relates to a simple measure of cognitive style, the Cognitive Reflection Test. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
A quantum probability explanation for violations of ‘rational’ decision theory
Pothos, Emmanuel M.; Busemeyer, Jerome R.
2009-01-01
Two experimental tasks in psychology, the two-stage gambling game and the Prisoner's Dilemma game, show that people violate the sure thing principle of decision theory. These paradoxical findings have resisted explanation by classical decision theory for over a decade. A quantum probability model, based on a Hilbert space representation and Schrödinger's equation, provides a simple and elegant explanation for this behaviour. The quantum model is compared with an equivalent Markov model and it is shown that the latter is unable to account for violations of the sure thing principle. Accordingly, it is argued that quantum probability provides a better framework for modelling human decision-making. PMID:19324743
Ensembles and Experiments in Classical and Quantum Physics
NASA Astrophysics Data System (ADS)
Neumaier, Arnold
A philosophically consistent axiomatic approach to classical and quantum mechanics is given. The approach realizes a strong formal implementation of Bohr's correspondence principle. In all instances, classical and quantum concepts are fully parallel: the same general theory has a classical realization and a quantum realization. Extending the ''probability via expectation'' approach of Whittle to noncommuting quantities, this paper defines quantities, ensembles, and experiments as mathematical concepts and shows how to model complementarity, uncertainty, probability, nonlocality and dynamics in these terms. The approach carries no connotation of unlimited repeatability; hence it can be applied to unique systems such as the universe. Consistent experiments provide an elegant solution to the reality problem, confirming the insistence of the orthodox Copenhagen interpretation on that there is nothing but ensembles, while avoiding its elusive reality picture. The weak law of large numbers explains the emergence of classical properties for macroscopic systems.
NASA Astrophysics Data System (ADS)
Lombardi, Olimpia; Fortin, Sebastian; Holik, Federico; López, Cristian
2017-04-01
Preface; Introduction; Part I. About the Concept of Information: 1. About the concept of information Sebastian Fortin and Olimpia Lombardi; 2. Representation, information, and theories of information Armond Duwell; 3. Information, communication, and manipulability Olimpia Lombardi and Cristian López; Part II. Information and quantum mechanics: 4. Quantum versus classical information Jeffrey Bub; 5. Quantum information and locality Dennis Dieks; 6. Pragmatic information in quantum mechanics Juan Roederer; 7. Interpretations of quantum theory: a map of madness Adán Cabello; Part III. Probability, Correlations, and Information: 8. On the tension between ontology and epistemology in quantum probabilities Amit Hagar; 9. Inferential versus dynamical conceptions of physics David Wallace; 10. Classical models for quantum information Federico Holik and Gustavo Martin Bosyk; 11. On the relative character of quantum correlations Guido Bellomo and Ángel Ricardo Plastino; Index.
Quantum probability and quantum decision-making.
Yukalov, V I; Sornette, D
2016-01-13
A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. © 2015 The Author(s).
Quantum probability, choice in large worlds, and the statistical structure of reality.
Ross, Don; Ladyman, James
2013-06-01
Classical probability models of incentive response are inadequate in "large worlds," where the dimensions of relative risk and the dimensions of similarity in outcome comparisons typically differ. Quantum probability models for choice in large worlds may be motivated pragmatically - there is no third theory - or metaphysically: statistical processing in the brain adapts to the true scale-relative structure of the universe.
Cognitive Diagnostic Attribute-Level Discrimination Indices
ERIC Educational Resources Information Center
Henson, Robert; Roussos, Louis; Douglas, Jeff; He, Xuming
2008-01-01
Cognitive diagnostic models (CDMs) model the probability of correctly answering an item as a function of an examinee's attribute mastery pattern. Because estimation of the mastery pattern involves more than a continuous measure of ability, reliability concepts introduced by classical test theory and item response theory do not apply. The cognitive…
Quantum Bayesian perspective for intelligence reservoir characterization, monitoring and management.
Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia; de Jesús Correa, María
2017-11-13
The paper starts with a brief review of the literature about uncertainty in geological, geophysical and petrophysical data. In particular, we present the viewpoints of experts in geophysics on the application of Bayesian inference and subjective probability. Then we present arguments that the use of classical probability theory (CP) does not match completely the structure of geophysical data. We emphasize that such data are characterized by contextuality and non-Kolmogorovness (the impossibility to use the CP model), incompleteness as well as incompatibility of some geophysical measurements. These characteristics of geophysical data are similar to the characteristics of quantum physical data. Notwithstanding all this, contextuality can be seen as a major deviation of quantum theory from classical physics. In particular, the contextual probability viewpoint is the essence of the Växjö interpretation of quantum mechanics. We propose to use quantum probability (QP) for decision-making during the characterization, modelling, exploring and management of the intelligent hydrocarbon reservoir Quantum Bayesianism (QBism), one of the recently developed information interpretations of quantum theory, can be used as the interpretational basis for such QP decision-making in geology, geophysics and petroleum projects design and management.This article is part of the themed issue 'Second quantum revolution: foundational questions'. © 2017 The Author(s).
Quantum mechanics: The Bayesian theory generalized to the space of Hermitian matrices
NASA Astrophysics Data System (ADS)
Benavoli, Alessio; Facchini, Alessandro; Zaffalon, Marco
2016-10-01
We consider the problem of gambling on a quantum experiment and enforce rational behavior by a few rules. These rules yield, in the classical case, the Bayesian theory of probability via duality theorems. In our quantum setting, they yield the Bayesian theory generalized to the space of Hermitian matrices. This very theory is quantum mechanics: in fact, we derive all its four postulates from the generalized Bayesian theory. This implies that quantum mechanics is self-consistent. It also leads us to reinterpret the main operations in quantum mechanics as probability rules: Bayes' rule (measurement), marginalization (partial tracing), independence (tensor product). To say it with a slogan, we obtain that quantum mechanics is the Bayesian theory in the complex numbers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Banik, Manik, E-mail: manik11ju@gmail.com
Steering is one of the most counter intuitive non-classical features of bipartite quantum system, first noticed by Schrödinger at the early days of quantum theory. On the other hand, measurement incompatibility is another non-classical feature of quantum theory, initially pointed out by Bohr. Recently, Quintino et al. [Phys. Rev. Lett. 113, 160402 (2014)] and Uola et al. [Phys. Rev. Lett. 113, 160403 (2014)] have investigated the relation between these two distinct non-classical features. They have shown that a set of measurements is not jointly measurable (i.e., incompatible) if and only if they can be used for demonstrating Schrödinger-Einstein-Podolsky-Rosen steering. Themore » concept of steering has been generalized for more general abstract tensor product theories rather than just Hilbert space quantum mechanics. In this article, we discuss that the notion of measurement incompatibility can be extended for general probability theories. Further, we show that the connection between steering and measurement incompatibility holds in a border class of tensor product theories rather than just quantum theory.« less
What is Quantum Mechanics? A Minimal Formulation
NASA Astrophysics Data System (ADS)
Friedberg, R.; Hohenberg, P. C.
2018-03-01
This paper presents a minimal formulation of nonrelativistic quantum mechanics, by which is meant a formulation which describes the theory in a succinct, self-contained, clear, unambiguous and of course correct manner. The bulk of the presentation is the so-called "microscopic theory", applicable to any closed system S of arbitrary size N, using concepts referring to S alone, without resort to external apparatus or external agents. An example of a similar minimal microscopic theory is the standard formulation of classical mechanics, which serves as the template for a minimal quantum theory. The only substantive assumption required is the replacement of the classical Euclidean phase space by Hilbert space in the quantum case, with the attendant all-important phenomenon of quantum incompatibility. Two fundamental theorems of Hilbert space, the Kochen-Specker-Bell theorem and Gleason's theorem, then lead inevitably to the well-known Born probability rule. For both classical and quantum mechanics, questions of physical implementation and experimental verification of the predictions of the theories are the domain of the macroscopic theory, which is argued to be a special case or application of the more general microscopic theory.
The force distribution probability function for simple fluids by density functional theory.
Rickayzen, G; Heyes, D M
2013-02-28
Classical density functional theory (DFT) is used to derive a formula for the probability density distribution function, P(F), and probability distribution function, W(F), for simple fluids, where F is the net force on a particle. The final formula for P(F) ∝ exp(-AF(2)), where A depends on the fluid density, the temperature, and the Fourier transform of the pair potential. The form of the DFT theory used is only applicable to bounded potential fluids. When combined with the hypernetted chain closure of the Ornstein-Zernike equation, the DFT theory for W(F) agrees with molecular dynamics computer simulations for the Gaussian and bounded soft sphere at high density. The Gaussian form for P(F) is still accurate at lower densities (but not too low density) for the two potentials, but with a smaller value for the constant, A, than that predicted by the DFT theory.
Budiyono, Agung; Rohrlich, Daniel
2017-11-03
Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.
Modeling Conditional Probabilities in Complex Educational Assessments. CSE Technical Report.
ERIC Educational Resources Information Center
Mislevy, Robert J.; Almond, Russell; Dibello, Lou; Jenkins, Frank; Steinberg, Linda; Yan, Duanli; Senturk, Deniz
An active area in psychometric research is coordinated task design and statistical analysis built around cognitive models. Compared with classical test theory and item response theory, there is often less information from observed data about the measurement-model parameters. On the other hand, there is more information from the grounding…
Some Remarks on Knowledge and Probability Arising from Counterfactual Quantum Effects
NASA Astrophysics Data System (ADS)
Lupacchini, Rossella
Can the mere possibility of a physical phenomenon affect the outcome of an experiment? In fact quantum theory presents us actual physical effects arising from "counterfactuals", that is physical effects brought about by things that might have happened, although they did not happen. How can it be? After a short outline of the quantum-mechanical description of physical reality, the occurrence of such counterfactual effects in quantum theory is illustrated by means of a Mach-Zehnder interferometer. Then these paradoxical phenomena undermining the very notion of physical event and questioning about which knowledge of physical reality can ever be obtained will be analysed using a classical possible-worlds model of knowledge and probability. Finally, a surprising application of counterfactual quantum effects producing a new kind of computing with no classical analogue will be shown.
Urns and Chameleons: two metaphors for two different types of measurements
NASA Astrophysics Data System (ADS)
Accardi, Luigi
2013-09-01
The awareness of the physical possibility of models of space, alternative with respect to the Euclidean one, begun to emerge towards the end of the 19-th century. At the end of the 20-th century a similar awareness emerged concerning the physical possibility of models of the laws of chance alternative with respect to the classical probabilistic models (Kolmogorov model). In geometry the mathematical construction of several non-Euclidean models of space preceded of about one century their applications in physics, which came with the theory of relativity. In physics the opposite situation took place. In fact, while the first example of non Kolmogorov probabilistic models emerged in quantum physics approximately one century ago, at the beginning of 1900, the awareness of the fact that this new mathematical formalism reflected a new mathematical model of the laws of chance had to wait until the early 1980's. In this long time interval the classical and the new probabilistic models were both used in the description and the interpretation of quantum phenomena and negatively interfered with each other because of the absence (for many decades) of a mathematical theory that clearly delimited the respective domains of application. The result of this interference was the emergence of the so-called the "paradoxes of quantum theory". For several decades there have been many different attempts to solve these paradoxes giving rise to what K. Popper baptized "the great quantum muddle": a debate which has been at the core of the philosophy of science for more than 50 years. However these attempts have led to contradictions between the two fundamental theories of the contemporary physical: the quantum theory and the theory of the relativity. Quantum probability identifies the reason of the emergence of non Kolmogorov models, and therefore of the so-called the paradoxes of quantum theory, in the difference between the notion of passive measurements like "reading pre-existent properties" (urn metaphor) and measurements consisting in reading "a response to an interaction" (chameleon metaphor). The non-trivial point is that one can prove that, while the urn scheme cannot lead to empirical data outside of classic probability, response based measurements can give rise to non classical statistics. The talk will include entirely classical examples of non classical statistics and potential applications to economic, sociological or biomedical phenomena.
Experimental non-classicality of an indivisible quantum system.
Lapkiewicz, Radek; Li, Peizhe; Schaeff, Christoph; Langford, Nathan K; Ramelow, Sven; Wieśniak, Marcin; Zeilinger, Anton
2011-06-22
In contrast to classical physics, quantum theory demands that not all properties can be simultaneously well defined; the Heisenberg uncertainty principle is a manifestation of this fact. Alternatives have been explored--notably theories relying on joint probability distributions or non-contextual hidden-variable models, in which the properties of a system are defined independently of their own measurement and any other measurements that are made. Various deep theoretical results imply that such theories are in conflict with quantum mechanics. Simpler cases demonstrating this conflict have been found and tested experimentally with pairs of quantum bits (qubits). Recently, an inequality satisfied by non-contextual hidden-variable models and violated by quantum mechanics for all states of two qubits was introduced and tested experimentally. A single three-state system (a qutrit) is the simplest system in which such a contradiction is possible; moreover, the contradiction cannot result from entanglement between subsystems, because such a three-state system is indivisible. Here we report an experiment with single photonic qutrits which provides evidence that no joint probability distribution describing the outcomes of all possible measurements--and, therefore, no non-contextual theory--can exist. Specifically, we observe a violation of the Bell-type inequality found by Klyachko, Can, Binicioğlu and Shumovsky. Our results illustrate a deep incompatibility between quantum mechanics and classical physics that cannot in any way result from entanglement.
The rational status of quantum cognition.
Pothos, Emmanuel M; Busemeyer, Jerome R; Shiffrin, Richard M; Yearsley, James M
2017-07-01
Classic probability theory (CPT) is generally considered the rational way to make inferences, but there have been some empirical findings showing a divergence between reasoning and the principles of classical probability theory (CPT), inviting the conclusion that humans are irrational. Perhaps the most famous of these findings is the conjunction fallacy (CF). Recently, the CF has been shown consistent with the principles of an alternative probabilistic framework, quantum probability theory (QPT). Does this imply that QPT is irrational or does QPT provide an alternative interpretation of rationality? Our presentation consists of 3 parts. First, we examine the putative rational status of QPT using the same argument as used to establish the rationality of CPT, the Dutch Book (DB) argument, according to which reasoners should not commit to bets guaranteeing a loss. We prove the rational status of QPT by formulating it as a particular case of an extended form of CPT, with separate probability spaces produced by changing context. Second, we empirically examine the key requirement for whether a CF can be rational or not; the results show that participants indeed behave rationally, at least relative to the representations they employ. Finally, we consider whether the conditions for the CF to be rational are applicable in the outside (nonmental) world. Our discussion provides a general and alternative perspective for rational probabilistic inference, based on the idea that contextuality requires either reasoning in separate CPT probability spaces or reasoning with QPT principles. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
PDF-based heterogeneous multiscale filtration model.
Gong, Jian; Rutland, Christopher J
2015-04-21
Motivated by modeling of gasoline particulate filters (GPFs), a probability density function (PDF) based heterogeneous multiscale filtration (HMF) model is developed to calculate filtration efficiency of clean particulate filters. A new methodology based on statistical theory and classic filtration theory is developed in the HMF model. Based on the analysis of experimental porosimetry data, a pore size probability density function is introduced to represent heterogeneity and multiscale characteristics of the porous wall. The filtration efficiency of a filter can be calculated as the sum of the contributions of individual collectors. The resulting HMF model overcomes the limitations of classic mean filtration models which rely on tuning of the mean collector size. Sensitivity analysis shows that the HMF model recovers the classical mean model when the pore size variance is very small. The HMF model is validated by fundamental filtration experimental data from different scales of filter samples. The model shows a good agreement with experimental data at various operating conditions. The effects of the microstructure of filters on filtration efficiency as well as the most penetrating particle size are correctly predicted by the model.
Physics of automated driving in framework of three-phase traffic theory.
Kerner, Boris S
2018-04-01
We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.
Physics of automated driving in framework of three-phase traffic theory
NASA Astrophysics Data System (ADS)
Kerner, Boris S.
2018-04-01
We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.
Cellular Automata Generalized To An Inferential System
NASA Astrophysics Data System (ADS)
Blower, David J.
2007-11-01
Stephen Wolfram popularized elementary one-dimensional cellular automata in his book, A New Kind of Science. Among many remarkable things, he proved that one of these cellular automata was a Universal Turing Machine. Such cellular automata can be interpreted in a different way by viewing them within the context of the formal manipulation rules from probability theory. Bayes's Theorem is the most famous of such formal rules. As a prelude, we recapitulate Jaynes's presentation of how probability theory generalizes classical logic using modus ponens as the canonical example. We emphasize the important conceptual standing of Boolean Algebra for the formal rules of probability manipulation and give an alternative demonstration augmenting and complementing Jaynes's derivation. We show the complementary roles played in arguments of this kind by Bayes's Theorem and joint probability tables. A good explanation for all of this is afforded by the expansion of any particular logic function via the disjunctive normal form (DNF). The DNF expansion is a useful heuristic emphasized in this exposition because such expansions point out where relevant 0s should be placed in the joint probability tables for logic functions involving any number of variables. It then becomes a straightforward exercise to rely on Boolean Algebra, Bayes's Theorem, and joint probability tables in extrapolating to Wolfram's cellular automata. Cellular automata are seen as purely deductive systems, just like classical logic, which probability theory is then able to generalize. Thus, any uncertainties which we might like to introduce into the discussion about cellular automata are handled with ease via the familiar inferential path. Most importantly, the difficult problem of predicting what cellular automata will do in the far future is treated like any inferential prediction problem.
Interference in the classical probabilistic model and its representation in complex Hilbert space
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei Yu.
2005-10-01
The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.
Dawson, Michael R W; Dupuis, Brian; Spetch, Marcia L; Kelly, Debbie M
2009-08-01
The matching law (Herrnstein 1961) states that response rates become proportional to reinforcement rates; this is related to the empirical phenomenon called probability matching (Vulkan 2000). Here, we show that a simple artificial neural network generates responses consistent with probability matching. This behavior was then used to create an operant procedure for network learning. We use the multiarmed bandit (Gittins 1989), a classic problem of choice behavior, to illustrate that operant training balances exploiting the bandit arm expected to pay off most frequently with exploring other arms. Perceptrons provide a medium for relating results from neural networks, genetic algorithms, animal learning, contingency theory, reinforcement learning, and theories of choice.
Semenov, Alexander; Babikov, Dmitri
2015-12-17
The mixed quantum classical theory, MQCT, for inelastic scattering of two molecules is developed, in which the internal (rotational, vibrational) motion of both collision partners is treated with quantum mechanics, and the molecule-molecule scattering (translational motion) is described by classical trajectories. The resultant MQCT formalism includes a system of coupled differential equations for quantum probability amplitudes, and the classical equations of motion in the mean-field potential. Numerical tests of this theory are carried out for several most important rotational state-to-state transitions in the N2 + H2 system, in a broad range of collision energies. Besides scattering resonances (at low collision energies) excellent agreement with full-quantum results is obtained, including the excitation thresholds, the maxima of cross sections, and even some smaller features, such as slight oscillations of energy dependencies. Most importantly, at higher energies the results of MQCT are nearly identical to the full quantum results, which makes this approach a good alternative to the full-quantum calculations that become computationally expensive at higher collision energies and for heavier collision partners. Extensions of this theory to include vibrational transitions or general asymmetric-top rotor (polyatomic) molecules are relatively straightforward.
Experimental Observation of Two Features Unexpected from the Classical Theories of Rubber Elasticity
NASA Astrophysics Data System (ADS)
Nishi, Kengo; Fujii, Kenta; Chung, Ung-il; Shibayama, Mitsuhiro; Sakai, Takamasa
2017-12-01
Although the elastic modulus of a Gaussian chain network is thought to be successfully described by classical theories of rubber elasticity, such as the affine and phantom models, verification experiments are largely lacking owing to difficulties in precisely controlling of the network structure. We prepared well-defined model polymer networks experimentally, and measured the elastic modulus G for a broad range of polymer concentrations and connectivity probabilities, p . In our experiment, we observed two features that were distinct from those predicted by classical theories. First, we observed the critical behavior G ˜|p -pc|1.95 near the sol-gel transition. This scaling law is different from the prediction of classical theories, but can be explained by analogy between the electric conductivity of resistor networks and the elasticity of polymer networks. Here, pc is the sol-gel transition point. Furthermore, we found that the experimental G -p relations in the region above C* did not follow the affine or phantom theories. Instead, all the G /G0-p curves fell onto a single master curve when G was normalized by the elastic modulus at p =1 , G0. We show that the effective medium approximation for Gaussian chain networks explains this master curve.
Analyzing force concept inventory with item response theory
NASA Astrophysics Data System (ADS)
Wang, Jing; Bao, Lei
2010-10-01
Item response theory is a popular assessment method used in education. It rests on the assumption of a probability framework that relates students' innate ability and their performance on test questions. Item response theory transforms students' raw test scores into a scaled proficiency score, which can be used to compare results obtained with different test questions. The scaled score also addresses the issues of ceiling effects and guessing, which commonly exist in quantitative assessment. We used item response theory to analyze the force concept inventory (FCI). Our results show that item response theory can be useful for analyzing physics concept surveys such as the FCI and produces results about the individual questions and student performance that are beyond the capability of classical statistics. The theory yields detailed measurement parameters regarding the difficulty, discrimination features, and probability of correct guess for each of the FCI questions.
Quantum-like Modeling of Cognition
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei
2015-09-01
This paper begins with a historical review of the mutual influence of physics and psychology, from Freud's invention of psychic energy inspired by von Boltzmann' thermodynamics to the enrichment quantum physics gained from the side of psychology by the notion of complementarity (the invention of Niels Bohr who was inspired by William James), besides we consider the resonance of the correspondence between Wolfgang Pauli and Carl Jung in both physics and psychology. Then we turn to the problem of development of mathematical models for laws of thought starting with Boolean logic and progressing towards foundations of classical probability theory. Interestingly, the laws of classical logic and probability are routinely violated not only by quantum statistical phenomena but by cognitive phenomena as well. This is yet another common feature between quantum physics and psychology. In particular, cognitive data can exhibit a kind of the probabilistic interference effect. This similarity with quantum physics convinced a multi-disciplinary group of scientists (physicists, psychologists, economists, sociologists) to apply the mathematical apparatus of quantum mechanics to modeling of cognition. We illustrate this activity by considering a few concrete phenomena: the order and disjunction effects, recognition of ambiguous figures, categorization-decision making. In Appendix 1 we briefly present essentials of theory of contextual probability and a method of representations of contextual probabilities by complex probability amplitudes (solution of the ``inverse Born's problem'') based on a quantum-like representation algorithm (QLRA).
NASA Astrophysics Data System (ADS)
McCauley, Joseph L.
2009-09-01
Preface; 1. Econophysics: why and what; 2. Neo-classical economic theory; 3. Probability and stochastic processes; 4. Introduction to financial economics; 5. Introduction to portfolio selection theory; 6. Scaling, pair correlations, and conditional densities; 7. Statistical ensembles: deducing dynamics from time series; 8. Martingale option pricing; 9. FX market globalization: evolution of the dollar to worldwide reserve currency; 10. Macroeconomics and econometrics: regression models vs. empirically based modeling; 11. Complexity; Index.
Bagarello, F; Haven, E; Khrennikov, A
2017-11-13
We present the mathematical model of decision-making (DM) of agents acting in a complex and uncertain environment (combining huge variety of economical, financial, behavioural and geopolitical factors). To describe interaction of agents with it, we apply the formalism of quantum field theory (QTF). Quantum fields are a purely informational nature. The QFT model can be treated as a far relative of the expected utility theory, where the role of utility is played by adaptivity to an environment (bath). However, this sort of utility-adaptivity cannot be represented simply as a numerical function. The operator representation in Hilbert space is used and adaptivity is described as in quantum dynamics. We are especially interested in stabilization of solutions for sufficiently large time. The outputs of this stabilization process, probabilities for possible choices, are treated in the framework of classical DM. To connect classical and quantum DM, we appeal to Quantum Bayesianism. We demonstrate the quantum-like interference effect in DM, which is exhibited as a violation of the formula of total probability, and hence the classical Bayesian inference scheme.This article is part of the themed issue 'Second quantum revolution: foundational questions'. © 2017 The Author(s).
A model of adaptive decision-making from representation of information environment by quantum fields
NASA Astrophysics Data System (ADS)
Bagarello, F.; Haven, E.; Khrennikov, A.
2017-10-01
We present the mathematical model of decision-making (DM) of agents acting in a complex and uncertain environment (combining huge variety of economical, financial, behavioural and geopolitical factors). To describe interaction of agents with it, we apply the formalism of quantum field theory (QTF). Quantum fields are a purely informational nature. The QFT model can be treated as a far relative of the expected utility theory, where the role of utility is played by adaptivity to an environment (bath). However, this sort of utility-adaptivity cannot be represented simply as a numerical function. The operator representation in Hilbert space is used and adaptivity is described as in quantum dynamics. We are especially interested in stabilization of solutions for sufficiently large time. The outputs of this stabilization process, probabilities for possible choices, are treated in the framework of classical DM. To connect classical and quantum DM, we appeal to Quantum Bayesianism. We demonstrate the quantum-like interference effect in DM, which is exhibited as a violation of the formula of total probability, and hence the classical Bayesian inference scheme. This article is part of the themed issue `Second quantum revolution: foundational questions'.
Contextual Advantage for State Discrimination
NASA Astrophysics Data System (ADS)
Schmid, David; Spekkens, Robert W.
2018-02-01
Finding quantitative aspects of quantum phenomena which cannot be explained by any classical model has foundational importance for understanding the boundary between classical and quantum theory. It also has practical significance for identifying information processing tasks for which those phenomena provide a quantum advantage. Using the framework of generalized noncontextuality as our notion of classicality, we find one such nonclassical feature within the phenomenology of quantum minimum-error state discrimination. Namely, we identify quantitative limits on the success probability for minimum-error state discrimination in any experiment described by a noncontextual ontological model. These constraints constitute noncontextuality inequalities that are violated by quantum theory, and this violation implies a quantum advantage for state discrimination relative to noncontextual models. Furthermore, our noncontextuality inequalities are robust to noise and are operationally formulated, so that any experimental violation of the inequalities is a witness of contextuality, independently of the validity of quantum theory. Along the way, we introduce new methods for analyzing noncontextuality scenarios and demonstrate a tight connection between our minimum-error state discrimination scenario and a Bell scenario.
Quantum Structure in Cognition and the Foundations of Human Reasoning
NASA Astrophysics Data System (ADS)
Aerts, Diederik; Sozzo, Sandro; Veloz, Tomas
2015-12-01
Traditional cognitive science rests on a foundation of classical logic and probability theory. This foundation has been seriously challenged by several findings in experimental psychology on human decision making. Meanwhile, the formalism of quantum theory has provided an efficient resource for modeling these classically problematical situations. In this paper, we start from our successful quantum-theoretic approach to the modeling of concept combinations to formulate a unifying explanatory hypothesis. In it, human reasoning is the superposition of two processes - a conceptual reasoning, whose nature is emergence of new conceptuality, and a logical reasoning, founded on an algebraic calculus of the logical type. In most cognitive processes however, the former reasoning prevails over the latter. In this perspective, the observed deviations from classical logical reasoning should not be interpreted as biases but, rather, as natural expressions of emergence in its deepest form.
A Transferrable Belief Model Representation for Physical Security of Nuclear Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Gerts
This work analyzed various probabilistic methods such as classic statistics, Bayesian inference, possibilistic theory, and Dempster-Shafer theory of belief functions for the potential insight offered into the physical security of nuclear materials as well as more broad application to nuclear non-proliferation automated decision making theory. A review of the fundamental heuristic and basic limitations of each of these methods suggested that the Dempster-Shafer theory of belief functions may offer significant capability. Further examination of the various interpretations of Dempster-Shafer theory, such as random set, generalized Bayesian, and upper/lower probability demonstrate some limitations. Compared to the other heuristics, the transferrable beliefmore » model (TBM), one of the leading interpretations of Dempster-Shafer theory, can improve the automated detection of the violation of physical security using sensors and human judgment. The improvement is shown to give a significant heuristic advantage over other probabilistic options by demonstrating significant successes for several classic gedanken experiments.« less
Quantum algorithms for quantum field theories.
Jordan, Stephen P; Lee, Keith S M; Preskill, John
2012-06-01
Quantum field theory reconciles quantum mechanics and special relativity, and plays a central role in many areas of physics. We developed a quantum algorithm to compute relativistic scattering probabilities in a massive quantum field theory with quartic self-interactions (φ(4) theory) in spacetime of four and fewer dimensions. Its run time is polynomial in the number of particles, their energy, and the desired precision, and applies at both weak and strong coupling. In the strong-coupling and high-precision regimes, our quantum algorithm achieves exponential speedup over the fastest known classical algorithm.
Do quantum strategies always win?
NASA Astrophysics Data System (ADS)
Anand, Namit; Benjamin, Colin
2015-11-01
In a seminal paper, Meyer (Phys Rev Lett 82:1052, 1999) described the advantages of quantum game theory by looking at the classical penny flip game. A player using a quantum strategy can win against a classical player almost 100 % of the time. Here we make a slight modification to the quantum game, with the two players sharing an entangled state to begin with. We then analyze two different scenarios: First in which quantum player makes unitary transformations to his qubit, while the classical player uses a pure strategy of either flipping or not flipping the state of his qubit. In this case, the quantum player always wins against the classical player. In the second scenario, we have the quantum player making similar unitary transformations, while the classical player makes use of a mixed strategy wherein he either flips or not with some probability " p." We show that in the second scenario, 100 % win record of a quantum player is drastically reduced and for a particular probability " p" the classical player can even win against the quantum player. This is of possible relevance to the field of quantum computation as we show that in this quantum game of preserving versus destroying entanglement a particular classical algorithm can beat the quantum algorithm.
Noncommutative Valuation of Options
NASA Astrophysics Data System (ADS)
Herscovich, Estanislao
2016-12-01
The aim of this note is to show that the classical results in finance theory for pricing of derivatives, given by making use of the replication principle, can be extended to the noncommutative world. We believe that this could be of interest in quantum probability. The main result called the First fundamental theorem of asset pricing, states that a noncommutative stock market admits no-arbitrage if and only if it admits a noncommutative equivalent martingale probability.
Introducing the Qplex: a novel arena for quantum theory
NASA Astrophysics Data System (ADS)
Appleby, Marcus; Fuchs, Christopher A.; Stacey, Blake C.; Zhu, Huangjun
2017-07-01
We reconstruct quantum theory starting from the premise that, as Asher Peres remarked, "Unperformed experiments have no results." The tools of quantum information theory, and in particular the symmetric informationally complete (SIC) measurements, provide a concise expression of how exactly Peres's dictum holds true. That expression is a constraint on how the probability distributions for outcomes of different, hypothetical and mutually exclusive experiments ought to mesh together, a type of constraint not foreseen in classical thinking. Taking this as our foundational principle, we show how to reconstruct the formalism of quantum theory in finite-dimensional Hilbert spaces. The central variety of mathematical entity in our reconstruction is the qplex, a very particular type of subset of a probability simplex. Along the way, by closely studying the symmetry properties of qplexes, we derive a condition for the existence of a d-dimensional SIC.
Principles of Quantum Mechanics
NASA Astrophysics Data System (ADS)
Landé, Alfred
2013-10-01
Preface; Introduction: 1. Observation and interpretation; 2. Difficulties of the classical theories; 3. The purpose of quantum theory; Part I. Elementary Theory of Observation (Principle of Complementarity): 4. Refraction in inhomogeneous media (force fields); 5. Scattering of charged rays; 6. Refraction and reflection at a plane; 7. Absolute values of momentum and wave length; 8. Double ray of matter diffracting light waves; 9. Double ray of matter diffracting photons; 10. Microscopic observation of ρ (x) and σ (p); 11. Complementarity; 12. Mathematical relation between ρ (x) and σ (p) for free particles; 13. General relation between ρ (q) and σ (p); 14. Crystals; 15. Transition density and transition probability; 16. Resultant values of physical functions; matrix elements; 17. Pulsating density; 18. General relation between ρ (t) and σ (є); 19. Transition density; matrix elements; Part II. The Principle of Uncertainty: 20. Optical observation of density in matter packets; 21. Distribution of momenta in matter packets; 22. Mathematical relation between ρ and σ; 23. Causality; 24. Uncertainty; 25. Uncertainty due to optical observation; 26. Dissipation of matter packets; rays in Wilson Chamber; 27. Density maximum in time; 28. Uncertainty of energy and time; 29. Compton effect; 30. Bothe-Geiger and Compton-Simon experiments; 31. Doppler effect; Raman effect; 32. Elementary bundles of rays; 33. Jeans' number of degrees of freedom; 34. Uncertainty of electromagnetic field components; Part III. The Principle of Interference and Schrödinger's equation: 35. Physical functions; 36. Interference of probabilities for p and q; 37. General interference of probabilities; 38. Differential equations for Ψp (q) and Xq (p); 39. Differential equation for фβ (q); 40. The general probability amplitude Φβ' (Q); 41. Point transformations; 42. General theorem of interference; 43. Conjugate variables; 44. Schrödinger's equation for conservative systems; 45. Schrödinger's equation for non-conservative systems; 46. Pertubation theory; 47. Orthogonality, normalization and Hermitian conjugacy; 48. General matrix elements; Part IV. The Principle of Correspondence: 49. Contact transformations in classical mechanics; 50. Point transformations; 51. Contact transformations in quantum mechanics; 52. Constants of motion and angular co-ordinates; 53. Periodic orbits; 54. De Broglie and Schrödinger function; correspondence to classical mechanics; 55. Packets of probability; 56. Correspondence to hydrodynamics; 57. Motion and scattering of wave packets; 58. Formal correspondence between classical and quantum mechanics; Part V. Mathematical Appendix: Principle of Invariance: 59. The general theorem of transformation; 60. Operator calculus; 61. Exchange relations; three criteria for conjugacy; 62. First method of canonical transformation; 63. Second method of canonical transformation; 64. Proof of the transformation theorem; 65. Invariance of the matrix elements against unitary transformations; 66. Matrix mechanics; Index of literature; Index of names and subjects.
ERIC Educational Resources Information Center
Hill, Theodore P.; Morrison, Kent E.
2010-01-01
This paper surveys the fascinating mathematics of fair division, and provides a suite of examples using basic ideas from algebra, calculus, and probability which can be used to examine and test new and sometimes complex mathematical theories and claims involving fair division. Conversely, the classical cut-and-choose and moving-knife algorithms…
Prospect theory on the brain? Toward a cognitive neuroscience of decision under risk.
Trepel, Christopher; Fox, Craig R; Poldrack, Russell A
2005-04-01
Most decisions must be made without advance knowledge of their consequences. Economists and psychologists have devoted much attention to modeling decisions made under conditions of risk in which options can be characterized by a known probability distribution over possible outcomes. The descriptive shortcomings of classical economic models motivated the development of prospect theory (D. Kahneman, A. Tversky, Prospect theory: An analysis of decision under risk. Econometrica, 4 (1979) 263-291; A. Tversky, D. Kahneman, Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5 (4) (1992) 297-323) the most successful behavioral model of decision under risk. In the prospect theory, subjective value is modeled by a value function that is concave for gains, convex for losses, and steeper for losses than for gains; the impact of probabilities are characterized by a weighting function that overweights low probabilities and underweights moderate to high probabilities. We outline the possible neural bases of the components of prospect theory, surveying evidence from human imaging, lesion, and neuropharmacology studies as well as animal neurophysiology studies. These results provide preliminary suggestions concerning the neural bases of prospect theory that include a broad set of brain regions and neuromodulatory systems. These data suggest that focused studies of decision making in the context of quantitative models may provide substantial leverage towards a fuller understanding of the cognitive neuroscience of decision making.
Theory of Stochastic Laplacian Growth
NASA Astrophysics Data System (ADS)
Alekseev, Oleg; Mineev-Weinstein, Mark
2017-07-01
We generalize the diffusion-limited aggregation by issuing many randomly-walking particles, which stick to a cluster at the discrete time unit providing its growth. Using simple combinatorial arguments we determine probabilities of different growth scenarios and prove that the most probable evolution is governed by the deterministic Laplacian growth equation. A potential-theoretical analysis of the growth probabilities reveals connections with the tau-function of the integrable dispersionless limit of the two-dimensional Toda hierarchy, normal matrix ensembles, and the two-dimensional Dyson gas confined in a non-uniform magnetic field. We introduce the time-dependent Hamiltonian, which generates transitions between different classes of equivalence of closed curves, and prove the Hamiltonian structure of the interface dynamics. Finally, we propose a relation between probabilities of growth scenarios and the semi-classical limit of certain correlation functions of "light" exponential operators in the Liouville conformal field theory on a pseudosphere.
A quantum theory account of order effects and conjunction fallacies in political judgments.
Yearsley, James M; Trueblood, Jennifer S
2017-09-06
Are our everyday judgments about the world around us normative? Decades of research in the judgment and decision-making literature suggest the answer is no. If people's judgments do not follow normative rules, then what rules if any do they follow? Quantum probability theory is a promising new approach to modeling human behavior that is at odds with normative, classical rules. One key advantage of using quantum theory is that it explains multiple types of judgment errors using the same basic machinery, unifying what have previously been thought of as disparate phenomena. In this article, we test predictions from quantum theory related to the co-occurrence of two classic judgment phenomena, order effects and conjunction fallacies, using judgments about real-world events (related to the U.S. presidential primaries). We also show that our data obeys two a priori and parameter free constraints derived from quantum theory. Further, we examine two factors that moderate the effects, cognitive thinking style (as measured by the Cognitive Reflection Test) and political ideology.
Classical theory of atom-surface scattering: The rainbow effect
NASA Astrophysics Data System (ADS)
Miret-Artés, Salvador; Pollak, Eli
2012-07-01
The scattering of heavy atoms and molecules from surfaces is oftentimes dominated by classical mechanics. A large body of experiments have gathered data on the angular distributions of the scattered species, their energy loss distribution, sticking probability, dependence on surface temperature and more. For many years these phenomena have been considered theoretically in the framework of the “washboard model” in which the interaction of the incident particle with the surface is described in terms of hard wall potentials. Although this class of models has helped in elucidating some of the features it left open many questions such as: true potentials are clearly not hard wall potentials, it does not provide a realistic framework for phonon scattering, and it cannot explain the incident angle and incident energy dependence of rainbow scattering, nor can it provide a consistent theory for sticking. In recent years we have been developing a classical perturbation theory approach which has provided new insight into the dynamics of atom-surface scattering. The theory includes both surface corrugation as well as interaction with surface phonons in terms of harmonic baths which are linearly coupled to the system coordinates. This model has been successful in elucidating many new features of rainbow scattering in terms of frictions and bath fluctuations or noise. It has also given new insight into the origins of asymmetry in atomic scattering from surfaces. New phenomena deduced from the theory include friction induced rainbows, energy loss rainbows, a theory of super-rainbows, and more. In this review we present the classical theory of atom-surface scattering as well as extensions and implications for semiclassical scattering and the further development of a quantum theory of surface scattering. Special emphasis is given to the inversion of scattering data into information on the particle-surface interactions.
Classical theory of atom-surface scattering: The rainbow effect
NASA Astrophysics Data System (ADS)
Miret-Artés, Salvador; Pollak, Eli
The scattering of heavy atoms and molecules from surfaces is oftentimes dominated by classical mechanics. A large body of experiments have gathered data on the angular distributions of the scattered species, their energy loss distribution, sticking probability, dependence on surface temperature and more. For many years these phenomena have been considered theoretically in the framework of the "washboard model" in which the interaction of the incident particle with the surface is described in terms of hard wall potentials. Although this class of models has helped in elucidating some of the features it left open many questions such as: true potentials are clearly not hard wall potentials, it does not provide a realistic framework for phonon scattering, and it cannot explain the incident angle and incident energy dependence of rainbow scattering, nor can it provide a consistent theory for sticking. In recent years we have been developing a classical perturbation theory approach which has provided new insight into the dynamics of atom-surface scattering. The theory includes both surface corrugation as well as interaction with surface phonons in terms of harmonic baths which are linearly coupled to the system coordinates. This model has been successful in elucidating many new features of rainbow scattering in terms of frictions and bath fluctuations or noise. It has also given new insight into the origins of asymmetry in atomic scattering from surfaces. New phenomena deduced from the theory include friction induced rainbows, energy loss rainbows, a theory of super-rainbows, and more. In this review we present the classical theory of atom-surface scattering as well as extensions and implications for semiclassical scattering and the further development of a quantum theory of surface scattering. Special emphasis is given to the inversion of scattering data into information on the particle-surface interactions.
Computation in generalised probabilisitic theories
NASA Astrophysics Data System (ADS)
Lee, Ciarán M.; Barrett, Jonathan
2015-08-01
From the general difficulty of simulating quantum systems using classical systems, and in particular the existence of an efficient quantum algorithm for factoring, it is likely that quantum computation is intrinsically more powerful than classical computation. At present, the best upper bound known for the power of quantum computation is that {{BQP}}\\subseteq {{AWPP}}, where {{AWPP}} is a classical complexity class (known to be included in {{PP}}, hence {{PSPACE}}). This work investigates limits on computational power that are imposed by simple physical, or information theoretic, principles. To this end, we define a circuit-based model of computation in a class of operationally-defined theories more general than quantum theory, and ask: what is the minimal set of physical assumptions under which the above inclusions still hold? We show that given only an assumption of tomographic locality (roughly, that multipartite states and transformations can be characterized by local measurements), efficient computations are contained in {{AWPP}}. This inclusion still holds even without assuming a basic notion of causality (where the notion is, roughly, that probabilities for outcomes cannot depend on future measurement choices). Following Aaronson, we extend the computational model by allowing post-selection on measurement outcomes. Aaronson showed that the corresponding quantum complexity class, {{PostBQP}}, is equal to {{PP}}. Given only the assumption of tomographic locality, the inclusion in {{PP}} still holds for post-selected computation in general theories. Hence in a world with post-selection, quantum theory is optimal for computation in the space of all operational theories. We then consider whether one can obtain relativized complexity results for general theories. It is not obvious how to define a sensible notion of a computational oracle in the general framework that reduces to the standard notion in the quantum case. Nevertheless, it is possible to define computation relative to a ‘classical oracle’. Then, we show there exists a classical oracle relative to which efficient computation in any theory satisfying the causality assumption does not include {{NP}}.
NASA Astrophysics Data System (ADS)
Basiladze, S. G.
2017-05-01
The paper describes the general physical theory of signals, carriers of information, which supplements Shannon's abstract classical theory and is applicable in much broader fields, including nuclear physics. It is shown that in the absence of classical noise its place should be taken by the physical threshold of signal perception for objects of both macrocosm and microcosm. The signal perception threshold allows the presence of subthreshold (virtual) signal states. For these states, Boolean algebra of logic ( A = 0/1) is transformed into the "algebraic logic" of probabilities (0 ≤ a ≤ 1). The similarity and difference of virtual states of macroand microsignals are elucidated. "Real" and "quantum" information for computers is considered briefly. The maximum information transmission rate is estimated based on physical constants.
Quantum-like dynamics of decision-making in prisoner's dilemma game
NASA Astrophysics Data System (ADS)
Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu
2012-03-01
In cognitive psychology, some experiments of games were reported [1, 2, 3, 4], and these demonstrated that real players did not use the "rational strategy" provided by classical game theory. To discuss probabilities of such "irrational choice", recently, we proposed a decision-making model which is based on the formalism of quantum mechanics [5, 6, 7, 8]. In this paper, we briefly explain the above model and calculate the probability of irrational choice in several prisoner's dilemma (PD) games.
Hypergame theory applied to cyber attack and defense
NASA Astrophysics Data System (ADS)
House, James Thomas; Cybenko, George
2010-04-01
This work concerns cyber attack and defense in the context of game theory--specifically hypergame theory. Hypergame theory extends classical game theory with the ability to deal with differences in players' expertise, differences in their understanding of game rules, misperceptions, and so forth. Each of these different sub-scenarios, or subgames, is associated with a probability--representing the likelihood that the given subgame is truly "in play" at a given moment. In order to form an optimal attack or defense policy, these probabilities must be learned if they're not known a-priori. We present hidden Markov model and maximum entropy approaches for accurately learning these probabilities through multiple iterations of both normal and modified game play. We also give a widely-applicable approach for the analysis of cases where an opponent is aware that he is being studied, and intentionally plays to spoil the process of learning and thereby obfuscate his attributes. These are considered in the context of a generic, abstract cyber attack example. We demonstrate that machine learning efficacy can be heavily dependent on the goals and styles of participant behavior. To this end detailed simulation results under various combinations of attacker and defender behaviors are presented and analyzed.
NASA Technical Reports Server (NTRS)
Mckenzie, R. L.
1974-01-01
The semiclassical approximation is applied to anharmonic diatomic oscillators in excited initial states. Multistate numerical solutions giving the vibrational transition probabilities for collinear collisions with an inert atom are compared with equivalent, exact quantum-mechanical calculations. Several symmetrization methods are shown to correlate accurately the predictions of both theories for all initial states, transitions, and molecular types tested, but only if coupling of the oscillator motion and the classical trajectory of the incident particle is considered. In anharmonic heteronuclear molecules, the customary semiclassical method of computing the classical trajectory independently leads to transition probabilities with anomalous low-energy resonances. Proper accounting of the effects of oscillator compression and recoil on the incident particle trajectory removes the anomalies and restores the applicability of the semiclassical approximation.
Weak Measurement and Quantum Smoothing of a Superconducting Qubit
NASA Astrophysics Data System (ADS)
Tan, Dian
In quantum mechanics, the measurement outcome of an observable in a quantum system is intrinsically random, yielding a probability distribution. The state of the quantum system can be described by a density matrix rho(t), which depends on the information accumulated until time t, and represents our knowledge about the system. The density matrix rho(t) gives probabilities for the outcomes of measurements at time t. Further probing of the quantum system allows us to refine our prediction in hindsight. In this thesis, we experimentally examine a quantum smoothing theory in a superconducting qubit by introducing an auxiliary matrix E(t) which is conditioned on information obtained from time t to a final time T. With the complete information before and after time t, the pair of matrices [rho(t), E(t)] can be used to make smoothed predictions for the measurement outcome at time t. We apply the quantum smoothing theory in the case of continuous weak measurement unveiling the retrodicted quantum trajectories and weak values. In the case of strong projective measurement, while the density matrix rho(t) with only diagonal elements in a given basis |n〉 may be treated as a classical mixture, we demonstrate a failure of this classical mixture description in determining the smoothed probabilities for the measurement outcome at time t with both diagonal rho(t) and diagonal E(t). We study the correlations between quantum states and weak measurement signals and examine aspects of the time symmetry of continuous quantum measurement. We also extend our study of quantum smoothing theory to the case of resonance fluorescence of a superconducting qubit with homodyne measurement and observe some interesting effects such as the modification of the excited state probabilities, weak values, and evolution of the predicted and retrodicted trajectories.
On the asymptotic states and the quantum S matrix of the η-deformed AdS 5 × S 5 superstring
Engelund, Oluf Tang; Roiban, Radu
2015-03-31
We investigate the worldsheet S matrix of string theory in η-deformed AdS 5 × S 5. By computing the six-point tree-level S matrix we explicitly show that there is no particle production at this level, as required by the classical integrability of the theory. At one and two loops we show that integrability requires that the classical two-particle states be redefined in a non-local and η-dependent way. This is a significant departure from the undeformed theory which is probably related to the quantum group symmetry of the worldsheet theory. We use generalized unitarity to carry out the loop calculations andmore » identify a set of integrals that allow us to give a two-loop Feynman integral representation of the logarithmic terms of the two-loop S matrix. We finally also discuss aspects of the calculation of the two-loop rational terms.« less
Reality, Contextuality, and Probability in Quantum Theory and Beyond
NASA Astrophysics Data System (ADS)
Plotnitsky, Arkady
This chapter explores the relationships among reality, contextuality, and probability, especially in quantum theory and, brie y and by extension, in other fields where these concepts, in their quantum-like versions, may play key roles. The chapter contends, following Derrida's argument, that while no meaning or event could be determined apart from its context, no context ultimately permits saturation, that is, could ever be determined with certainty. Any such determination is ultimately provisional. However, because of its mathematical-experimental character, physics allows one, in classical physics and relativity, to disregard the role of the context of observation in describing the physical systems considered, and in quantum mechanics, where the context of observation cannot be so disregarded, to determine such a context sufficiently. While, however, classical physics or relativity and quantum mechanics can do so sufficiently for their disciplinary functioning and practice, they cannot do so entirely. Moreover, a given concept of this functioning, especially as concerns what is considered its proper functioning, still depends on a broader contextual field that defies saturation or guaranteed determination.
Quantum Graphical Models and Belief Propagation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leifer, M.S.; Perimeter Institute for Theoretical Physics, 31 Caroline Street North, Waterloo Ont., N2L 2Y5; Poulin, D.
Belief Propagation algorithms acting on Graphical Models of classical probability distributions, such as Markov Networks, Factor Graphs and Bayesian Networks, are amongst the most powerful known methods for deriving probabilistic inferences amongst large numbers of random variables. This paper presents a generalization of these concepts and methods to the quantum case, based on the idea that quantum theory can be thought of as a noncommutative, operator-valued, generalization of classical probability theory. Some novel characterizations of quantum conditional independence are derived, and definitions of Quantum n-Bifactor Networks, Markov Networks, Factor Graphs and Bayesian Networks are proposed. The structure of Quantum Markovmore » Networks is investigated and some partial characterization results are obtained, along the lines of the Hammersley-Clifford theorem. A Quantum Belief Propagation algorithm is presented and is shown to converge on 1-Bifactor Networks and Markov Networks when the underlying graph is a tree. The use of Quantum Belief Propagation as a heuristic algorithm in cases where it is not known to converge is discussed. Applications to decoding quantum error correcting codes and to the simulation of many-body quantum systems are described.« less
Quantum entanglement percolation
NASA Astrophysics Data System (ADS)
Siomau, Michael
2016-09-01
Quantum communication demands efficient distribution of quantum entanglement across a network of connected partners. The search for efficient strategies for the entanglement distribution may be based on percolation theory, which describes evolution of network connectivity with respect to some network parameters. In this framework, the probability to establish perfect entanglement between two remote partners decays exponentially with the distance between them before the percolation transition point, which unambiguously defines percolation properties of any classical network or lattice. Here we introduce quantum networks created with local operations and classical communication, which exhibit non-classical percolation transition points leading to striking communication advantages over those offered by the corresponding classical networks. We show, in particular, how to establish perfect entanglement between any two nodes in the simplest possible network—the 1D chain—using imperfectly entangled pairs of qubits.
Quantum transitions through cosmological singularities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bramberger, Sebastian F.; Lehners, Jean-Luc; Hertog, Thomas
2017-07-01
In a quantum theory of cosmology spacetime behaves classically only in limited patches of the configuration space on which the wave function of the universe is defined. Quantum transitions can connect classical evolution in different patches. Working in the saddle point approximation and in minisuperspace we compute quantum transitions connecting inflationary histories across a de Sitter like throat or a singularity. This supplies probabilities for how an inflating universe, when evolved backwards, transitions and branches into an ensemble of histories on the opposite side of a quantum bounce. Generalising our analysis to scalar potentials with negative regions we identify saddlemore » points describing a quantum transition between a classically contracting, crunching ekpyrotic phase and an inflationary universe.« less
Quantum transitions through cosmological singularities
NASA Astrophysics Data System (ADS)
Bramberger, Sebastian F.; Hertog, Thomas; Lehners, Jean-Luc; Vreys, Yannick
2017-07-01
In a quantum theory of cosmology spacetime behaves classically only in limited patches of the configuration space on which the wave function of the universe is defined. Quantum transitions can connect classical evolution in different patches. Working in the saddle point approximation and in minisuperspace we compute quantum transitions connecting inflationary histories across a de Sitter like throat or a singularity. This supplies probabilities for how an inflating universe, when evolved backwards, transitions and branches into an ensemble of histories on the opposite side of a quantum bounce. Generalising our analysis to scalar potentials with negative regions we identify saddle points describing a quantum transition between a classically contracting, crunching ekpyrotic phase and an inflationary universe.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Bin; Department of Chemical Physics, University of Science and Technology of China, Hefei 230026; Guo, Hua, E-mail: hguo@unm.edu
Recently, we reported the first highly accurate nine-dimensional global potential energy surface (PES) for water interacting with a rigid Ni(111) surface, built on a large number of density functional theory points [B. Jiang and H. Guo, Phys. Rev. Lett. 114, 166101 (2015)]. Here, we investigate site-specific reaction probabilities on this PES using a quasi-seven-dimensional quantum dynamical model. It is shown that the site-specific reactivity is largely controlled by the topography of the PES instead of the barrier height alone, underscoring the importance of multidimensional dynamics. In addition, the full-dimensional dissociation probability is estimated by averaging fixed-site reaction probabilities with appropriatemore » weights. To validate this model and gain insights into the dynamics, additional quasi-classical trajectory calculations in both full and reduced dimensions have also been performed and important dynamical factors such as the steering effect are discussed.« less
Fundamental finite key limits for one-way information reconciliation in quantum key distribution
NASA Astrophysics Data System (ADS)
Tomamichel, Marco; Martinez-Mateo, Jesus; Pacher, Christoph; Elkouss, David
2017-11-01
The security of quantum key distribution protocols is guaranteed by the laws of quantum mechanics. However, a precise analysis of the security properties requires tools from both classical cryptography and information theory. Here, we employ recent results in non-asymptotic classical information theory to show that one-way information reconciliation imposes fundamental limitations on the amount of secret key that can be extracted in the finite key regime. In particular, we find that an often used approximation for the information leakage during information reconciliation is not generally valid. We propose an improved approximation that takes into account finite key effects and numerically test it against codes for two probability distributions, that we call binary-binary and binary-Gaussian, that typically appear in quantum key distribution protocols.
Quantum Information as a Non-Kolmogorovian Generalization of Shannon's Theory
NASA Astrophysics Data System (ADS)
Holik, Federico; Bosyk, Gustavo; Bellomo, Guido
2015-10-01
In this article we discuss the formal structure of a generalized information theory based on the extension of the probability calculus of Kolmogorov to a (possibly) non-commutative setting. By studying this framework, we argue that quantum information can be considered as a particular case of a huge family of non-commutative extensions of its classical counterpart. In any conceivable information theory, the possibility of dealing with different kinds of information measures plays a key role. Here, we generalize a notion of state spectrum, allowing us to introduce a majorization relation and a new family of generalized entropic measures.
Quantum-Like Model for Decision Making Process in Two Players Game. A Non-Kolmogorovian Model
NASA Astrophysics Data System (ADS)
Asano, Masanari; Ohya, Masanori; Khrennikov, Andrei
2011-03-01
In experiments of games, players frequently make choices which are regarded as irrational in game theory. In papers of Khrennikov (Information Dynamics in Cognitive, Psychological and Anomalous Phenomena. Fundamental Theories of Physics, Kluwer Academic, Norwell, 2004; Fuzzy Sets Syst. 155:4-17, 2005; Biosystems 84:225-241, 2006; Found. Phys. 35(10):1655-1693, 2005; in QP-PQ Quantum Probability and White Noise Analysis, vol. XXIV, pp. 105-117, 2009), it was pointed out that statistics collected in such the experiments have "quantum-like" properties, which can not be explained in classical probability theory. In this paper, we design a simple quantum-like model describing a decision-making process in a two-players game and try to explain a mechanism of the irrational behavior of players. Finally we discuss a mathematical frame of non-Kolmogorovian system in terms of liftings (Accardi and Ohya, in Appl. Math. Optim. 39:33-59, 1999).
Quantum gravity in timeless configuration space
NASA Astrophysics Data System (ADS)
Gomes, Henrique
2017-12-01
On the path towards quantum gravity we find friction between temporal relations in quantum mechanics (QM) (where they are fixed and field-independent), and in general relativity (where they are field-dependent and dynamic). This paper aims to attenuate that friction, by encoding gravity in the timeless configuration space of spatial fields with dynamics given by a path integral. The framework demands that boundary conditions for this path integral be uniquely given, but unlike other approaches where they are prescribed—such as the no-boundary and the tunneling proposals—here I postulate basic principles to identify boundary conditions in a large class of theories. Uniqueness arises only if a reduced configuration space can be defined and if it has a profoundly asymmetric fundamental structure. These requirements place strong restrictions on the field and symmetry content of theories encompassed here; shape dynamics is one such theory. When these constraints are met, any emerging theory will have a Born rule given merely by a particular volume element built from the path integral in (reduced) configuration space. Also as in other boundary proposals, Time, including space-time, emerges as an effective concept; valid for certain curves in configuration space but not assumed from the start. When some such notion of time becomes available, conservation of (positive) probability currents ensues. I show that, in the appropriate limits, a Schrödinger equation dictates the evolution of weakly coupled source fields on a classical gravitational background. Due to the asymmetry of reduced configuration space, these probabilities and currents avoid a known difficulty of standard WKB approximations for Wheeler DeWitt in minisuperspace: the selection of a unique Hamilton–Jacobi solution to serve as background. I illustrate these constructions with a simple example of a full quantum gravitational theory (i.e. not in minisuperspace) for which the formalism is applicable, and give a formula for calculating gravitational semi-classical relative probabilities in it.
Physics of risk and uncertainty in quantum decision making
NASA Astrophysics Data System (ADS)
Yukalov, V. I.; Sornette, D.
2009-10-01
The Quantum Decision Theory, developed recently by the authors, is applied to clarify the role of risk and uncertainty in decision making and in particular in relation to the phenomenon of dynamic inconsistency. By formulating this notion in precise mathematical terms, we distinguish three types of inconsistency: time inconsistency, planning paradox, and inconsistency occurring in some discounting effects. While time inconsistency is well accounted for in classical decision theory, the planning paradox is in contradiction with classical utility theory. It finds a natural explanation in the frame of the Quantum Decision Theory. Different types of discounting effects are analyzed and shown to enjoy a straightforward explanation within the suggested theory. We also introduce a general methodology based on self-similar approximation theory for deriving the evolution equations for the probabilities of future prospects. This provides a novel classification of possible discount factors, which include the previously known cases (exponential or hyperbolic discounting), but also predicts a novel class of discount factors that decay to a strictly positive constant for very large future time horizons. This class may be useful to deal with very long-term discounting situations associated with intergenerational public policy choices, encompassing issues such as global warming and nuclear waste disposal.
Spectral dimension controlling the decay of the quantum first-detection probability
NASA Astrophysics Data System (ADS)
Thiel, Felix; Kessler, David A.; Barkai, Eli
2018-06-01
We consider a quantum system that is initially localized at xin and that is repeatedly projectively probed with a fixed period τ at position xd. We ask for the probability Fn that the system is detected at xd for the very first time, where n is the number of detection attempts. We relate the asymptotic decay and oscillations of Fn with the system's energy spectrum, which is assumed to be absolutely continuous. In particular, Fn is determined by the Hamiltonian's measurement spectral density of states (MSDOS) f (E ) that is closely related to the density of energy states (DOS). We find that Fn decays like a power law whose exponent is determined by the power-law exponent dS of f (E ) around its singularities E*. Our findings are analogous to the classical first passage theory of random walks. In contrast to the classical case, the decay of Fn is accompanied by oscillations with frequencies that are determined by the singularities E*. This gives rise to critical detection periods τc at which the oscillations disappear. In the ordinary case dS can be identified with the spectral dimension associated with the DOS. Furthermore, the singularities E* are the van Hove singularities of the DOS in this case. We find that the asymptotic statistics of Fn depend crucially on the initial and detection state and can be wildly different for out-of-the-ordinary states, which is in sharp contrast to the classical theory. The properties of the first-detection probabilities can alternatively be derived from the transition amplitudes. All our results are confirmed by numerical simulations of the tight-binding model, and of a free particle in continuous space both with a normal and with an anomalous dispersion relation. We provide explicit asymptotic formulas for the first-detection probability in these models.
New fundamental evidence of non-classical structure in the combination of natural concepts.
Aerts, D; Sozzo, S; Veloz, T
2016-01-13
We recently performed cognitive experiments on conjunctions and negations of two concepts with the aim of investigating the combination problem of concepts. Our experiments confirmed the deviations (conceptual vagueness, underextension, overextension etc.) from the rules of classical (fuzzy) logic and probability theory observed by several scholars in concept theory, while our data were successfully modelled in a quantum-theoretic framework developed by ourselves. In this paper, we isolate a new, very stable and systematic pattern of violation of classicality that occurs in concept combinations. In addition, the strength and regularity of this non-classical effect leads us to believe that it occurs at a more fundamental level than the deviations observed up to now. It is our opinion that we have identified a deep non-classical mechanism determining not only how concepts are combined but, rather, how they are formed. We show that this effect can be faithfully modelled in a two-sector Fock space structure, and that it can be exactly explained by assuming that human thought is the superposition of two processes, a 'logical reasoning', guided by 'logic', and a 'conceptual reasoning', guided by 'emergence', and that the latter generally prevails over the former. All these findings provide new fundamental support to our quantum-theoretic approach to human cognition. © 2015 The Author(s).
NASA Astrophysics Data System (ADS)
Friedberg, R.; Hohenberg, P. C.
2014-09-01
Formulations of quantum mechanics (QM) can be characterized as realistic, operationalist, or a combination of the two. In this paper a realistic theory is defined as describing a closed system entirely by means of entities and concepts pertaining to the system. An operationalist theory, on the other hand, requires in addition entities external to the system. A realistic formulation comprises an ontology, the set of (mathematical) entities that describe the system, and assertions, the set of correct statements (predictions) the theory makes about the objects in the ontology. Classical mechanics is the prime example of a realistic physical theory. A straightforward generalization of classical mechanics to QM is hampered by the inconsistency of quantum properties with classical logic, a circumstance that was noted many years ago by Birkhoff and von Neumann. The present realistic formulation of the histories approach originally introduced by Griffiths, which we call ‘compatible quantum theory (CQT)’, consists of a ‘microscopic’ part (MIQM), which applies to a closed quantum system of any size, and a ‘macroscopic’ part (MAQM), which requires the participation of a large (ideally, an infinite) system. The first (MIQM) can be fully formulated based solely on the assumption of a Hilbert space ontology and the noncontextuality of probability values, relying in an essential way on Gleason's theorem and on an application to dynamics due in large part to Nistico. Thus, the present formulation, in contrast to earlier ones, derives the Born probability formulas and the consistency (decoherence) conditions for frameworks. The microscopic theory does not, however, possess a unique corpus of assertions, but rather a multiplicity of contextual truths (‘c-truths’), each one associated with a different framework. This circumstance leads us to consider the microscopic theory to be physically indeterminate and therefore incomplete, though logically coherent. The completion of the theory requires a macroscopic mechanism for selecting a physical framework, which is part of the macroscopic theory (MAQM). The selection of a physical framework involves the breaking of the microscopic ‘framework symmetry’, which can proceed either phenomenologically as in the standard quantum measurement theory, or more fundamentally by considering the quantum system under study to be a subsystem of a macroscopic quantum system. The decoherent histories formulation of Gell-Mann and Hartle, as well as that of Omnès, are theories of this fundamental type, where the physical framework is selected by a coarse-graining procedure in which the physical phenomenon of decoherence plays an essential role. Various well-known interpretations of QM are described from the perspective of CQT. Detailed definitions and proofs are presented in the appendices.
Are Quantum Models for Order Effects Quantum?
NASA Astrophysics Data System (ADS)
Moreira, Catarina; Wichert, Andreas
2017-12-01
The application of principles of Quantum Mechanics in areas outside of physics has been getting increasing attention in the scientific community in an emergent disciplined called Quantum Cognition. These principles have been applied to explain paradoxical situations that cannot be easily explained through classical theory. In quantum probability, events are characterised by a superposition state, which is represented by a state vector in a N-dimensional vector space. The probability of an event is given by the squared magnitude of the projection of this superposition state into the desired subspace. This geometric approach is very useful to explain paradoxical findings that involve order effects, but do we really need quantum principles for models that only involve projections? This work has two main goals. First, it is still not clear in the literature if a quantum projection model has any advantage towards a classical projection. We compared both models and concluded that the Quantum Projection model achieves the same results as its classical counterpart, because the quantum interference effects play no role in the computation of the probabilities. Second, it intends to propose an alternative relativistic interpretation for rotation parameters that are involved in both classical and quantum models. In the end, instead of interpreting these parameters as a similarity measure between questions, we propose that they emerge due to the lack of knowledge concerned with a personal basis state and also due to uncertainties towards the state of world and towards the context of the questions.
Quantifying Bell nonlocality with the trace distance
NASA Astrophysics Data System (ADS)
Brito, S. G. A.; Amaral, B.; Chaves, R.
2018-02-01
Measurements performed on distant parts of an entangled quantum state can generate correlations incompatible with classical theories respecting the assumption of local causality. This is the phenomenon known as quantum nonlocality that, apart from its fundamental role, can also be put to practical use in applications such as cryptography and distributed computing. Clearly, developing ways of quantifying nonlocality is an important primitive in this scenario. Here, we propose to quantify the nonlocality of a given probability distribution via its trace distance to the set of classical correlations. We show that this measure is a monotone under the free operations of a resource theory and, furthermore, that it can be computed efficiently with a linear program. We put our framework to use in a variety of relevant Bell scenarios also comparing the trace distance to other standard measures in the literature.
Pressure wave propagation in fluid-filled co-axial elastic tubes. Part 1: Basic theory.
Berkouk, K; Carpenter, P W; Lucey, A D
2003-12-01
Our work is motivated by ideas about the pathogenesis of syringomyelia. This is a serious disease characterized by the appearance of longitudinal cavities within the spinal cord. Its causes are unknown, but pressure propagation is probably implicated. We have developed an inviscid theory for the propagation of pressure waves in co-axial, fluid-filled, elastic tubes. This is intended as a simple model of the intraspinal cerebrospinal-fluid system. Our approach is based on the classic theory for the propagation of longitudinal waves in single, fluid-filled, elastic tubes. We show that for small-amplitude waves the governing equations reduce to the classic wave equation. The wave speed is found to be a strong function of the ratio of the tubes' cross-sectional areas. It is found that the leading edge of a transmural pressure pulse tends to generate compressive waves with converging wave fronts. Consequently, the leading edge of the pressure pulse steepens to form a shock-like elastic jump. A weakly nonlinear theory is developed for such an elastic jump.
Entanglement-enhanced Neyman-Pearson target detection using quantum illumination
NASA Astrophysics Data System (ADS)
Zhuang, Quntao; Zhang, Zheshen; Shapiro, Jeffrey H.
2017-08-01
Quantum illumination (QI) provides entanglement-based target detection---in an entanglement-breaking environment---whose performance is significantly better than that of optimum classical-illumination target detection. QI's performance advantage was established in a Bayesian setting with the target presumed equally likely to be absent or present and error probability employed as the performance metric. Radar theory, however, eschews that Bayesian approach, preferring the Neyman-Pearson performance criterion to avoid the difficulties of accurately assigning prior probabilities to target absence and presence and appropriate costs to false-alarm and miss errors. We have recently reported an architecture---based on sum-frequency generation (SFG) and feedforward (FF) processing---for minimum error-probability QI target detection with arbitrary prior probabilities for target absence and presence. In this paper, we use our results for FF-SFG reception to determine the receiver operating characteristic---detection probability versus false-alarm probability---for optimum QI target detection under the Neyman-Pearson criterion.
Cumulants, free cumulants and half-shuffles
Ebrahimi-Fard, Kurusch; Patras, Frédéric
2015-01-01
Free cumulants were introduced as the proper analogue of classical cumulants in the theory of free probability. There is a mix of similarities and differences, when one considers the two families of cumulants. Whereas the combinatorics of classical cumulants is well expressed in terms of set partitions, that of free cumulants is described and often introduced in terms of non-crossing set partitions. The formal series approach to classical and free cumulants also largely differs. The purpose of this study is to put forward a different approach to these phenomena. Namely, we show that cumulants, whether classical or free, can be understood in terms of the algebra and combinatorics underlying commutative as well as non-commutative (half-)shuffles and (half-) unshuffles. As a corollary, cumulants and free cumulants can be characterized through linear fixed point equations. We study the exponential solutions of these linear fixed point equations, which display well the commutative, respectively non-commutative, character of classical and free cumulants. PMID:27547078
NASA Astrophysics Data System (ADS)
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Pérez, Omar D; Aitken, Michael R F; Zhukovsky, Peter; Soto, Fabián A; Urcelay, Gonzalo P; Dickinson, Anthony
2016-12-15
Associative learning theories regard the probability of reinforcement as the critical factor determining responding. However, the role of this factor in instrumental conditioning is not completely clear. In fact, free-operant experiments show that participants respond at a higher rate on variable ratio than on variable interval schedules even though the reinforcement probability is matched between the schedules. This difference has been attributed to the differential reinforcement of long inter-response times (IRTs) by interval schedules, which acts to slow responding. In the present study, we used a novel experimental design to investigate human responding under random ratio (RR) and regulated probability interval (RPI) schedules, a type of interval schedule that sets a reinforcement probability independently of the IRT duration. Participants responded on each type of schedule before a final choice test in which they distributed responding between two schedules similar to those experienced during training. Although response rates did not differ during training, the participants responded at a lower rate on the RPI schedule than on the matched RR schedule during the choice test. This preference cannot be attributed to a higher probability of reinforcement for long IRTs and questions the idea that similar associative processes underlie classical and instrumental conditioning.
NASA Astrophysics Data System (ADS)
Alekseev, Oleg; Mineev-Weinstein, Mark
2016-12-01
A point source on a plane constantly emits particles which rapidly diffuse and then stick to a growing cluster. The growth probability of a cluster is presented as a sum over all possible scenarios leading to the same final shape. The classical point for the action, defined as a minus logarithm of the growth probability, describes the most probable scenario and reproduces the Laplacian growth equation, which embraces numerous fundamental free boundary dynamics in nonequilibrium physics. For nonclassical scenarios we introduce virtual point sources, in which presence the action becomes the Kullback-Leibler entropy. Strikingly, this entropy is shown to be the sum of electrostatic energies of layers grown per elementary time unit. Hence the growth probability of the presented nonequilibrium process obeys the Gibbs-Boltzmann statistics, which, as a rule, is not applied out from equilibrium. Each layer's probability is expressed as a product of simple factors in an auxiliary complex plane after a properly chosen conformal map. The action at this plane is a sum of Robin functions, which solve the Liouville equation. At the end we establish connections of our theory with the τ function of the integrable Toda hierarchy and with the Liouville theory for noncritical quantum strings.
Quantum-Like Representation of Non-Bayesian Inference
NASA Astrophysics Data System (ADS)
Asano, M.; Basieva, I.; Khrennikov, A.; Ohya, M.; Tanaka, Y.
2013-01-01
This research is related to the problem of "irrational decision making or inference" that have been discussed in cognitive psychology. There are some experimental studies, and these statistical data cannot be described by classical probability theory. The process of decision making generating these data cannot be reduced to the classical Bayesian inference. For this problem, a number of quantum-like coginitive models of decision making was proposed. Our previous work represented in a natural way the classical Bayesian inference in the frame work of quantum mechanics. By using this representation, in this paper, we try to discuss the non-Bayesian (irrational) inference that is biased by effects like the quantum interference. Further, we describe "psychological factor" disturbing "rationality" as an "environment" correlating with the "main system" of usual Bayesian inference.
The stochastic energy-Casimir method
NASA Astrophysics Data System (ADS)
Arnaudon, Alexis; Ganaba, Nader; Holm, Darryl D.
2018-04-01
In this paper, we extend the energy-Casimir stability method for deterministic Lie-Poisson Hamiltonian systems to provide sufficient conditions for stability in probability of stochastic dynamical systems with symmetries. We illustrate this theory with classical examples of coadjoint motion, including the rigid body, the heavy top, and the compressible Euler equation in two dimensions. The main result is that stable deterministic equilibria remain stable in probability up to a certain stopping time that depends on the amplitude of the noise for finite-dimensional systems and on the amplitude of the spatial derivative of the noise for infinite-dimensional systems. xml:lang="fr"
Unconditional security of quantum key distribution over arbitrarily long distances
Lo; Chau
1999-03-26
Quantum key distribution is widely thought to offer unconditional security in communication between two users. Unfortunately, a widely accepted proof of its security in the presence of source, device, and channel noises has been missing. This long-standing problem is solved here by showing that, given fault-tolerant quantum computers, quantum key distribution over an arbitrarily long distance of a realistic noisy channel can be made unconditionally secure. The proof is reduced from a noisy quantum scheme to a noiseless quantum scheme and then from a noiseless quantum scheme to a noiseless classical scheme, which can then be tackled by classical probability theory.
Efficient quantum walk on a quantum processor
Qiang, Xiaogang; Loke, Thomas; Montanaro, Ashley; Aungskunsiri, Kanin; Zhou, Xiaoqi; O'Brien, Jeremy L.; Wang, Jingbo B.; Matthews, Jonathan C. F.
2016-01-01
The random walk formalism is used across a wide range of applications, from modelling share prices to predicting population genetics. Likewise, quantum walks have shown much potential as a framework for developing new quantum algorithms. Here we present explicit efficient quantum circuits for implementing continuous-time quantum walks on the circulant class of graphs. These circuits allow us to sample from the output probability distributions of quantum walks on circulant graphs efficiently. We also show that solving the same sampling problem for arbitrary circulant quantum circuits is intractable for a classical computer, assuming conjectures from computational complexity theory. This is a new link between continuous-time quantum walks and computational complexity theory and it indicates a family of tasks that could ultimately demonstrate quantum supremacy over classical computers. As a proof of principle, we experimentally implement the proposed quantum circuit on an example circulant graph using a two-qubit photonics quantum processor. PMID:27146471
Boolean approach to dichotomic quantum measurement theories
NASA Astrophysics Data System (ADS)
Nagata, K.; Nakamura, T.; Batle, J.; Abdalla, S.; Farouk, A.
2017-02-01
Recently, a new measurement theory based on truth values was proposed by Nagata and Nakamura [Int. J. Theor. Phys. 55, 3616 (2016)], that is, a theory where the results of measurements are either 0 or 1. The standard measurement theory accepts a hidden variable model for a single Pauli observable. Hence, we can introduce a classical probability space for the measurement theory in this particular case. Additionally, we discuss in the present contribution the fact that projective measurement theories (the results of which are either +1 or -1) imply the Bell, Kochen, and Specker (BKS) paradox for a single Pauli observable. To justify our assertion, we present the BKS theorem in almost all the two-dimensional states by using a projective measurement theory. As an example, we present the BKS theorem in two-dimensions with white noise. Our discussion provides new insight into the quantum measurement problem by using this measurement theory based on the truth values.
About the cumulants of periodic signals
NASA Astrophysics Data System (ADS)
Barrau, Axel; El Badaoui, Mohammed
2018-01-01
This note studies cumulants of time series. These functions originating from the probability theory being commonly used as features of deterministic signals, their classical properties are examined in this modified framework. We show additivity of cumulants, ensured in the case of independent random variables, requires here a different hypothesis. Practical applications are proposed, in particular an analysis of the failure of the JADE algorithm to separate some specific periodic signals.
Reliability of a Measure of Institutional Discrimination against Minorities
1979-12-01
samples are presented. The first is based upon classical statistical theory and the second derives from a series of computer-generated Monte Carlo...Institutional racism and sexism . Englewood Cliffs, N. J.: Prentice-Hall, Inc., 1978. Hays, W. L. and Winkler, R. L. Statistics : probability, inference... statistical measure of the e of institutional discrimination are discussed. Two methods of dealing with the problem of reliability of the measure in small
Trapping dynamics of xenon on Pt(111)
NASA Astrophysics Data System (ADS)
Arumainayagam, Christopher R.; Madix, Robert J.; Mcmaster, Mark C.; Suzawa, Valerie M.; Tully, John C.
1990-02-01
The dynamics of Xe trapping on Pt(111) was studied using supersonic atomic beam techniques. Initial trapping probabilities ( S0) were measured directly as a function of incident translational energy ( EinT) and angle of incidence (θ i) at a surface temperature ( Tins) 95 K. The initial trapping probability decreases smoothly with increasing ET cosθ i;, rather than ET cos 2θ i, suggesting participation of parallel momentum in the trapping process. Accordingly, the measured initial trapping probability falls off more slowly with increasing incident translational energy than predicted by one-dimensional theories. This finding is in near agreement with previous mean translational energy measurements for Xe desorbing near the Pt(111) surface normal, assuming detailed balance applies. Three-dimensional stochastic classical trajectory calculations presented herein also exhibit the importance of tangential momentum in trapping and satisfactorily reproduce the experimental initial trapping probabilities.
Quantum Probability Cancellation Due to a Single-Photon State
NASA Technical Reports Server (NTRS)
Ou, Z. Y.
1996-01-01
When an N-photon state enters a lossless symmetric beamsplitter from one input port, the photon distribution for the two output ports has the form of Bernouli Binormial, with highest probability at equal partition (N/2 at one outport and N/2 at the other). However, injection of a single photon state at the other input port can dramatically change the photon distribution at the outputs, resulting in zero probability at equal partition. Such a strong deviation from classical particle theory stems from quantum probability amplitude cancellation. The effect persists even if the N-photon state is replaced by an arbitrary state of light. A special case is the coherent state which corresponds to homodyne detection of a single photon state and can lead to the measurement of the wave function of a single photon state.
Predicting the probability of slip in gait: methodology and distribution study.
Gragg, Jared; Yang, James
2016-01-01
The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.
A quantum measure of the multiverse
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vilenkin, Alexander, E-mail: vilenkin@cosmos.phy.tufts.edu
2014-05-01
It has been recently suggested that probabilities of different events in the multiverse are given by the frequencies at which these events are encountered along the worldline of a geodesic observer (the ''watcher''). Here I discuss an extension of this probability measure to quantum theory. The proposed extension is gauge-invariant, as is the classical version of this measure. Observations of the watcher are described by a reduced density matrix, and the frequencies of events can be found using the decoherent histories formalism of Quantum Mechanics (adapted to open systems). The quantum watcher measure makes predictions in agreement with the standardmore » Born rule of QM.« less
Fourth-Order Vibrational Transition State Theory and Chemical Kinetics
NASA Astrophysics Data System (ADS)
Stanton, John F.; Matthews, Devin A.; Gong, Justin Z.
2015-06-01
Second-order vibrational perturbation theory (VPT2) is an enormously successful and well-established theory for treating anharmonic effects on the vibrational levels of semi-rigid molecules. Partially as a consequence of the fact that the theory is exact for the Morse potential (which provides an appropriate qualitative model for stretching anharmonicity), VPT2 calculations for such systems with appropriate ab initio potential functions tend to give fundamental and overtone levels that fall within a handful of wavenumbers of experimentally measured positions. As a consequence, the next non-vanishing level of perturbation theory -- VPT4 -- offers only slight improvements over VPT2 and is not practical for most calculations since it requires information about force constants up through sextic. However, VPT4 (as well as VPT2) can be used for other applications such as the next vibrational correction to rotational constants (the ``gammas'') and other spectroscopic parameters. In addition, the marriage of VPT with the semi-classical transition state theory of Miller (SCTST) has recently proven to be a powerful and accurate treatment for chemical kinetics. In this talk, VPT4-based SCTST tunneling probabilities and cumulative reaction probabilities are give for the first time for selected low-dimensional model systems. The prospects for VPT4, both practical and intrinsic, will also be discussed.
Asteroid orbital error analysis: Theory and application
NASA Technical Reports Server (NTRS)
Muinonen, K.; Bowell, Edward
1992-01-01
We present a rigorous Bayesian theory for asteroid orbital error estimation in which the probability density of the orbital elements is derived from the noise statistics of the observations. For Gaussian noise in a linearized approximation the probability density is also Gaussian, and the errors of the orbital elements at a given epoch are fully described by the covariance matrix. The law of error propagation can then be applied to calculate past and future positional uncertainty ellipsoids (Cappellari et al. 1976, Yeomans et al. 1987, Whipple et al. 1991). To our knowledge, this is the first time a Bayesian approach has been formulated for orbital element estimation. In contrast to the classical Fisherian school of statistics, the Bayesian school allows a priori information to be formally present in the final estimation. However, Bayesian estimation does give the same results as Fisherian estimation when no priori information is assumed (Lehtinen 1988, and reference therein).
Fixation Probability in a Haploid-Diploid Population
Bessho, Kazuhiro; Otto, Sarah P.
2017-01-01
Classical population genetic theory generally assumes either a fully haploid or fully diploid life cycle. However, many organisms exhibit more complex life cycles, with both free-living haploid and diploid stages. Here we ask what the probability of fixation is for selected alleles in organisms with haploid-diploid life cycles. We develop a genetic model that considers the population dynamics using both the Moran model and Wright–Fisher model. Applying a branching process approximation, we obtain an accurate fixation probability assuming that the population is large and the net effect of the mutation is beneficial. We also find the diffusion approximation for the fixation probability, which is accurate even in small populations and for deleterious alleles, as long as selection is weak. These fixation probabilities from branching process and diffusion approximations are similar when selection is weak for beneficial mutations that are not fully recessive. In many cases, particularly when one phase predominates, the fixation probability differs substantially for haploid-diploid organisms compared to either fully haploid or diploid species. PMID:27866168
Tsallis non-extensive statistics and solar wind plasma complexity
NASA Astrophysics Data System (ADS)
Pavlos, G. P.; Iliopoulos, A. C.; Zastenker, G. N.; Zelenyi, L. M.; Karakatsanis, L. P.; Riazantseva, M. O.; Xenakis, M. N.; Pavlos, E. G.
2015-03-01
This article presents novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which took place on 26th September 2011. Solar wind plasma is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields (B → , E →) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992).
Information Theoretic Characterization of Physical Theories with Projective State Space
NASA Astrophysics Data System (ADS)
Zaopo, Marco
2015-08-01
Probabilistic theories are a natural framework to investigate the foundations of quantum theory and possible alternative or deeper theories. In a generic probabilistic theory, states of a physical system are represented as vectors of outcomes probabilities and state spaces are convex cones. In this picture the physics of a given theory is related to the geometric shape of the cone of states. In quantum theory, for instance, the shape of the cone of states corresponds to a projective space over complex numbers. In this paper we investigate geometric constraints on the state space of a generic theory imposed by the following information theoretic requirements: every non completely mixed state of a system is perfectly distinguishable from some other state in a single shot measurement; information capacity of physical systems is conserved under making mixtures of states. These assumptions guarantee that a generic physical system satisfies a natural principle asserting that the more a state of the system is mixed the less information can be stored in the system using that state as logical value. We show that all theories satisfying the above assumptions are such that the shape of their cones of states is that of a projective space over a generic field of numbers. Remarkably, these theories constitute generalizations of quantum theory where superposition principle holds with coefficients pertaining to a generic field of numbers in place of complex numbers. If the field of numbers is trivial and contains only one element we obtain classical theory. This result tells that superposition principle is quite common among probabilistic theories while its absence gives evidence of either classical theory or an implausible theory.
Matrix Concentration Inequalities via the Method of Exchangeable Pairs
2012-01-27
viewed as an exchangeable pairs version of the Burkholder –Davis–Gundy (BDG) inequality from classical martingale theory [Bur73]. Matrix extensions of...non-commutative probability. Math. Ann., 319:1–16, 2001. [Bur73] D. L. Burkholder . Distribution function inequalities for martingales. Ann. Probab., 1...Statist. Assoc., 58(301):13–30, 1963. [JX03] M. Junge and Q. Xu. Noncommutative Burkholder /Rosenthal inequalities. Ann. Probab., 31(2):948–995, 2003
Large-deviation probabilities for correlated Gaussian processes and intermittent dynamical systems
NASA Astrophysics Data System (ADS)
Massah, Mozhdeh; Nicol, Matthew; Kantz, Holger
2018-05-01
In its classical version, the theory of large deviations makes quantitative statements about the probability of outliers when estimating time averages, if time series data are identically independently distributed. We study large-deviation probabilities (LDPs) for time averages in short- and long-range correlated Gaussian processes and show that long-range correlations lead to subexponential decay of LDPs. A particular deterministic intermittent map can, depending on a control parameter, also generate long-range correlated time series. We illustrate numerically, in agreement with the mathematical literature, that this type of intermittency leads to a power law decay of LDPs. The power law decay holds irrespective of whether the correlation time is finite or infinite, and hence irrespective of whether the central limit theorem applies or not.
Simonov, P V
1997-01-01
At the 23rd International Congress of Physiology Sciences (Tokyo, 1965) the results of experiment led us to the conclusion that emotions were determined by the actual need and estimation of probability (possibility) of its satisfaction. Low probability of need satisfaction leads to negative emotions actively minimized by the subject. Increased probability of satisfaction, as compared to the earlier forecast, generates positive emotions which the subject tries to maximize, that is, to enhance, to prolong, to repeat. We named our concept the Need-Informational Theory of Emotions. According to this theory, motivation, emotion, and estimation of probability have different neuromorphological substrates. Activation through the hypothalamic motivatiogenic structures of the frontal parts of the neocortex orients the behavior to signals with a high probability of their reinforcement. At the same time the hippocampus is necessary for reactions to signals of low probability events, which are typical for the emotionally excited brain. By comparison of motivational excitation with available stimuli or their engrams, the amygdala selects a dominant motivation, destined to be satisfied in the first instance. In the cases of classical conditioning and escape reaction the reinforcement was related to involvement of the negative emotion's hypothalamic neurons, while in the course of avoidance reaction the positive emotion's neurons were involved. The role of the left and right frontal neocortex in the appearance or positive or negative emotions depends on these informational (cognitive) functions.
[The brain mechanisms of emotions].
Simonov, P V
1997-01-01
At the 23rd International Congress of Physiological Sciences (Tokyo, 1965) the results of experiment brought us to a conclusion that emotions were determined by the actual need and estimation of probability (possibility) of its satisfaction. Low probability of need satisfaction leads to negative emotions actively minimized by the subject. Increased probability of satisfaction, as compared to the earlier forecast, generates positive emotions which the subject tries to maximize, that is to enhance, to prolong, to repeat. We named our concept the Need-Informational Theory of Emotions. According to this theory, motivation, emotion and estimation of probability have different neuromorphological substrate. Activating by motivatiogenic structures of the hypothalamus the frontal parts of neocortex orients the behavior to signals with a high probability of their reinforcement. At the same time the hippocampus is necessary for reactions to signals of low probability events, which is typical for emotionally excited brain. By comparison of motivational excitation with available stimuli or their engrams the amygdala selects a dominant motivation, destined to be satisfied in the first instance. In the cases of classical conditioning and escape reaction the reinforcement was related to involvement of the negative emotion's hypothalamic neurons while in the course of avoidance reaction the positive emotion's neurons being involved. The role of the left and right frontal neocortex in the appearance of positive or negative emotions depends on this informational (cognitive) functions.
Zurek, Wojciech Hubert
2018-07-13
The emergence of the classical world from the quantum substrate of our Universe is a long-standing conundrum. In this paper, I describe three insights into the transition from quantum to classical that are based on the recognition of the role of the environment. I begin with the derivation of preferred sets of states that help to define what exists-our everyday classical reality. They emerge as a result of the breaking of the unitary symmetry of the Hilbert space which happens when the unitarity of quantum evolutions encounters nonlinearities inherent in the process of amplification-of replicating information. This derivation is accomplished without the usual tools of decoherence, and accounts for the appearance of quantum jumps and the emergence of preferred pointer states consistent with those obtained via environment-induced superselection, or einselection The pointer states obtained in this way determine what can happen-define events-without appealing to Born's Rule for probabilities. Therefore, p k =| ψ k | 2 can now be deduced from the entanglement-assisted invariance, or envariance -a symmetry of entangled quantum states. With probabilities at hand, one also gains new insights into the foundations of quantum statistical physics. Moreover, one can now analyse the information flows responsible for decoherence. These information flows explain how the perception of objective classical reality arises from the quantum substrate: the effective amplification that they represent accounts for the objective existence of the einselected states of macroscopic quantum systems through the redundancy of pointer state records in their environment-through quantum Darwinism This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).
Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio
2012-09-07
In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007)]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H(5)(+) complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H(5)(+) complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011)] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.
NASA Astrophysics Data System (ADS)
Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio
2012-09-01
In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007), 10.1063/1.2430711]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H_5^+ complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H_5^+ complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011), 10.1063/1.3587246] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.
Randomized central limit theorems: A unified theory.
Eliazar, Iddo; Klafter, Joseph
2010-08-01
The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.
Randomized central limit theorems: A unified theory
NASA Astrophysics Data System (ADS)
Eliazar, Iddo; Klafter, Joseph
2010-08-01
The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles’ aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles’ extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic—scaling all ensemble components by a common deterministic scale. However, there are “random environment” settings in which the underlying scaling schemes are stochastic—scaling the ensemble components by different random scales. Examples of such settings include Holtsmark’s law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)—in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes—and present “randomized counterparts” to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.
Quantum Probability -- A New Direction for Modeling in Cognitive Science
NASA Astrophysics Data System (ADS)
Roy, Sisir
2014-07-01
Human cognition is still a puzzling issue in research and its appropriate modeling. It depends on how the brain behaves at that particular instance and identifies and responds to a signal among myriads of noises that are present in the surroundings (called external noise) as well as in the neurons themselves (called internal noise). Thus it is not surprising to assume that the functionality consists of various uncertainties, possibly a mixture of aleatory and epistemic uncertainties. It is also possible that a complicated pathway consisting of both types of uncertainties in continuum play a major role in human cognition. For more than 200 years mathematicians and philosophers have been using probability theory to describe human cognition. Recently in several experiments with human subjects, violation of traditional probability theory has been clearly revealed in plenty of cases. Literature survey clearly suggests that classical probability theory fails to model human cognition beyond a certain limit. While the Bayesian approach may seem to be a promising candidate to this problem, the complete success story of Bayesian methodology is yet to be written. The major problem seems to be the presence of epistemic uncertainty and its effect on cognition at any given time. Moreover the stochasticity in the model arises due to the unknown path or trajectory (definite state of mind at each time point), a person is following. To this end a generalized version of probability theory borrowing ideas from quantum mechanics may be a plausible approach. A superposition state in quantum theory permits a person to be in an indefinite state at each point of time. Such an indefinite state allows all the states to have the potential to be expressed at each moment. Thus a superposition state appears to be able to represent better, the uncertainty, ambiguity or conflict experienced by a person at any moment demonstrating that mental states follow quantum mechanics during perception and cognition of ambiguous figures.
ERIC Educational Resources Information Center
Yelboga, Atilla; Tavsancil, Ezel
2010-01-01
In this research, the classical test theory and generalizability theory analyses were carried out with the data obtained by a job performance scale for the years 2005 and 2006. The reliability coefficients obtained (estimated) from the classical test theory and generalizability theory analyses were compared. In classical test theory, test retest…
A kinetic theory for age-structured stochastic birth-death processes
NASA Astrophysics Data System (ADS)
Chou, Tom; Greenman, Chris
Classical age-structured mass-action models such as the McKendrick-von Foerster equation have been extensively studied but they are structurally unable to describe stochastic fluctuations or population-size-dependent birth and death rates. Conversely, current theories that include size-dependent population dynamics (e.g., carrying capacity) cannot be easily extended to take into account age-dependent birth and death rates. In this paper, we present a systematic derivation of a new fully stochastic kinetic theory for interacting age-structured populations. By defining multiparticle probability density functions, we derive a hierarchy of kinetic equations for the stochastic evolution of an aging population undergoing birth and death. We show that the fully stochastic age-dependent birth-death process precludes factorization of the corresponding probability densities, which then must be solved by using a BBGKY-like hierarchy. Our results generalize both deterministic models and existing master equation approaches by providing an intuitive and efficient way to simultaneously model age- and population-dependent stochastic dynamics applicable to the study of demography, stem cell dynamics, and disease evolution. NSF.
Quantum and classical behavior in interacting bosonic systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hertzberg, Mark P.
It is understood that in free bosonic theories, the classical field theory accurately describes the full quantum theory when the occupancy numbers of systems are very large. However, the situation is less understood in interacting theories, especially on time scales longer than the dynamical relaxation time. Recently there have been claims that the quantum theory deviates spectacularly from the classical theory on this time scale, even if the occupancy numbers are extremely large. Furthermore, it is claimed that the quantum theory quickly thermalizes while the classical theory does not. The evidence for these claims comes from noticing a spectacular differencemore » in the time evolution of expectation values of quantum operators compared to the classical micro-state evolution. If true, this would have dramatic consequences for many important phenomena, including laboratory studies of interacting BECs, dark matter axions, preheating after inflation, etc. In this work we critically examine these claims. We show that in fact the classical theory can describe the quantum behavior in the high occupancy regime, even when interactions are large. The connection is that the expectation values of quantum operators in a single quantum micro-state are approximated by a corresponding classical ensemble average over many classical micro-states. Furthermore, by the ergodic theorem, a classical ensemble average of local fields with statistical translation invariance is the spatial average of a single micro-state. So the correlation functions of the quantum and classical field theories of a single micro-state approximately agree at high occupancy, even in interacting systems. Furthermore, both quantum and classical field theories can thermalize, when appropriate coarse graining is introduced, with the classical case requiring a cutoff on low occupancy UV modes. We discuss applications of our results.« less
Continuous-Time Classical and Quantum Random Walk on Direct Product of Cayley Graphs
NASA Astrophysics Data System (ADS)
Salimi, S.; Jafarizadeh, M. A.
2009-06-01
In this paper we define direct product of graphs and give a recipe for obtaining probability of observing particle on vertices in the continuous-time classical and quantum random walk. In the recipe, the probability of observing particle on direct product of graph is obtained by multiplication of probability on the corresponding to sub-graphs, where this method is useful to determining probability of walk on complicated graphs. Using this method, we calculate the probability of continuous-time classical and quantum random walks on many of finite direct product Cayley graphs (complete cycle, complete Kn, charter and n-cube). Also, we inquire that the classical state the stationary uniform distribution is reached as t → ∞ but for quantum state is not always satisfied.
Diagrammar in classical scalar field theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cattaruzza, E., E-mail: Enrico.Cattaruzza@gmail.com; Gozzi, E., E-mail: gozzi@ts.infn.it; INFN, Sezione di Trieste
2011-09-15
In this paper we analyze perturbatively a g{phi}{sup 4}classical field theory with and without temperature. In order to do that, we make use of a path-integral approach developed some time ago for classical theories. It turns out that the diagrams appearing at the classical level are many more than at the quantum level due to the presence of extra auxiliary fields in the classical formalism. We shall show that a universal supersymmetry present in the classical path-integral mentioned above is responsible for the cancelation of various diagrams. The same supersymmetry allows the introduction of super-fields and super-diagrams which considerably simplifymore » the calculations and make the classical perturbative calculations almost 'identical' formally to the quantum ones. Using the super-diagrams technique, we develop the classical perturbation theory up to third order. We conclude the paper with a perturbative check of the fluctuation-dissipation theorem. - Highlights: > We provide the Feynman diagrams of perturbation theory for a classical field theory. > We give a super-formalism which links the quantum diagrams to the classical ones. > We check perturbatively the fluctuation-dissipation theorem.« less
Wolf, Paul L
2005-11-01
Many myths, theories, and speculations exist as to the exact etiology of the diseases, drugs, and chemicals that affected the creativity and productivity of famous sculptors, classic painters, classic music composers, and authors. To emphasize the importance of a modern clinical chemistry laboratory and hematology coagulation laboratory in interpreting the basis for the creativity and productivity of various artists. This investigation analyzed the lives of famous artists, including classical sculptor Benvenuto Cellini; classical sculptor and painter Michelangelo Buonarroti; classic painters Ivar Arosenius, Edvard Munch, and Vincent Van Gogh; classic music composer Louis Hector Berlioz; and English essayist Thomas De Quincey. The analysis includes their illnesses, their famous artistic works, and the modern clinical chemistry, toxicology, and hematology coagulation tests that would have been important in the diagnosis and treatment of their diseases. The associations between illness and art may be close and many because of both the actual physical limitations of the artists and their mental adaptation to disease. Although they were ill, many continued to be productive. If modern clinical chemistry, toxicology, and hematology coagulation laboratories had existed during the lifetimes of these various well-known individuals, clinical laboratories might have unraveled the mysteries of their afflictions. The illnesses these people endured probably could have been ascertained and perhaps treated. Diseases, drugs, and chemicals may have influenced their creativity and productivity.
NASA Astrophysics Data System (ADS)
Dahms, Rainer N.
2016-04-01
A generalized framework for multi-component liquid injections is presented to understand and predict the breakdown of classic two-phase theory and spray atomization at engine-relevant conditions. The analysis focuses on the thermodynamic structure and the immiscibility state of representative gas-liquid interfaces. The most modern form of Helmholtz energy mixture state equation is utilized which exhibits a unique and physically consistent behavior over the entire two-phase regime of fluid densities. It is combined with generalized models for non-linear gradient theory and for liquid injections to quantify multi-component two-phase interface structures in global thermal equilibrium. Then, the Helmholtz free energy is minimized which determines the interfacial species distribution as a consequence. This minimal free energy state is demonstrated to validate the underlying assumptions of classic two-phase theory and spray atomization. However, under certain engine-relevant conditions for which corroborating experimental data are presented, this requirement for interfacial thermal equilibrium becomes unsustainable. A rigorously derived probability density function quantifies the ability of the interface to develop internal spatial temperature gradients in the presence of significant temperature differences between injected liquid and ambient gas. Then, the interface can no longer be viewed as an isolated system at minimal free energy. Instead, the interfacial dynamics become intimately connected to those of the separated homogeneous phases. Hence, the interface transitions toward a state in local equilibrium whereupon it becomes a dense-fluid mixing layer. A new conceptual view of a transitional liquid injection process emerges from a transition time scale analysis. Close to the nozzle exit, the two-phase interface still remains largely intact and more classic two-phase processes prevail as a consequence. Further downstream, however, the transition to dense-fluid mixing generally occurs before the liquid length is reached. The significance of the presented modeling expressions is established by a direct comparison to a reduced model, which utilizes widely applied approximations but fundamentally fails to capture the physical complexity discussed in this paper.
Is There a Conjunction Fallacy in Legal Probabilistic Decision Making?
Wojciechowski, Bartosz W; Pothos, Emmanuel M
2018-01-01
Classical probability theory (CPT) has represented the rational standard for decision making in human cognition. Even though CPT has provided many descriptively excellent decision models, there have also been some empirical results persistently problematic for CPT accounts. The tension between the normative prescription of CPT and human behavior is particularly acute in cases where we have higher expectations for rational decisions. One such case concerns legal decision making from legal experts, such as attorneys and prosecutors and, more so, judges. In the present research we explore one of the most influential CPT decision fallacies, the conjunction fallacy (CF), in a legal decision making task, involving assessing evidence that the same suspect had committed two separate crimes. The information for the two crimes was presented consecutively. Each participant was asked to provide individual ratings for the two crimes in some cases and conjunctive probability rating for both crimes in other cases, after all information had been presented. Overall, 360 probability ratings for guilt were collected from 120 participants, comprised of 40 judges, 40 attorneys and prosecutors, and 40 individuals without legal education. Our results provide evidence for a double conjunction fallacy (in this case, a higher probability of committing both crimes than the probability of committing either crime individually), in the group of individuals without legal education. These results are discussed in terms of their applied implications and in relation to a recent framework for understanding such results, quantum probability theory (QPT).
Is There a Conjunction Fallacy in Legal Probabilistic Decision Making?
Wojciechowski, Bartosz W.; Pothos, Emmanuel M.
2018-01-01
Classical probability theory (CPT) has represented the rational standard for decision making in human cognition. Even though CPT has provided many descriptively excellent decision models, there have also been some empirical results persistently problematic for CPT accounts. The tension between the normative prescription of CPT and human behavior is particularly acute in cases where we have higher expectations for rational decisions. One such case concerns legal decision making from legal experts, such as attorneys and prosecutors and, more so, judges. In the present research we explore one of the most influential CPT decision fallacies, the conjunction fallacy (CF), in a legal decision making task, involving assessing evidence that the same suspect had committed two separate crimes. The information for the two crimes was presented consecutively. Each participant was asked to provide individual ratings for the two crimes in some cases and conjunctive probability rating for both crimes in other cases, after all information had been presented. Overall, 360 probability ratings for guilt were collected from 120 participants, comprised of 40 judges, 40 attorneys and prosecutors, and 40 individuals without legal education. Our results provide evidence for a double conjunction fallacy (in this case, a higher probability of committing both crimes than the probability of committing either crime individually), in the group of individuals without legal education. These results are discussed in terms of their applied implications and in relation to a recent framework for understanding such results, quantum probability theory (QPT). PMID:29674983
Fixation Probability in a Haploid-Diploid Population.
Bessho, Kazuhiro; Otto, Sarah P
2017-01-01
Classical population genetic theory generally assumes either a fully haploid or fully diploid life cycle. However, many organisms exhibit more complex life cycles, with both free-living haploid and diploid stages. Here we ask what the probability of fixation is for selected alleles in organisms with haploid-diploid life cycles. We develop a genetic model that considers the population dynamics using both the Moran model and Wright-Fisher model. Applying a branching process approximation, we obtain an accurate fixation probability assuming that the population is large and the net effect of the mutation is beneficial. We also find the diffusion approximation for the fixation probability, which is accurate even in small populations and for deleterious alleles, as long as selection is weak. These fixation probabilities from branching process and diffusion approximations are similar when selection is weak for beneficial mutations that are not fully recessive. In many cases, particularly when one phase predominates, the fixation probability differs substantially for haploid-diploid organisms compared to either fully haploid or diploid species. Copyright © 2017 by the Genetics Society of America.
Quantum-Like Bayesian Networks for Modeling Decision Making
Moreira, Catarina; Wichert, Andreas
2016-01-01
In this work, we explore an alternative quantum structure to perform quantum probabilistic inferences to accommodate the paradoxical findings of the Sure Thing Principle. We propose a Quantum-Like Bayesian Network, which consists in replacing classical probabilities by quantum probability amplitudes. However, since this approach suffers from the problem of exponential growth of quantum parameters, we also propose a similarity heuristic that automatically fits quantum parameters through vector similarities. This makes the proposed model general and predictive in contrast to the current state of the art models, which cannot be generalized for more complex decision scenarios and that only provide an explanatory nature for the observed paradoxes. In the end, the model that we propose consists in a nonparametric method for estimating inference effects from a statistical point of view. It is a statistical model that is simpler than the previous quantum dynamic and quantum-like models proposed in the literature. We tested the proposed network with several empirical data from the literature, mainly from the Prisoner's Dilemma game and the Two Stage Gambling game. The results obtained show that the proposed quantum Bayesian Network is a general method that can accommodate violations of the laws of classical probability theory and make accurate predictions regarding human decision-making in these scenarios. PMID:26858669
[Modern foreign car safety systems and their forensic-medical significance].
Iakunin, S A
2007-01-01
The author gives a characteristic of active and passive security systems installed in cars of foreign production. These security systems significantly modify the classic car trauma character decreasing frequency of occurrence and dimensions of specific and typical injuries. A new approach based on the theory of probability to estimate these injuries is required. The most common active and passive security systems are described in the article; their principles of operation and influence on the trauma character are estimated.
Diagonal couplings of quantum Markov chains
NASA Astrophysics Data System (ADS)
Kümmerer, Burkhard; Schwieger, Kay
2016-05-01
In this paper we extend the coupling method from classical probability theory to quantum Markov chains on atomic von Neumann algebras. In particular, we establish a coupling inequality, which allow us to estimate convergence rates by analyzing couplings. For a given tensor dilation we construct a self-coupling of a Markov operator. It turns out that the coupling is a dual version of the extended dual transition operator studied by Gohm et al. We deduce that this coupling is successful if and only if the dilation is asymptotically complete.
Raykov, Tenko; Marcoulides, George A
2016-04-01
The frequently neglected and often misunderstood relationship between classical test theory and item response theory is discussed for the unidimensional case with binary measures and no guessing. It is pointed out that popular item response models can be directly obtained from classical test theory-based models by accounting for the discrete nature of the observed items. Two distinct observational equivalence approaches are outlined that render the item response models from corresponding classical test theory-based models, and can each be used to obtain the former from the latter models. Similarly, classical test theory models can be furnished using the reverse application of either of those approaches from corresponding item response models.
Improvement of ore recovery efficiency in a flotation column cell using ultra-sonic enhanced bubbles
NASA Astrophysics Data System (ADS)
Filippov, L. O.; Royer, J. J.; Filippova, I. V.
2017-07-01
The ore process flotation technique is enhanced by using external ultra-sonic waves. Compared to the classical flotation method, the application of ultrasounds to flotation fluids generates micro-bubbles by hydrodynamic cavitation. Flotation performances increase was modelled as a result of increased probabilities of the particle-bubble attachment and reduced detachment probability under sonication. A simplified analytical Navier-Stokes model is used to predict the effect of ultrasonic waves on bubble behavior. If the theory is verified by experimentation, it predicts that the ultrasonic waves would create cavitation micro-bubbles, smaller than the flotation bubble added by the gas sparger. This effect leads to increasing the number of small bubbles in the liquid which promote particle-bubble attachment through coalescence between bubbles and micro-bubbles. The decrease in the radius of the flotation bubbles under external vibration forces has an additional effect by enhancing the bubble-particle collision. Preliminary results performed on a potash ore seem to confirm the theory.
Bidirectional Classical Stochastic Processes with Measurements and Feedback
NASA Technical Reports Server (NTRS)
Hahne, G. E.
2005-01-01
A measurement on a quantum system is said to cause the "collapse" of the quantum state vector or density matrix. An analogous collapse occurs with measurements on a classical stochastic process. This paper addresses the question of describing the response of a classical stochastic process when there is feedback from the output of a measurement to the input, and is intended to give a model for quantum-mechanical processes that occur along a space-like reaction coordinate. The classical system can be thought of in physical terms as two counterflowing probability streams, which stochastically exchange probability currents in a way that the net probability current, and hence the overall probability, suitably interpreted, is conserved. The proposed formalism extends the . mathematics of those stochastic processes describable with linear, single-step, unidirectional transition probabilities, known as Markov chains and stochastic matrices. It is shown that a certain rearrangement and combination of the input and output of two stochastic matrices of the same order yields another matrix of the same type. Each measurement causes the partial collapse of the probability current distribution in the midst of such a process, giving rise to calculable, but non-Markov, values for the ensuing modification of the system's output probability distribution. The paper concludes with an analysis of a classical probabilistic version of the so-called grandfather paradox.
Decision theory and information propagation in quantum physics
NASA Astrophysics Data System (ADS)
Forrester, Alan
In recent papers, Zurek [(2005). Probabilities from entanglement, Born's rule p k =| ψ k | 2 from entanglement. Physical Review A, 71, 052105] has objected to the decision-theoretic approach of Deutsch [(1999) Quantum theory of probability and decisions. Proceedings of the Royal Society of London A, 455, 3129-3137] and Wallace [(2003). Everettian rationality: defending Deutsch's approach to probability in the Everett interpretation. Studies in History and Philosophy of Modern Physics, 34, 415-438] to deriving the Born rule for quantum probabilities on the grounds that it courts circularity. Deutsch and Wallace assume that the many worlds theory is true and that decoherence gives rise to a preferred basis. However, decoherence arguments use the reduced density matrix, which relies upon the partial trace and hence upon the Born rule for its validity. Using the Heisenberg picture and quantum Darwinism-the notion that classical information is quantum information that can proliferate in the environment pioneered in Ollivier et al. [(2004). Objective properties from subjective quantum states: Environment as a witness. Physical Review Letters, 93, 220401 and (2005). Environment as a witness: Selective proliferation of information and emergence of objectivity in a quantum universe. Physical Review A, 72, 042113]-I show that measurement interactions between two systems only create correlations between a specific set of commuting observables of system 1 and a specific set of commuting observables of system 2. This argument picks out a unique basis in which information flows in the correlations between those sets of commuting observables. I then derive the Born rule for both pure and mixed states and answer some other criticisms of the decision theoretic approach to quantum probability.
Quantum-classical correspondence for the inverted oscillator
NASA Astrophysics Data System (ADS)
Maamache, Mustapha; Ryeol Choi, Jeong
2017-11-01
While quantum-classical correspondence for a system is a very fundamental problem in modern physics, the understanding of its mechanism is often elusive, so the methods used and the results of detailed theoretical analysis have been accompanied by active debate. In this study, the differences and similarities between quantum and classical behavior for an inverted oscillator have been analyzed based on the description of a complete generalized Airy function-type quantum wave solution. The inverted oscillator model plays an important role in several branches of cosmology and particle physics. The quantum wave packet of the system is composed of many sub-packets that are localized at different positions with regular intervals between them. It is shown from illustrations of the probability density that, although the quantum trajectory of the wave propagation is somewhat different from the corresponding classical one, the difference becomes relatively small when the classical excitation is sufficiently high. We have confirmed that a quantum wave packet moving along a positive or negative direction accelerates over time like a classical wave. From these main interpretations and others in the text, we conclude that our theory exquisitely illustrates quantum and classical correspondence for the system, which is a crucial concept in quantum mechanics. Supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2016R1D1A1A09919503)
Quantum to Classical Transitions via Weak Measurements and Post-Selection
NASA Astrophysics Data System (ADS)
Cohen, Eliahu; Aharonov, Yakir
Alongside its immense empirical success, the quantum mechanical account of physical systems imposes a myriad of divergences from our thoroughly ingrained classical ways of thinking. These divergences, while striking, would have been acceptable if only a continuous transition to the classical domain was at hand. Strangely, this is not quite the case. The difficulties involved in reconciling the quantum with the classical have given rise to different interpretations, each with its own shortcomings. Traditionally, the two domains are sewed together by invoking an ad hoc theory of measurement, which has been incorporated in the axiomatic foundations of quantum theory. This work will incorporate a few related tools for addressing the above conceptual difficulties: deterministic operators, weak measurements, and post-selection. Weak Measurement, based on a very weak von Neumann coupling, is a unique kind of quantum measurement with numerous theoretical and practical applications. In contrast to other measurement techniques, it allows to gather a small amount of information regarding the quantum system, with only a negligible probability of collapsing it onto an eigenstate of the measured observable. A single weak measurement yieldsan almost random outcome, but when performed repeatedly over a large ensemble, the averaged outcome becomes increasingly robust and accurate. Importantly, a long sequence of weak measurements can be thought of as a single projective measurement. We claim in this work that classical variables appearing in the o-world, such as center of mass, moment of inertia, pressure, and average forces, result from a multitude of quantum weak measurements performed in the micro-world. Here again, the quantum outcomes are highly uncertain, but the law of large numbers obliges their convergence to the definite quantities we know from our everyday lives. By augmenting this description with a final boundary condition and employing the notion of "classical robustness under time-reversal", we will draw a quantitative borderline between the classical and quantum regimes. We will conclude by analyzing the role of oscopic systems in amplifying and recording quantum outcomes.
Graph-theoretic approach to quantum correlations.
Cabello, Adán; Severini, Simone; Winter, Andreas
2014-01-31
Correlations in Bell and noncontextuality inequalities can be expressed as a positive linear combination of probabilities of events. Exclusive events can be represented as adjacent vertices of a graph, so correlations can be associated to a subgraph. We show that the maximum value of the correlations for classical, quantum, and more general theories is the independence number, the Lovász number, and the fractional packing number of this subgraph, respectively. We also show that, for any graph, there is always a correlation experiment such that the set of quantum probabilities is exactly the Grötschel-Lovász-Schrijver theta body. This identifies these combinatorial notions as fundamental physical objects and provides a method for singling out experiments with quantum correlations on demand.
NASA Astrophysics Data System (ADS)
Blutner, Reinhard
2009-03-01
Recently, Gerd Niestegge developed a new approach to quantum mechanics via conditional probabilities developing the well-known proposal to consider the Lüders-von Neumann measurement as a non-classical extension of probability conditionalization. I will apply his powerful and rigorous approach to the treatment of concepts using a geometrical model of meaning. In this model, instances are treated as vectors of a Hilbert space H. In the present approach there are at least two possibilities to form categories. The first possibility sees categories as a mixture of its instances (described by a density matrix). In the simplest case we get the classical probability theory including the Bayesian formula. The second possibility sees categories formed by a distinctive prototype which is the superposition of the (weighted) instances. The construction of prototypes can be seen as transferring a mixed quantum state into a pure quantum state freezing the probabilistic characteristics of the superposed instances into the structure of the formed prototype. Closely related to the idea of forming concepts by prototypes is the existence of interference effects. Such inference effects are typically found in macroscopic quantum systems and I will discuss them in connection with several puzzles of bounded rationality. The present approach nicely generalizes earlier proposals made by authors such as Diederik Aerts, Andrei Khrennikov, Ricardo Franco, and Jerome Busemeyer. Concluding, I will suggest that an active dialogue between cognitive approaches to logic and semantics and the modern approach of quantum information science is mandatory.
Quantum decision-maker theory and simulation
NASA Astrophysics Data System (ADS)
Zak, Michail; Meyers, Ronald E.; Deacon, Keith S.
2000-07-01
A quantum device simulating the human decision making process is introduced. It consists of quantum recurrent nets generating stochastic processes which represent the motor dynamics, and of classical neural nets describing the evolution of probabilities of these processes which represent the mental dynamics. The autonomy of the decision making process is achieved by a feedback from the mental to motor dynamics which changes the stochastic matrix based upon the probability distribution. This feedback replaces unavailable external information by an internal knowledge- base stored in the mental model in the form of probability distributions. As a result, the coupled motor-mental dynamics is described by a nonlinear version of Markov chains which can decrease entropy without an external source of information. Applications to common sense based decisions as well as to evolutionary games are discussed. An example exhibiting self-organization is computed using quantum computer simulation. Force on force and mutual aircraft engagements using the quantum decision maker dynamics are considered.
(Never) Mind your p's and q's: Von Neumann versus Jordan on the foundations of quantum theory
NASA Astrophysics Data System (ADS)
Duncan, A.; Janssen, M.
2013-03-01
In 1927, in two papers entitled "On a new foundation [Neue Begründung] of quantum mechanics," Pascual Jordan presented his version of what came to be known as the Dirac-Jordan statistical transformation theory. Jordan and Paul Dirac arrived at essentially the same theory independently of one another at around the same time. Later in 1927, partly in response to Jordan and Dirac and avoiding the mathematical difficulties facing their approach, John von Neumann developed the modern Hilbert space formalism of quantum mechanics. We focus on Jordan and von Neumann. Central to the formalisms of both are expressions for conditional probabilities of finding some value for one quantity given the value of another. Beyond that Jordan and von Neumann had very different views about the appropriate formulation of problems in quantum mechanics. For Jordan, unable to let go of the analogy to classical mechanics, the solution of such problems required the identification of sets of canonically conjugate variables, i.e., p's and q's. For von Neumann, not constrained by the analogy to classical mechanics, it required only the identification of a maximal set of commuting operators with simultaneous eigenstates. He had no need for p's and q's. Jordan and von Neumann also stated the characteristic new rules for probabilities in quantum mechanics somewhat differently. Jordan and Dirac were the first to state those rules in full generality. Von Neumann rephrased them and, in a paper published a few months later, sought to derive them from more basic considerations. In this paper we reconstruct the central arguments of these 1927 papers by Jordan and von Neumann and of a paper on Jordan's approach by Hilbert, von Neumann, and Nordheim. We highlight those elements in these papers that bring out the gradual loosening of the ties between the new quantum formalism and classical mechanics. This paper was written as part of a joint project in the history of quantum physics of the Max Planck Institut für Wissenschaftsgeschichte and the Fritz-Haber-Institut in Berlin.
NASA Astrophysics Data System (ADS)
Sahyoun, Maher; Wex, Heike; Gosewinkel, Ulrich; Šantl-Temkiv, Tina; Nielsen, Niels W.; Finster, Kai; Sørensen, Jens H.; Stratmann, Frank; Korsholm, Ulrik S.
2016-08-01
Bacterial ice-nucleating particles (INP) are present in the atmosphere and efficient in heterogeneous ice-nucleation at temperatures up to -2 °C in mixed-phase clouds. However, due to their low emission rates, their climatic impact was considered insignificant in previous modeling studies. In view of uncertainties about the actual atmospheric emission rates and concentrations of bacterial INP, it is important to re-investigate the threshold fraction of cloud droplets containing bacterial INP for a pronounced effect on ice-nucleation, by using a suitable parameterization that describes the ice-nucleation process by bacterial INP properly. Therefore, we compared two heterogeneous ice-nucleation rate parameterizations, denoted CH08 and HOO10 herein, both of which are based on classical-nucleation-theory and measurements, and use similar equations, but different parameters, to an empirical parameterization, denoted HAR13 herein, which considers implicitly the number of bacterial INP. All parameterizations were used to calculate the ice-nucleation probability offline. HAR13 and HOO10 were implemented and tested in a one-dimensional version of a weather-forecast-model in two meteorological cases. Ice-nucleation-probabilities based on HAR13 and CH08 were similar, in spite of their different derivation, and were higher than those based on HOO10. This study shows the importance of the method of parameterization and of the input variable, number of bacterial INP, for accurately assessing their role in meteorological and climatic processes.
Tensor products of process matrices with indefinite causal structure
NASA Astrophysics Data System (ADS)
Jia, Ding; Sakharwade, Nitica
2018-03-01
Theories with indefinite causal structure have been studied from both the fundamental perspective of quantum gravity and the practical perspective of information processing. In this paper we point out a restriction in forming tensor products of objects with indefinite causal structure in certain models: there exist both classical and quantum objects the tensor products of which violate the normalization condition of probabilities, if all local operations are allowed. We obtain a necessary and sufficient condition for when such unrestricted tensor products of multipartite objects are (in)valid. This poses a challenge to extending communication theory to indefinite causal structures, as the tensor product is the fundamental ingredient in the asymptotic setting of communication theory. We discuss a few options to evade this issue. In particular, we show that the sequential asymptotic setting does not suffer the violation of normalization.
Quantum game application to spectrum scarcity problems
NASA Astrophysics Data System (ADS)
Zabaleta, O. G.; Barrangú, J. P.; Arizmendi, C. M.
2017-01-01
Recent spectrum-sharing research has produced a strategy to address spectrum scarcity problems. This novel idea, named cognitive radio, considers that secondary users can opportunistically exploit spectrum holes left temporarily unused by primary users. This presents a competitive scenario among cognitive users, making it suitable for game theory treatment. In this work, we show that the spectrum-sharing benefits of cognitive radio can be increased by designing a medium access control based on quantum game theory. In this context, we propose a model to manage spectrum fairly and effectively, based on a multiple-users multiple-choice quantum minority game. By taking advantage of quantum entanglement and quantum interference, it is possible to reduce the probability of collision problems commonly associated with classic algorithms. Collision avoidance is an essential property for classic and quantum communications systems. In our model, two different scenarios are considered, to meet the requirements of different user strategies. The first considers sensor networks where the rational use of energy is a cornerstone; the second focuses on installations where the quality of service of the entire network is a priority.
On observation of position in quantum theory
NASA Astrophysics Data System (ADS)
Kryukov, A.
2018-05-01
Newtonian and Schrödinger dynamics can be formulated in a physically meaningful way within the same Hilbert space framework. This fact was recently used to discover an unexpected relation between classical and quantum motions that goes beyond the results provided by the Ehrenfest theorem. A formula relating the normal probability distribution and the Born rule was also found. Here the dynamical mechanism responsible for the latter formula is proposed and applied to measurements of macroscopic and microscopic systems. A relationship between the classical Brownian motion and the diffusion of state on the space of states is discovered. The role of measuring devices in quantum theory is investigated in the new framework. It is shown that the so-called collapse of the wave function is not measurement specific and does not require a "concentration" near the eigenstates of the measured observable. Instead, it is explained by the common diffusion of a state over the space of states under interaction with the apparatus and the environment. This in turn provides us with a basic reason for the definite position of macroscopic bodies in space.
NASA Astrophysics Data System (ADS)
Yousefian, Pedram; Tiryakioğlu, Murat
2018-02-01
An in-depth discussion of pore formation is presented in this paper by first reinterpreting in situ observations reported in the literature as well as assumptions commonly made to model pore formation in aluminum castings. The physics of pore formation is reviewed through theoretical fracture pressure calculations based on classical nucleation theory for homogeneous and heterogeneous nucleation, with and without dissolved gas, i.e., hydrogen. Based on the fracture pressure for aluminum, critical pore size and the corresponding probability of vacancies clustering to form that size have been calculated using thermodynamic data reported in the literature. Calculations show that it is impossible for a pore to nucleate either homogeneously or heterogeneously in aluminum, even with dissolved hydrogen. The formation of pores in aluminum castings can only be explained by inflation of entrained surface oxide films (bifilms) under reduced pressure and/or with dissolved gas, which involves only growth, avoiding any nucleation problem. This mechanism is consistent with the reinterpretations of in situ observations as well as the assumptions made in the literature to model pore formation.
Forward flux sampling calculation of homogeneous nucleation rates from aqueous NaCl solutions.
Jiang, Hao; Haji-Akbari, Amir; Debenedetti, Pablo G; Panagiotopoulos, Athanassios Z
2018-01-28
We used molecular dynamics simulations and the path sampling technique known as forward flux sampling to study homogeneous nucleation of NaCl crystals from supersaturated aqueous solutions at 298 K and 1 bar. Nucleation rates were obtained for a range of salt concentrations for the Joung-Cheatham NaCl force field combined with the Extended Simple Point Charge (SPC/E) water model. The calculated nucleation rates are significantly lower than the available experimental measurements. The estimates for the nucleation rates in this work do not rely on classical nucleation theory, but the pathways observed in the simulations suggest that the nucleation process is better described by classical nucleation theory than an alternative interpretation based on Ostwald's step rule, in contrast to some prior simulations of related models. In addition to the size of NaCl nucleus, we find that the crystallinity of a nascent cluster plays an important role in the nucleation process. Nuclei with high crystallinity were found to have higher growth probability and longer lifetimes, possibly because they are less exposed to hydration water.
NASA Astrophysics Data System (ADS)
Ellerman, David
2014-03-01
In models of QM over finite fields (e.g., Schumacher's ``modal quantum theory'' MQT), one finite field stands out, Z2, since Z2 vectors represent sets. QM (finite-dimensional) mathematics can be transported to sets resulting in quantum mechanics over sets or QM/sets. This gives a full probability calculus (unlike MQT with only zero-one modalities) that leads to a fulsome theory of QM/sets including ``logical'' models of the double-slit experiment, Bell's Theorem, QIT, and QC. In QC over Z2 (where gates are non-singular matrices as in MQT), a simple quantum algorithm (one gate plus one function evaluation) solves the Parity SAT problem (finding the parity of the sum of all values of an n-ary Boolean function). Classically, the Parity SAT problem requires 2n function evaluations in contrast to the one function evaluation required in the quantum algorithm. This is quantum speedup but with all the calculations over Z2 just like classical computing. This shows definitively that the source of quantum speedup is not in the greater power of computing over the complex numbers, and confirms the idea that the source is in superposition.
Forward flux sampling calculation of homogeneous nucleation rates from aqueous NaCl solutions
NASA Astrophysics Data System (ADS)
Jiang, Hao; Haji-Akbari, Amir; Debenedetti, Pablo G.; Panagiotopoulos, Athanassios Z.
2018-01-01
We used molecular dynamics simulations and the path sampling technique known as forward flux sampling to study homogeneous nucleation of NaCl crystals from supersaturated aqueous solutions at 298 K and 1 bar. Nucleation rates were obtained for a range of salt concentrations for the Joung-Cheatham NaCl force field combined with the Extended Simple Point Charge (SPC/E) water model. The calculated nucleation rates are significantly lower than the available experimental measurements. The estimates for the nucleation rates in this work do not rely on classical nucleation theory, but the pathways observed in the simulations suggest that the nucleation process is better described by classical nucleation theory than an alternative interpretation based on Ostwald's step rule, in contrast to some prior simulations of related models. In addition to the size of NaCl nucleus, we find that the crystallinity of a nascent cluster plays an important role in the nucleation process. Nuclei with high crystallinity were found to have higher growth probability and longer lifetimes, possibly because they are less exposed to hydration water.
Data Analysis Techniques for Physical Scientists
NASA Astrophysics Data System (ADS)
Pruneau, Claude A.
2017-10-01
Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.
ERIC Educational Resources Information Center
Raykov, Tenko; Marcoulides, George A.
2016-01-01
The frequently neglected and often misunderstood relationship between classical test theory and item response theory is discussed for the unidimensional case with binary measures and no guessing. It is pointed out that popular item response models can be directly obtained from classical test theory-based models by accounting for the discrete…
Fundamental theories of waves and particles formulated without classical mass
NASA Astrophysics Data System (ADS)
Fry, J. L.; Musielak, Z. E.
2010-12-01
Quantum and classical mechanics are two conceptually and mathematically different theories of physics, and yet they do use the same concept of classical mass that was originally introduced by Newton in his formulation of the laws of dynamics. In this paper, physical consequences of using the classical mass by both theories are explored, and a novel approach that allows formulating fundamental (Galilean invariant) theories of waves and particles without formally introducing the classical mass is presented. In this new formulation, the theories depend only on one common parameter called 'wave mass', which is deduced from experiments for selected elementary particles and for the classical mass of one kilogram. It is shown that quantum theory with the wave mass is independent of the Planck constant and that higher accuracy of performing calculations can be attained by such theory. Natural units in connection with the presented approach are also discussed and justification beyond dimensional analysis is given for the particular choice of such units.
The contrasting roles of Planck's constant in classical and quantum theories
NASA Astrophysics Data System (ADS)
Boyer, Timothy H.
2018-04-01
We trace the historical appearance of Planck's constant in physics, and we note that initially the constant did not appear in connection with quanta. Furthermore, we emphasize that Planck's constant can appear in both classical and quantum theories. In both theories, Planck's constant sets the scale of atomic phenomena. However, the roles played in the foundations of the theories are sharply different. In quantum theory, Planck's constant is crucial to the structure of the theory. On the other hand, in classical electrodynamics, Planck's constant is optional, since it appears only as the scale factor for the (homogeneous) source-free contribution to the general solution of Maxwell's equations. Since classical electrodynamics can be solved while taking the homogenous source-free contribution in the solution as zero or non-zero, there are naturally two different theories of classical electrodynamics, one in which Planck's constant is taken as zero and one where it is taken as non-zero. The textbooks of classical electromagnetism present only the version in which Planck's constant is taken to vanish.
Taking-On: A Grounded Theory of Addressing Barriers in Task Completion
ERIC Educational Resources Information Center
Austinson, Julie Ann
2011-01-01
This study of taking-on was conducted using classical grounded theory methodology (Glaser, 1978, 1992, 1998, 2001, 2005; Glaser & Strauss, 1967). Classical grounded theory is inductive, empirical, and naturalistic; it does not utilize manipulation or constrained time frames. Classical grounded theory is a systemic research method used to generate…
On the classic and modern theories of matching.
McDowell, J J
2005-07-01
Classic matching theory, which is based on Herrnstein's (1961) original matching equation and includes the well-known quantitative law of effect, is almost certainly false. The theory is logically inconsistent with known experimental findings, and experiments have shown that its central constant-k assumption is not tenable. Modern matching theory, which is based on the power function version of the original matching equation, remains tenable, although it has not been discussed or studied extensively. The modern theory is logically consistent with known experimental findings, it predicts the fact and details of the violation of the classic theory's constant-k assumption, and it accurately describes at least some data that are inconsistent with the classic theory.
Classical Field Theory and the Stress-Energy Tensor
NASA Astrophysics Data System (ADS)
Swanson, Mark S.
2015-09-01
This book is a concise introduction to the key concepts of classical field theory for beginning graduate students and advanced undergraduate students who wish to study the unifying structures and physical insights provided by classical field theory without dealing with the additional complication of quantization. In that regard, there are many important aspects of field theory that can be understood without quantizing the fields. These include the action formulation, Galilean and relativistic invariance, traveling and standing waves, spin angular momentum, gauge invariance, subsidiary conditions, fluctuations, spinor and vector fields, conservation laws and symmetries, and the Higgs mechanism, all of which are often treated briefly in a course on quantum field theory. The variational form of classical mechanics and continuum field theory are both developed in the time-honored graduate level text by Goldstein et al (2001). An introduction to classical field theory from a somewhat different perspective is available in Soper (2008). Basic classical field theory is often treated in books on quantum field theory. Two excellent texts where this is done are Greiner and Reinhardt (1996) and Peskin and Schroeder (1995). Green's function techniques are presented in Arfken et al (2013).
A quantum-classical theory with nonlinear and stochastic dynamics
NASA Astrophysics Data System (ADS)
Burić, N.; Popović, D. B.; Radonjić, M.; Prvanović, S.
2014-12-01
The method of constrained dynamical systems on the quantum-classical phase space is utilized to develop a theory of quantum-classical hybrid systems. Effects of the classical degrees of freedom on the quantum part are modeled using an appropriate constraint, and the interaction also includes the effects of neglected degrees of freedom. Dynamical law of the theory is given in terms of nonlinear stochastic differential equations with Hamiltonian and gradient terms. The theory provides a successful dynamical description of the collapse during quantum measurement.
Parts and Wholes. An Inquiry into Quantum and Classical Correlations
NASA Astrophysics Data System (ADS)
Seevinck, M. P.
2008-10-01
The primary topic of this dissertation is, firstly, the study of the correlations between outcomes of measurements on the subsystems of a composed system as predicted by a particular physical theory; secondly, the study of what this physical theory predicts for the relationships these subsystems can have to the composed system they are a part of; and thirdly, the comparison of different physical theories with respect to these two aspects. The physical theories investigated and compared are generalized probability theories in a quasi-classical physics framework and non-relativistic quantum theory. The motivation for these enquiries is that a comparison of the relationships between parts and whole as described by each theory, and of the correlations predicted by each theory between separated subsystems yields a fruitful method to investigate what these physical theories say about the world. One then finds, independent of any physical model, relationships and constraints that capture the essential physical assumptions and structural aspects of the theory in question. As such one gains a larger and deeper understanding of the different physical theories under investigation and of what they say about the world. A large part of this dissertation is devoted to understanding different aspects of different kinds of correlations that can exist between the outcomes of measurement on subsystems of a larger system. Four different kinds of correlation have been investigated: local, partially-local, no-signaling and quantum mechanical. Novel characteristics of these correlations have been used to study how they are related and how they can be discerned. The main tool of this investigation has been the usage of Bell-type inequalities that give non-trivial bounds on the strength of the correlations. The study of quantum correlations has also prompted us to study the multi-partite qubit state space with respect to its entanglement and separability characteristics, and the differing strength of the correlations in separable and entangled qubit states. Comparing the different types of correlations has provided us with many new results on the various strengths of the different types of correlation. Because of the generality of the investigation -- we have considered abstract general models, not some specific and particular ones -- these results have strong repercussions for different sorts of physical theories. These repercussions have foundational as well as philosophical impact, notably for the viability of hidden variable theories for quantum mechanics, for the possibility of doing experimental metaphysics, for the question of holism in physical theories, and for the classical vs. quantum dichotomy.
NASA Technical Reports Server (NTRS)
Kentzer, C. P.
1976-01-01
A statistical approach to sound propagation is considered in situations where, due to the presence of large gradients of properties of the medium, the classical (deterministic) treatment of wave motion is inadequate. Mathematical methods for wave motions not restricted to small wavelengths (analogous to known methods of quantum mechanics) are used to formulate a wave theory of sound in nonuniform flows. Nonlinear transport equations for field probabilities are derived for the limiting case of noninteracting sound waves and it is postulated that such transport equations, appropriately generalized, may be used to predict the statistical behavior of sound in arbitrary flows.
NASA Astrophysics Data System (ADS)
Yan, Han
2012-08-01
Extending Parikh-Wilczek's semi-classical tunneling method, we discuss the Hawking radiation of the charged massive particles via tunneling from the cosmological horizon of ( n+2)-dimensional Topological Reissner-Nordström-de Sitter black hole.The result shows that, when energy conservation and electric charge conservation are taken into account, the derived spectrum deviates from the pure thermal one, but satisfies the unitary theory, which provides a probability for the solution of the information loss paradox.
Some loopholes to save quantum nonlocality
NASA Astrophysics Data System (ADS)
Accardi, Luigi
2005-02-01
The EPR-chameleon experiment has closed a long standing debate between the supporters of quantum nonlocality and the thesis of quantum probability according to which the essence of the quantum pecularity is non Kolmogorovianity rather than non locality. The theory of adaptive systems (symbolized by the chameleon effect) provides a natural intuition for the emergence of non-Kolmogorovian statistics from classical deterministic dynamical systems. These developments are quickly reviewed and in conclusion some comments are introduced on recent attempts to "reconstruct history" on the lines described by Orwell in "1984".
Theory of the stopping power of fast multicharged ions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yudin, G.L.
1991-12-01
The processes of Coulomb excitation and ionization of atoms by a fast charged particle moving along a classical trajectory are studied. The target electrons are described by the Dirac equation, while the field of the incident particle is described by the Lienard-Wiechert potential. The theory is formulated in the form most convenient for investigation of various characteristics of semiclassical atomic collisions. The theory of sudden perturbations, which is valid at high enough velocities for a high projectile charge, is employed to obtain probabilities and cross sections of the Coulomb excitation and ionization of atomic hydrogen by fast multiply charged ions.more » Based on the semiclassical sudden Born approximation, the ionization cross section and the average electronic energy loss of a fast ion in a single collision with an atom are investigated over a wide specific energy range from 500 keV/amu to 50 MeV/amu.« less
NASA Astrophysics Data System (ADS)
Lusanna, Luca; Pauri, Massimo
2014-08-01
If the classical structure of space-time is assumed to define an a priori scenario for the formulation of quantum theory (QT), the coordinate representation of the solutions of the Schroedinger equation of a quantum system containing one ( N) massive scalar particle has a preferred status. Let us consider all of the solutions admitting a multipolar expansion of the probability density function (and more generally of the Wigner function) around a space-time trajectory to be properly selected. For every normalized solution there is a privileged trajectory implying the vanishing of the dipole moment of the multipolar expansion: it is given by the expectation value of the position operator . Then, the special subset of solutions which satisfy Ehrenfest's Theorem (named thereby Ehrenfest monopole wave functions (EMWF)), have the important property that this privileged classical trajectory is determined by a closed Newtonian equation of motion where the effective force is the Newtonian force plus non-Newtonian terms (of order ħ 2 or higher) depending on the higher multipoles of the probability distribution ρ. Note that the superposition of two EMWFs is not an EMWF, a result to be strongly hoped for, given the possible unwanted implications concerning classical spatial perception. These results can be extended to N-particle systems in such a way that, when N classical trajectories with all the dipole moments vanishing and satisfying Ehrenfest theorem are associated with the normalized wave functions of the N-body system, we get a natural transition from the 3 N-dimensional configuration space to the space-time. Moreover, these results can be extended to relativistic quantum mechanics. Consequently, in suitable states of N quantum particle which are EMWF, we get the "emergence" of corresponding "classical particles" following Newton-like trajectories in space-time. Note that all this holds true in the standard framework of quantum mechanics, i.e. assuming, in particular, the validity of Born's rule and the individual system interpretation of the wave function (no ensemble interpretation). These results are valid without any approximation (like ħ → 0, big quantum numbers, etc.). Moreover, we do not commit ourselves to any specific ontological interpretation of quantum theory (such as, e.g., the Bohmian one). We will argue that, in substantial agreement with Bohr's viewpoint, the macroscopic description of the preparation, certain intermediate steps and the detection of the final outcome of experiments involving massive particles are dominated by these classical "effective" trajectories. This approach can be applied to the point of view of de-coherence in the case of a diagonal reduced density matrix ρ red (an improper mixture) depending on the position variables of a massive particle and of a pointer. When both the particle and the pointer wave functions appearing in ρ red are EMWF, the expectation value of the particle and pointer position variables becomes a statistical average on a classical ensemble. In these cases an improper quantum mixture becomes a classical statistical one, thus providing a particular answer to an open problem of de-coherence about the emergence of classicality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dahms, Rainer N.
A generalized framework for multi-component liquid injections is presented to understand and predict the breakdown of classic two-phase theory and spray atomization at engine-relevant conditions. The analysis focuses on the thermodynamic structure and the immiscibility state of representative gas-liquid interfaces. The most modern form of Helmholtz energy mixture state equation is utilized which exhibits a unique and physically consistent behavior over the entire two-phase regime of fluid densities. It is combined with generalized models for non-linear gradient theory and for liquid injections to quantify multi-component two-phase interface structures in global thermal equilibrium. Then, the Helmholtz free energy is minimized whichmore » determines the interfacial species distribution as a consequence. This minimal free energy state is demonstrated to validate the underlying assumptions of classic two-phase theory and spray atomization. However, under certain engine-relevant conditions for which corroborating experimental data are presented, this requirement for interfacial thermal equilibrium becomes unsustainable. A rigorously derived probability density function quantifies the ability of the interface to develop internal spatial temperature gradients in the presence of significant temperature differences between injected liquid and ambient gas. Then, the interface can no longer be viewed as an isolated system at minimal free energy. Instead, the interfacial dynamics become intimately connected to those of the separated homogeneous phases. Hence, the interface transitions toward a state in local equilibrium whereupon it becomes a dense-fluid mixing layer. A new conceptual view of a transitional liquid injection process emerges from a transition time scale analysis. Close to the nozzle exit, the two-phase interface still remains largely intact and more classic two-phase processes prevail as a consequence. Further downstream, however, the transition to dense-fluid mixing generally occurs before the liquid length is reached. As a result, the significance of the presented modeling expressions is established by a direct comparison to a reduced model, which utilizes widely applied approximations but fundamentally fails to capture the physical complexity discussed in this paper.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dahms, Rainer N., E-mail: Rndahms@sandia.gov
A generalized framework for multi-component liquid injections is presented to understand and predict the breakdown of classic two-phase theory and spray atomization at engine-relevant conditions. The analysis focuses on the thermodynamic structure and the immiscibility state of representative gas-liquid interfaces. The most modern form of Helmholtz energy mixture state equation is utilized which exhibits a unique and physically consistent behavior over the entire two-phase regime of fluid densities. It is combined with generalized models for non-linear gradient theory and for liquid injections to quantify multi-component two-phase interface structures in global thermal equilibrium. Then, the Helmholtz free energy is minimized whichmore » determines the interfacial species distribution as a consequence. This minimal free energy state is demonstrated to validate the underlying assumptions of classic two-phase theory and spray atomization. However, under certain engine-relevant conditions for which corroborating experimental data are presented, this requirement for interfacial thermal equilibrium becomes unsustainable. A rigorously derived probability density function quantifies the ability of the interface to develop internal spatial temperature gradients in the presence of significant temperature differences between injected liquid and ambient gas. Then, the interface can no longer be viewed as an isolated system at minimal free energy. Instead, the interfacial dynamics become intimately connected to those of the separated homogeneous phases. Hence, the interface transitions toward a state in local equilibrium whereupon it becomes a dense-fluid mixing layer. A new conceptual view of a transitional liquid injection process emerges from a transition time scale analysis. Close to the nozzle exit, the two-phase interface still remains largely intact and more classic two-phase processes prevail as a consequence. Further downstream, however, the transition to dense-fluid mixing generally occurs before the liquid length is reached. The significance of the presented modeling expressions is established by a direct comparison to a reduced model, which utilizes widely applied approximations but fundamentally fails to capture the physical complexity discussed in this paper.« less
Dahms, Rainer N.
2016-04-26
A generalized framework for multi-component liquid injections is presented to understand and predict the breakdown of classic two-phase theory and spray atomization at engine-relevant conditions. The analysis focuses on the thermodynamic structure and the immiscibility state of representative gas-liquid interfaces. The most modern form of Helmholtz energy mixture state equation is utilized which exhibits a unique and physically consistent behavior over the entire two-phase regime of fluid densities. It is combined with generalized models for non-linear gradient theory and for liquid injections to quantify multi-component two-phase interface structures in global thermal equilibrium. Then, the Helmholtz free energy is minimized whichmore » determines the interfacial species distribution as a consequence. This minimal free energy state is demonstrated to validate the underlying assumptions of classic two-phase theory and spray atomization. However, under certain engine-relevant conditions for which corroborating experimental data are presented, this requirement for interfacial thermal equilibrium becomes unsustainable. A rigorously derived probability density function quantifies the ability of the interface to develop internal spatial temperature gradients in the presence of significant temperature differences between injected liquid and ambient gas. Then, the interface can no longer be viewed as an isolated system at minimal free energy. Instead, the interfacial dynamics become intimately connected to those of the separated homogeneous phases. Hence, the interface transitions toward a state in local equilibrium whereupon it becomes a dense-fluid mixing layer. A new conceptual view of a transitional liquid injection process emerges from a transition time scale analysis. Close to the nozzle exit, the two-phase interface still remains largely intact and more classic two-phase processes prevail as a consequence. Further downstream, however, the transition to dense-fluid mixing generally occurs before the liquid length is reached. As a result, the significance of the presented modeling expressions is established by a direct comparison to a reduced model, which utilizes widely applied approximations but fundamentally fails to capture the physical complexity discussed in this paper.« less
The problems in quantum foundations in the light of gauge theories
NASA Astrophysics Data System (ADS)
Ne'Eman, Yuval
1986-04-01
We review the issues of nonseparability and seemingly acausal propagation of information in EPR, as displayed by experiments and the failure of Bell's inequalities. We show that global effects are in the very nature of the geometric structure of modern physical theories, occurring even at the classical level. The Aharonov-Bohm effect, magnetic monopoles, instantons, etc. result from the topology and homotopy features of the fiber bundle manifolds of gauge theories. The conservation of probabilities, a supposedly highly quantum effect, is also achieved through global geometry equations. The EPR observables all fit in such geometries, and space-time is a truncated representation and is not the correct arena for their understanding. Relativistic quantum field theory represents the global action of the measurement operators as the zero-momentum (and therefore spatially infinitely spread) limit of their wave functions (form factors). We also analyze the collapse of the state vector as a case of spontaneous symmetry breakdown in the apparatus-observed state interaction.
Predicting Rotator Cuff Tears Using Data Mining and Bayesian Likelihood Ratios
Lu, Hsueh-Yi; Huang, Chen-Yuan; Su, Chwen-Tzeng; Lin, Chen-Chiang
2014-01-01
Objectives Rotator cuff tear is a common cause of shoulder diseases. Correct diagnosis of rotator cuff tears can save patients from further invasive, costly and painful tests. This study used predictive data mining and Bayesian theory to improve the accuracy of diagnosing rotator cuff tears by clinical examination alone. Methods In this retrospective study, 169 patients who had a preliminary diagnosis of rotator cuff tear on the basis of clinical evaluation followed by confirmatory MRI between 2007 and 2011 were identified. MRI was used as a reference standard to classify rotator cuff tears. The predictor variable was the clinical assessment results, which consisted of 16 attributes. This study employed 2 data mining methods (ANN and the decision tree) and a statistical method (logistic regression) to classify the rotator cuff diagnosis into “tear” and “no tear” groups. Likelihood ratio and Bayesian theory were applied to estimate the probability of rotator cuff tears based on the results of the prediction models. Results Our proposed data mining procedures outperformed the classic statistical method. The correction rate, sensitivity, specificity and area under the ROC curve of predicting a rotator cuff tear were statistical better in the ANN and decision tree models compared to logistic regression. Based on likelihood ratios derived from our prediction models, Fagan's nomogram could be constructed to assess the probability of a patient who has a rotator cuff tear using a pretest probability and a prediction result (tear or no tear). Conclusions Our predictive data mining models, combined with likelihood ratios and Bayesian theory, appear to be good tools to classify rotator cuff tears as well as determine the probability of the presence of the disease to enhance diagnostic decision making for rotator cuff tears. PMID:24733553
Generalized classical and quantum signal theories
NASA Astrophysics Data System (ADS)
Rundblad, E.; Labunets, V.; Novak, P.
2005-05-01
In this paper we develop two topics and show their inter- and cross-relation. The first centers on general notions of the generalized classical signal theory on finite Abelian hypergroups. The second concerns the generalized quantum hyperharmonic analysis of quantum signals (Hermitean operators associated with classical signals). We study classical and quantum generalized convolution hypergroup algebras of classical and quantum signals.
Is the local linearity of space-time inherited from the linearity of probabilities?
NASA Astrophysics Data System (ADS)
Müller, Markus P.; Carrozza, Sylvain; Höhn, Philipp A.
2017-02-01
The appearance of linear spaces, describing physical quantities by vectors and tensors, is ubiquitous in all of physics, from classical mechanics to the modern notion of local Lorentz invariance. However, as natural as this seems to the physicist, most computer scientists would argue that something like a ‘local linear tangent space’ is not very typical and in fact a quite surprising property of any conceivable world or algorithm. In this paper, we take the perspective of the computer scientist seriously, and ask whether there could be any inherently information-theoretic reason to expect this notion of linearity to appear in physics. We give a series of simple arguments, spanning quantum information theory, group representation theory, and renormalization in quantum gravity, that supports a surprising thesis: namely, that the local linearity of space-time might ultimately be a consequence of the linearity of probabilities. While our arguments involve a fair amount of speculation, they have the virtue of being independent of any detailed assumptions on quantum gravity, and they are in harmony with several independent recent ideas on emergent space-time in high-energy physics.
q-Gaussian distributions of leverage returns, first stopping times, and default risk valuations
NASA Astrophysics Data System (ADS)
Katz, Yuri A.; Tian, Li
2013-10-01
We study the probability distributions of daily leverage returns of 520 North American industrial companies that survive de-listing during the financial crisis, 2006-2012. We provide evidence that distributions of unbiased leverage returns of all individual firms belong to the class of q-Gaussian distributions with the Tsallis entropic parameter within the interval 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, S.; Li, Y.; Liu, C.
2015-08-15
This paper presents a statistical theory for the initial onset of multipactor breakdown in coaxial transmission lines, taking both the nonuniform electric field and random electron emission velocity into account. A general numerical method is first developed to construct the joint probability density function based on the approximate equation of the electron trajectory. The nonstationary dynamics of the multipactor process on both surfaces of coaxial lines are modelled based on the probability of various impacts and their corresponding secondary emission. The resonant assumption of the classical theory on the independent double-sided and single-sided impacts is replaced by the consideration ofmore » their interaction. As a result, the time evolutions of the electron population for exponential growth and absorption on both inner and outer conductor, in response to the applied voltage above and below the multipactor breakdown level, are obtained to investigate the exact mechanism of multipactor discharge in coaxial lines. Furthermore, the multipactor threshold predictions of the presented model are compared with experimental results using measured secondary emission yield of the tested samples which shows reasonable agreement. Finally, the detailed impact scenario reveals that single-surface multipactor is more likely to occur with a higher outer to inner conductor radius ratio.« less
NASA Astrophysics Data System (ADS)
Nalewajski, Roman F.
Information theory (IT) probe of the molecular electronic structure, within the communication theory of chemical bonds (CTCB), uses the standard entropy/information descriptors of the Shannon theory of communication to characterize a scattering of the electronic probabilities and their information content throughout the system chemical bonds generated by the occupied molecular orbitals (MO). These "communications" between the basis-set orbitals are determined by the two-orbital conditional probabilities: one- and two-electron in character. They define the molecular information system, in which the electron-allocation "signals" are transmitted between various orbital "inputs" and "outputs". It is argued, using the quantum mechanical superposition principle, that the one-electron conditional probabilities are proportional to the squares of corresponding elements of the charge and bond-order (CBO) matrix of the standard LCAO MO theory. Therefore, the probability of the interorbital connections in the molecular communication system is directly related to Wiberg's quadratic covalency indices of chemical bonds. The conditional-entropy (communication "noise") and mutual-information (information capacity) descriptors of these molecular channels generate the IT-covalent and IT-ionic bond components, respectively. The former reflects the electron delocalization (indeterminacy) due to the orbital mixing, throughout all chemical bonds in the system under consideration. The latter characterizes the localization (determinacy) in the probability scattering in the molecule. These two IT indices, respectively, indicate a fraction of the input information lost in the channel output, due to the communication noise, and its surviving part, due to deterministic elements in probability scattering in the molecular network. Together, these two components generate the system overall bond index. By a straightforward output reduction (condensation) of the molecular channel, the IT indices of molecular fragments, for example, localized bonds, functional groups, and forward and back donations accompanying the bond formation, and so on, can be extracted. The flow of information in such molecular communication networks is investigated in several prototype molecules. These illustrative (model) applications of the orbital communication theory of chemical bonds (CTCB) deal with several classical issues in the electronic structure theory: atom hybridization/promotion, single and multiple chemical bonds, bond conjugation, and so on. The localized bonds in hydrides and delocalized [pi]-bonds in simple hydrocarbons, as well as the multiple bonds in CO and CO2, are diagnosed using the entropy/information descriptors of CTCB. The atom promotion in hydrides and bond conjugation in [pi]-electron systems are investigated in more detail. A major drawback of the previous two-electron approach to molecular channels, namely, two weak bond differentiation in aromatic systems, has been shown to be remedied in the one-electron approach.
Application of quantum master equation for long-term prognosis of asset-prices
NASA Astrophysics Data System (ADS)
Khrennikova, Polina
2016-05-01
This study combines the disciplines of behavioral finance and an extension of econophysics, namely the concepts and mathematical structure of quantum physics. We apply the formalism of quantum theory to model the dynamics of some correlated financial assets, where the proposed model can be potentially applied for developing a long-term prognosis of asset price formation. At the informational level, the asset price states interact with each other by the means of a ;financial bath;. The latter is composed of agents' expectations about the future developments of asset prices on the finance market, as well as financially important information from mass-media, society, and politicians. One of the essential behavioral factors leading to the quantum-like dynamics of asset prices is the irrationality of agents' expectations operating on the finance market. These expectations lead to a deeper type of uncertainty concerning the future price dynamics of the assets, than given by a classical probability theory, e.g., in the framework of the classical financial mathematics, which is based on the theory of stochastic processes. The quantum dimension of the uncertainty in price dynamics is expressed in the form of the price-states superposition and entanglement between the prices of the different financial assets. In our model, the resolution of this deep quantum uncertainty is mathematically captured with the aid of the quantum master equation (its quantum Markov approximation). We illustrate our model of preparation of a future asset price prognosis by a numerical simulation, involving two correlated assets. Their returns interact more intensively, than understood by a classical statistical correlation. The model predictions can be extended to more complex models to obtain price configuration for multiple assets and portfolios.
Quantum Image Processing and Its Application to Edge Detection: Theory and Experiment
NASA Astrophysics Data System (ADS)
Yao, Xi-Wei; Wang, Hengyan; Liao, Zeyang; Chen, Ming-Cheng; Pan, Jian; Li, Jun; Zhang, Kechao; Lin, Xingcheng; Wang, Zhehui; Luo, Zhihuang; Zheng, Wenqiang; Li, Jianzhong; Zhao, Meisheng; Peng, Xinhua; Suter, Dieter
2017-07-01
Processing of digital images is continuously gaining in volume and relevance, with concomitant demands on data storage, transmission, and processing power. Encoding the image information in quantum-mechanical systems instead of classical ones and replacing classical with quantum information processing may alleviate some of these challenges. By encoding and processing the image information in quantum-mechanical systems, we here demonstrate the framework of quantum image processing, where a pure quantum state encodes the image information: we encode the pixel values in the probability amplitudes and the pixel positions in the computational basis states. Our quantum image representation reduces the required number of qubits compared to existing implementations, and we present image processing algorithms that provide exponential speed-up over their classical counterparts. For the commonly used task of detecting the edge of an image, we propose and implement a quantum algorithm that completes the task with only one single-qubit operation, independent of the size of the image. This demonstrates the potential of quantum image processing for highly efficient image and video processing in the big data era.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mastromatteo, Michael; Jackson, Bret, E-mail: jackson@chem.umass.edu
Electronic structure methods based on density functional theory are used to construct a reaction path Hamiltonian for CH{sub 4} dissociation on the Ni(100) and Ni(111) surfaces. Both quantum and quasi-classical trajectory approaches are used to compute dissociative sticking probabilities, including all molecular degrees of freedom and the effects of lattice motion. Both approaches show a large enhancement in sticking when the incident molecule is vibrationally excited, and both can reproduce the mode specificity observed in experiments. However, the quasi-classical calculations significantly overestimate the ground state dissociative sticking at all energies, and the magnitude of the enhancement in sticking with vibrationalmore » excitation is much smaller than that computed using the quantum approach or observed in the experiments. The origin of this behavior is an unphysical flow of zero point energy from the nine normal vibrational modes into the reaction coordinate, giving large values for reaction at energies below the activation energy. Perturbative assumptions made in the quantum studies are shown to be accurate at all energies studied.« less
The Aharonov–Bohm effect in scattering theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sitenko, Yu.A., E-mail: yusitenko@bitp.kiev.ua; Vlasii, N.D.
2013-12-15
The Aharonov–Bohm effect is considered as a scattering event with nonrelativistic charged particles of the wavelength which is less than the transverse size of an impenetrable magnetic vortex. The quasiclassical WKB method is shown to be efficient in solving this scattering problem. We find that the scattering cross section consists of two terms, one describing the classical phenomenon of elastic reflection and another one describing the quantum phenomenon of diffraction; the Aharonov–Bohm effect is manifested as a fringe shift in the diffraction pattern. Both the classical and the quantum phenomena are independent of the choice of a boundary condition atmore » the vortex edge, providing that probability is conserved. We show that a propagation of charged particles can be controlled by altering the flux of a magnetic vortex placed on their way. -- Highlights: •Aharonov–Bohm effect as a scattering event. •Impenetrable magnetic vortex of nonzero transverse size. •Scattering cross section is independent of a self-adjoint extension employed. •Classical phenomenon of elastic reflection and quantum phenomenon of diffraction. •Aharonov–Bohm effect as a fringe shift in the diffraction pattern.« less
Quantum-like model of unconscious–conscious dynamics
Khrennikov, Andrei
2015-01-01
We present a quantum-like model of sensation–perception dynamics (originated in Helmholtz theory of unconscious inference) based on the theory of quantum apparatuses and instruments. We illustrate our approach with the model of bistable perception of a particular ambiguous figure, the Schröder stair. This is a concrete model for unconscious and conscious processing of information and their interaction. The starting point of our quantum-like journey was the observation that perception dynamics is essentially contextual which implies impossibility of (straightforward) embedding of experimental statistical data in the classical (Kolmogorov, 1933) framework of probability theory. This motivates application of nonclassical probabilistic schemes. And the quantum formalism provides a variety of the well-approved and mathematically elegant probabilistic schemes to handle results of measurements. The theory of quantum apparatuses and instruments is the most general quantum scheme describing measurements and it is natural to explore it to model the sensation–perception dynamics. In particular, this theory provides the scheme of indirect quantum measurements which we apply to model unconscious inference leading to transition from sensations to perceptions. PMID:26283979
Concepts and their dynamics: a quantum-theoretic modeling of human thought.
Aerts, Diederik; Gabora, Liane; Sozzo, Sandro
2013-10-01
We analyze different aspects of our quantum modeling approach of human concepts and, more specifically, focus on the quantum effects of contextuality, interference, entanglement, and emergence, illustrating how each of them makes its appearance in specific situations of the dynamics of human concepts and their combinations. We point out the relation of our approach, which is based on an ontology of a concept as an entity in a state changing under influence of a context, with the main traditional concept theories, that is, prototype theory, exemplar theory, and theory theory. We ponder about the question why quantum theory performs so well in its modeling of human concepts, and we shed light on this question by analyzing the role of complex amplitudes, showing how they allow to describe interference in the statistics of measurement outcomes, while in the traditional theories statistics of outcomes originates in classical probability weights, without the possibility of interference. The relevance of complex numbers, the appearance of entanglement, and the role of Fock space in explaining contextual emergence, all as unique features of the quantum modeling, are explicitly revealed in this article by analyzing human concepts and their dynamics. © 2013 Cognitive Science Society, Inc.
Survival probability of a truncated radial oscillator subject to periodic kicks
NASA Astrophysics Data System (ADS)
Tanabe, Seiichi; Watanabe, Shinichi; Saif, Farhan; Matsuzawa, Michio
2002-03-01
Classical and quantum survival probabilities are compared for a truncated radial oscillator undergoing impulsive interactions with periodic laser pulses represented here as kicks. The system is truncated in the sense that the harmonic potential is made valid only within a finite range; the rest of the space is treated as a perfect absorber. Exploring extended values of the parameters of this model [Phys. Rev. A 63, 052721 (2001)], we supplement discussions on classical and quantum features near resonances. The classical system proves to be quasi-integrable and preserves phase-space area despite the momentum transfered by the kicks, exhibiting simple yet rich phase-space features. A geometrical argument reveals quantum-classical correspondence in the locations of minima in the paired survival probabilities while the ``ionization'' rates differ due to quantum tunneling.
Probability Distributions for Random Quantum Operations
NASA Astrophysics Data System (ADS)
Schultz, Kevin
Motivated by uncertainty quantification and inference of quantum information systems, in this work we draw connections between the notions of random quantum states and operations in quantum information with probability distributions commonly encountered in the field of orientation statistics. This approach identifies natural sample spaces and probability distributions upon these spaces that can be used in the analysis, simulation, and inference of quantum information systems. The theory of exponential families on Stiefel manifolds provides the appropriate generalization to the classical case. Furthermore, this viewpoint motivates a number of additional questions into the convex geometry of quantum operations relative to both the differential geometry of Stiefel manifolds as well as the information geometry of exponential families defined upon them. In particular, we draw on results from convex geometry to characterize which quantum operations can be represented as the average of a random quantum operation. This project was supported by the Intelligence Advanced Research Projects Activity via Department of Interior National Business Center Contract Number 2012-12050800010.
Introduction to Classical Density Functional Theory by a Computational Experiment
ERIC Educational Resources Information Center
Jeanmairet, Guillaume; Levy, Nicolas; Levesque, Maximilien; Borgis, Daniel
2014-01-01
We propose an in silico experiment to introduce the classical density functional theory (cDFT). Density functional theories, whether quantum or classical, rely on abstract concepts that are nonintuitive; however, they are at the heart of powerful tools and active fields of research in both physics and chemistry. They led to the 1998 Nobel Prize in…
Design Equations and Criteria of Orthotropic Composite Panels
2013-05-01
33 Appendix A Classical Laminate Theory ( CLT ): ....................................................................... A–1 Appendix...Science London , 1990. NSWCCD-65-TR–2004/16A A–1 Appendix A Classical Laminate Theory ( CLT ): In Section 6 of this report, preliminary design...determined using: Classical Laminate Theory, CLT , to Predict Equivalent Stiffness Characteristics, First- Ply Strength Note: CLT is valid for
Bidargaddi, Niranjan P; Chetty, Madhu; Kamruzzaman, Joarder
2008-06-01
Profile hidden Markov models (HMMs) based on classical HMMs have been widely applied for protein sequence identification. The formulation of the forward and backward variables in profile HMMs is made under statistical independence assumption of the probability theory. We propose a fuzzy profile HMM to overcome the limitations of that assumption and to achieve an improved alignment for protein sequences belonging to a given family. The proposed model fuzzifies the forward and backward variables by incorporating Sugeno fuzzy measures and Choquet integrals, thus further extends the generalized HMM. Based on the fuzzified forward and backward variables, we propose a fuzzy Baum-Welch parameter estimation algorithm for profiles. The strong correlations and the sequence preference involved in the protein structures make this fuzzy architecture based model as a suitable candidate for building profiles of a given family, since the fuzzy set can handle uncertainties better than classical methods.
Consistent resolution of some relativistic quantum paradoxes
NASA Astrophysics Data System (ADS)
Griffiths, Robert B.
2002-12-01
A relativistic version of the (consistent or decoherent) histories approach to quantum theory is developed on the basis of earlier work by Hartle, and used to discuss relativistic forms of the paradoxes of spherical wave packet collapse, Bohm's formulation of the Einstein-Podolsky-Rosen paradox, and Hardy's paradox. It is argued that wave function collapse is not needed for introducing probabilities into relativistic quantum mechanics, and in any case should never be thought of as a physical process. Alternative approaches to stochastic time dependence can be used to construct a physical picture of the measurement process that is less misleading than collapse models. In particular, one can employ a coarse-grained but fully quantum-mechanical description in which particles move along trajectories, with behavior under Lorentz transformations the same as in classical relativistic physics, and detectors are triggered by particles reaching them along such trajectories. States entangled between spacelike separate regions are also legitimate quantum descriptions, and can be consistently handled by the formalism presented here. The paradoxes in question arise because of using modes of reasoning which, while correct for classical physics, are inconsistent with the mathematical structure of quantum theory, and are resolved (or tamed) by using a proper quantum analysis. In particular, there is no need to invoke, nor any evidence for, mysterious long-range superluminal influences, and thus no incompatibility, at least from this source, between relativity theory and quantum mechanics.
k-Cosymplectic Classical Field Theories: Tulczyjew and Skinner-Rusk Formulations
NASA Astrophysics Data System (ADS)
Rey, Angel M.; Román-Roy, Narciso; Salgado, Modesto; Vilariño, Silvia
2012-06-01
The k-cosymplectic Lagrangian and Hamiltonian formalisms of first-order classical field theories are reviewed and completed. In particular, they are stated for singular and almost-regular systems. Subsequently, several alternative formulations for k-cosymplectic first-order field theories are developed: First, generalizing the construction of Tulczyjew for mechanics, we give a new interpretation of the classical field equations. Second, the Lagrangian and Hamiltonian formalisms are unified by giving an extension of the Skinner-Rusk formulation on classical mechanics.
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei
2017-02-01
The scientific methodology based on two descriptive levels, ontic (reality as it is) and epistemic (observational), is briefly presented. Following Schrödinger, we point to the possible gap between these two descriptions. Our main aim is to show that, although ontic entities may be unaccessible for observations, they can be useful for clarification of the physical nature of operational epistemic entities. We illustrate this thesis by the concrete example: starting with the concrete ontic model preceding quantum mechanics (the latter is treated as an epistemic model), namely, prequantum classical statistical field theory (PCSFT), we propose the natural physical interpretation for the basic quantum mechanical entity-the quantum state ("wave function"). The correspondence PCSFT ↦ QM is not straightforward, it couples the covariance operators of classical (prequantum) random fields with the quantum density operators. We use this correspondence to clarify the physical meaning of the pure quantum state and the superposition principle-by using the formalism of classical field correlations. In classical mechanics the phase space description can be considered as the ontic description, here states are given by points λ =(x , p) of phase space. The dynamics of the ontic state is given by the system of Hamiltonian equations.We can also consider probability distributions on the phase space (or equivalently random variables valued in it). We call them probabilistic ontic states. Dynamics of probabilistic ontic states is given by the Liouville equation.In classical physics we can (at least in principle) measure both the coordinate and momentum and hence ontic states can be treated as epistemic states as well (or it is better to say that here epistemic states can be treated as ontic states). Probabilistic ontic states represent probabilities for outcomes of joint measurement of position and momentum.However, this was a very special, although very important, example of description of physical phenomena. In general there are no reasons to expect that properties of ontic states are approachable through our measurements. There is a gap between ontic and epistemic descriptions, cf. also with 't Hooft [49,50] and G G. Groessing et al. [51]. In general the presence of such a gap also implies unapproachability of the probabilistic ontic states, i.e., probability distributions on the space of ontic states. De Broglie [28] called such probability distributions hidden probabilities and distinguished them sharply from probability distributions of measurements outcomes, see also Lochak [29]. (The latter distributions are described by the quantum formalism.)This ontic-epistemic approach based on the combination of two descriptive levels for natural phenomena is closely related to the old Bild conception which was originated in the works of Hertz. Later it was heavily explored by Schrödinger in the quantum domain, see, e.g., [8,11] for detailed analysis. According to Hertz one cannot expect to construct a complete theoretical model based explicitly on observable quantities. The complete theoretical model can contain quantities which are unapproachable for external measurement inspection. For example, Hertz by trying to create a mechanical model for Maxwell's electromagnetism invented hidden masses. The main distinguishing property of a theoretical model (in contrast to an observational model) is the continuity of description, i.e., the absence of gaps in description. From this viewpoint, the quantum mechanical description is not continuous: there is a gap between premeasurement dynamics and the measurement outcome. QM cannot say anything what happens in the process of measurement, this is the well known measurement problem of QM [32], cf. [52,53]. Continuity of description is closely related to causality. However, here we cannot go in more detail, see [8,11].The important question is about interrelation between two levels of description, ontic-epistemic (or theoretical-observational). In the introduction we have already cited Schrödinger who emphasized the possible complexity of this interrelation. In particular, in general there is no reason to expect a straightforward coupling of the form, cf. [9,10]:
NASA Astrophysics Data System (ADS)
González-Díaz, Pedro F.
We re-explore the effects of multiply-connected wormholes on ordinary matter at low energies. It is obtained that the path integral that describes these effects is given in terms of a Planckian probability distribution for the Coleman α-parameters, rather than a classical Gaussian distribution law. This implies that the path integral over all low-energy fields with the wormhole effective interactions can no longer vary continuously, and that the quantities α2 are interpretable as the momenta of a quantum field. Using the new result that, rather than being given in terms of the Coleman-Hawking probability, the Euclidean action must equal negative entropy, the model predicts a very small but still nonzero cosmological constant and quite reasonable values for the pion and neutrino masses. The divergence problems of Euclidean quantum gravity are also discussed in the light of the above results.
Atmospheric Quantum Channels with Weak and Strong Turbulence.
Vasylyev, D; Semenov, A A; Vogel, W
2016-08-26
The free-space transfer of high-fidelity optical signals between remote locations has many applications, including both classical and quantum communication, precision navigation, clock synchronization, etc. The physical processes that contribute to signal fading and loss need to be carefully analyzed in the theory of light propagation through the atmospheric turbulence. Here we derive the probability distribution for the atmospheric transmittance including beam wandering, beam shape deformation, and beam-broadening effects. Our model, referred to as the elliptic beam approximation, applies to weak, weak-to-moderate, and strong turbulence and hence to the most important regimes in atmospheric communication scenarios.
On the possibility of negative activation energies in bimolecular reactions
NASA Technical Reports Server (NTRS)
Jaffe, R. L.
1978-01-01
The temperature dependence of the rate constants for model reacting systems was studied to understand some recent experimental measurements which imply the existence of negative activation energies. A collision theory model and classical trajectory calculations are used to demonstrate that the reaction probability can vary inversely with collision energy for bimolecular reactions occurring on attractive potential energy surfaces. However, this is not a sufficient condition to ensure that the rate constant has a negative temperature dependence. On the basis of these calculations, it seems unlikely that a true bimolecular reaction between neutral molecules will have a negative activation energy.
NASA Astrophysics Data System (ADS)
Born, Max; Wolf, Emil
1999-10-01
Principles of Optics is one of the classic science books of the twentieth century, and probably the most influential book in optics published in the past forty years. This edition has been thoroughly revised and updated, with new material covering the CAT scan, interference with broad-band light and the so-called Rayleigh-Sommerfeld diffraction theory. This edition also details scattering from inhomogeneous media and presents an account of the principles of diffraction tomography to which Emil Wolf has made a basic contribution. Several new appendices are also included. This new edition will be invaluable to advanced undergraduates, graduate students and researchers working in most areas of optics.
Probabilities for time-dependent properties in classical and quantum mechanics
NASA Astrophysics Data System (ADS)
Losada, Marcelo; Vanni, Leonardo; Laura, Roberto
2013-05-01
We present a formalism which allows one to define probabilities for expressions that involve properties at different times for classical and quantum systems and we study its lattice structure. The formalism is based on the notion of time translation of properties. In the quantum case, the properties involved should satisfy compatibility conditions in order to obtain well-defined probabilities. The formalism is applied to describe the double-slit experiment.
Measures, Probability and Holography in Cosmology
NASA Astrophysics Data System (ADS)
Phillips, Daniel
This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We comment on the general implications of this view, and specifically question the application of classical probability theory to cosmology in cases where key questions are known to have no quantum answer. We argue that the ideas developed here may offer a way out of the notorious measure problems of eternal inflation. The fourth project looks at finite universes as alternatives to multiverse theories of cosmology. We compare two holographic arguments that impose especially strong bounds on the amount of inflation. One comes from the de Sitter Equilibrium cosmology and the other from the work of Banks and Fischler. We find that simple versions of these two approaches yield the same bound on the number of e-foldings. A careful examination reveals that while these pictures are similar in spirit, they are not necessarily identical prescriptions. We apply the two pictures to specific cosmologies which expose potentially important differences and which also demonstrate ways these seemingly simple proposals can be tricky to implement in practice.
Kinetic theory of age-structured stochastic birth-death processes
NASA Astrophysics Data System (ADS)
Greenman, Chris D.; Chou, Tom
2016-01-01
Classical age-structured mass-action models such as the McKendrick-von Foerster equation have been extensively studied but are unable to describe stochastic fluctuations or population-size-dependent birth and death rates. Stochastic theories that treat semi-Markov age-dependent processes using, e.g., the Bellman-Harris equation do not resolve a population's age structure and are unable to quantify population-size dependencies. Conversely, current theories that include size-dependent population dynamics (e.g., mathematical models that include carrying capacity such as the logistic equation) cannot be easily extended to take into account age-dependent birth and death rates. In this paper, we present a systematic derivation of a new, fully stochastic kinetic theory for interacting age-structured populations. By defining multiparticle probability density functions, we derive a hierarchy of kinetic equations for the stochastic evolution of an aging population undergoing birth and death. We show that the fully stochastic age-dependent birth-death process precludes factorization of the corresponding probability densities, which then must be solved by using a Bogoliubov--Born--Green--Kirkwood--Yvon-like hierarchy. Explicit solutions are derived in three limits: no birth, no death, and steady state. These are then compared with their corresponding mean-field results. Our results generalize both deterministic models and existing master equation approaches by providing an intuitive and efficient way to simultaneously model age- and population-dependent stochastic dynamics applicable to the study of demography, stem cell dynamics, and disease evolution.
de Bock, Élodie; Hardouin, Jean-Benoit; Blanchin, Myriam; Le Neel, Tanguy; Kubis, Gildas; Bonnaud-Antignac, Angélique; Dantan, Étienne; Sébille, Véronique
2016-10-01
The objective was to compare classical test theory and Rasch-family models derived from item response theory for the analysis of longitudinal patient-reported outcomes data with possibly informative intermittent missing items. A simulation study was performed in order to assess and compare the performance of classical test theory and Rasch model in terms of bias, control of the type I error and power of the test of time effect. The type I error was controlled for classical test theory and Rasch model whether data were complete or some items were missing. Both methods were unbiased and displayed similar power with complete data. When items were missing, Rasch model remained unbiased and displayed higher power than classical test theory. Rasch model performed better than the classical test theory approach regarding the analysis of longitudinal patient-reported outcomes with possibly informative intermittent missing items mainly for power. This study highlights the interest of Rasch-based models in clinical research and epidemiology for the analysis of incomplete patient-reported outcomes data. © The Author(s) 2013.
Born’s rule as signature of a superclassical current algebra
NASA Astrophysics Data System (ADS)
Fussy, S.; Mesa Pascasio, J.; Schwabl, H.; Grössing, G.
2014-04-01
We present a new tool for calculating the interference patterns and particle trajectories of a double-, three- and N-slit system on the basis of an emergent sub-quantum theory developed by our group throughout the last years. The quantum itself is considered as an emergent system representing an off-equilibrium steady state oscillation maintained by a constant throughput of energy provided by a classical zero-point energy field. We introduce the concept of a “relational causality” which allows for evaluating structural interdependences of different systems levels, i.e. in our case of the relations between partial and total probability density currents, respectively. Combined with the application of 21st century classical physics like, e.g., modern nonequilibrium thermodynamics, we thus arrive at a “superclassical” theory. Within this framework, the proposed current algebra directly leads to a new formulation of the guiding equation which is equivalent to the original one of the de Broglie-Bohm theory. By proving the absence of third order interferences in three-path systems it is shown that Born’s rule is a natural consequence of our theory. Considering the series of one-, double-, or, generally, of N-slit systems, with the first appearance of an interference term in the double slit case, we can explain the violation of Sorkin’s first order sum rule, just as the validity of all higher order sum rules. Moreover, the Talbot patterns and Talbot distance for an arbitrary N-slit device can be reproduced exactly by our model without any quantum physics tool.
Huang, Guangzao; Yuan, Mingshun; Chen, Moliang; Li, Lei; You, Wenjie; Li, Hanjie; Cai, James J; Ji, Guoli
2017-10-07
The application of machine learning in cancer diagnostics has shown great promise and is of importance in clinic settings. Here we consider applying machine learning methods to transcriptomic data derived from tumor-educated platelets (TEPs) from individuals with different types of cancer. We aim to define a reliability measure for diagnostic purposes to increase the potential for facilitating personalized treatments. To this end, we present a novel classification method called MFRB (for Multiple Fitting Regression and Bayes decision), which integrates the process of multiple fitting regression (MFR) with Bayes decision theory. MFR is first used to map multidimensional features of the transcriptomic data into a one-dimensional feature. The probability density function of each class in the mapped space is then adjusted using the Gaussian probability density function. Finally, the Bayes decision theory is used to build a probabilistic classifier with the estimated probability density functions. The output of MFRB can be used to determine which class a sample belongs to, as well as to assign a reliability measure for a given class. The classical support vector machine (SVM) and probabilistic SVM (PSVM) are used to evaluate the performance of the proposed method with simulated and real TEP datasets. Our results indicate that the proposed MFRB method achieves the best performance compared to SVM and PSVM, mainly due to its strong generalization ability for limited, imbalanced, and noisy data.
Telling and Not-Telling: A Classic Grounded Theory of Sharing Life-Stories
ERIC Educational Resources Information Center
Powers, Trudy Lee
2013-01-01
This study of "Telling and Not-Telling" was conducted using the classic grounded theory methodology (Glaser 1978, 1992, 1998; Glaser & Strauss, 1967). This unique methodology systematically and inductively generates conceptual theories from data. The goal is to discover theory that explains, predicts, and provides practical…
A Concise Introduction to Quantum Mechanics
NASA Astrophysics Data System (ADS)
Swanson, Mark S.
2018-02-01
Assuming a background in basic classical physics, multivariable calculus, and differential equations, A Concise Introduction to Quantum Mechanics provides a self-contained presentation of the mathematics and physics of quantum mechanics. The relevant aspects of classical mechanics and electrodynamics are reviewed, and the basic concepts of wave-particle duality are developed as a logical outgrowth of experiments involving blackbody radiation, the photoelectric effect, and electron diffraction. The Copenhagen interpretation of the wave function and its relation to the particle probability density is presented in conjunction with Fourier analysis and its generalization to function spaces. These concepts are combined to analyze the system consisting of a particle confined to a box, developing the probabilistic interpretation of observations and their associated expectation values. The Schrödinger equation is then derived by using these results and demanding both Galilean invariance of the probability density and Newtonian energy-momentum relations. The general properties of the Schrödinger equation and its solutions are analyzed, and the theory of observables is developed along with the associated Heisenberg uncertainty principle. Basic applications of wave mechanics are made to free wave packet spreading, barrier penetration, the simple harmonic oscillator, the Hydrogen atom, and an electric charge in a uniform magnetic field. In addition, Dirac notation, elements of Hilbert space theory, operator techniques, and matrix algebra are presented and used to analyze coherent states, the linear potential, two state oscillations, and electron diffraction. Applications are made to photon and electron spin and the addition of angular momentum, and direct product multiparticle states are used to formulate both the Pauli exclusion principle and quantum decoherence. The book concludes with an introduction to the rotation group and the general properties of angular momentum.
On Replacing "Quantum Thinking" with Counterfactual Reasoning
NASA Astrophysics Data System (ADS)
Narens, Louis
The probability theory used in quantum mechanics is currently being employed by psychologists to model the impact of context on decision. Its event space consists of closed subspaces of a Hilbert space, and its probability function sometimes violate the law of the finite additivity of probabilities. Results from the quantum mechanics literature indicate that such a "Hilbert space probability theory" cannot be extended in a useful way to standard, finitely additive, probability theory by the addition of new events with specific probabilities. This chapter presents a new kind of probability theory that shares many fundamental algebraic characteristics with Hilbert space probability theory but does extend to standard probability theory by adjoining new events with specific probabilities. The new probability theory arises from considerations about how psychological experiments are related through counterfactual reasoning.
The sight of the peacock's tail makes me sick: the early arguments on sexual selection.
Hiraiwa-Hasegawa, M
2000-03-01
Why does a peacock have a beautiful train, while a peahen is sober without such flamboyance? Darwin proposed the theory of sexual selection to explain the differences between the sexes of the same species. Recently the study of sexual selection has been one of the most flourishing areas in evolutionary biology. However, the theory met with great resistance from biologists since the publication of the idea and the history of the theory included a lot of misunderstanding and confusion. There are several reasons for this. First, classical Darwinism failed to recognize social competition as an important selective force. Second, the good-for-the-species argument, which persisted in the days after Darwin, made the sexual selection argument more difficult to under-stand. Compared to the discussions on animals, Darwin's argument on human sex differences is not satisfactory. The reason probably lies in the debate over human racial differences which prevailed in the 19th century.
Entropy, a Unifying Concept: from Physics to Cognitive Psychology
NASA Astrophysics Data System (ADS)
Tsallis, Constantino; Tsallis, Alexandra C.
Together with classical, relativistic and quantum mechanics, as well as Maxwell electromagnetism, Boltzmann-Gibbs (BG) statistical mechanics constitutes one of the main theories of contemporary physics. This theory primarily concerns inanimate matter, and at its generic foundation we find nonlinear dynamical systems satisfying the ergodic hypothesis. This hypothesis is typically guaranteed for systems whose maximal Lyapunov exponent is positive. What happens when this crucial quantity is zero instead? We suggest here that, in what concerns thermostatistical properties, we typically enter what in some sense may be considered as a new world — the world of living systems — . The need emerges, at least for many systems, for generalizing the basis of BG statistical mechanics, namely the Boltzmann-Gibbs-von Neumann-Shannon en-tropic functional form, which connects the oscopic, thermodynamic quantity, with the occurrence probabilities of microscopic configurations. This unifying approach is briefly reviewed here, and its widespread applications — from physics to cognitive psychology — are overviewed. Special attention is dedicated to the learning/memorizing process in humans and computers. The present observations might be related to the gestalt theory of visual perceptions and the actor-network theory.
Kappa and Rater Accuracy: Paradigms and Parameters.
Conger, Anthony J
2017-12-01
Drawing parallels to classical test theory, this article clarifies the difference between rater accuracy and reliability and demonstrates how category marginal frequencies affect rater agreement and Cohen's kappa (κ). Category assignment paradigms are developed: comparing raters to a standard (index) versus comparing two raters to one another (concordance), using both nonstochastic and stochastic category membership. Using a probability model to express category assignments in terms of rater accuracy and random error, it is shown that observed agreement (Po) depends only on rater accuracy and number of categories; however, expected agreement (Pe) and κ depend additionally on category frequencies. Moreover, category frequencies affect Pe and κ solely through the variance of the category proportions, regardless of the specific frequencies underlying the variance. Paradoxically, some judgment paradigms involving stochastic categories are shown to yield higher κ values than their nonstochastic counterparts. Using the stated probability model, assignments to categories were generated for 552 combinations of paradigms, rater and category parameters, category frequencies, and number of stimuli. Observed means and standard errors for Po, Pe, and κ were fully consistent with theory expectations. Guidelines for interpretation of rater accuracy and reliability are offered, along with a discussion of alternatives to the basic model.
Finite-block-length analysis in classical and quantum information theory.
Hayashi, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects.
Finite-block-length analysis in classical and quantum information theory
HAYASHI, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects. PMID:28302962
Nanotube Tunneling as a Consequence of Probable Discrete Trajectories
NASA Technical Reports Server (NTRS)
Robinson, Daryl C.
2001-01-01
It has been recently reported that the electrical charge in a semiconductive carbon nanotube is not evenly distributed, but is divided into charge "islands." A clear understanding of tunneling phenomena can be useful to elucidate the mechanism for electrical conduction in nanotubes. This paper represents the first attempt to shed light on the aforementioned phenomenon through viewing tunneling as a natural consequence of "discrete trajectories." The relevance of this analysis is that it may provide further insight into the higher rate of tunneling processes, which makes tunneling devices attractive. In a situation involving particles impinging on a classically impenetrable barrier, the result of quantum mechanics that the probability of detecting transmitted particles falls off exponentially is derived without wave theory. This paper should provide a basis for calculating the charge profile over the length of the tube so that nanoscale devices' conductive properties may be fully exploited.
On the no-boundary proposal for ekpyrotic and cyclic cosmologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Battarra, Lorenzo; Lehners, Jean-Luc, E-mail: lorenzo.battarra@aei.mpg.de, E-mail: jlehners@aei.mpg.de
2014-12-01
The no-boundary proposal provides a compelling theory for the initial conditions of our universe. We study the implications of such initial conditions for ekpyrotic and cyclic cosmologies. These cosmologies allow for the existence of a new type of ''ekpyrotic instanton'', which describes the creation of a universe in the ekpyrotic contraction phase. Remarkably, we find that the ekpyrotic attractor can explain how the universe became classical. In a cyclic context, in addition to the ekpyrotic instantons there exist de Sitter-like instantons describing the emergence of the universe in the dark energy phase. Our results show that typically the ekpyrotic instantonsmore » yield a higher probability. In fact, in a potential energy landscape allowing both inflationary and cyclic cosmologies, the no-boundary proposal implies that the probability for ekpyrotic and cyclic initial conditions is vastly higher than that for inflationary ones.« less
Richard, David; Speck, Thomas
2018-03-28
We investigate the kinetics and the free energy landscape of the crystallization of hard spheres from a supersaturated metastable liquid though direct simulations and forward flux sampling. In this first paper, we describe and test two different ways to reconstruct the free energy barriers from the sampled steady state probability distribution of cluster sizes without sampling the equilibrium distribution. The first method is based on mean first passage times, and the second method is based on splitting probabilities. We verify both methods for a single particle moving in a double-well potential. For the nucleation of hard spheres, these methods allow us to probe a wide range of supersaturations and to reconstruct the kinetics and the free energy landscape from the same simulation. Results are consistent with the scaling predicted by classical nucleation theory although a quantitative fit requires a rather large effective interfacial tension.
Universality classes of fluctuation dynamics in hierarchical complex systems
NASA Astrophysics Data System (ADS)
Macêdo, A. M. S.; González, Iván R. Roa; Salazar, D. S. P.; Vasconcelos, G. L.
2017-03-01
A unified approach is proposed to describe the statistics of the short-time dynamics of multiscale complex systems. The probability density function of the relevant time series (signal) is represented as a statistical superposition of a large time-scale distribution weighted by the distribution of certain internal variables that characterize the slowly changing background. The dynamics of the background is formulated as a hierarchical stochastic model whose form is derived from simple physical constraints, which in turn restrict the dynamics to only two possible classes. The probability distributions of both the signal and the background have simple representations in terms of Meijer G functions. The two universality classes for the background dynamics manifest themselves in the signal distribution as two types of tails: power law and stretched exponential, respectively. A detailed analysis of empirical data from classical turbulence and financial markets shows excellent agreement with the theory.
NASA Astrophysics Data System (ADS)
Richard, David; Speck, Thomas
2018-03-01
We investigate the kinetics and the free energy landscape of the crystallization of hard spheres from a supersaturated metastable liquid though direct simulations and forward flux sampling. In this first paper, we describe and test two different ways to reconstruct the free energy barriers from the sampled steady state probability distribution of cluster sizes without sampling the equilibrium distribution. The first method is based on mean first passage times, and the second method is based on splitting probabilities. We verify both methods for a single particle moving in a double-well potential. For the nucleation of hard spheres, these methods allow us to probe a wide range of supersaturations and to reconstruct the kinetics and the free energy landscape from the same simulation. Results are consistent with the scaling predicted by classical nucleation theory although a quantitative fit requires a rather large effective interfacial tension.
Koopman-von Neumann formulation of classical Yang-Mills theories: I
NASA Astrophysics Data System (ADS)
Carta, P.; Gozzi, E.; Mauro, D.
2006-03-01
In this paper we present the Koopman-von Neumann (KvN) formulation of classical non-Abelian gauge field theories. In particular we shall explore the functional (or classical path integral) counterpart of the KvN method. In the quantum path integral quantization of Yang-Mills theories concepts like gauge-fixing and Faddeev-Popov determinant appear in a quite natural way. We will prove that these same objects are needed also in this classical path integral formulation for Yang-Mills theories. We shall also explore the classical path integral counterpart of the BFV formalism and build all the associated universal and gauge charges. These last are quite different from the analog quantum ones and we shall show the relation between the two. This paper lays the foundation of this formalism which, due to the many auxiliary fields present, is rather heavy. Applications to specific topics outlined in the paper will appear in later publications.
a Classical Isodual Theory of Antimatter and its Prediction of Antigravity
NASA Astrophysics Data System (ADS)
Santilli, Ruggero Maria
An inspection of the contemporary physics literature reveals that, while matter is treated at all levels of study, from Newtonian mechanics to quantum field theory, antimatter is solely treated at the level of second quantization. For the purpose of initiating the restoration of full equivalence in the treatment of matter and antimatter in due time, and as the classical foundations of an axiomatically consistent inclusion of gravitation in unified gauge theories recently appeared elsewhere, in this paper we present a classical representation of antimatter which begins at the primitive Newtonian level with corresponding formulations at all subsequent levels. By recalling that charge conjugation of particles into antiparticles is antiautomorphic, the proposed theory of antimatter is based on a new map, called isoduality, which is also antiautomorphic (and more generally, antiisomorphic), yet it is applicable beginning at the classical level and then persists at the quantum level where it becomes equivalent to charge conjugation. We therefore present, apparently for the first time, the classical isodual theory of antimatter, we identify the physical foundations of the theory as being the novel isodual Galilean, special and general relativities, and we show the compatibility of the theory with all available classical experimental data on antimatter. We identify the classical foundations of the prediction of antigravity for antimatter in the field of matter (or vice-versa) without any claim on its validity, and defer its resolution to specifically identified experiments. We identify the novel, classical, isodual electromagnetic waves which are predicted to be emitted by antimatter, the so-called space-time machine based on a novel non-Newtonian geometric propulsion, and other implications of the theory. We also introduce, apparently for the first time, the isodual space and time inversions and show that they are nontrivially different than the conventional ones, thus offering a possibility for the future resolution whether far away galaxies and quasars are made up of matter or of antimatter. The paper ends with the indication that the studies are at their first infancy, and indicates some of the open problems. To avoid a prohibitive length, the paper is restricted to the classical treatment, while studies on operator profiles are treated elsewhere.
A theory of quantum dynamics of a nanomagnet under excitation
NASA Astrophysics Data System (ADS)
Sham, L. J.
2013-09-01
A quantum treatment of magnetization dynamics of a nanomagnet between a thousand and a million spins may be needed as the magnet interacts with quantum control. The advantage of the all-quantum approach over the classical treatment of magnetization is the accounting for the correlation between the magnet and the control agent and the first-principles source of noise. This supplement to the conference talk will concentrate on an overview of the theory with a presentation of the basic ideas which could have wide applications and illustrations with some results. Details of applications to specific models are or will be published elsewhere. A clear concept of the structure of the ground and excited macrospin states as magnetization rotation states and magnons in the Bloch/Dyson sense gives rise to a consistent theory of the magnetization dynamics of a ferromagnet modeled by the Heisenberg Hamiltonian. An example of quantum control is the spin torque transfer, treated here as a sequence of scatterings of each current electron with the localized electrons of the ferromagnet, yields in each encounter a probability distribution of the magnetization recoil state correlated with each outgoing state of the electron. This picture provides a natural Monte Carlo process for simulation of the dynamics in which the probability is determined by quantum mechanics. The computed results of mean motion, noise and damping of the magnetization will be discussed.
NASA Astrophysics Data System (ADS)
Tanona, Scott Daniel
I develop a new analysis of Niels Bohr's Copenhagen interpretation of quantum mechanics by examining the development of his views from his earlier use of the correspondence principle in the so-called 'old quantum theory' to his articulation of the idea of complementarity in the context of the novel mathematical formalism of quantum mechanics. I argue that Bohr was motivated not by controversial and perhaps dispensable epistemological ideas---positivism or neo-Kantianism, for example---but by his own unique perspective on the difficulties of creating a new working physics of the internal structure of the atom. Bohr's use of the correspondence principle in the old quantum theory was associated with an empirical methodology that used this principle as an epistemological bridge to connect empirical phenomena with quantum models. The application of the correspondence principle required that one determine the validity of the idealizations and approximations necessary for the judicious use of classical physics within quantum theory. Bohr's interpretation of the new quantum mechanics then focused on the largely unexamined ways in which the developing abstract mathematical formalism is given empirical content by precisely this process of approximation. Significant consistency between his later interpretive framework and his forms of argument with the correspondence principle indicate that complementarity is best understood as a relationship among the various approximations and idealizations that must be made when one connects otherwise meaningless quantum mechanical symbols to empirical situations or 'experimental arrangements' described using concepts from classical physics. We discover that this relationship is unavoidable not through any sort of a priori analysis of the priority of classical concepts, but because quantum mechanics incorporates the correspondence approach in the way in which it represents quantum properties with matrices of transition probabilities, the empirical meaning of which depend on the situation but in general are tied to the correspondence connection to the spectra. For Bohr, it is then the commutation relations, which arise from the formalism, which inform us of the complementary nature of this approximate representation of quantum properties via the classical equations through which we connect them to experiments.
NASA Astrophysics Data System (ADS)
Stephanik, Brian Michael
This dissertation describes the results of two related investigations into introductory student understanding of ideas from classical physics that are key elements of quantum mechanics. One investigation probes the extent to which students are able to interpret and apply potential energy diagrams (i.e., graphs of potential energy versus position). The other probes the extent to which students are able to reason classically about probability and spatial probability density. The results of these investigations revealed significant conceptual and reasoning difficulties that students encounter with these topics. The findings guided the design of instructional materials to address the major problems. Results from post-instructional assessments are presented that illustrate the impact of the curricula on student learning.
INFORMATION-THEORETIC INEQUALITIES ON UNIMODULAR LIE GROUPS
Chirikjian, Gregory S.
2010-01-01
Classical inequalities used in information theory such as those of de Bruijn, Fisher, Cramér, Rao, and Kullback carry over in a natural way from Euclidean space to unimodular Lie groups. These are groups that possess an integration measure that is simultaneously invariant under left and right shifts. All commutative groups are unimodular. And even in noncommutative cases unimodular Lie groups share many of the useful features of Euclidean space. The rotation and Euclidean motion groups, which are perhaps the most relevant Lie groups to problems in geometric mechanics, are unimodular, as are the unitary groups that play important roles in quantum computing. The extension of core information theoretic inequalities defined in the setting of Euclidean space to this broad class of Lie groups is potentially relevant to a number of problems relating to information gathering in mobile robotics, satellite attitude control, tomographic image reconstruction, biomolecular structure determination, and quantum information theory. In this paper, several definitions are extended from the Euclidean setting to that of Lie groups (including entropy and the Fisher information matrix), and inequalities analogous to those in classical information theory are derived and stated in the form of fifteen small theorems. In all such inequalities, addition of random variables is replaced with the group product, and the appropriate generalization of convolution of probability densities is employed. An example from the field of robotics demonstrates how several of these results can be applied to quantify the amount of information gained by pooling different sensory inputs. PMID:21113416
Dissociation rate of bromine diatomics in an argon heat bath
NASA Technical Reports Server (NTRS)
Razner, R.; Hopkins, D.
1973-01-01
The evolution of a collection of 300 K bromine diatomics embedded in a heat bath of argon atoms at 1800 K was studied by computer, and a dissociation-rate constant for the reaction Br2 + BR + Ar yields Br + Ar was determined. Previously published probability distributions for energy and angular momentum transfers in classical three-dimensional Br2-Ar collisions were used in conjunction with a newly developed Monte Carlo scheme for this purpose. Results are compared with experimental shock-tube data and the predictions of several other theoretical models. A departure from equilibrium is obtained which is significantly greater than that predicted by any of these other theories.
Practical results from a mathematical analysis of guard patrols
DOE Office of Scientific and Technical Information (OSTI.GOV)
Indusi, Joseph P.
1978-12-01
Using guard patrols as a primary detection mechanism is not generally viewed as a highly efficient detection method when compared to electronic means. Many factors such as visibility, alertness, and the space-time coincidence of guard and adversary presence all have an effect on the probability of detection. Mathematical analysis of the guard patrol detection problem is related to that of classical search theory originally developed for naval search operations. The results of this analysis tend to support the current practice of using guard forces to assess and respond to previously detected intrusions and not as the primary detection mechanism. 6more » refs.« less
NASA Technical Reports Server (NTRS)
Smith, Jeffrey H.
2006-01-01
The need for sufficient quantities of oxygen, water, and fuel resources to support a crew on the surface of Mars presents a critical logistical issue of whether to transport such resources from Earth or manufacture them on Mars. An approach based on the classical Wildcat Drilling Problem of Bayesian decision theory was applied to the problem of finding water in order to compute the expected value of precursor mission sample information. An implicit (required) probability of finding water on Mars was derived from the value of sample information using the expected mass savings of alternative precursor missions.
Life-history differences favor evolution of male dimorphism in competitive games.
Smallegange, Isabel M; Johansson, Jacob
2014-02-01
Many species exhibit two discrete male morphs: fighters and sneakers. Fighters are large and possess weapons but may mature slowly. Sneakers are small and have no weapons but can sneak matings and may mature quickly to start mating earlier in life than fighters. However, how differences in competitive ability and life history interact to determine male morph coexistence has not yet been investigated within a single framework. Here we integrate demography and game theory into a two-sex population model to study the evolution of strategies that result in the coexistence of fighters and sneakers. We incorporate differences in maturation time between the morphs and use a mating-probability matrix analogous to the classic hawk-dove game. Using adaptive dynamics, we show that male dimorphism evolves more easily in our model than in classic game theory approaches. Our results also revealed an interaction between life-history differences and sneaker competitiveness, which shows that demography and competitive games should be treated as interlinked mechanisms to understand the evolution of male dimorphism. Applying our approach to empirical data on bulb mites (Rhizoglyphus robini), coho salmon (Oncorhynchus kisutch), and bullhorned dung beetles (Onthophagus taurus) indicates that observed occurrences of male dimorphism are in general agreement with model predictions.
The Prediction of Item Parameters Based on Classical Test Theory and Latent Trait Theory
ERIC Educational Resources Information Center
Anil, Duygu
2008-01-01
In this study, the prediction power of the item characteristics based on the experts' predictions on conditions try-out practices cannot be applied was examined for item characteristics computed depending on classical test theory and two-parameters logistic model of latent trait theory. The study was carried out on 9914 randomly selected students…
Any Ontological Model of the Single Qubit Stabilizer Formalism must be Contextual
NASA Astrophysics Data System (ADS)
Lillystone, Piers; Wallman, Joel J.
Quantum computers allow us to easily solve some problems classical computers find hard. Non-classical improvements in computational power should be due to some non-classical property of quantum theory. Contextuality, a more general notion of non-locality, is a necessary, but not sufficient, resource for quantum speed-up. Proofs of contextuality can be constructed for the classically simulable stabilizer formalism. Previous proofs of stabilizer contextuality are known for 2 or more qubits, for example the Mermin-Peres magic square. In the work presented we extend these results and prove that any ontological model of the single qubit stabilizer theory must be contextual, as defined by R. Spekkens, and give a relation between our result and the Mermin-Peres square. By demonstrating that contextuality is present in the qubit stabilizer formalism we provide further insight into the contextuality present in quantum theory. Understanding the contextuality of classical sub-theories will allow us to better identify the physical properties of quantum theory required for computational speed up. This research was supported by CIFAR, the Government of Ontario, and the Government of Canada through NSERC and Industry Canada.
NASA Astrophysics Data System (ADS)
Oblow, E. M.
1982-10-01
An evaluation was made of the mathematical and economic basis for conversion processes in the Long-term Energy Analysis Program (LEAP) energy economy model. Conversion processes are the main modeling subunit in LEAP used to represent energy conversion industries and are supposedly based on the classical economic theory of the firm. Questions about uniqueness and existence of LEAP solutions and their relation to classical equilibrium economic theory prompted the study. An analysis of classical theory and LEAP model equations was made to determine their exact relationship. The conclusions drawn from this analysis were that LEAP theory is not consistent with the classical theory of the firm. Specifically, the capacity factor formalism used by LEAP does not support a classical interpretation in terms of a technological production function for energy conversion processes. The economic implications of this inconsistency are suboptimal process operation and short term negative profits in years where plant operation should be terminated. A new capacity factor formalism, which retains the behavioral features of the original model, is proposed to resolve these discrepancies.
ERIC Educational Resources Information Center
Lange, Elizabeth
2015-01-01
This article argues that sociology has been a foundational discipline for the field of adult education, but it has been largely implicit, until recently. This article contextualizes classical theories of sociology within contemporary critiques, reviews the historical roots of sociology and then briefly introduces the classical theories…
A fast elitism Gaussian estimation of distribution algorithm and application for PID optimization.
Xu, Qingyang; Zhang, Chengjin; Zhang, Li
2014-01-01
Estimation of distribution algorithm (EDA) is an intelligent optimization algorithm based on the probability statistics theory. A fast elitism Gaussian estimation of distribution algorithm (FEGEDA) is proposed in this paper. The Gaussian probability model is used to model the solution distribution. The parameters of Gaussian come from the statistical information of the best individuals by fast learning rule. A fast learning rule is used to enhance the efficiency of the algorithm, and an elitism strategy is used to maintain the convergent performance. The performances of the algorithm are examined based upon several benchmarks. In the simulations, a one-dimensional benchmark is used to visualize the optimization process and probability model learning process during the evolution, and several two-dimensional and higher dimensional benchmarks are used to testify the performance of FEGEDA. The experimental results indicate the capability of FEGEDA, especially in the higher dimensional problems, and the FEGEDA exhibits a better performance than some other algorithms and EDAs. Finally, FEGEDA is used in PID controller optimization of PMSM and compared with the classical-PID and GA.
A Fast Elitism Gaussian Estimation of Distribution Algorithm and Application for PID Optimization
Xu, Qingyang; Zhang, Chengjin; Zhang, Li
2014-01-01
Estimation of distribution algorithm (EDA) is an intelligent optimization algorithm based on the probability statistics theory. A fast elitism Gaussian estimation of distribution algorithm (FEGEDA) is proposed in this paper. The Gaussian probability model is used to model the solution distribution. The parameters of Gaussian come from the statistical information of the best individuals by fast learning rule. A fast learning rule is used to enhance the efficiency of the algorithm, and an elitism strategy is used to maintain the convergent performance. The performances of the algorithm are examined based upon several benchmarks. In the simulations, a one-dimensional benchmark is used to visualize the optimization process and probability model learning process during the evolution, and several two-dimensional and higher dimensional benchmarks are used to testify the performance of FEGEDA. The experimental results indicate the capability of FEGEDA, especially in the higher dimensional problems, and the FEGEDA exhibits a better performance than some other algorithms and EDAs. Finally, FEGEDA is used in PID controller optimization of PMSM and compared with the classical-PID and GA. PMID:24892059
Stott, Clifford; Drury, John
2016-04-01
This article explores the origins and ideology of classical crowd psychology, a body of theory reflected in contemporary popularised understandings such as of the 2011 English 'riots'. This article argues that during the nineteenth century, the crowd came to symbolise a fear of 'mass society' and that 'classical' crowd psychology was a product of these fears. Classical crowd psychology pathologised, reified and decontextualised the crowd, offering the ruling elites a perceived opportunity to control it. We contend that classical theory misrepresents crowd psychology and survives in contemporary understanding because it is ideological. We conclude by discussing how classical theory has been supplanted in academic contexts by an identity-based crowd psychology that restores the meaning to crowd action, replaces it in its social context and in so doing transforms theoretical understanding of 'riots' and the nature of the self. © The Author(s) 2016.
Influence of an asymmetric ring on the modeling of an orthogonally stiffened cylindrical shell
NASA Technical Reports Server (NTRS)
Rastogi, Naveen; Johnson, Eric R.
1994-01-01
Structural models are examined for the influence of a ring with an asymmetrical cross section on the linear elastic response of an orthogonally stiffened cylindrical shell subjected to internal pressure. The first structural model employs classical theory for the shell and stiffeners. The second model employs transverse shear deformation theories for the shell and stringer and classical theory for the ring. Closed-end pressure vessel effects are included. Interacting line load intensities are computed in the stiffener-to-skin joints for an example problem having the dimensions of the fuselage of a large transport aircraft. Classical structural theory is found to exaggerate the asymmetric response compared to the transverse shear deformation theory.
Self-Consistent Field Lattice Model for Polymer Networks.
Tito, Nicholas B; Storm, Cornelis; Ellenbroek, Wouter G
2017-12-26
A lattice model based on polymer self-consistent field theory is developed to predict the equilibrium statistics of arbitrary polymer networks. For a given network topology, our approach uses moment propagators on a lattice to self-consistently construct the ensemble of polymer conformations and cross-link spatial probability distributions. Remarkably, the calculation can be performed "in the dark", without any prior knowledge on preferred chain conformations or cross-link positions. Numerical results from the model for a test network exhibit close agreement with molecular dynamics simulations, including when the network is strongly sheared. Our model captures nonaffine deformation, mean-field monomer interactions, cross-link fluctuations, and finite extensibility of chains, yielding predictions that differ markedly from classical rubber elasticity theory for polymer networks. By examining polymer networks with different degrees of interconnectivity, we gain insight into cross-link entropy, an important quantity in the macroscopic behavior of gels and self-healing materials as they are deformed.
Nonequilibrium Entropy in a Shock
Margolin, Len G.
2017-07-19
In a classic paper, Morduchow and Libby use an analytic solution for the profile of a Navier–Stokes shock to show that the equilibrium thermodynamic entropy has a maximum inside the shock. There is no general nonequilibrium thermodynamic formulation of entropy; the extension of equilibrium theory to nonequililbrium processes is usually made through the assumption of local thermodynamic equilibrium (LTE). However, gas kinetic theory provides a perfectly general formulation of a nonequilibrium entropy in terms of the probability distribution function (PDF) solutions of the Boltzmann equation. In this paper I will evaluate the Boltzmann entropy for the PDF that underlies themore » Navier–Stokes equations and also for the PDF of the Mott–Smith shock solution. I will show that both monotonically increase in the shock. As a result, I will propose a new nonequilibrium thermodynamic entropy and show that it is also monotone and closely approximates the Boltzmann entropy.« less
Point-to-point connectivity prediction in porous media using percolation theory
NASA Astrophysics Data System (ADS)
Tavagh-Mohammadi, Behnam; Masihi, Mohsen; Ganjeh-Ghazvini, Mostafa
2016-10-01
The connectivity between two points in porous media is important for evaluating hydrocarbon recovery in underground reservoirs or toxic migration in waste disposal. For example, the connectivity between a producer and an injector in a hydrocarbon reservoir impact the fluid dispersion throughout the system. The conventional approach, flow simulation, is computationally very expensive and time consuming. Alternative method employs percolation theory. Classical percolation approach investigates the connectivity between two lines (representing the wells) in 2D cross sectional models whereas we look for the connectivity between two points (representing the wells) in 2D aerial models. In this study, site percolation is used to determine the fraction of permeable regions connected between two cells at various occupancy probabilities and system sizes. The master curves of mean connectivity and its uncertainty are then generated by finite size scaling. The results help to predict well-to-well connectivity without need to any further simulation.
Nonequilibrium Entropy in a Shock
DOE Office of Scientific and Technical Information (OSTI.GOV)
Margolin, Len G.
In a classic paper, Morduchow and Libby use an analytic solution for the profile of a Navier–Stokes shock to show that the equilibrium thermodynamic entropy has a maximum inside the shock. There is no general nonequilibrium thermodynamic formulation of entropy; the extension of equilibrium theory to nonequililbrium processes is usually made through the assumption of local thermodynamic equilibrium (LTE). However, gas kinetic theory provides a perfectly general formulation of a nonequilibrium entropy in terms of the probability distribution function (PDF) solutions of the Boltzmann equation. In this paper I will evaluate the Boltzmann entropy for the PDF that underlies themore » Navier–Stokes equations and also for the PDF of the Mott–Smith shock solution. I will show that both monotonically increase in the shock. As a result, I will propose a new nonequilibrium thermodynamic entropy and show that it is also monotone and closely approximates the Boltzmann entropy.« less
Real time forecasting of near-future evolution.
Gerrish, Philip J; Sniegowski, Paul D
2012-09-07
A metaphor for adaptation that informs much evolutionary thinking today is that of mountain climbing, where horizontal displacement represents change in genotype, and vertical displacement represents change in fitness. If it were known a priori what the 'fitness landscape' looked like, that is, how the myriad possible genotypes mapped onto fitness, then the possible paths up the fitness mountain could each be assigned a probability, thus providing a dynamical theory with long-term predictive power. Such detailed genotype-fitness data, however, are rarely available and are subject to change with each change in the organism or in the environment. Here, we take a very different approach that depends only on fitness or phenotype-fitness data obtained in real time and requires no a priori information about the fitness landscape. Our general statistical model of adaptive evolution builds on classical theory and gives reasonable predictions of fitness and phenotype evolution many generations into the future.
NASA Technical Reports Server (NTRS)
Cheeseman, Peter; Stutz, John
2005-01-01
A long standing mystery in using Maximum Entropy (MaxEnt) is how to deal with constraints whose values are uncertain. This situation arises when constraint values are estimated from data, because of finite sample sizes. One approach to this problem, advocated by E.T. Jaynes [1], is to ignore this uncertainty, and treat the empirically observed values as exact. We refer to this as the classic MaxEnt approach. Classic MaxEnt gives point probabilities (subject to the given constraints), rather than probability densities. We develop an alternative approach that assumes that the uncertain constraint values are represented by a probability density {e.g: a Gaussian), and this uncertainty yields a MaxEnt posterior probability density. That is, the classic MaxEnt point probabilities are regarded as a multidimensional function of the given constraint values, and uncertainty on these values is transmitted through the MaxEnt function to give uncertainty over the MaXEnt probabilities. We illustrate this approach by explicitly calculating the generalized MaxEnt density for a simple but common case, then show how this can be extended numerically to the general case. This paper expands the generalized MaxEnt concept introduced in a previous paper [3].
Patel, Nitin R; Ankolekar, Suresh
2007-11-30
Classical approaches to clinical trial design ignore economic factors that determine economic viability of a new drug. We address the choice of sample size in Phase III trials as a decision theory problem using a hybrid approach that takes a Bayesian view from the perspective of a drug company and a classical Neyman-Pearson view from the perspective of regulatory authorities. We incorporate relevant economic factors in the analysis to determine the optimal sample size to maximize the expected profit for the company. We extend the analysis to account for risk by using a 'satisficing' objective function that maximizes the chance of meeting a management-specified target level of profit. We extend the models for single drugs to a portfolio of clinical trials and optimize the sample sizes to maximize the expected profit subject to budget constraints. Further, we address the portfolio risk and optimize the sample sizes to maximize the probability of achieving a given target of expected profit.
ERIC Educational Resources Information Center
Magno, Carlo
2009-01-01
The present report demonstrates the difference between classical test theory (CTT) and item response theory (IRT) approach using an actual test data for chemistry junior high school students. The CTT and IRT were compared across two samples and two forms of test on their item difficulty, internal consistency, and measurement errors. The specific…
ERIC Educational Resources Information Center
Guler, Nese; Gelbal, Selahattin
2010-01-01
In this study, the Classical test theory and generalizability theory were used for determination to reliability of scores obtained from measurement tool of mathematics success. 24 open-ended mathematics question of the TIMSS-1999 was applied to 203 students in 2007-spring semester. Internal consistency of scores was found as 0.92. For…
ERIC Educational Resources Information Center
Kohli, Nidhi; Koran, Jennifer; Henn, Lisa
2015-01-01
There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…
Cappelleri, Joseph C; Jason Lundy, J; Hays, Ron D
2014-05-01
The US Food and Drug Administration's guidance for industry document on patient-reported outcomes (PRO) defines content validity as "the extent to which the instrument measures the concept of interest" (FDA, 2009, p. 12). According to Strauss and Smith (2009), construct validity "is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity" (p. 7). Hence, both qualitative and quantitative information are essential in evaluating the validity of measures. We review classical test theory and item response theory (IRT) approaches to evaluating PRO measures, including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized "difficulty" (severity) order of items is represented by observed responses. If a researcher has few qualitative data and wants to get preliminary information about the content validity of the instrument, then descriptive assessments using classical test theory should be the first step. As the sample size grows during subsequent stages of instrument development, confidence in the numerical estimates from Rasch and other IRT models (as well as those of classical test theory) would also grow. Classical test theory and IRT can be useful in providing a quantitative assessment of items and scales during the content-validity phase of PRO-measure development. Depending on the particular type of measure and the specific circumstances, the classical test theory and/or the IRT should be considered to help maximize the content validity of PRO measures. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.
Constrained variational calculus for higher order classical field theories
NASA Astrophysics Data System (ADS)
Campos, Cédric M.; de León, Manuel; Martín de Diego, David
2010-11-01
We develop an intrinsic geometrical setting for higher order constrained field theories. As a main tool we use an appropriate generalization of the classical Skinner-Rusk formalism. Some examples of applications are studied, in particular to the geometrical description of optimal control theory for partial differential equations.
Using harmonic oscillators to determine the spot size of Hermite-Gaussian laser beams
NASA Technical Reports Server (NTRS)
Steely, Sidney L.
1993-01-01
The similarity of the functional forms of quantum mechanical harmonic oscillators and the modes of Hermite-Gaussian laser beams is illustrated. This functional similarity provides a direct correlation to investigate the spot size of large-order mode Hermite-Gaussian laser beams. The classical limits of a corresponding two-dimensional harmonic oscillator provide a definition of the spot size of Hermite-Gaussian laser beams. The classical limits of the harmonic oscillator provide integration limits for the photon probability densities of the laser beam modes to determine the fraction of photons detected therein. Mathematica is used to integrate the probability densities for large-order beam modes and to illustrate the functional similarities. The probabilities of detecting photons within the classical limits of Hermite-Gaussian laser beams asymptotically approach unity in the limit of large-order modes, in agreement with the Correspondence Principle. The classical limits for large-order modes include all of the nodes for Hermite Gaussian laser beams; Sturm's theorem provides a direct proof.
Quantum illumination for enhanced detection of Rayleigh-fading targets
NASA Astrophysics Data System (ADS)
Zhuang, Quntao; Zhang, Zheshen; Shapiro, Jeffrey H.
2017-08-01
Quantum illumination (QI) is an entanglement-enhanced sensing system whose performance advantage over a comparable classical system survives its usage in an entanglement-breaking scenario plagued by loss and noise. In particular, QI's error-probability exponent for discriminating between equally likely hypotheses of target absence or presence is 6 dB higher than that of the optimum classical system using the same transmitted power. This performance advantage, however, presumes that the target return, when present, has known amplitude and phase, a situation that seldom occurs in light detection and ranging (lidar) applications. At lidar wavelengths, most target surfaces are sufficiently rough that their returns are speckled, i.e., they have Rayleigh-distributed amplitudes and uniformly distributed phases. QI's optical parametric amplifier receiver—which affords a 3 dB better-than-classical error-probability exponent for a return with known amplitude and phase—fails to offer any performance gain for Rayleigh-fading targets. We show that the sum-frequency generation receiver [Zhuang et al., Phys. Rev. Lett. 118, 040801 (2017), 10.1103/PhysRevLett.118.040801]—whose error-probability exponent for a nonfading target achieves QI's full 6 dB advantage over optimum classical operation—outperforms the classical system for Rayleigh-fading targets. In this case, QI's advantage is subexponential: its error probability is lower than the classical system's by a factor of 1 /ln(M κ ¯NS/NB) , when M κ ¯NS/NB≫1 , with M ≫1 being the QI transmitter's time-bandwidth product, NS≪1 its brightness, κ ¯ the target return's average intensity, and NB the background light's brightness.
Segregating gas from melt: an experimental study of the Ostwald ripening of vapor bubbles in magmas
Lautze, Nicole C.; Sisson, Thomas W.; Mangan, Margaret T.; Grove, Timothy L.
2011-01-01
Diffusive coarsening (Ostwald ripening) of H2O and H2O-CO2 bubbles in rhyolite and basaltic andesite melts was studied with elevated temperature–pressure experiments to investigate the rates and time spans over which vapor bubbles may enlarge and attain sufficient buoyancy to segregate in magmatic systems. Bubble growth and segregation are also considered in terms of classical steady-state and transient (non-steady-state) ripening theory. Experimental results are consistent with diffusive coarsening as the dominant mechanism of bubble growth. Ripening is faster in experiments saturated with pure H2O than in those with a CO2-rich mixed vapor probably due to faster diffusion of H2O than CO2 through the melt. None of the experimental series followed the time1/3 increase in mean bubble radius and time-1 decrease in bubble number density predicted by classical steady-state ripening theory. Instead, products are interpreted as resulting from transient regime ripening. Application of transient regime theory suggests that bubbly magmas may require from days to 100 years to reach steady-state ripening conditions. Experimental results, as well as theory for steady-state ripening of bubbles that are immobile or undergoing buoyant ascent, indicate that diffusive coarsening efficiently eliminates micron-sized bubbles and would produce mm-sized bubbles in 102–104 years in crustal magma bodies. Once bubbles attain mm-sizes, their calculated ascent rates are sufficient that they could transit multiple kilometers over hundreds to thousands of years through mafic and silicic melt, respectively. These results show that diffusive coarsening can facilitate transfer of volatiles through, and from, magmatic systems by creating bubbles sufficiently large for rapid ascent.
Dressing the post-Newtonian two-body problem and classical effective field theory
NASA Astrophysics Data System (ADS)
Kol, Barak; Smolkin, Michael
2009-12-01
We apply a dressed perturbation theory to better organize and economize the computation of high orders of the 2-body effective action of an inspiralling post-Newtonian (PN) gravitating binary. We use the effective field theory approach with the nonrelativistic field decomposition (NRG fields). For that purpose we develop quite generally the dressing theory of a nonlinear classical field theory coupled to pointlike sources. We introduce dressed charges and propagators, but unlike the quantum theory there are no dressed bulk vertices. The dressed quantities are found to obey recursive integral equations which succinctly encode parts of the diagrammatic expansion, and are the classical version of the Schwinger-Dyson equations. Actually, the classical equations are somewhat stronger since they involve only finitely many quantities, unlike the quantum theory. Classical diagrams are shown to factorize exactly when they contain nonlinear worldline vertices, and we classify all the possible topologies of irreducible diagrams for low loop numbers. We apply the dressing program to our post-Newtonian case of interest. The dressed charges consist of the dressed energy-momentum tensor after a nonrelativistic decomposition, and we compute all dressed charges (in the harmonic gauge) appearing up to 2PN in the 2-body effective action (and more). We determine the irreducible skeleton diagrams up to 3PN and we employ the dressed charges to compute several terms beyond 2PN.
Born’s rule as signature of a superclassical current algebra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fussy, S.; Mesa Pascasio, J.; Institute for Atomic and Subatomic Physics, Vienna University of Technology, Operng. 9, 1040 Vienna
2014-04-15
We present a new tool for calculating the interference patterns and particle trajectories of a double-, three- and N-slit system on the basis of an emergent sub-quantum theory developed by our group throughout the last years. The quantum itself is considered as an emergent system representing an off-equilibrium steady state oscillation maintained by a constant throughput of energy provided by a classical zero-point energy field. We introduce the concept of a “relational causality” which allows for evaluating structural interdependences of different systems levels, i.e. in our case of the relations between partial and total probability density currents, respectively. Combined with themore » application of 21st century classical physics like, e.g., modern nonequilibrium thermodynamics, we thus arrive at a “superclassical” theory. Within this framework, the proposed current algebra directly leads to a new formulation of the guiding equation which is equivalent to the original one of the de Broglie–Bohm theory. By proving the absence of third order interferences in three-path systems it is shown that Born’s rule is a natural consequence of our theory. Considering the series of one-, double-, or, generally, of N-slit systems, with the first appearance of an interference term in the double slit case, we can explain the violation of Sorkin’s first order sum rule, just as the validity of all higher order sum rules. Moreover, the Talbot patterns and Talbot distance for an arbitrary N-slit device can be reproduced exactly by our model without any quantum physics tool. -- Highlights: •Calculating the interference patterns and particle trajectories of a double-, three- and N-slit system. •Deriving a new formulation of the guiding equation equivalent to the de Broglie–Bohm one. •Proving the absence of third order interferences and thus explaining Born’s rule. •Explaining the violation of Sorkin’s order sum rules. •Classical simulation of Talbot patterns and exact reproduction of Talbot distance for N slits.« less
Using Playing Cards to Differentiate Probability Interpretations
ERIC Educational Resources Information Center
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
Bosonic Loop Diagrams as Perturbative Solutions of the Classical Field Equations in ϕ4-Theory
NASA Astrophysics Data System (ADS)
Finster, Felix; Tolksdorf, Jürgen
2012-05-01
Solutions of the classical ϕ4-theory in Minkowski space-time are analyzed in a perturbation expansion in the nonlinearity. Using the language of Feynman diagrams, the solution of the Cauchy problem is expressed in terms of tree diagrams which involve the retarded Green's function and have one outgoing leg. In order to obtain general tree diagrams, we set up a "classical measurement process" in which a virtual observer of a scattering experiment modifies the field and detects suitable energy differences. By adding a classical stochastic background field, we even obtain all loop diagrams. The expansions are compared with the standard Feynman diagrams of the corresponding quantum field theory.
How Settings Change People: Applying Behavior Setting Theory to Consumer-Run Organizations
ERIC Educational Resources Information Center
Brown, Louis D.; Shepherd, Matthew D.; Wituk, Scott A.; Meissen, Greg
2007-01-01
Self-help initiatives stand as a classic context for organizational studies in community psychology. Behavior setting theory stands as a classic conception of organizations and the environment. This study explores both, applying behavior setting theory to consumer-run organizations (CROs). Analysis of multiple data sets from all CROs in Kansas…
ERIC Educational Resources Information Center
Langdale, John A.
The construct of "organizational climate" was explicated and various ways of operationalizing it were reviewed. A survey was made of the literature pertinent to the classical-human relations dimension of environmental quality. As a result, it was hypothesized that the appropriateness of the classical and human-relations master plans is moderated…
The evolving Planck mass in classically scale-invariant theories
NASA Astrophysics Data System (ADS)
Kannike, K.; Raidal, M.; Spethmann, C.; Veermäe, H.
2017-04-01
We consider classically scale-invariant theories with non-minimally coupled scalar fields, where the Planck mass and the hierarchy of physical scales are dynamically generated. The classical theories possess a fixed point, where scale invariance is spontaneously broken. In these theories, however, the Planck mass becomes unstable in the presence of explicit sources of scale invariance breaking, such as non-relativistic matter and cosmological constant terms. We quantify the constraints on such classical models from Big Bang Nucleosynthesis that lead to an upper bound on the non-minimal coupling and require trans-Planckian field values. We show that quantum corrections to the scalar potential can stabilise the fixed point close to the minimum of the Coleman-Weinberg potential. The time-averaged motion of the evolving fixed point is strongly suppressed, thus the limits on the evolving gravitational constant from Big Bang Nucleosynthesis and other measurements do not presently constrain this class of theories. Field oscillations around the fixed point, if not damped, contribute to the dark matter density of the Universe.
DOE R&D Accomplishments Database
Weinberg, Alvin M.; Noderer, L. C.
1951-05-15
The large scale release of nuclear energy in a uranium fission chain reaction involves two essentially distinct physical phenomena. On the one hand there are the individual nuclear processes such as fission, neutron capture, and neutron scattering. These are essentially quantum mechanical in character, and their theory is non-classical. On the other hand, there is the process of diffusion -- in particular, diffusion of neutrons, which is of fundamental importance in a nuclear chain reaction. This process is classical; insofar as the theory of the nuclear chain reaction depends on the theory of neutron diffusion, the mathematical study of chain reactions is an application of classical, not quantum mechanical, techniques.
Using Bayes' theorem for free energy calculations
NASA Astrophysics Data System (ADS)
Rogers, David M.
Statistical mechanics is fundamentally based on calculating the probabilities of molecular-scale events. Although Bayes' theorem has generally been recognized as providing key guiding principals for setup and analysis of statistical experiments [83], classical frequentist models still predominate in the world of computational experimentation. As a starting point for widespread application of Bayesian methods in statistical mechanics, we investigate the central quantity of free energies from this perspective. This dissertation thus reviews the basics of Bayes' view of probability theory, and the maximum entropy formulation of statistical mechanics before providing examples of its application to several advanced research areas. We first apply Bayes' theorem to a multinomial counting problem in order to determine inner shell and hard sphere solvation free energy components of Quasi-Chemical Theory [140]. We proceed to consider the general problem of free energy calculations from samples of interaction energy distributions. From there, we turn to spline-based estimation of the potential of mean force [142], and empirical modeling of observed dynamics using integrator matching. The results of this research are expected to advance the state of the art in coarse-graining methods, as they allow a systematic connection from high-resolution (atomic) to low-resolution (coarse) structure and dynamics. In total, our work on these problems constitutes a critical starting point for further application of Bayes' theorem in all areas of statistical mechanics. It is hoped that the understanding so gained will allow for improvements in comparisons between theory and experiment.
Classical conformality in the Standard Model from Coleman’s theory
NASA Astrophysics Data System (ADS)
Kawana, Kiyoharu
2016-09-01
The classical conformality (CC) is one of the possible candidates for explaining the gauge hierarchy of the Standard Model (SM). We show that it is naturally obtained from the Coleman’s theory on baby universe.
A post-classical theory of enamel biomineralization… and why we need one.
Simmer, James P; Richardson, Amelia S; Hu, Yuan-Yuan; Smith, Charles E; Ching-Chun Hu, Jan
2012-09-01
Enamel crystals are unique in shape, orientation and organization. They are hundreds of thousands times longer than they are wide, run parallel to each other, are oriented with respect to the ameloblast membrane at the mineralization front and are organized into rod or interrod enamel. The classical theory of amelogenesis postulates that extracellular matrix proteins shape crystallites by specifically inhibiting ion deposition on the crystal sides, orient them by binding multiple crystallites and establish higher levels of crystal organization. Elements of the classical theory are supported in principle by in vitro studies; however, the classical theory does not explain how enamel forms in vivo. In this review, we describe how amelogenesis is highly integrated with ameloblast cell activities and how the shape, orientation and organization of enamel mineral ribbons are established by a mineralization front apparatus along the secretory surface of the ameloblast cell membrane.
Successive phase transitions and kink solutions in Φ⁸, Φ¹⁰, and Φ¹² field theories
Khare, Avinash; Christov, Ivan C.; Saxena, Avadh
2014-08-27
We obtain exact solutions for kinks in Φ⁸, Φ¹⁰, and Φ¹² field theories with degenerate minima, which can describe a second-order phase transition followed by a first-order one, a succession of two first-order phase transitions and a second-order phase transition followed by two first-order phase transitions, respectively. Such phase transitions are known to occur in ferroelastic and ferroelectric crystals and in meson physics. In particular, we find that the higher-order field theories have kink solutions with algebraically-decaying tails and also asymmetric cases with mixed exponential-algebraic tail decay, unlike the lower-order Φ⁴ and Φ⁶ theories. Additionally, we construct distinct kinks withmore » equal energies in all three field theories considered, and we show the co-existence of up to three distinct kinks (for a Φ¹² potential with six degenerate minima). We also summarize phonon dispersion relations for these systems, showing that the higher-order field theories have specific cases in which only nonlinear phonons are allowed. For the Φ¹⁰ field theory, which is a quasi-exactly solvable (QES) model akin to Φ⁶, we are also able to obtain three analytical solutions for the classical free energy as well as the probability distribution function in the thermodynamic limit.« less
Not all (possibly) “random” sequences are created equal
Pincus, Steve; Kalman, Rudolf E.
1997-01-01
The need to assess the randomness of a single sequence, especially a finite sequence, is ubiquitous, yet is unaddressed by axiomatic probability theory. Here, we assess randomness via approximate entropy (ApEn), a computable measure of sequential irregularity, applicable to single sequences of both (even very short) finite and infinite length. We indicate the novelty and facility of the multidimensional viewpoint taken by ApEn, in contrast to classical measures. Furthermore and notably, for finite length, finite state sequences, one can identify maximally irregular sequences, and then apply ApEn to quantify the extent to which given sequences differ from maximal irregularity, via a set of deficit (defm) functions. The utility of these defm functions which we show allows one to considerably refine the notions of probabilistic independence and normality, is featured in several studies, including (i) digits of e, π, √2, and √3, both in base 2 and in base 10, and (ii) sequences given by fractional parts of multiples of irrationals. We prove companion analytic results, which also feature in a discussion of the role and validity of the almost sure properties from axiomatic probability theory insofar as they apply to specified sequences and sets of sequences (in the physical world). We conclude by relating the present results and perspective to both previous and subsequent studies. PMID:11038612
Quantum probabilistic logic programming
NASA Astrophysics Data System (ADS)
Balu, Radhakrishnan
2015-05-01
We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.
NASA Astrophysics Data System (ADS)
Vitanov, Nikolay V.
2018-05-01
In the experimental determination of the population transfer efficiency between discrete states of a coherently driven quantum system it is often inconvenient to measure the population of the target state. Instead, after the interaction that transfers the population from the initial state to the target state, a second interaction is applied which brings the system back to the initial state, the population of which is easy to measure and normalize. If the transition probability is p in the forward process, then classical intuition suggests that the probability to return to the initial state after the backward process should be p2. However, this classical expectation is generally misleading because it neglects interference effects. This paper presents a rigorous theoretical analysis based on the SU(2) and SU(3) symmetries of the propagators describing the evolution of quantum systems with two and three states, resulting in explicit analytic formulas that link the two-step probabilities to the single-step ones. Explicit examples are given with the popular techniques of rapid adiabatic passage and stimulated Raman adiabatic passage. The present results suggest that quantum-mechanical probabilities degrade faster in repeated processes than classical probabilities. Therefore, the actual single-pass efficiencies in various experiments, calculated from double-pass probabilities, might have been greater than the reported values.
Phase selection during crystallization of undercooled liquid eutectic lead-tin alloys
NASA Technical Reports Server (NTRS)
Fecht, H. J.
1991-01-01
During rapid solidification substantial amounts of undercooling are in general required for formation of metastable phases. Crystallization at varying levels of undercooling and melting of metastable phases were studied during slow cooling and heating of emulsified PB-Sn alloys. Besides the experimental demonstration of the reversibility of metastable phase equilibra, two different principal solidification paths have been identified and compared with the established metastable phase diagram and predictions from classical nucleation theory. The results suggest that the most probable solidification path is described by the 'step rule' resulting in the formation of metastable phases at low undercooling, whereas the stable eutectic phase mixture crystallizes without metastable phase formation at high undercooling.
NASA Astrophysics Data System (ADS)
Pearle, Philip
1982-03-01
In the problem of the gambler's ruin, a classic problem in probability theory, a number of gamblers play against each other until all but one of them is “wiped out.” It is shown that this problem is identical to a previously presented formulation of the reduction of the state vector, so that the state vectors in a linear superposition may be regarded as “playing” against each other until all but one of them is “wiped out.” This is a useful part of the description of an objectively real universe represented by a state vector that is a superposition of macroscopically distinguishable states dynamically created by the Hamiltonian and destroyed by the reduction mechanism.
Statistical mechanics in the context of special relativity. II.
Kaniadakis, G
2005-09-01
The special relativity laws emerge as one-parameter (light speed) generalizations of the corresponding laws of classical physics. These generalizations, imposed by the Lorentz transformations, affect both the definition of the various physical observables (e.g., momentum, energy, etc.), as well as the mathematical apparatus of the theory. Here, following the general lines of [Phys. Rev. E 66, 056125 (2002)], we show that the Lorentz transformations impose also a proper one-parameter generalization of the classical Boltzmann-Gibbs-Shannon entropy. The obtained relativistic entropy permits us to construct a coherent and self-consistent relativistic statistical theory, preserving the main features of the ordinary statistical theory, which is recovered in the classical limit. The predicted distribution function is a one-parameter continuous deformation of the classical Maxwell-Boltzmann distribution and has a simple analytic form, showing power law tails in accordance with the experimental evidence. Furthermore, this statistical mechanics can be obtained as the stationary case of a generalized kinetic theory governed by an evolution equation obeying the H theorem and reproducing the Boltzmann equation of the ordinary kinetics in the classical limit.
Frank, Stefan; Roberts, Daniel E; Rikvold, Per Arne
2005-02-08
The influence of nearest-neighbor diffusion on the decay of a metastable low-coverage phase (monolayer adsorption) in a square lattice-gas model of electrochemical metal deposition is investigated by kinetic Monte Carlo simulations. The phase-transformation dynamics are compared to the well-established Kolmogorov-Johnson-Mehl-Avrami theory. The phase transformation is accelerated by diffusion, but remains in accord with the theory for continuous nucleation up to moderate diffusion rates. At very high diffusion rates the phase-transformation kinetic shows a crossover to instantaneous nucleation. Then, the probability of medium-sized clusters is reduced in favor of large clusters. Upon reversal of the supersaturation, the adsorbate desorbs, but large clusters still tend to grow during the initial stages of desorption. Calculation of the free energy of subcritical clusters by enumeration of lattice animals yields a quasiequilibrium distribution which is in reasonable agreement with the simulation results. This is an improvement relative to classical droplet theory, which fails to describe the distributions, since the macroscopic surface tension is a bad approximation for small clusters.
Two-Way Communication with a Single Quantum Particle.
Del Santo, Flavio; Dakić, Borivoje
2018-02-09
In this Letter we show that communication when restricted to a single information carrier (i.e., single particle) and finite speed of propagation is fundamentally limited for classical systems. On the other hand, quantum systems can surpass this limitation. We show that communication bounded to the exchange of a single quantum particle (in superposition of different spatial locations) can result in "two-way signaling," which is impossible in classical physics. We quantify the discrepancy between classical and quantum scenarios by the probability of winning a game played by distant players. We generalize our result to an arbitrary number of parties and we show that the probability of success is asymptotically decreasing to zero as the number of parties grows, for all classical strategies. In contrast, quantum strategy allows players to win the game with certainty.
Two-Way Communication with a Single Quantum Particle
NASA Astrophysics Data System (ADS)
Del Santo, Flavio; Dakić, Borivoje
2018-02-01
In this Letter we show that communication when restricted to a single information carrier (i.e., single particle) and finite speed of propagation is fundamentally limited for classical systems. On the other hand, quantum systems can surpass this limitation. We show that communication bounded to the exchange of a single quantum particle (in superposition of different spatial locations) can result in "two-way signaling," which is impossible in classical physics. We quantify the discrepancy between classical and quantum scenarios by the probability of winning a game played by distant players. We generalize our result to an arbitrary number of parties and we show that the probability of success is asymptotically decreasing to zero as the number of parties grows, for all classical strategies. In contrast, quantum strategy allows players to win the game with certainty.
Leading-order classical Lagrangians for the nonminimal standard-model extension
NASA Astrophysics Data System (ADS)
Reis, J. A. A. S.; Schreck, M.
2018-03-01
In this paper, we derive the general leading-order classical Lagrangian covering all fermion operators of the nonminimal standard-model extension (SME). Such a Lagrangian is considered to be the point-particle analog of the effective field theory description of Lorentz violation that is provided by the SME. At leading order in Lorentz violation, the Lagrangian obtained satisfies the set of five nonlinear equations that govern the map from the field theory to the classical description. This result can be of use for phenomenological studies of classical bodies in gravitational fields.
Naive Probability: A Mental Model Theory of Extensional Reasoning.
ERIC Educational Resources Information Center
Johnson-Laird, P. N.; Legrenzi, Paolo; Girotto, Vittorio; Legrenzi, Maria Sonino; Caverni, Jean-Paul
1999-01-01
Outlines a theory of naive probability in which individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an "extensional" way. The theory accommodates reasoning based on numerical premises, and explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem.…
Cuprate High Temperature Superconductors and the Vision for Room Temperature Superconductivity
NASA Astrophysics Data System (ADS)
Newns, Dennis M.; Martyna, Glenn J.; Tsuei, Chang C.
Superconducting transition temperatures of 164 K in cuprate high temperature superconductors (HTS) and recently 200 K in H3S under high pressure encourage us to believe that room temperature superconductivity (RTS) might be possible. In considering paths to RTS, we contrast conventional (BCS) SC, such as probably manifested by H3S, with the unconventional superconductivity (SC) in the cuprate HTS family. Turning to SC models, we show that in the presence of one or more van Hove singularities (vHs) near the Fermi level, SC mediated by classical phonons (kBTc>ℏ×phonon frequency) can occur. The phonon frequency in the standard Tc formula is replaced by an electronic cutoff, enabling a much higher Tc independent of phonon frequency. The resulting Tc and isotope shift plot versus doping strongly resembles that seen experimentally in HTS. A more detailed theory of HTS, which involves mediation by classical phonons, satisfactorily reproduces the chief anomalous features characteristic of these materials. We propose that, while a path to RTS through an H3S-like scenario via strongly-coupled ultra-high frequency phonons is attractive, features perhaps unavailable at ordinary pressures, a route involving SC mediated by classical phonons which can be low frequency may be found.
JOURNAL SCOPE GUIDELINES: Paper classification scheme
NASA Astrophysics Data System (ADS)
2005-06-01
This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas
Quasi-Static Analysis of Round LaRC THUNDER Actuators
NASA Technical Reports Server (NTRS)
Campbell, Joel F.
2007-01-01
An analytic approach is developed to predict the shape and displacement with voltage in the quasi-static limit of round LaRC Thunder Actuators. The problem is treated with classical lamination theory and Von Karman non-linear analysis. In the case of classical lamination theory exact analytic solutions are found. It is shown that classical lamination theory is insufficient to describe the physical situation for large actuators but is sufficient for very small actuators. Numerical results are presented for the non-linear analysis and compared with experimental measurements. Snap-through behavior, bifurcation, and stability are presented and discussed.
Quasi-Static Analysis of LaRC THUNDER Actuators
NASA Technical Reports Server (NTRS)
Campbell, Joel F.
2007-01-01
An analytic approach is developed to predict the shape and displacement with voltage in the quasi-static limit of LaRC Thunder Actuators. The problem is treated with classical lamination theory and Von Karman non-linear analysis. In the case of classical lamination theory exact analytic solutions are found. It is shown that classical lamination theory is insufficient to describe the physical situation for large actuators but is sufficient for very small actuators. Numerical results are presented for the non-linear analysis and compared with experimental measurements. Snap-through behavior, bifurcation, and stability are presented and discussed.
Device-independent randomness generation from several Bell estimators
NASA Astrophysics Data System (ADS)
Nieto-Silleras, Olmo; Bamps, Cédric; Silman, Jonathan; Pironio, Stefano
2018-02-01
Device-independent randomness generation and quantum key distribution protocols rely on a fundamental relation between the non-locality of quantum theory and its random character. This relation is usually expressed in terms of a trade-off between the probability of guessing correctly the outcomes of measurements performed on quantum systems and the amount of violation of a given Bell inequality. However, a more accurate assessment of the randomness produced in Bell experiments can be obtained if the value of several Bell expressions is simultaneously taken into account, or if the full set of probabilities characterizing the behavior of the device is considered. We introduce protocols for device-independent randomness generation secure against classical side information, that rely on the estimation of an arbitrary number of Bell expressions or even directly on the experimental frequencies of measurement outcomes. Asymptotically, this results in an optimal generation of randomness from experimental data (as measured by the min-entropy), without having to assume beforehand that the devices violate a specific Bell inequality.
Quantum probabilities from quantum entanglement: experimentally unpacking the Born rule
Harris, Jérémie; Bouchard, Frédéric; Santamato, Enrico; ...
2016-05-11
The Born rule, a foundational axiom used to deduce probabilities of events from wavefunctions, is indispensable in the everyday practice of quantum physics. It is also key in the quest to reconcile the ostensibly inconsistent laws of the quantum and classical realms, as it confers physical significance to reduced density matrices, the essential tools of decoherence theory. Following Bohr's Copenhagen interpretation, textbooks postulate the Born rule outright. But, recent attempts to derive it from other quantum principles have been successful, holding promise for simplifying and clarifying the quantum foundational bedrock. Moreover, a major family of derivations is based on envariance,more » a recently discovered symmetry of entangled quantum states. Here, we identify and experimentally test three premises central to these envariance-based derivations, thus demonstrating, in the microworld, the symmetries from which the Born rule is derived. Furthermore, we demonstrate envariance in a purely local quantum system, showing its independence from relativistic causality.« less
Emergent mechanics, quantum and un-quantum
NASA Astrophysics Data System (ADS)
Ralston, John P.
2013-10-01
There is great interest in quantum mechanics as an "emergent" phenomenon. The program holds that nonobvious patterns and laws can emerge from complicated physical systems operating by more fundamental rules. We find a new approach where quantum mechanics itself should be viewed as an information management tool not derived from physics nor depending on physics. The main accomplishment of quantum-style theory comes in expanding the notion of probability. We construct a map from macroscopic information as data" to quantum probability. The map allows a hidden variable description for quantum states, and efficient use of the helpful tools of quantum mechanics in unlimited circumstances. Quantum dynamics via the time-dependent Shroedinger equation or operator methods actually represents a restricted class of classical Hamiltonian or Lagrangian dynamics, albeit with different numbers of degrees of freedom. We show that under wide circumstances such dynamics emerges from structureless dynamical systems. The uses of the quantum information management tools are illustrated by numerical experiments and practical applications
Uncertainty and denial: a resource-rational model of the value of information.
Pierson, Emma; Goodman, Noah
2014-01-01
Classical decision theory predicts that people should be indifferent to information that is not useful for making decisions, but this model often fails to describe human behavior. Here we investigate one such scenario, where people desire information about whether an event (the gain/loss of money) will occur even though there is no obvious decision to be made on the basis of this information. We find a curious dual trend: if information is costless, as the probability of the event increases people want the information more; if information is not costless, people's desire for the information peaks at an intermediate probability. People also want information more as the importance of the event increases, and less as the cost of the information increases. We propose a model that explains these results, based on the assumption that people have limited cognitive resources and obtain information about which events will occur so they can determine whether to expend effort planning for them.
Uncertainty and Denial: A Resource-Rational Model of the Value of Information
Pierson, Emma; Goodman, Noah
2014-01-01
Classical decision theory predicts that people should be indifferent to information that is not useful for making decisions, but this model often fails to describe human behavior. Here we investigate one such scenario, where people desire information about whether an event (the gain/loss of money) will occur even though there is no obvious decision to be made on the basis of this information. We find a curious dual trend: if information is costless, as the probability of the event increases people want the information more; if information is not costless, people's desire for the information peaks at an intermediate probability. People also want information more as the importance of the event increases, and less as the cost of the information increases. We propose a model that explains these results, based on the assumption that people have limited cognitive resources and obtain information about which events will occur so they can determine whether to expend effort planning for them. PMID:25426631
NASA Astrophysics Data System (ADS)
Arena, Dylan A.; Schwartz, Daniel L.
2014-08-01
Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics, where people's everyday experiences often conflict with normative statistical theories and a videogame might provide an alternate set of experiences for students to draw upon. The research used a game called Stats Invaders!, a variant of the classic videogame Space Invaders. In Stats Invaders!, the locations of descending alien invaders follow probability distributions, and players need to infer the shape of the distributions to play well. The experiment tested whether the game developed participants' intuitions about the structure of random events and thereby prepared them for future learning from a subsequent written passage on probability distributions. Community-college students who played the game and then read the passage learned more than participants who only read the passage.
Propensity, Probability, and Quantum Theory
NASA Astrophysics Data System (ADS)
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
Navigating the grounded theory terrain. Part 2.
Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John
2011-01-01
In this paper, the choice of classic grounded theory will be discussed and justified in the context of the first author's PhD research. The methodological discussion takes place within the context of PhD research entitled: Development of a stakeholder-led framework for a structured education programme that will prepare nurses and healthcare assistants to deliver a psychosocial intervention for people with dementia. There is a lack of research and limited understanding of the effect of psychosocial interventions on people with dementia. The first author thought classic grounded theory a suitable research methodology to investigate as it is held to be ideal for areas of research where there is little understanding of the social processes at work. The literature relating to the practical application of classic grounded theory is illustrated using examples relating to four key grounded theory components: Theory development: using constant comparison and memoing, Methodological rigour, Emergence of a core category, Inclusion of self and engagement with participants. Following discussion of the choice and application of classic grounded theory, this paper explores the need for researchers to visit and understand the various grounded theory options. This paper argues that researchers new to grounded theory must be familiar with and understand the various options. The researchers will then be able to apply the methodologies they choose consistently and critically. Doing so will allow them to develop theory rigorously and they will ultimately be able to better defend their final methodological destinations.
Mathematical model of the SH-3G helicopter
NASA Technical Reports Server (NTRS)
Phillips, J. D.
1982-01-01
A mathematical model of the Sikorsky SH-3G helicopter based on classical nonlinear, quasi-steady rotor theory was developed. The model was validated statically and dynamically by comparison with Navy flight-test data. The model incorporates ad hoc revisions which address the ideal assumptions of classical rotor theory and improve the static trim characteristics to provide a more realistic simulation, while retaining the simplicity of the classical model.
Geometric Algebra for Physicists
NASA Astrophysics Data System (ADS)
Doran, Chris; Lasenby, Anthony
2007-11-01
Preface; Notation; 1. Introduction; 2. Geometric algebra in two and three dimensions; 3. Classical mechanics; 4. Foundations of geometric algebra; 5. Relativity and spacetime; 6. Geometric calculus; 7. Classical electrodynamics; 8. Quantum theory and spinors; 9. Multiparticle states and quantum entanglement; 10. Geometry; 11. Further topics in calculus and group theory; 12. Lagrangian and Hamiltonian techniques; 13. Symmetry and gauge theory; 14. Gravitation; Bibliography; Index.
NASA Astrophysics Data System (ADS)
Wang, Dong; Hu, You-Di; Wang, Zhe-Qiang; Ye, Liu
2015-06-01
We develop two efficient measurement-based schemes for remotely preparing arbitrary three- and four-particle W-class entangled states by utilizing genuine tripartite Greenberg-Horn-Zeilinger-type states as quantum channels, respectively. Through appropriate local operations and classical communication, the desired states can be faithfully retrieved at the receiver's place with certain probability. Compared with the previously existing schemes, the success probability in current schemes is greatly increased. Moreover, the required classical communication cost is calculated as well. Further, several attractive discussions on the properties of the presented schemes, including the success probability and reducibility, are made. Remarkably, the proposed schemes can be faithfully achieved with unity total success probability when the employed channels are reduced into maximally entangled ones.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shore, B.W.; Knight, P.L.
The Jaynes-Cummings Model (JCM), a soluble fully quantum mechanical model of an atom in a field, was first used (in 1963) to examine the classical aspects of spontaneous emission and to reveal the existence of Rabi oscillations in atomic excitation probability for fields with sharply defined energy (or photon number). For fields having a statistical distributions of photon numbers the oscillations collapse to an expected steady value. In 1980 it was discovered that with appropriate initial conditions (e.g. a near-classical field), the Rabi oscillations would eventually revive -- only to collapse and revive repeatedly in a complicated pattern. The existencemore » of these revivals, present in the analytic solutions of the JCM, provided direct evidence for discreteness of field excitation (photons) and hence for the truly quantum nature of radiation. Subsequent study revealed further nonclassical properties of the JCM field, such as a tendency of the photons to antibunch. Within the last two years it has been found that during the quiescent intervals of collapsed Rabi oscillations the atom and field exist in a macroscopic superposition state (a Schroedinger cat). This discovery offers the opportunity to use the JCM to elucidate the basic properties of quantum correlation (entanglement) and to explore still further the relationship between classical and quantum physics. In tribute to E. D. Jaynes, who first recognized the importance of the JCM for clarifying the differences and similarities between quantum and classical physics, we here present an overview of the theory of the JCM and some of the many remarkable discoveries about it.« less
The Nature of Quantum Truth: Logic, Set Theory, & Mathematics in the Context of Quantum Theory
NASA Astrophysics Data System (ADS)
Frey, Kimberly
The purpose of this dissertation is to construct a radically new type of mathematics whose underlying logic differs from the ordinary classical logic used in standard mathematics, and which we feel may be more natural for applications in quantum mechanics. Specifically, we begin by constructing a first order quantum logic, the development of which closely parallels that of ordinary (classical) first order logic --- the essential differences are in the nature of the logical axioms, which, in our construction, are motivated by quantum theory. After showing that the axiomatic first order logic we develop is sound and complete (with respect to a particular class of models), this logic is then used as a foundation on which to build (axiomatic) mathematical systems --- and we refer to the resulting new mathematics as "quantum mathematics." As noted above, the hope is that this form of mathematics is more natural than classical mathematics for the description of quantum systems, and will enable us to address some foundational aspects of quantum theory which are still troublesome --- e.g. the measurement problem --- as well as possibly even inform our thinking about quantum gravity. After constructing the underlying logic, we investigate properties of several mathematical systems --- e.g. axiom systems for abstract algebras, group theory, linear algebra, etc. --- in the presence of this quantum logic. In the process, we demonstrate that the resulting quantum mathematical systems have some strange, but very interesting features, which indicates a richness in the structure of mathematics that is classically inaccessible. Moreover, some of these features do indeed suggest possible applications to foundational questions in quantum theory. We continue our investigation of quantum mathematics by constructing an axiomatic quantum set theory, which we show satisfies certain desirable criteria. Ultimately, we hope that such a set theory will lead to a foundation for quantum mathematics in a sense which parallels the foundational role of classical set theory in classical mathematics. One immediate application of the quantum set theory we develop is to provide a foundation on which to construct quantum natural numbers, which are the quantum analog of the classical counting numbers. It turns out that in a special class of models, there exists a 1-1 correspondence between the quantum natural numbers and bounded observables in quantum theory whose eigenvalues are (ordinary) natural numbers. This 1-1 correspondence is remarkably satisfying, and not only gives us great confidence in our quantum set theory, but indicates the naturalness of such models for quantum theory itself. We go on to develop a Peano-like arithmetic for these new "numbers," as well as consider some of its consequences. Finally, we conclude by summarizing our results, and discussing directions for future work.
Nucleation theory - Is replacement free energy needed?. [error analysis of capillary approximation
NASA Technical Reports Server (NTRS)
Doremus, R. H.
1982-01-01
It has been suggested that the classical theory of nucleation of liquid from its vapor as developed by Volmer and Weber (1926) needs modification with a factor referred to as the replacement free energy and that the capillary approximation underlying the classical theory is in error. Here, the classical nucleation equation is derived from fluctuation theory, Gibb's result for the reversible work to form a critical nucleus, and the rate of collision of gas molecules with a surface. The capillary approximation is not used in the derivation. The chemical potential of small drops is then considered, and it is shown that the capillary approximation can be derived from thermodynamic equations. The results show that no corrections to Volmer's equation are needed.
Effective model hierarchies for dynamic and static classical density functional theories
NASA Astrophysics Data System (ADS)
Majaniemi, S.; Provatas, N.; Nonomura, M.
2010-09-01
The origin and methodology of deriving effective model hierarchies are presented with applications to solidification of crystalline solids. In particular, it is discussed how the form of the equations of motion and the effective parameters on larger scales can be obtained from the more microscopic models. It will be shown that tying together the dynamic structure of the projection operator formalism with static classical density functional theories can lead to incomplete (mass) transport properties even though the linearized hydrodynamics on large scales is correctly reproduced. To facilitate a more natural way of binding together the dynamics of the macrovariables and classical density functional theory, a dynamic generalization of density functional theory based on the nonequilibrium generating functional is suggested.
Time and the foundations of quantum mechanics
NASA Astrophysics Data System (ADS)
Pashby, Thomas
Quantum mechanics has provided philosophers of science with many counterintuitive insights and interpretive puzzles, but little has been written about the role that time plays in the theory. One reason for this is the celebrated argument of Wolfgang Pauli against the inclusion of time as an observable of the theory, which has been seen as a demonstration that time may only enter the theory as a classical parameter. Against this orthodoxy I argue that there are good reasons to expect certain kinds of `time observables' to find a representation within quantum theory, including clock operators (which provide the means to measure the passage of time) and event time operators, which provide predictions for the time at which a particular event occurs, such as the appearance of a dot on a luminescent screen. I contend that these time operators deserve full status as observables of the theory, and on re ection provide a uniquely compelling reason to expand the set of observables allowed by the standard formalism of quantum mechanics. In addition, I provide a novel association of event time operators with conditional probabilities, and propose a temporally extended form of quantum theory to better accommodate the time of an event as an observable quantity. This leads to a proposal to interpret quantum theory within an event ontology, inspired by Bertrand Russell's Analysis of Matter. On this basis I mount a defense of Russell's relational theory of time against a recent attack.
Using extant literature in a grounded theory study: a personal account.
Yarwood-Ross, Lee; Jack, Kirsten
2015-03-01
To provide a personal account of the factors in a doctoral study that led to the adoption of classic grounded theory principles relating to the use of literature. Novice researchers considering grounded theory methodology will become aware of the contentious issue of how and when extant literature should be incorporated into a study. The three main grounded theory approaches are classic, Straussian and constructivist, and the seminal texts provide conflicting beliefs surrounding the use of literature. A classic approach avoids a pre-study literature review to minimise preconceptions and emphasises the constant comparison method, while the Straussian and constructivist approaches focus more on the beneficial aspects of an initial literature review and researcher reflexivity. The debate also extends into the wider academic community, where no consensus exists. This is a methodological paper detailing the authors' engagement in the debate surrounding the role of the literature in a grounded theory study. In the authors' experience, researchers can best understand the use of literature in grounded theory through immersion in the seminal texts, engaging with wider academic literature, and examining their preconceptions of the substantive area. The authors concluded that classic grounded theory principles were appropriate in the context of their doctoral study. Novice researchers will have their own sets of circumstances when preparing their studies and should become aware of the different perspectives to make decisions that they can ultimately justify. This paper can be used by other novice researchers as an example of the decision-making process that led to delaying a pre-study literature review and identifies the resources used to write a research proposal when using a classic grounded theory approach.
NASA Astrophysics Data System (ADS)
Nihill, Kevin J.; Hund, Zachary M.; Muzas, Alberto; Díaz, Cristina; del Cueto, Marcos; Frankcombe, Terry; Plymale, Noah T.; Lewis, Nathan S.; Martín, Fernando; Sibener, S. J.
2016-08-01
Fundamental details concerning the interaction between H2 and CH3-Si(111) have been elucidated by the combination of diffractive scattering experiments and electronic structure and scattering calculations. Rotationally inelastic diffraction (RID) of H2 and D2 from this model hydrocarbon-decorated semiconductor interface has been confirmed for the first time via both time-of-flight and diffraction measurements, with modest j = 0 → 2 RID intensities for H2 compared to the strong RID features observed for D2 over a large range of kinematic scattering conditions along two high-symmetry azimuthal directions. The Debye-Waller model was applied to the thermal attenuation of diffraction peaks, allowing for precise determination of the RID probabilities by accounting for incoherent motion of the CH3-Si(111) surface atoms. The probabilities of rotationally inelastic diffraction of H2 and D2 have been quantitatively evaluated as a function of beam energy and scattering angle, and have been compared with complementary electronic structure and scattering calculations to provide insight into the interaction potential between H2 (D2) and hence the surface charge density distribution. Specifically, a six-dimensional potential energy surface (PES), describing the electronic structure of the H2(D2)/CH3-Si(111) system, has been computed based on interpolation of density functional theory energies. Quantum and classical dynamics simulations have allowed for an assessment of the accuracy of the PES, and subsequently for identification of the features of the PES that serve as classical turning points. A close scrutiny of the PES reveals the highly anisotropic character of the interaction potential at these turning points. This combination of experiment and theory provides new and important details about the interaction of H2 with a hybrid organic-semiconductor interface, which can be used to further investigate energy flow in technologically relevant systems.
Classical BV Theories on Manifolds with Boundary
NASA Astrophysics Data System (ADS)
Cattaneo, Alberto S.; Mnev, Pavel; Reshetikhin, Nicolai
2014-12-01
In this paper we extend the classical BV framework to gauge theories on spacetime manifolds with boundary. In particular, we connect the BV construction in the bulk with the BFV construction on the boundary and we develop its extension to strata of higher codimension in the case of manifolds with corners. We present several examples including electrodynamics, Yang-Mills theory and topological field theories coming from the AKSZ construction, in particular, the Chern-Simons theory, the BF theory, and the Poisson sigma model. This paper is the first step towards developing the perturbative quantization of such theories on manifolds with boundary in a way consistent with gluing.
An Examination of the Flynn Effect in the National Intelligence Test in Estonia
ERIC Educational Resources Information Center
Shiu, William
2012-01-01
This study examined the Flynn Effect (FE; i.e., the rise in IQ scores over time) in Estonia from Scale B of the National Intelligence Test using both classical test theory (CTT) and item response theory (IRT) methods. Secondary data from two cohorts (1934, n = 890 and 2006, n = 913) of students were analyzed, using both classical test theory (CTT)…
Hepatitis disease detection using Bayesian theory
NASA Astrophysics Data System (ADS)
Maseleno, Andino; Hidayati, Rohmah Zahroh
2017-02-01
This paper presents hepatitis disease diagnosis using a Bayesian theory for better understanding of the theory. In this research, we used a Bayesian theory for detecting hepatitis disease and displaying the result of diagnosis process. Bayesian algorithm theory is rediscovered and perfected by Laplace, the basic idea is using of the known prior probability and conditional probability density parameter, based on Bayes theorem to calculate the corresponding posterior probability, and then obtained the posterior probability to infer and make decisions. Bayesian methods combine existing knowledge, prior probabilities, with additional knowledge derived from new data, the likelihood function. The initial symptoms of hepatitis which include malaise, fever and headache. The probability of hepatitis given the presence of malaise, fever, and headache. The result revealed that a Bayesian theory has successfully identified the existence of hepatitis disease.
Nilsson, Håkan; Juslin, Peter; Winman, Anders
2016-01-01
Costello and Watts (2014) present a model assuming that people's knowledge of probabilities adheres to probability theory, but that their probability judgments are perturbed by a random noise in the retrieval from memory. Predictions for the relationships between probability judgments for constituent events and their disjunctions and conjunctions, as well as for sums of such judgments were derived from probability theory. Costello and Watts (2014) report behavioral data showing that subjective probability judgments accord with these predictions. Based on the finding that subjective probability judgments follow probability theory, Costello and Watts (2014) conclude that the results imply that people's probability judgments embody the rules of probability theory and thereby refute theories of heuristic processing. Here, we demonstrate the invalidity of this conclusion by showing that all of the tested predictions follow straightforwardly from an account assuming heuristic probability integration (Nilsson, Winman, Juslin, & Hansson, 2009). We end with a discussion of a number of previous findings that harmonize very poorly with the predictions by the model suggested by Costello and Watts (2014). (c) 2015 APA, all rights reserved).
Bukhvostov-Lipatov model and quantum-classical duality
NASA Astrophysics Data System (ADS)
Bazhanov, Vladimir V.; Lukyanov, Sergei L.; Runov, Boris A.
2018-02-01
The Bukhvostov-Lipatov model is an exactly soluble model of two interacting Dirac fermions in 1 + 1 dimensions. The model describes weakly interacting instantons and anti-instantons in the O (3) non-linear sigma model. In our previous work [arxiv:arXiv:1607.04839] we have proposed an exact formula for the vacuum energy of the Bukhvostov-Lipatov model in terms of special solutions of the classical sinh-Gordon equation, which can be viewed as an example of a remarkable duality between integrable quantum field theories and integrable classical field theories in two dimensions. Here we present a complete derivation of this duality based on the classical inverse scattering transform method, traditional Bethe ansatz techniques and analytic theory of ordinary differential equations. In particular, we show that the Bethe ansatz equations defining the vacuum state of the quantum theory also define connection coefficients of an auxiliary linear problem for the classical sinh-Gordon equation. Moreover, we also present details of the derivation of the non-linear integral equations determining the vacuum energy and other spectral characteristics of the model in the case when the vacuum state is filled by 2-string solutions of the Bethe ansatz equations.
Whitley, Heather D.; Scullard, Christian R.; Benedict, Lorin X.; ...
2014-12-04
Here, we present a discussion of kinetic theory treatments of linear electrical and thermal transport in hydrogen plasmas, for a regime of interest to inertial confinement fusion applications. In order to assess the accuracy of one of the more involved of these approaches, classical Lenard-Balescu theory, we perform classical molecular dynamics simulations of hydrogen plasmas using 2-body quantum statistical potentials and compute both electrical and thermal conductivity from out particle trajectories using the Kubo approach. Our classical Lenard-Balescu results employing the identical statistical potentials agree well with the simulations.
Further Development of an Optimal Design Approach Applied to Axial Magnetic Bearings
NASA Technical Reports Server (NTRS)
Bloodgood, V. Dale, Jr.; Groom, Nelson J.; Britcher, Colin P.
2000-01-01
Classical design methods involved in magnetic bearings and magnetic suspension systems have always had their limitations. Because of this, the overall effectiveness of a design has always relied heavily on the skill and experience of the individual designer. This paper combines two approaches that have been developed to aid the accuracy and efficiency of magnetostatic design. The first approach integrates classical magnetic circuit theory with modern optimization theory to increase design efficiency. The second approach uses loss factors to increase the accuracy of classical magnetic circuit theory. As an example, an axial magnetic thrust bearing is designed for minimum power.
DeVore, Matthew S; Gull, Stephen F; Johnson, Carey K
2012-04-05
We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions.
Are quantum-mechanical-like models possible, or necessary, outside quantum physics?
NASA Astrophysics Data System (ADS)
Plotnitsky, Arkady
2014-12-01
This article examines some experimental conditions that invite and possibly require recourse to quantum-mechanical-like mathematical models (QMLMs), models based on the key mathematical features of quantum mechanics, in scientific fields outside physics, such as biology, cognitive psychology, or economics. In particular, I consider whether the following two correlative features of quantum phenomena that were decisive for establishing the mathematical formalism of quantum mechanics play similarly important roles in QMLMs elsewhere. The first is the individuality and discreteness of quantum phenomena, and the second is the irreducibly probabilistic nature of our predictions concerning them, coupled to the particular character of the probabilities involved, as different from the character of probabilities found in classical physics. I also argue that these features could be interpreted in terms of a particular form of epistemology that suspends and even precludes a causal and, in the first place, realist description of quantum objects and processes. This epistemology limits the descriptive capacity of quantum theory to the description, classical in nature, of the observed quantum phenomena manifested in measuring instruments. Quantum mechanics itself only provides descriptions, probabilistic in nature, concerning numerical data pertaining to such phenomena, without offering a physical description of quantum objects and processes. While QMLMs share their use of the quantum-mechanical or analogous mathematical formalism, they may differ by the roles, if any, the two features in question play in them and by different ways of interpreting the phenomena they considered and this formalism itself. This article will address those differences as well.
NASA Astrophysics Data System (ADS)
Möller, Peter; Pfeiffer, Bernd; Kratz, Karl-Ludwig
2003-05-01
Recent compilations of experimental gross β-decay properties, i.e., half-lives (T1/2) and neutron-emission probabilities (Pn), are compared to improved global macroscopic-microscopic model predictions. The model combines calculations within the quasiparticle (QP) random-phase approximation for the Gamow-Teller (GT) part with an empirical spreading of the QP strength and the gross theory for the first-forbidden part of β- decay. Nuclear masses are either taken from the 1995 data compilation of Audi et al., when available, otherwise from the finite-range droplet model. Especially for spherical and neutron-(sub-)magic isotopes a considerable improvement compared to our earlier predictions for pure GT decay (ADNDT, 1997) is observed. T1/2 and Pn values up to the neutron drip line have been used in r-process calculations within the classical “waiting-point” approximation. With the new nuclear-physics input, a considerable speeding-up of the r-matter flow is observed, in particular at those r-abundance peaks which are related to magic neutron-shell closures.
Ethical and Stylistic Implications in Delivering Conference Papers.
ERIC Educational Resources Information Center
Enos, Theresa
1986-01-01
Analyzes shortcomings of conference papers intended for the eye rather than the ear. Referring to classical oratory, speech act theory, and cognitive theory, recommends revising papers for oral presentation by using classical disposition; deductive rather than inductive argument; formulaic repetition of words and phrases; non-inverted clause…
Quantum theory for 1D X-ray free electron laser
NASA Astrophysics Data System (ADS)
Anisimov, Petr M.
2018-06-01
Classical 1D X-ray Free Electron Laser (X-ray FEL) theory has stood the test of time by guiding FEL design and development prior to any full-scale analysis. Future X-ray FELs and inverse-Compton sources, where photon recoil approaches an electron energy spread value, push the classical theory to its limits of applicability. After substantial efforts by the community to find what those limits are, there is no universally agreed upon quantum approach to design and development of future X-ray sources. We offer a new approach to formulate the quantum theory for 1D X-ray FELs that has an obvious connection to the classical theory, which allows for immediate transfer of knowledge between the two regimes. We exploit this connection in order to draw quantum mechanical conclusions about the quantum nature of electrons and generated radiation in terms of FEL variables.
Plasmon mass scale and quantum fluctuations of classical fields on a real time lattice
NASA Astrophysics Data System (ADS)
Kurkela, Aleksi; Lappi, Tuomas; Peuron, Jarkko
2018-03-01
Classical real-time lattice simulations play an important role in understanding non-equilibrium phenomena in gauge theories and are used in particular to model the prethermal evolution of heavy-ion collisions. Above the Debye scale the classical Yang-Mills (CYM) theory can be matched smoothly to kinetic theory. First we study the limits of the quasiparticle picture of the CYM fields by determining the plasmon mass of the system using 3 different methods. Then we argue that one needs a numerical calculation of a system of classical gauge fields and small linearized fluctuations, which correspond to quantum fluctuations, in a way that keeps the separation between the two manifest. We demonstrate and test an implementation of an algorithm with the linearized fluctuation showing that the linearization indeed works and that the Gauss's law is conserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nomura, Yasunori; Salzetta, Nico; Sanches, Fabio
We study the Hilbert space structure of classical spacetimes under the assumption that entanglement in holographic theories determines semiclassical geometry. We show that this simple assumption has profound implications; for example, a superposition of classical spacetimes may lead to another classical spacetime. Despite its unconventional nature, this picture admits the standard interpretation of superpositions of well-defined semiclassical spacetimes in the limit that the number of holographic degrees of freedom becomes large. We illustrate these ideas using a model for the holographic theory of cosmological spacetimes.
Classical theory of radiating strings
NASA Technical Reports Server (NTRS)
Copeland, Edmund J.; Haws, D.; Hindmarsh, M.
1990-01-01
The divergent part of the self force of a radiating string coupled to gravity, an antisymmetric tensor and a dilaton in four dimensions are calculated to first order in classical perturbation theory. While this divergence can be absorbed into a renormalization of the string tension, demanding that both it and the divergence in the energy momentum tensor vanish forces the string to have the couplings of compactified N = 1 D = 10 supergravity. In effect, supersymmetry cures the classical infinities.
Emergence of a classical Universe from quantum gravity and cosmology.
Kiefer, Claus
2012-09-28
I describe how we can understand the classical appearance of our world from a universal quantum theory. The essential ingredient is the process of decoherence. I start with a general discussion in ordinary quantum theory and then turn to quantum gravity and quantum cosmology. There is a whole hierarchy of classicality from the global gravitational field to the fluctuations in the cosmic microwave background, which serve as the seeds for the structure in the Universe.
How the Weak Variance of Momentum Can Turn Out to be Negative
NASA Astrophysics Data System (ADS)
Feyereisen, M. R.
2015-05-01
Weak values are average quantities, therefore investigating their associated variance is crucial in understanding their place in quantum mechanics. We develop the concept of a position-postselected weak variance of momentum as cohesively as possible, building primarily on material from Moyal (Mathematical Proceedings of the Cambridge Philosophical Society, Cambridge University Press, Cambridge, 1949) and Sonego (Found Phys 21(10):1135, 1991) . The weak variance is defined in terms of the Wigner function, using a standard construction from probability theory. We show this corresponds to a measurable quantity, which is not itself a weak value. It also leads naturally to a connection between the imaginary part of the weak value of momentum and the quantum potential. We study how the negativity of the Wigner function causes negative weak variances, and the implications this has on a class of `subquantum' theories. We also discuss the role of weak variances in studying determinism, deriving the classical limit from a variational principle.
Classical gluon and graviton radiation from the bi-adjoint scalar double copy
NASA Astrophysics Data System (ADS)
Goldberger, Walter D.; Prabhu, Siddharth G.; Thompson, Jedidiah O.
2017-09-01
We find double-copy relations between classical radiating solutions in Yang-Mills theory coupled to dynamical color charges and their counterparts in a cubic bi-adjoint scalar field theory which interacts linearly with particles carrying bi-adjoint charge. The particular color-to-kinematics replacements we employ are motivated by the Bern-Carrasco-Johansson double-copy correspondence for on-shell amplitudes in gauge and gravity theories. They are identical to those recently used to establish relations between classical radiating solutions in gauge theory and in dilaton gravity. Our explicit bi-adjoint solutions are constructed to second order in a perturbative expansion, and map under the double copy onto gauge theory solutions which involve at most cubic gluon self-interactions. If the correspondence is found to persist to higher orders in perturbation theory, our results suggest the possibility of calculating gravitational radiation from colliding compact objects, directly from a scalar field with vastly simpler (purely cubic) Feynman vertices.
Econophysics and individual choice
NASA Astrophysics Data System (ADS)
Bordley, Robert F.
2005-08-01
The subjectivist theory of probability specifies certain axioms of rationality which together lead to both a theory of probability and a theory of preference. The theory of probability is used throughout the sciences while the theory of preferences is used in economics. Results in quantum physics challenge the adequacy of the subjectivist theory of probability. As we show, answering this challenge requires modifying an Archimedean axiom in the subjectivist theory. But changing this axiom modifies the subjectivist theory of preference and therefore has implications for economics. As this paper notes, these implications are consistent with current empirical findings in psychology and economics. As we show, these results also have implications for pricing in securities markets. This suggests further directions for research in econophysics.
Combinatorial Market Processing for Multilateral Coordination
2005-09-01
8 In the classical auction theory literature, most of the attention is focused on one-sided, single-item auctions [86]. There is now a growing body of...Programming in Infinite-dimensional Spaces: Theory and Applications, Wiley, 1987. [3] K. J. Arrow, “An extension of the basic theorems of classical ...Commodities, Princeton University Press, 1969. [43] D. Friedman and J. Rust, The Double Auction Market: Institutions, Theories, and Evidence, Addison
Probability and Statistics: A Prelude.
ERIC Educational Resources Information Center
Goodman, A. F.; Blischke, W. R.
Probability and statistics have become indispensable to scientific, technical, and management progress. They serve as essential dialects of mathematics, the classical language of science, and as instruments necessary for intelligent generation and analysis of information. A prelude to probability and statistics is presented by examination of the…
ERIC Educational Resources Information Center
Boyer, Timothy H.
1985-01-01
The classical vacuum of physics is not empty, but contains a distinctive pattern of electromagnetic fields. Discovery of the vacuum, thermal spectrum, classical electron theory, zero-point spectrum, and effects of acceleration are discussed. Connection between thermal radiation and the classical vacuum reveals unexpected unity in the laws of…
Alternative probability theories for cognitive psychology.
Narens, Louis
2014-01-01
Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, H.
In this dissertation we study a procedure which restarts a Markov process when the process is killed by some arbitrary multiplicative functional. The regenerative nature of this revival procedure is characterized through a Markov renewal equation. An interesting duality between the revival procedure and the classical killing operation is found. Under the condition that the multiplicative functional possesses an intensity, the generators of the revival process can be written down explicitly. An intimate connection is also found between the perturbation of the sample path of a Markov process and the perturbation of a generator (in Kato's sense). The applications ofmore » the theory include the study of the processes like piecewise-deterministic Markov process, virtual waiting time process and the first entrance decomposition (taboo probability).« less
Detailed Balance of Thermalization Dynamics in Rydberg-Atom Quantum Simulators.
Kim, Hyosub; Park, YeJe; Kim, Kyungtae; Sim, H-S; Ahn, Jaewook
2018-05-04
Dynamics of large complex systems, such as relaxation towards equilibrium in classical statistical mechanics, often obeys a master equation that captures essential information from the complexities. Here, we find that thermalization of an isolated many-body quantum state can be described by a master equation. We observe sudden quench dynamics of quantum Ising-like models implemented in our quantum simulator, defect-free single-atom tweezers in conjunction with Rydberg-atom interaction. Saturation of their local observables, a thermalization signature, obeys a master equation experimentally constructed by monitoring the occupation probabilities of prequench states and imposing the principle of the detailed balance. Our experiment agrees with theories and demonstrates the detailed balance in a thermalization dynamics that does not require coupling to baths or postulated randomness.
Constrained Multiobjective Biogeography Optimization Algorithm
Mo, Hongwei; Xu, Zhidan; Xu, Lifang; Wu, Zhou; Ma, Haiping
2014-01-01
Multiobjective optimization involves minimizing or maximizing multiple objective functions subject to a set of constraints. In this study, a novel constrained multiobjective biogeography optimization algorithm (CMBOA) is proposed. It is the first biogeography optimization algorithm for constrained multiobjective optimization. In CMBOA, a disturbance migration operator is designed to generate diverse feasible individuals in order to promote the diversity of individuals on Pareto front. Infeasible individuals nearby feasible region are evolved to feasibility by recombining with their nearest nondominated feasible individuals. The convergence of CMBOA is proved by using probability theory. The performance of CMBOA is evaluated on a set of 6 benchmark problems and experimental results show that the CMBOA performs better than or similar to the classical NSGA-II and IS-MOEA. PMID:25006591
Detailed Balance of Thermalization Dynamics in Rydberg-Atom Quantum Simulators
NASA Astrophysics Data System (ADS)
Kim, Hyosub; Park, YeJe; Kim, Kyungtae; Sim, H.-S.; Ahn, Jaewook
2018-05-01
Dynamics of large complex systems, such as relaxation towards equilibrium in classical statistical mechanics, often obeys a master equation that captures essential information from the complexities. Here, we find that thermalization of an isolated many-body quantum state can be described by a master equation. We observe sudden quench dynamics of quantum Ising-like models implemented in our quantum simulator, defect-free single-atom tweezers in conjunction with Rydberg-atom interaction. Saturation of their local observables, a thermalization signature, obeys a master equation experimentally constructed by monitoring the occupation probabilities of prequench states and imposing the principle of the detailed balance. Our experiment agrees with theories and demonstrates the detailed balance in a thermalization dynamics that does not require coupling to baths or postulated randomness.
Time of arrival in quantum and Bohmian mechanics
NASA Astrophysics Data System (ADS)
Leavens, C. R.
1998-08-01
In a recent paper Grot, Rovelli, and Tate (GRT) [Phys. Rev. A 54, 4676 (1996)] derived an expression for the probability distribution π(TX) of intrinsic arrival times T(X) at position x=X for a quantum particle with initial wave function ψ(x,t=0) freely evolving in one dimension. This was done by quantizing the classical expression for the time of arrival of a free particle at X, assuming a particular choice of operator ordering, and then regulating the resulting time of arrival operator. For the special case of a minimum-uncertainty-product wave packet at t=0 with average wave number
Neo-classical theory of competition or Adam Smith's hand as mathematized ideology
NASA Astrophysics Data System (ADS)
McCauley, Joseph L.
2001-10-01
Orthodox economic theory (utility maximization, rational agents, efficient markets in equilibrium) is based on arbitrarily postulated, nonempiric notions. The disagreement between economic reality and a key feature of neo-classical economic theory was criticized empirically by Osborne. I show that the orthodox theory is internally self-inconsistent for the very reason suggested by Osborne: lack of invertibility of demand and supply as functions of price to obtain price as functions of supply and demand. The reason for the noninvertibililty arises from nonintegrable excess demand dynamics, a feature of their theory completely ignored by economists.
Decision analysis with cumulative prospect theory.
Bayoumi, A M; Redelmeier, D A
2000-01-01
Individuals sometimes express preferences that do not follow expected utility theory. Cumulative prospect theory adjusts for some phenomena by using decision weights rather than probabilities when analyzing a decision tree. The authors examined how probability transformations from cumulative prospect theory might alter a decision analysis of a prophylactic therapy in AIDS, eliciting utilities from patients with HIV infection (n = 75) and calculating expected outcomes using an established Markov model. They next focused on transformations of three sets of probabilities: 1) the probabilities used in calculating standard-gamble utility scores; 2) the probabilities of being in discrete Markov states; 3) the probabilities of transitioning between Markov states. The same prophylaxis strategy yielded the highest quality-adjusted survival under all transformations. For the average patient, prophylaxis appeared relatively less advantageous when standard-gamble utilities were transformed. Prophylaxis appeared relatively more advantageous when state probabilities were transformed and relatively less advantageous when transition probabilities were transformed. Transforming standard-gamble and transition probabilities simultaneously decreased the gain from prophylaxis by almost half. Sensitivity analysis indicated that even near-linear probability weighting transformations could substantially alter quality-adjusted survival estimates. The magnitude of benefit estimated in a decision-analytic model can change significantly after using cumulative prospect theory. Incorporating cumulative prospect theory into decision analysis can provide a form of sensitivity analysis and may help describe when people deviate from expected utility theory.
Perpetual extraction of work from a nonequilibrium dynamical system under Markovian feedback control
NASA Astrophysics Data System (ADS)
Kosugi, Taichi
2013-09-01
By treating both control parameters and dynamical variables as probabilistic variables, we develop a succinct theory of perpetual extraction of work from a generic classical nonequilibrium system subject to a heat bath via repeated measurements under a Markovian feedback control. It is demonstrated that a problem for perpetual extraction of work in a nonequilibrium system is reduced to a problem of Markov chain in the higher-dimensional phase space. We derive a version of the detailed fluctuation theorem, which was originally derived for classical nonequilibrium systems by Horowitz and Vaikuntanathan [Phys. Rev. EPLEEE81539-375510.1103/PhysRevE.82.061120 82, 061120 (2010)], in a form suitable for the analyses of perpetual extraction of work. Since our theory is formulated for generic dynamics of probability distribution function in phase space, its application to a physical system is straightforward. As simple applications of the theory, two exactly solvable models are analyzed. The one is a nonequilibrium two-state system and the other is a particle confined to a one-dimensional harmonic potential in thermal equilibrium. For the former example, it is demonstrated that the observer on the transitory steps to the stationary state can lose energy and that work larger than that achieved in the stationary state can be extracted. For the latter example, it is demonstrated that the optimal protocol for the extraction of work via repeated measurements can differ from that via a single measurement. The validity of our version of the detailed fluctuation theorem, which determines the upper bound of the expected work in the stationary state, is also confirmed for both examples. These observations provide useful insights into exploration for realistic modeling of a machine that extracts work from its environment.
Surprisingly rational: probability theory plus noise explains biases in judgment.
Costello, Fintan; Watts, Paul
2014-07-01
The systematic biases seen in people's probability judgments are typically taken as evidence that people do not use the rules of probability theory when reasoning about probability but instead use heuristics, which sometimes yield reasonable judgments and sometimes yield systematic biases. This view has had a major impact in economics, law, medicine, and other fields; indeed, the idea that people cannot reason with probabilities has become a truism. We present a simple alternative to this view, where people reason about probability according to probability theory but are subject to random variation or noise in the reasoning process. In this account the effect of noise is canceled for some probabilistic expressions. Analyzing data from 2 experiments, we find that, for these expressions, people's probability judgments are strikingly close to those required by probability theory. For other expressions, this account produces systematic deviations in probability estimates. These deviations explain 4 reliable biases in human probabilistic reasoning (conservatism, subadditivity, conjunction, and disjunction fallacies). These results suggest that people's probability judgments embody the rules of probability theory and that biases in those judgments are due to the effects of random noise. (c) 2014 APA, all rights reserved.
Quantum Kramers model: Corrections to the linear response theory for continuous bath spectrum
NASA Astrophysics Data System (ADS)
Rips, Ilya
2017-01-01
Decay of the metastable state is analyzed within the quantum Kramers model in the weak-to-intermediate dissipation regime. The decay kinetics in this regime is determined by energy exchange between the unstable mode and the stable modes of thermal bath. In our previous paper [Phys. Rev. A 42, 4427 (1990), 10.1103/PhysRevA.42.4427], Grabert's perturbative approach to well dynamics in the case of the discrete bath [Phys. Rev. Lett. 61, 1683 (1988), 10.1103/PhysRevLett.61.1683] has been extended to account for the second order terms in the classical equations of motion (EOM) for the stable modes. Account of the secular terms reduces EOM for the stable modes to those of the forced oscillator with the time-dependent frequency (TDF oscillator). Analytic expression for the characteristic function of energy loss of the unstable mode has been derived in terms of the generating function of the transition probabilities for the quantum forced TDF oscillator. In this paper, the approach is further developed and applied to the case of the continuous frequency spectrum of the bath. The spectral density functions of the bath of stable modes are expressed in terms of the dissipative properties (the friction function) of the original bath. They simplify considerably for the one-dimensional systems, when the density of phonon states is constant. Explicit expressions for the fourth order corrections to the linear response theory result for the characteristic function of the energy loss and its cumulants are obtained for the particular case of the cubic potential with Ohmic (Markovian) dissipation. The range of validity of the perturbative approach in this case is determined (γ /ωb<0.26 ), which includes the turnover region. The dominant correction to the linear response theory result is associated with the "work function" and leads to reduction of the average energy loss and its dispersion. This reduction increases with the increasing dissipation strength (up to ˜10 % ) within the range of validity of the approach. We have also calculated corrections to the depopulation factor and the escape rate for the quantum and for the classical Kramers models. Results for the classical escape rate are in very good agreement with the numerical simulations for high barriers. The results can serve as an additional proof of the robustness and accuracy of the linear response theory.
Quantum Kramers model: Corrections to the linear response theory for continuous bath spectrum.
Rips, Ilya
2017-01-01
Decay of the metastable state is analyzed within the quantum Kramers model in the weak-to-intermediate dissipation regime. The decay kinetics in this regime is determined by energy exchange between the unstable mode and the stable modes of thermal bath. In our previous paper [Phys. Rev. A 42, 4427 (1990)PLRAAN1050-294710.1103/PhysRevA.42.4427], Grabert's perturbative approach to well dynamics in the case of the discrete bath [Phys. Rev. Lett. 61, 1683 (1988)PRLTAO0031-900710.1103/PhysRevLett.61.1683] has been extended to account for the second order terms in the classical equations of motion (EOM) for the stable modes. Account of the secular terms reduces EOM for the stable modes to those of the forced oscillator with the time-dependent frequency (TDF oscillator). Analytic expression for the characteristic function of energy loss of the unstable mode has been derived in terms of the generating function of the transition probabilities for the quantum forced TDF oscillator. In this paper, the approach is further developed and applied to the case of the continuous frequency spectrum of the bath. The spectral density functions of the bath of stable modes are expressed in terms of the dissipative properties (the friction function) of the original bath. They simplify considerably for the one-dimensional systems, when the density of phonon states is constant. Explicit expressions for the fourth order corrections to the linear response theory result for the characteristic function of the energy loss and its cumulants are obtained for the particular case of the cubic potential with Ohmic (Markovian) dissipation. The range of validity of the perturbative approach in this case is determined (γ/ω_{b}<0.26), which includes the turnover region. The dominant correction to the linear response theory result is associated with the "work function" and leads to reduction of the average energy loss and its dispersion. This reduction increases with the increasing dissipation strength (up to ∼10%) within the range of validity of the approach. We have also calculated corrections to the depopulation factor and the escape rate for the quantum and for the classical Kramers models. Results for the classical escape rate are in very good agreement with the numerical simulations for high barriers. The results can serve as an additional proof of the robustness and accuracy of the linear response theory.
Theory of mind deficit in adult patients with congenital heart disease.
Chiavarino, Claudia; Bianchino, Claudia; Brach-Prever, Silvia; Riggi, Chiara; Palumbo, Luigi; Bara, Bruno G; Bosco, Francesca M
2015-10-01
This article provides the first assessment of theory of mind, that is, the ability to reason about mental states, in adult patients with congenital heart disease. Patients with congenital heart disease and matched healthy controls were administered classical theory of mind tasks and a semi-structured interview which provides a multidimensional evaluation of theory of mind (Theory of Mind Assessment Scale). The patients with congenital heart disease performed worse than the controls on the Theory of Mind Assessment Scale, whereas they did as well as the control group on the classical theory-of-mind tasks. These findings provide the first evidence that adults with congenital heart disease may display specific impairments in theory of mind. © The Author(s) 2013.
Probability Simulations by Non-Lipschitz Chaos
NASA Technical Reports Server (NTRS)
Zak, Michail
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices. Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Beauvais, Francis
2013-04-01
The randomized controlled trial (RCT) is the 'gold standard' of modern clinical pharmacology. However, for many practitioners of homeopathy, blind RCTs are an inadequate research tool for testing complex therapies such as homeopathy. Classical probabilities used in biological sciences and in medicine are only a special case of the generalized theory of probability used in quantum physics. I describe homeopathy trials using a quantum-like statistical model, a model inspired by quantum physics and taking into consideration superposition of states, non-commuting observables, probability interferences, contextuality, etc. The negative effect of blinding on success of homeopathy trials and the 'smearing effect' ('specific' effects of homeopathy medicine occurring in the placebo group) are described by quantum-like probabilities without supplementary ad hoc hypotheses. The difference of positive outcome rates between placebo and homeopathy groups frequently vanish in centralized blind trials. The model proposed here suggests a way to circumvent such problems in masked homeopathy trials by incorporating in situ randomization/unblinding. In this quantum-like model of homeopathy clinical trials, success in open-label setting and failure with centralized blind RCTs emerge logically from the formalism. This model suggests that significant differences between placebo and homeopathy in blind RCTs would be found more frequently if in situ randomization/unblinding was used. Copyright © 2013. Published by Elsevier Ltd.
Unified path integral approach to theories of diffusion-influenced reactions
NASA Astrophysics Data System (ADS)
Prüstel, Thorsten; Meier-Schellersheim, Martin
2017-08-01
Building on mathematical similarities between quantum mechanics and theories of diffusion-influenced reactions, we develop a general approach for computational modeling of diffusion-influenced reactions that is capable of capturing not only the classical Smoluchowski picture but also alternative theories, as is here exemplified by a volume reactivity model. In particular, we prove the path decomposition expansion of various Green's functions describing the irreversible and reversible reaction of an isolated pair of molecules. To this end, we exploit a connection between boundary value and interaction potential problems with δ - and δ'-function perturbation. We employ a known path-integral-based summation of a perturbation series to derive a number of exact identities relating propagators and survival probabilities satisfying different boundary conditions in a unified and systematic manner. Furthermore, we show how the path decomposition expansion represents the propagator as a product of three factors in the Laplace domain that correspond to quantities figuring prominently in stochastic spatially resolved simulation algorithms. This analysis will thus be useful for the interpretation of current and the design of future algorithms. Finally, we discuss the relation between the general approach and the theory of Brownian functionals and calculate the mean residence time for the case of irreversible and reversible reactions.
Open or closed? Dirac, Heisenberg, and the relation between classical and quantum mechanics
NASA Astrophysics Data System (ADS)
Bokulich, Alisa
2004-09-01
This paper describes a long-standing, though little known, debate between Dirac and Heisenberg over the nature of scientific methodology, theory change, and intertheoretic relations. Following Heisenberg's terminology, their disagreements can be summarized as a debate over whether the classical and quantum theories are "open" or "closed." A close examination of this debate sheds new light on the philosophical views of two of the great founders of quantum theory.
The role of a posteriori mathematics in physics
NASA Astrophysics Data System (ADS)
MacKinnon, Edward
2018-05-01
The calculus that co-evolved with classical mechanics relied on definitions of functions and differentials that accommodated physical intuitions. In the early nineteenth century mathematicians began the rigorous reformulation of calculus and eventually succeeded in putting almost all of mathematics on a set-theoretic foundation. Physicists traditionally ignore this rigorous mathematics. Physicists often rely on a posteriori math, a practice of using physical considerations to determine mathematical formulations. This is illustrated by examples from classical and quantum physics. A justification of such practice stems from a consideration of the role of phenomenological theories in classical physics and effective theories in contemporary physics. This relates to the larger question of how physical theories should be interpreted.
NASA Technical Reports Server (NTRS)
Paquette, John A.; Nuth, Joseph A., III
2011-01-01
Classical nucleation theory has been used in models of dust nucleation in circumstellar outflows around oxygen-rich asymptotic giant branch stars. One objection to the application of classical nucleation theory (CNT) to astrophysical systems of this sort is that an equilibrium distribution of clusters (assumed by CNT) is unlikely to exist in such conditions due to a low collision rate of condensable species. A model of silicate grain nucleation and growth was modified to evaluate the effect of a nucleation flux orders of magnitUde below the equilibrium value. The results show that a lack of chemical equilibrium has only a small effect on the ultimate grain distribution.
S-Duality, Deconstruction and Confinement for a Marginal Deformation of N=4 SUSY Yang-Mills
NASA Astrophysics Data System (ADS)
Dorey, Nick
2004-08-01
We study an exactly marginal deformation of Script N = 4 SUSY Yang-Mills with gauge group U(N) using field theory and string theory methods. The classical theory has a Higgs branch for rational values of the deformation parameter. We argue that the quantum theory also has an S-dual confining branch which cannot be seen classically. The low-energy effective theory on these branches is a six-dimensional non-commutative gauge theory with sixteen supercharges. Confinement of magnetic and electric charges, on the Higgs and confining branches respectively, occurs due to the formation of BPS-saturated strings in the low energy theory. The results also suggest a new way of deconstructing Little String Theory as a large-N limit of a confining gauge theory in four dimensions.
High-pressure phase transitions - Examples of classical predictability
NASA Astrophysics Data System (ADS)
Celebonovic, Vladan
1992-09-01
The applicability of the Savic and Kasanin (1962-1967) classical theory of dense matter to laboratory experiments requiring estimates of high-pressure phase transitions was examined by determining phase transition pressures for a set of 19 chemical substances (including elements, hydrocarbons, metal oxides, and salts) for which experimental data were available. A comparison between experimental and transition points and those predicted by the Savic-Kasanin theory showed that the theory can be used for estimating values of transition pressures. The results also support conclusions obtained in previous astronomical applications of the Savic-Kasanin theory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khrennikov, Andrei
We present fundamentals of a prequantum model with hidden variables of the classical field type. In some sense this is the comeback of classical wave mechanics. Our approach also can be considered as incorporation of quantum mechanics into classical signal theory. All quantum averages (including correlations of entangled systems) can be represented as classical signal averages and correlations.
Is quantum theory a form of statistical mechanics?
NASA Astrophysics Data System (ADS)
Adler, S. L.
2007-05-01
We give a review of the basic themes of my recent book: Adler S L 2004 Quantum Theory as an Emergent Phenomenon (Cambridge: Cambridge University Press). We first give motivations for considering the possibility that quantum mechanics is not exact, but is instead an accurate asymptotic approximation to a deeper level theory. For this deeper level, we propose a non-commutative generalization of classical mechanics, that we call "trace dynamics", and we give a brief survey of how it works, considering for simplicity only the bosonic case. We then discuss the statistical mechanics of trace dynamics and give our argument that with suitable approximations, the Ward identities for trace dynamics imply that ensemble averages in the canonical ensemble correspond to Wightman functions in quantum field theory. Thus, quantum theory emerges as the statistical thermodynamics of trace dynamics. Finally, we argue that Brownian motion corrections to this thermodynamics lead to stochastic corrections to the Schrödinger equation, of the type that have been much studied in the "continuous spontaneous localization" model of objective state vector reduction. In appendices to the talk, we give details of the existence of a conserved operator in trace dynamics that encodes the structure of the canonical algebra, of the derivation of the Ward identities, and of the proof that the stochastically-modified Schrödinger equation leads to state vector reduction with Born rule probabilities.
A Theory of Immersion Freezing
NASA Technical Reports Server (NTRS)
Barahona, Donifan
2017-01-01
Immersion freezing is likely involved in the initiation of precipitation and determines to large extent the phase partitioning in convective clouds. Theoretical models commonly used to describe immersion freezing in atmospheric models are based on the classical nucleation theory which however neglects important interactions near the immersed particle that may affect nucleation rates. This work introduces a new theory of immersion freezing based on two premises. First, immersion ice nucleation is mediated by the modification of the properties of water near the particle-liquid interface, rather than by the geometry of the ice germ. Second, the same mechanism that leads to the decrease in the work of germ formation also decreases the mobility of water molecules near the immersed particle. These two premises allow establishing general thermodynamic constraints to the ice nucleation rate. Analysis of the new theory shows that active sites likely trigger ice nucleation, but they do not control the overall nucleation rate nor the probability of freezing. It also suggests that materials with different ice nucleation efficiency may exhibit similar freezing temperatures under similar conditions but differ in their sensitivity to particle surface area and cooling rate. Predicted nucleation rates show good agreement with observations for a diverse set of materials including dust, black carbon and bacterial ice nucleating particles. The application of the new theory within the NASA Global Earth System Model (GEOS-5) is also discussed.
DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.
2012-01-01
We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions. PMID:22338694
ERIC Educational Resources Information Center
Ruckle, L. J.; Belloni, M.; Robinett, R. W.
2012-01-01
The biharmonic oscillator and the asymmetric linear well are two confining power-law-type potentials for which complete bound-state solutions are possible in both classical and quantum mechanics. We examine these problems in detail, beginning with studies of their trajectories in position and momentum space, evaluation of the classical probability…
Simulations of Probabilities for Quantum Computing
NASA Technical Reports Server (NTRS)
Zak, M.
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Uniting the Spheres: Modern Feminist Theory and Classic Texts in AP English
ERIC Educational Resources Information Center
Drew, Simao J. A.; Bosnic, Brenda G.
2008-01-01
High school teachers Simao J. A. Drew and Brenda G. Bosnic help familiarize students with gender role analysis and feminist theory. Students examine classic literature and contemporary texts, considering characters' historical, literary, and social contexts while expanding their understanding of how patterns of identity and gender norms exist and…
Aesthetic Creativity: Insights from Classical Literary Theory on Creative Learning
ERIC Educational Resources Information Center
Hellstrom, Tomas Georg
2011-01-01
This paper addresses the subject of textual creativity by drawing on work done in classical literary theory and criticism, specifically new criticism, structuralism and early poststructuralism. The question of how readers and writers engage creatively with the text is closely related to educational concerns, though they are often thought of as…
ERIC Educational Resources Information Center
Bazaldua, Diego A. Luna; Lee, Young-Sun; Keller, Bryan; Fellers, Lauren
2017-01-01
The performance of various classical test theory (CTT) item discrimination estimators has been compared in the literature using both empirical and simulated data, resulting in mixed results regarding the preference of some discrimination estimators over others. This study analyzes the performance of various item discrimination estimators in CTT:…
Louis Guttman's Contributions to Classical Test Theory
ERIC Educational Resources Information Center
Zimmerman, Donald W.; Williams, Richard H.; Zumbo, Bruno D.; Ross, Donald
2005-01-01
This article focuses on Louis Guttman's contributions to the classical theory of educational and psychological tests, one of the lesser known of his many contributions to quantitative methods in the social sciences. Guttman's work in this field provided a rigorous mathematical basis for ideas that, for many decades after Spearman's initial work,…
Generalization of the Activated Complex Theory of Reaction Rates. II. Classical Mechanical Treatment
DOE R&D Accomplishments Database
Marcus, R. A.
1964-01-01
In its usual classical form activated complex theory assumes a particular expression for the kinetic energy of the reacting system -- one associated with a rectilinear motion along the reaction coordinate. The derivation of the rate expression given in the present paper is based on the general kinetic energy expression.
NASA Astrophysics Data System (ADS)
Yang, Chen
2018-05-01
The transitions from classical theories to quantum theories have attracted many interests. This paper demonstrates the analogy between the electromagnetic potentials and wave-like dynamic variables with their connections to quantum theory for audiences at advanced undergraduate level and above. In the first part, the counterpart relations in the classical electrodynamics (e.g. gauge transform and Lorenz condition) and classical mechanics (e.g. Legendre transform and free particle condition) are presented. These relations lead to similar governing equations of the field variables and dynamic variables. The Lorenz gauge, scalar potential and vector potential manifest a one-to-one similarity to the action, Hamiltonian and momentum, respectively. In the second part, the connections between the classical pictures of electromagnetic field and particle to quantum picture are presented. By characterising the states of electromagnetic field and particle via their (corresponding) variables, their evolution pictures manifest the same algebraic structure (isomorphic). Subsequently, pictures of the electromagnetic field and particle are compared to the quantum picture and their interconnections are given. A brief summary of the obtained results are presented at the end of the paper.
Operator Formulation of Classical Mechanics.
ERIC Educational Resources Information Center
Cohn, Jack
1980-01-01
Discusses the construction of an operator formulation of classical mechanics which is directly concerned with wave packets in configuration space and is more similar to that of convential quantum theory than other extant operator formulations of classical mechanics. (Author/HM)
Epistemic View of Quantum States and Communication Complexity of Quantum Channels
NASA Astrophysics Data System (ADS)
Montina, Alberto
2012-09-01
The communication complexity of a quantum channel is the minimal amount of classical communication required for classically simulating a process of state preparation, transmission through the channel and subsequent measurement. It establishes a limit on the power of quantum communication in terms of classical resources. We show that classical simulations employing a finite amount of communication can be derived from a special class of hidden variable theories where quantum states represent statistical knowledge about the classical state and not an element of reality. This special class has attracted strong interest very recently. The communication cost of each derived simulation is given by the mutual information between the quantum state and the classical state of the parent hidden variable theory. Finally, we find that the communication complexity for single qubits is smaller than 1.28 bits. The previous known upper bound was 1.85 bits.
An Analysis of Categorical and Quantitative Methods for Planning Under Uncertainty
Langlotz, Curtis P.; Shortliffe, Edward H.
1988-01-01
Decision theory and logical reasoning are both methods for representing and solving medical decision problems. We analyze the usefulness of these two approaches to medical therapy planning by establishing a simple correspondence between decision theory and non-monotonic logic, a formalization of categorical logical reasoning. The analysis indicates that categorical approaches to planning can be viewed as comprising two decision-theoretic concepts: probabilities (degrees of belief in planning hypotheses) and utilities (degrees of desirability of planning outcomes). We present and discuss examples of the following lessons from this decision-theoretic view of categorical (nonmonotonic) reasoning: (1) Decision theory and artificial intelligence techniques are intended to solve different components of the planning problem. (2) When considered in the context of planning under uncertainty, nonmonotonic logics do not retain the domain-independent characteristics of classical logical reasoning for planning under certainty. (3) Because certain nonmonotonic programming paradigms (e.g., frame-based inheritance, rule-based planning, protocol-based reminders) are inherently problem-specific, they may be inappropriate to employ in the solution of certain types of planning problems. We discuss how these conclusions affect several current medical informatics research issues, including the construction of “very large” medical knowledge bases.
NASA Astrophysics Data System (ADS)
Kvinge, Henry
We prove two results at the intersection of Lie theory and the representation theory of symmetric groups, Hecke algebras, and their generalizations. The first is a categorification of the crystal isomorphism B. (1,1) tensor B1,1 ⊕ B(Lambdai ) ≅ B(Lambdasigma (i)). Here B(Lambdai and B(Lambda sigma(i)) are two affine type highest weight crystals of weight Lambdai and Lambdasigma (i) respectively, sigma is a specific map from the Dynkin indexing set I to itself, and B1,1 is a Kirillov-Reshetikhin crystal. We show that this crystal isomorphism is in fact the shadow of a richer module-theoretic phenomenon in the representation theory of Khovanov-Lauda-Rouquier algebras of classical affine type. Our second result identifies the center EndH'( 1) of Khovanov's Heisenberg category H', as the algebra of shifted symmetric functions Lambda* of Okounkov and Olshanski, i.e. End H'(1) ≅ Lambda*. This isomorphism provides us with a graphical calculus for Lambda*. It also allows us to describe EndH'(1) in terms of the transition and co-transition measure of Kerov and the noncommutative probability spaces of Biane.
There are no particles, there are only fields
NASA Astrophysics Data System (ADS)
Hobson, Art
2013-03-01
Quantum foundations are still unsettled, with mixed effects on science and society. By now it should be possible to obtain consensus on at least one issue: Are the fundamental constituents fields or particles? As this paper shows, experiment and theory imply that unbounded fields, not bounded particles, are fundamental. This is especially clear for relativistic systems, implying that it's also true of nonrelativistic systems. Particles are epiphenomena arising from fields. Thus, the Schrödinger field is a space-filling physical field whose value at any spatial point is the probability amplitude for an interaction to occur at that point. The field for an electron is the electron; each electron extends over both slits in the two-slit experiment and spreads over the entire pattern; and quantum physics is about interactions of microscopic systems with the macroscopic world rather than just about measurements. It's important to clarify this issue because textbooks still teach a particles- and measurement-oriented interpretation that contributes to bewilderment among students and pseudoscience among the public. This article reviews classical and quantum fields, the two-slit experiment, rigorous theorems showing particles are inconsistent with relativistic quantum theory, and several phenomena showing particles are incompatible with quantum field theories.
Extended theory of harmonic maps connects general relativity to chaos and quantum mechanism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Gang; Duan, Yi-Shi
General relativity and quantum mechanism are two separate rules of modern physics explaining how nature works. Both theories are accurate, but the direct connection between two theories was not yet clarified. Recently, researchers blur the line between classical and quantum physics by connecting chaos and entanglement equation. Here in this paper, we showed the Duan's extended HM theory, which has the solution of the general relativity, can also have the solutions of the classic chaos equations and even the solution of Schrödinger equation in quantum physics, suggesting the extended theory of harmonic maps may act as a universal theory ofmore » physics.« less
Extended theory of harmonic maps connects general relativity to chaos and quantum mechanism
Ren, Gang; Duan, Yi-Shi
2017-07-20
General relativity and quantum mechanism are two separate rules of modern physics explaining how nature works. Both theories are accurate, but the direct connection between two theories was not yet clarified. Recently, researchers blur the line between classical and quantum physics by connecting chaos and entanglement equation. Here in this paper, we showed the Duan's extended HM theory, which has the solution of the general relativity, can also have the solutions of the classic chaos equations and even the solution of Schrödinger equation in quantum physics, suggesting the extended theory of harmonic maps may act as a universal theory ofmore » physics.« less
Quantum Capacity under Adversarial Quantum Noise: Arbitrarily Varying Quantum Channels
NASA Astrophysics Data System (ADS)
Ahlswede, Rudolf; Bjelaković, Igor; Boche, Holger; Nötzel, Janis
2013-01-01
We investigate entanglement transmission over an unknown channel in the presence of a third party (called the adversary), which is enabled to choose the channel from a given set of memoryless but non-stationary channels without informing the legitimate sender and receiver about the particular choice that he made. This channel model is called an arbitrarily varying quantum channel (AVQC). We derive a quantum version of Ahlswede's dichotomy for classical arbitrarily varying channels. This includes a regularized formula for the common randomness-assisted capacity for entanglement transmission of an AVQC. Quite surprisingly and in contrast to the classical analog of the problem involving the maximal and average error probability, we find that the capacity for entanglement transmission of an AVQC always equals its strong subspace transmission capacity. These results are accompanied by different notions of symmetrizability (zero-capacity conditions) as well as by conditions for an AVQC to have a capacity described by a single-letter formula. In the final part of the paper the capacity of the erasure-AVQC is computed and some light shed on the connection between AVQCs and zero-error capacities. Additionally, we show by entirely elementary and operational arguments motivated by the theory of AVQCs that the quantum, classical, and entanglement-assisted zero-error capacities of quantum channels are generically zero and are discontinuous at every positivity point.
Quantization and Quantum-Like Phenomena: A Number Amplitude Approach
NASA Astrophysics Data System (ADS)
Robinson, T. R.; Haven, E.
2015-12-01
Historically, quantization has meant turning the dynamical variables of classical mechanics that are represented by numbers into their corresponding operators. Thus the relationships between classical variables determine the relationships between the corresponding quantum mechanical operators. Here, we take a radically different approach to this conventional quantization procedure. Our approach does not rely on any relations based on classical Hamiltonian or Lagrangian mechanics nor on any canonical quantization relations, nor even on any preconceptions of particle trajectories in space and time. Instead we examine the symmetry properties of certain Hermitian operators with respect to phase changes. This introduces harmonic operators that can be identified with a variety of cyclic systems, from clocks to quantum fields. These operators are shown to have the characteristics of creation and annihilation operators that constitute the primitive fields of quantum field theory. Such an approach not only allows us to recover the Hamiltonian equations of classical mechanics and the Schrödinger wave equation from the fundamental quantization relations, but also, by freeing the quantum formalism from any physical connotation, makes it more directly applicable to non-physical, so-called quantum-like systems. Over the past decade or so, there has been a rapid growth of interest in such applications. These include, the use of the Schrödinger equation in finance, second quantization and the number operator in social interactions, population dynamics and financial trading, and quantum probability models in cognitive processes and decision-making. In this paper we try to look beyond physical analogies to provide a foundational underpinning of such applications.
Quantum correlations and dynamics from classical random fields valued in complex Hilbert spaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khrennikov, Andrei
2010-08-15
One of the crucial differences between mathematical models of classical and quantum mechanics (QM) is the use of the tensor product of the state spaces of subsystems as the state space of the corresponding composite system. (To describe an ensemble of classical composite systems, one uses random variables taking values in the Cartesian product of the state spaces of subsystems.) We show that, nevertheless, it is possible to establish a natural correspondence between the classical and the quantum probabilistic descriptions of composite systems. Quantum averages for composite systems (including entangled) can be represented as averages with respect to classical randommore » fields. It is essentially what Albert Einstein dreamed of. QM is represented as classical statistical mechanics with infinite-dimensional phase space. While the mathematical construction is completely rigorous, its physical interpretation is a complicated problem. We present the basic physical interpretation of prequantum classical statistical field theory in Sec. II. However, this is only the first step toward real physical theory.« less
Introduction to Quantum Intelligence
NASA Technical Reports Server (NTRS)
Zak, Michail
1996-01-01
An impact of ideas associated with the concept of a hypothetical quantum computer upon classical computing is analyzed. Two fundamental properties of quantum computing: direct simulations of probabilities, and influence between different branches of probabilistic scenarios, as well as their classical versions, are discussed.
Opening Switch Research on a Plasma Focus VI.
1988-02-26
Sausage Instability in the Plasma Focus In this section the classical Kruskal- Schwarzschild 3 theory for the sausage mode is applied to the pinch phase...on 1) the shape of the pinch, 2) axial flow of plasma, and 3) self-generated magnetic fields are also presented. The Kruskal- Schwarzschild Theory The...classical mhd theory for the m=O mode in a plasma supported by a magnetic field against gravity; this is the well-known Kruskal- Schwarzschild
Nanoscale Capillary Flows in Alumina: Testing the Limits of Classical Theory.
Lei, Wenwen; McKenzie, David R
2016-07-21
Anodic aluminum oxide (AAO) membranes have well-formed cylindrical channels, as small as 10 nm in diameter, in a close packed hexagonal array. The channels in AAO membranes simulate very small leaks that may be present for example in an aluminum oxide device encapsulation. The 10 nm alumina channel is the smallest that has been studied to date for its moisture flow properties and provides a stringent test of classical capillary theory. We measure the rate at which moisture penetrates channels with diameters in the range of 10 to 120 nm with moist air present at 1 atm on one side and dry air at the same total pressure on the other. We extend classical theory for water leak rates at high humidities by allowing for variable meniscus curvature at the entrance and show that the extended theory explains why the flow increases greatly when capillary filling occurs and enables the contact angle to be determined. At low humidities our measurements for air-filled channels agree well with theory for the interdiffusive flow of water vapor in air. The flow rate of water-filled channels is one order of magnitude less than expected from classical capillary filling theory and is coincidentally equal to the helium flow rate, validating the use of helium leak testing for evaluating moisture flows in aluminum oxide leaks.
Two-proton pickup studies with the (6Li,8B) reaction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weisenmiller, R.B.
1976-12-03
The (/sup 6/Li,/sup 8/B) reaction has been investigated on targets of /sup 26/Mg, /sup 24/Mg, /sup 16/O, /sup 13/C, /sup 12/C, /sup 11/B, /sup 10/B, and /sup 9/Be at a bombarding energy of 80.0 MeV, and on targets of /sup 16/O, /sup 12/C, /sup 9/Be, /sup 7/Li, and /sup 6/Li at a bombarding energy of 93.3 MeV. Only levels consistent with direct, single-step two-proton pickup reaction mechanisms were observed to be strongly populated. On T/sub z/ = 0 targets, the spectroscopic selectivity of this reaction resembles that of the analogous (p,t) reaction. Additionally, these data demonstrate the dominance of spatiallymore » symmetric transfer of the two protons. On T/sub z/ greater than 0 targets the (/sup 6/Li,/sup 8/B) reaction was employed to locate two previously unreported levels (at 7.47 +- 0.05 MeV and 8.86 +- 0.07 MeV) in the T/sub z/ = 2 nuclide /sup 24/Ne and to establish the low-lying 1p-shell states in the T/sub z/ = /sup 3///sub 2/ nuclei /sup 11/Be, /sup 9/Li, and /sup 7/He. However, no evidence was seen for any narrow levels in the T/sub z/ = /sup 3///sub 2/ nuclide /sup 5/H nor for any narrow excited states in /sup 7/He. The angular distributions reported here are rather featureless and decrease monotonically with increasing angle. This behavior can be shown by a semi-classical reaction theory to be a consequence of the reaction kinematics. A semi-classical approach also suggests that the kinematic term in the transition matrix element is only weakly dependent upon the angular momentum transfer (which is consistent with simple Distorted Wave Born Approximation calculations). However, only qualitative agreement was obtained between the observed relative transition yields and semi-classical predictions, using the two-nucleon coefficients of fractional parentage of Cohen and Kurath, probably due to the limitations of the semi-classical reaction theory.« less
A physically constrained classical description of the homogeneous nucleation of ice in water.
Koop, Thomas; Murray, Benjamin J
2016-12-07
Liquid water can persist in a supercooled state to below 238 K in the Earth's atmosphere, a temperature range where homogeneous nucleation becomes increasingly probable. However, the rate of homogeneous ice nucleation in supercooled water is poorly constrained, in part, because supercooled water eludes experimental scrutiny in the region of the homogeneous nucleation regime where it can exist only fleetingly. Here we present a new parameterization of the rate of homogeneous ice nucleation based on classical nucleation theory. In our approach, we constrain the key terms in classical theory, i.e., the diffusion activation energy and the ice-liquid interfacial energy, with physically consistent parameterizations of the pertinent quantities. The diffusion activation energy is related to the translational self-diffusion coefficient of water for which we assess a range of descriptions and conclude that the most physically consistent fit is provided by a power law. The other key term is the interfacial energy between the ice embryo and supercooled water whose temperature dependence we constrain using the Turnbull correlation, which relates the interfacial energy to the difference in enthalpy between the solid and liquid phases. The only adjustable parameter in our model is the absolute value of the interfacial energy at one reference temperature. That value is determined by fitting this classical model to a selection of laboratory homogeneous ice nucleation data sets between 233.6 K and 238.5 K. On extrapolation to temperatures below 233 K, into a range not accessible to standard techniques, we predict that the homogeneous nucleation rate peaks between about 227 and 231 K at a maximum nucleation rate many orders of magnitude lower than previous parameterizations suggest. This extrapolation to temperatures below 233 K is consistent with the most recent measurement of the ice nucleation rate in micrometer-sized droplets at temperatures of 227-232 K on very short time scales using an X-ray laser technique. In summary, we present a new physically constrained parameterization for homogeneous ice nucleation which is consistent with the latest literature nucleation data and our physical understanding of the properties of supercooled water.
NASA Astrophysics Data System (ADS)
Miller, Steven David
1999-10-01
A consistent extension of the Oppenheimer-Snyder gravitational collapse formalism is presented which incorporates stochastic, conformal, vacuum fluctuations of the metric tensor. This results in a tractable approach to studying the possible effects of vacuum fluctuations on collapse and singularity formation. The motivation here, is that it is known that coupling stochastic noise to a classical field theory can lead to workable methodologies that accommodate or reproduce many aspects of quantum theory, turbulence or structure formation. The effect of statistically averaging over the metric fluctuations gives the appearance of a deterministic Riemannian structure, with an induced non-vanishing cosmological constant arising from the nonlinearity. The Oppenheimer-Snyder collapse of a perfect fluid or dust star in the fluctuating or `turbulent' spacetime, is reformulated in terms of nonlinear Einstein-Langevin field equations, with an additional noise source in the energy-momentum tensor. The smooth deterministic worldlines of collapsing matter within the classical Oppenheimer-Snyder model, now become nonlinear Brownian motions due to the backreaction induced by vacuum fluctuations. As the star collapses, the matter worldlines become increasingly randomized since the backreaction coupling to the vacuum fluctuations is nonlinear; the input assumptions of the Hawking-Penrose singularity theorems should then be violated. Solving the nonlinear Einstein-Langevin field equation for collapse - via the Ito interpretation - gives a singularity-free solution, which is equivalent to the original Oppenheimer solution but with higher-order stochastic corrections; the original singular solution is recovered in the limit of zero vacuum fluctuations. The `geometro-hydrodynamics' of noisy gravitational collapse, were also translated into an equivalent mathematical formulation in terms of nonlinear Einstein-Fokker-Planck (EFP) continuity equations with respect to comoving coordinates: these describe the collapse as a conserved flow of probability. A solution was found in the dilute limit of weak fluctuations where the EFP equation is linearized. There is zero probability that the star collapses to a singular state in the presence of background vacuum fluctuations, but the singularity returns with unit probability when the fluctuations are reduced to zero. Finally, an EFP equation was considered with respect to standard exterior coordinates. Using the thermal Brownian motion paradigm, an exact stationary or equilibrium solution was found in the infinite standard time relaxation limit. The solution gives the conditions required for the final collapsed object (a black hole) to be in thermal equilibrium with the background vacuum fluctuations. From this solution, one recovers the Hawking temperature without using field theory. The stationary solution then seems to correspond to a black hole in thermal equilibrium with a fluctuating conformal scalar field; or the Hawking-Hartle state.
Hamilton-Jacobi theory in multisymplectic classical field theories
NASA Astrophysics Data System (ADS)
de León, Manuel; Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso; Vilariño, Silvia
2017-09-01
The geometric framework for the Hamilton-Jacobi theory developed in the studies of Cariñena et al. [Int. J. Geom. Methods Mod. Phys. 3(7), 1417-1458 (2006)], Cariñena et al. [Int. J. Geom. Methods Mod. Phys. 13(2), 1650017 (2015)], and de León et al. [Variations, Geometry and Physics (Nova Science Publishers, New York, 2009)] is extended for multisymplectic first-order classical field theories. The Hamilton-Jacobi problem is stated for the Lagrangian and the Hamiltonian formalisms of these theories as a particular case of a more general problem, and the classical Hamilton-Jacobi equation for field theories is recovered from this geometrical setting. Particular and complete solutions to these problems are defined and characterized in several equivalent ways in both formalisms, and the equivalence between them is proved. The use of distributions in jet bundles that represent the solutions to the field equations is the fundamental tool in this formulation. Some examples are analyzed and, in particular, the Hamilton-Jacobi equation for non-autonomous mechanical systems is obtained as a special case of our results.
Properties of the Boltzmann equation in the classical approximation
Epelbaum, Thomas; Gelis, François; Tanji, Naoto; ...
2014-12-30
We examine the Boltzmann equation with elastic point-like scalar interactions in two different versions of the the classical approximation. Although solving numerically the Boltzmann equation with the unapproximated collision term poses no problem, this allows one to study the effect of the ultraviolet cutoff in these approximations. This cutoff dependence in the classical approximations of the Boltzmann equation is closely related to the non-renormalizability of the classical statistical approximation of the underlying quantum field theory. The kinetic theory setup that we consider here allows one to study in a much simpler way the dependence on the ultraviolet cutoff, since onemore » has also access to the non-approximated result for comparison.« less
On the motion of classical three-body system with consideration of quantum fluctuations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gevorkyan, A. S., E-mail: g-ashot@sci.am
2017-03-15
We obtained the systemof stochastic differential equations which describes the classicalmotion of the three-body system under influence of quantum fluctuations. Using SDEs, for the joint probability distribution of the total momentum of bodies system were obtained the partial differential equation of the second order. It is shown, that the equation for the probability distribution is solved jointly by classical equations, which in turn are responsible for the topological peculiarities of tubes of quantum currents, transitions between asymptotic channels and, respectively for arising of quantum chaos.
ERIC Educational Resources Information Center
MacMillan, Peter D.
2000-01-01
Compared classical test theory (CTT), generalizability theory (GT), and multifaceted Rasch model (MFRM) approaches to detecting and correcting for rater variability using responses of 4,930 high school students graded by 3 raters on 9 scales. The MFRM approach identified far more raters as different than did the CTT analysis. GT and Rasch…
Marshaling Resources: A Classic Grounded Theory Study of Online Learners
ERIC Educational Resources Information Center
Yalof, Barbara
2012-01-01
Students who enroll in online courses comprise one quarter of an increasingly diverse student body in higher education today. Yet, it is not uncommon for an online program to lose over 50% of its enrolled students prior to graduation. This study used a classic grounded theory qualitative methodology to investigate the persistent problem of…
ERIC Educational Resources Information Center
Gotsch-Thomson, Susan
1990-01-01
Describes how gender is integrated into a classical social theory course by including a female theorist in the reading assignments and using "The Handmaid's Tale" by Margaret Atwood as the basis for class discussion. Reviews the course objectives and readings; describes the process of the class discussions; and provides student…
The Development of Bayesian Theory and Its Applications in Business and Bioinformatics
NASA Astrophysics Data System (ADS)
Zhang, Yifei
2018-03-01
Bayesian Theory originated from an Essay of a British mathematician named Thomas Bayes in 1763, and after its development in 20th century, Bayesian Statistics has been taking a significant part in statistical study of all fields. Due to the recent breakthrough of high-dimensional integral, Bayesian Statistics has been improved and perfected, and now it can be used to solve problems that Classical Statistics failed to solve. This paper summarizes Bayesian Statistics’ history, concepts and applications, which are illustrated in five parts: the history of Bayesian Statistics, the weakness of Classical Statistics, Bayesian Theory and its development and applications. The first two parts make a comparison between Bayesian Statistics and Classical Statistics in a macroscopic aspect. And the last three parts focus on Bayesian Theory in specific -- from introducing some particular Bayesian Statistics’ concepts to listing their development and finally their applications.
ERIC Educational Resources Information Center
Gómez-Torres, Emilse; Batanero, Carmen; Díaz, Carmen; Contreras, José Miguel
2016-01-01
In this paper we describe the development of a questionnaire designed to assess the probability content knowledge of prospective primary school teachers. Three components of mathematical knowledge for teaching and three different meanings of probability (classical, frequentist and subjective) are considered. The questionnaire content is based on…
NASA Astrophysics Data System (ADS)
Mojahedi, Mahdi; Shekoohinejad, Hamidreza
2018-02-01
In this paper, temperature distribution in the continuous and pulsed end-pumped Nd:YAG rod crystal is determined using nonclassical and classical heat conduction theories. In order to find the temperature distribution in crystal, heat transfer differential equations of crystal with consideration of boundary conditions are derived based on non-Fourier's model and temperature distribution of the crystal is achieved by an analytical method. Then, by transferring non-Fourier differential equations to matrix equations, using finite element method, temperature and stress of every point of crystal are calculated in the time domain. According to the results, a comparison between classical and nonclassical theories is represented to investigate rupture power values. In continuous end pumping with equal input powers, non-Fourier theory predicts greater temperature and stress compared to Fourier theory. It also shows that with an increase in relaxation time, crystal rupture power decreases. Despite of these results, in single rectangular pulsed end-pumping condition, with an equal input power, Fourier theory indicates higher temperature and stress rather than non-Fourier theory. It is also observed that, when the relaxation time increases, maximum amounts of temperature and stress decrease.
Probability evolution method for exit location distribution
NASA Astrophysics Data System (ADS)
Zhu, Jinjie; Chen, Zhen; Liu, Xianbin
2018-03-01
The exit problem in the framework of the large deviation theory has been a hot topic in the past few decades. The most probable escape path in the weak-noise limit has been clarified by the Freidlin-Wentzell action functional. However, noise in real physical systems cannot be arbitrarily small while noise with finite strength may induce nontrivial phenomena, such as noise-induced shift and noise-induced saddle-point avoidance. Traditional Monte Carlo simulation of noise-induced escape will take exponentially large time as noise approaches zero. The majority of the time is wasted on the uninteresting wandering around the attractors. In this paper, a new method is proposed to decrease the escape simulation time by an exponentially large factor by introducing a series of interfaces and by applying the reinjection on them. This method can be used to calculate the exit location distribution. It is verified by examining two classical examples and is compared with theoretical predictions. The results show that the method performs well for weak noise while may induce certain deviations for large noise. Finally, some possible ways to improve our method are discussed.
NASA Astrophysics Data System (ADS)
Afanas'ev, V. P.; Gryazev, A. S.; Efremenko, D. S.; Kaplya, P. S.; Kuznetcova, A. V.
2017-12-01
Precise knowledge of the differential inverse inelastic mean free path (DIIMFP) and differential surface excitation probability (DSEP) of Tungsten is essential for many fields of material science. In this paper, a fitting algorithm is applied for extracting DIIMFP and DSEP from X-ray photoelectron spectra and electron energy loss spectra. The algorithm uses the partial intensity approach as a forward model, in which a spectrum is given as a weighted sum of cross-convolved DIIMFPs and DSEPs. The weights are obtained as solutions of the Riccati and Lyapunov equations derived from the invariant imbedding principle. The inversion algorithm utilizes the parametrization of DIIMFPs and DSEPs on the base of a classical Lorentz oscillator. Unknown parameters of the model are found by using the fitting procedure, which minimizes the residual between measured spectra and forward simulations. It is found that the surface layer of Tungsten contains several sublayers with corresponding Langmuir resonances. The thicknesses of these sublayers are proportional to the periods of corresponding Langmuir oscillations, as predicted by the theory of R.H. Ritchie.
Gambini, R; Pullin, J
2000-12-18
We consider general relativity with a cosmological constant as a perturbative expansion around a completely solvable diffeomorphism invariant field theory. This theory is the lambda --> infinity limit of general relativity. This allows an explicit perturbative computational setup in which the quantum states of the theory and the classical observables can be explicitly computed. An unexpected relationship arises at a quantum level between the discrete spectrum of the volume operator and the allowed values of the cosmological constant.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lusanna, Luca
2004-08-19
The four (electro-magnetic, weak, strong and gravitational) interactions are described by singular Lagrangians and by Dirac-Bergmann theory of Hamiltonian constraints. As a consequence a subset of the original configuration variables are gauge variables, not determined by the equations of motion. Only at the Hamiltonian level it is possible to separate the gauge variables from the deterministic physical degrees of freedom, the Dirac observables, and to formulate a well posed Cauchy problem for them both in special and general relativity. Then the requirement of causality dictates the choice of retarded solutions at the classical level. However both the problems of themore » classical theory of the electron, leading to the choice of (1/2) (retarded + advanced) solutions, and the regularization of quantum field theory, leading to the Feynman propagator, introduce anticipatory aspects. The determination of the relativistic Darwin potential as a semi-classical approximation to the Lienard-Wiechert solution for particles with Grassmann-valued electric charges, regularizing the Coulomb self-energies, shows that these anticipatory effects live beyond the semi-classical approximation (tree level) under the form of radiative corrections, at least for the electro-magnetic interaction.Talk and 'best contribution' at The Sixth International Conference on Computing Anticipatory Systems CASYS'03, Liege August 11-16, 2003.« less
The dynamical mass of a classical Cepheid variable star in an eclipsing binary system.
Pietrzyński, G; Thompson, I B; Gieren, W; Graczyk, D; Bono, G; Udalski, A; Soszyński, I; Minniti, D; Pilecki, B
2010-11-25
Stellar pulsation theory provides a means of determining the masses of pulsating classical Cepheid supergiants-it is the pulsation that causes their luminosity to vary. Such pulsational masses are found to be smaller than the masses derived from stellar evolution theory: this is the Cepheid mass discrepancy problem, for which a solution is missing. An independent, accurate dynamical mass determination for a classical Cepheid variable star (as opposed to type-II Cepheids, low-mass stars with a very different evolutionary history) in a binary system is needed in order to determine which is correct. The accuracy of previous efforts to establish a dynamical Cepheid mass from Galactic single-lined non-eclipsing binaries was typically about 15-30% (refs 6, 7), which is not good enough to resolve the mass discrepancy problem. In spite of many observational efforts, no firm detection of a classical Cepheid in an eclipsing double-lined binary has hitherto been reported. Here we report the discovery of a classical Cepheid in a well detached, double-lined eclipsing binary in the Large Magellanic Cloud. We determine the mass to a precision of 1% and show that it agrees with its pulsation mass, providing strong evidence that pulsation theory correctly and precisely predicts the masses of classical Cepheids.
Phase-Sensitive Coherence and the Classical-Quantum Boundary in Ghost Imaging
NASA Technical Reports Server (NTRS)
Erkmen, Baris I.; Hardy, Nicholas D.; Venkatraman, Dheera; Wong, Franco N. C.; Shapiro, Jeffrey H.
2011-01-01
The theory of partial coherence has a long and storied history in classical statistical optics. the vast majority of this work addresses fields that are statistically stationary in time, hence their complex envelopes only have phase-insensitive correlations. The quantum optics of squeezed-state generation, however, depends on nonlinear interactions producing baseband field operators with phase-insensitive and phase-sensitive correlations. Utilizing quantum light to enhance imaging has been a topic of considerable current interest, much of it involving biphotons, i.e., streams of entangled-photon pairs. Biphotons have been employed for quantum versions of optical coherence tomography, ghost imaging, holography, and lithography. However, their seemingly quantum features have been mimicked with classical-sate light, questioning wherein lies the classical-quantum boundary. We have shown, for the case of Gaussian-state light, that this boundary is intimately connected to the theory of phase-sensitive partial coherence. Here we present that theory, contrasting it with the familiar case of phase-insensitive partial coherence, and use it to elucidate the classical-quantum boundary of ghost imaging. We show, both theoretically and experimentally, that classical phase-sensitive light produces ghost imaging most closely mimicking those obtained in biphotons, and we derived the spatial resolution, image contrast, and signal-to-noise ratio of a standoff-sensing ghost imager, taking into account target-induced speckle.
Bertrand's theorem and virial theorem in fractional classical mechanics
NASA Astrophysics Data System (ADS)
Yu, Rui-Yan; Wang, Towe
2017-09-01
Fractional classical mechanics is the classical counterpart of fractional quantum mechanics. The central force problem in this theory is investigated. Bertrand's theorem is generalized, and virial theorem is revisited, both in three spatial dimensions. In order to produce stable, closed, non-circular orbits, the inverse-square law and the Hooke's law should be modified in fractional classical mechanics.
Spectral Rate Theory for Two-State Kinetics
NASA Astrophysics Data System (ADS)
Prinz, Jan-Hendrik; Chodera, John D.; Noé, Frank
2014-02-01
Classical rate theories often fail in cases where the observable(s) or order parameter(s) used is a poor reaction coordinate or the observed signal is deteriorated by noise, such that no clear separation between reactants and products is possible. Here, we present a general spectral two-state rate theory for ergodic dynamical systems in thermal equilibrium that explicitly takes into account how the system is observed. The theory allows the systematic estimation errors made by standard rate theories to be understood and quantified. We also elucidate the connection of spectral rate theory with the popular Markov state modeling approach for molecular simulation studies. An optimal rate estimator is formulated that gives robust and unbiased results even for poor reaction coordinates and can be applied to both computer simulations and single-molecule experiments. No definition of a dividing surface is required. Another result of the theory is a model-free definition of the reaction coordinate quality. The reaction coordinate quality can be bounded from below by the directly computable observation quality, thus providing a measure allowing the reaction coordinate quality to be optimized by tuning the experimental setup. Additionally, the respective partial probability distributions can be obtained for the reactant and product states along the observed order parameter, even when these strongly overlap. The effects of both filtering (averaging) and uncorrelated noise are also examined. The approach is demonstrated on numerical examples and experimental single-molecule force-probe data of the p5ab RNA hairpin and the apo-myoglobin protein at low pH, focusing here on the case of two-state kinetics.
Noninvasive fetal QRS detection using an echo state network and dynamic programming.
Lukoševičius, Mantas; Marozas, Vaidotas
2014-08-01
We address a classical fetal QRS detection problem from abdominal ECG recordings with a data-driven statistical machine learning approach. Our goal is to have a powerful, yet conceptually clean, solution. There are two novel key components at the heart of our approach: an echo state recurrent neural network that is trained to indicate fetal QRS complexes, and several increasingly sophisticated versions of statistics-based dynamic programming algorithms, which are derived from and rooted in probability theory. We also employ a standard technique for preprocessing and removing maternal ECG complexes from the signals, but do not take this as the main focus of this work. The proposed approach is quite generic and can be extended to other types of signals and annotations. Open-source code is provided.
Electronic excitation and quenching of atoms at insulator surfaces
NASA Technical Reports Server (NTRS)
Swaminathan, P. K.; Garrett, Bruce C.; Murthy, C. S.
1988-01-01
A trajectory-based semiclassical method is used to study electronically inelastic collisions of gas atoms with insulator surfaces. The method provides for quantum-mechanical treatment of the internal electronic dynamics of a localized region involving the gas/surface collision, and a classical treatment of all the nuclear degrees of freedom (self-consistently and in terms of stochastic trajectories), and includes accurate simulation of the bath-temperature effects. The method is easy to implement and has a generality that holds promise for many practical applications. The problem of electronically inelastic dynamics is solved by computing a set of stochastic trajectories that on thermal averaging directly provide electronic transition probabilities at a given temperature. The theory is illustrated by a simple model of a two-state gas/surface interaction.
Fuzzy logic and causal reasoning with an 'n' of 1 for diagnosis and treatment of the stroke patient.
Helgason, Cathy M; Jobe, Thomas H
2004-03-01
The current scientific model for clinical decision-making is founded on binary or Aristotelian logic, classical set theory and probability-based statistics. Evidence-based medicine has been established as the basis for clinical recommendations. There is a problem with this scientific model when the physician must diagnose and treat the individual patient. The problem is a paradox, which is that the scientific model of evidence-based medicine is based upon a hypothesis aimed at the group and therefore, any conclusions cannot be extrapolated but to a degree to the individual patient. This extrapolation is dependent upon the expertise of the physician. A fuzzy logic multivalued-based scientific model allows this expertise to be numerically represented and solves the clinical paradox of evidence-based medicine.
Breakdown of the classical description of a local system.
Kot, Eran; Grønbech-Jensen, Niels; Nielsen, Bo M; Neergaard-Nielsen, Jonas S; Polzik, Eugene S; Sørensen, Anders S
2012-06-08
We provide a straightforward demonstration of a fundamental difference between classical and quantum mechanics for a single local system: namely, the absence of a joint probability distribution of the position x and momentum p. Elaborating on a recently reported criterion by Bednorz and Belzig [Phys. Rev. A 83, 052113 (2011)] we derive a simple criterion that must be fulfilled for any joint probability distribution in classical physics. We demonstrate the violation of this criterion using the homodyne measurement of a single photon state, thus proving a straightforward signature of the breakdown of a classical description of the underlying state. Most importantly, the criterion used does not rely on quantum mechanics and can thus be used to demonstrate nonclassicality of systems not immediately apparent to exhibit quantum behavior. The criterion is directly applicable to any system described by the continuous canonical variables x and p, such as a mechanical or an electrical oscillator and a collective spin of a large ensemble.
The Tensile Strength of Liquid Nitrogen
NASA Astrophysics Data System (ADS)
Huang, Jian
1992-01-01
The tensile strength of liquids has been a puzzling subject. On the one hand, the classical nucleation theory has met great success in predicting the nucleation rates of superheated liquids. On the other hand, most of reported experimental values of the tensile strength for different liquids are far below the prediction from the classical nucleation theory. In this study, homogeneous nucleation in liquid nitrogen and its tensile strength have been investigated. Different approaches for determining the pressure amplitude were studied carefully. It is shown that Raman-Nath theory, as modified by the introduction of an effective interaction length, can be used to determine the pressure amplitude in the focal plane of a focusing ultrasonic transducer. The results obtained from different diffraction orders are consistent and in good agreement with other approaches including Debye's theory and solving the KZK equation. The measurement of the tensile strength was carried out in a high pressure stainless steel dewar. A High intensity ultrasonic wave was focused into a small volume of liquid nitrogen in a short time period. A probe laser beam passes through the focal region of a concave spherical transducer with small aperture angle and the transmitted light is detected with a photodiode. The pressure amplitude at the focus is calculated based on the acoustic power radiated into the liquid. In the experiment, the electrical signal on the transducer is gated at its resonance frequency with gate widths of 20 mus to 0.2 ms and temperature range from 77 K to near 100 K. The calculated pressure amplitude is in agreement with the prediction of classical nucleation theory for the nucleation rates from 10^6 to 10^ {11} (bubbles/cm^3 sec). This work provides the experimental evidence that the validity of the classical nucleation theory can be extended to the region of the negative pressure up to -90 atm. This is only the second cryogenic liquid to reach the tensile strength predicted from the classical nucleation theory.
Costello, Fintan; Watts, Paul
2016-01-01
A standard assumption in much of current psychology is that people do not reason about probability using the rules of probability theory but instead use various heuristics or "rules of thumb," which can produce systematic reasoning biases. In Costello and Watts (2014), we showed that a number of these biases can be explained by a model where people reason according to probability theory but are subject to random noise. More importantly, that model also predicted agreement with probability theory for certain expressions that cancel the effects of random noise: Experimental results strongly confirmed this prediction, showing that probabilistic reasoning is simultaneously systematically biased and "surprisingly rational." In their commentaries on that paper, both Crupi and Tentori (2016) and Nilsson, Juslin, and Winman (2016) point to various experimental results that, they suggest, our model cannot explain. In this reply, we show that our probability theory plus noise model can in fact explain every one of the results identified by these authors. This gives a degree of additional support to the view that people's probability judgments embody the rational rules of probability theory and that biases in those judgments can be explained as simply effects of random noise. (c) 2015 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Dimakis, N.; Terzis, Petros A.; Zampeli, Adamantia; Christodoulakis, T.
2016-09-01
The high degree of symmetry renders the dynamics of cosmological as well as some black hole spacetimes describable by a system of finite degrees of freedom. These systems are generally known as minisuperspace models. One of their important key features is the invariance of the corresponding reduced actions under reparametrizations of the independent variable, a fact that can be seen as the remnant of the general covariance of the full theory. In the case of a system of n degrees of freedom, described by a Lagrangian quadratic in velocities, one can use the lapse by either gauge fixing it or letting it be defined by the constraint and subsequently substitute into the rest of the equations. In the first case, the system of the second-order equations of motion is solvable for all n accelerations and the constraint becomes a restriction among constants of integration. In the second case, the system can be solved for only n -1 accelerations and the "gauge" freedom is transferred to the choice of one of the scalar degrees of freedom. In this paper, we take the second path and express all n -1 scalar degrees of freedom in terms of the remaining one, say q . By considering these n -1 degrees of freedom as arbitrary but given functions of q , we manage to extract a two-dimensional pure gauge system consisting of the lapse N and the arbitrary q : in a way, we decouple the reparametrization invariance from the rest of the equations of motion, which are thus describing the "true" dynamics. The solution of the corresponding quantum two-dimensional system is used for the definition of a generalized probability for every configuration fi(q ), be it classical or not. The main result is that, interestingly enough, this probability attains its extrema on the classical solution of the initial n -dimensional system.
Nonequilibrium dynamics of the O( N ) model on dS3 and AdS crunches
NASA Astrophysics Data System (ADS)
Kumar, S. Prem; Vaganov, Vladislav
2018-03-01
We study the nonperturbative quantum evolution of the interacting O( N ) vector model at large- N , formulated on a spatial two-sphere, with time dependent couplings which diverge at finite time. This model - the so-called "E-frame" theory, is related via a conformal transformation to the interacting O( N ) model in three dimensional global de Sitter spacetime with time independent couplings. We show that with a purely quartic, relevant deformation the quantum evolution of the E-frame model is regular even when the classical theory is rendered singular at the end of time by the diverging coupling. Time evolution drives the E-frame theory to the large- N Wilson-Fisher fixed point when the classical coupling diverges. We study the quantum evolution numerically for a variety of initial conditions and demonstrate the finiteness of the energy at the classical "end of time". With an additional (time dependent) mass deformation, quantum backreaction lowers the mass, with a putative smooth time evolution only possible in the limit of infinite quartic coupling. We discuss the relevance of these results for the resolution of crunch singularities in AdS geometries dual to E-frame theories with a classical gravity dual.
Bojowald, Martin
2008-01-01
Quantum gravity is expected to be necessary in order to understand situations in which classical general relativity breaks down. In particular in cosmology one has to deal with initial singularities, i.e., the fact that the backward evolution of a classical spacetime inevitably comes to an end after a finite amount of proper time. This presents a breakdown of the classical picture and requires an extended theory for a meaningful description. Since small length scales and high curvatures are involved, quantum effects must play a role. Not only the singularity itself but also the surrounding spacetime is then modified. One particular theory is loop quantum cosmology, an application of loop quantum gravity to homogeneous systems, which removes classical singularities. Its implications can be studied at different levels. The main effects are introduced into effective classical equations, which allow one to avoid the interpretational problems of quantum theory. They give rise to new kinds of early-universe phenomenology with applications to inflation and cyclic models. To resolve classical singularities and to understand the structure of geometry around them, the quantum description is necessary. Classical evolution is then replaced by a difference equation for a wave function, which allows an extension of quantum spacetime beyond classical singularities. One main question is how these homogeneous scenarios are related to full loop quantum gravity, which can be dealt with at the level of distributional symmetric states. Finally, the new structure of spacetime arising in loop quantum gravity and its application to cosmology sheds light on more general issues, such as the nature of time. Supplementary material is available for this article at 10.12942/lrr-2008-4.
Aging Theories for Establishing Safe Life Spans of Airborne Critical Structural Components
NASA Technical Reports Server (NTRS)
Ko, William L.
2003-01-01
New aging theories have been developed to establish the safe life span of airborne critical structural components such as B-52B aircraft pylon hooks for carrying air-launch drop-test vehicles. The new aging theories use the equivalent-constant-amplitude loading spectrum to represent the actual random loading spectrum with the same damaging effect. The crack growth due to random loading cycling of the first flight is calculated using the half-cycle theory, and then extrapolated to all the crack growths of the subsequent flights. The predictions of the new aging theories (finite difference aging theory and closed-form aging theory) are compared with the classical flight-test life theory and the previously developed Ko first- and Ko second-order aging theories. The new aging theories predict the number of safe flights as considerably lower than that predicted by the classical aging theory, and slightly lower than those predicted by the Ko first- and Ko second-order aging theories due to the inclusion of all the higher order terms.
On the co-creation of classical and modern physics.
Staley, Richard
2005-12-01
While the concept of "classical physics" has long framed our understanding of the environment from which modern physics emerged, it has consistently been read back into a period in which the physicists concerned initially considered their work in quite other terms. This essay explores the shifting currency of the rich cultural image of the classical/ modern divide by tracing empirically different uses of "classical" within the physics community from the 1890s to 1911. A study of fin-de-siècle addresses shows that the earliest general uses of the concept proved controversial. Our present understanding of the term was in large part shaped by its incorporation (in different ways) within the emerging theories of relativity and quantum theory--where the content of "classical" physics was defined by proponents of the new. Studying the diverse ways in which Boltzmann, Larmor, Poincaré, Einstein, Minkowski, and Planck invoked the term "classical" will help clarify the critical relations between physicists' research programs and their use of worldview arguments in fashioning modern physics.
Contact stresses in gear teeth: A new method of analysis
NASA Technical Reports Server (NTRS)
Somprakit, Paisan; Huston, Ronald L.; Oswald, Fred B.
1991-01-01
A new, innovative procedure called point load superposition for determining the contact stresses in mating gear teeth. It is believed that this procedure will greatly extend both the range of applicability and the accuracy of gear contact stress analysis. Point load superposition is based upon fundamental solutions from the theory of elasticity. It is an iterative numerical procedure which has distinct advantages over the classical Hertz method, the finite element method, and over existing applications with the boundary element method. Specifically, friction and sliding effects, which are either excluded from or difficult to study with the classical methods, are routinely handled with the new procedure. Presented here are the basic theory and the algorithms. Several examples are given. Results are consistent with those of the classical theories. Applications to spur gears are discussed.
Large, Matthew
2013-12-01
Probability theory is at the base of modern concepts of risk assessment in mental health. The aim of the current paper is to review the key developments in the early history of probability theory in order to enrich our understanding of current risk assessment practices.
Causal inference, probability theory, and graphical insights.
Baker, Stuart G
2013-11-10
Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.
The internal consistency of the standard gamble: tests after adjusting for prospect theory.
Oliver, Adam
2003-07-01
This article reports a study that tests whether the internal consistency of the standard gamble can be improved upon by incorporating loss weighting and probability transformation parameters in the standard gamble valuation procedure. Five alternatives to the standard EU formulation are considered: (1) probability transformation within an EU framework; and, within a prospect theory framework, (2) loss weighting and full probability transformation, (3) no loss weighting and full probability transformation, (4) loss weighting and no probability transformation, and (5) loss weighting and partial probability transformation. Of the five alternatives, only the prospect theory formulation with loss weighting and no probability transformation offers an improvement in internal consistency over the standard EU valuation procedure.
Break-up dynamics of fluctuating liquid threads
Petit, Julien; Rivière, David; Kellay, Hamid; Delville, Jean-Pierre
2012-01-01
The thinning dynamics of a liquid neck before break-up, as may happen when a drop detaches from a faucet or a capillary, follows different rules and dynamic scaling laws depending on the importance of inertia, viscous stresses, or capillary forces. If now the thinning neck reaches dimensions comparable to the thermally excited interfacial fluctuations, as for nanojet break-up or the fragmentation of thermally annealed nanowires, these fluctuations should play a dominant role according to recent theory and observations. Using near-critical interfaces, we here fully characterize the universal dynamics of this thermal fluctuation-dominated regime and demonstrate that the cross-over from the classical two-fluid pinch-off scenario of a liquid thread to the fluctuation-dominated regime occurs at a well-defined neck radius proportional to the thermal length scale. Investigating satellite drop formation, we also show that at the level of the cross-over between these two regimes it is more probable to produce monodisperse droplets because fluctuation-dominated pinch-off may allow the unique situation where satellite drop formation can be inhibited. Nonetheless, the interplay between the evolution of the neck profiles from the classical to the fluctuation-dominated regime and the satellites’ production remains to be clarified. PMID:23090994
Quantum structure of negation and conjunction in human thought
Aerts, Diederik; Sozzo, Sandro; Veloz, Tomas
2015-01-01
We analyze in this paper the data collected in a set of experiments investigating how people combine natural concepts. We study the mutual influence of conceptual conjunction and negation by measuring the membership weights of a list of exemplars with respect to two concepts, e.g., Fruits and Vegetables, and their conjunction Fruits And Vegetables, but also their conjunction when one or both concepts are negated, namely, Fruits And Not Vegetables, Not Fruits And Vegetables, and Not Fruits And Not Vegetables. Our findings sharpen and advance existing analysis on conceptual combinations, revealing systematic deviations from classical (fuzzy set) logic and probability theory. And, more important, our results give further considerable evidence to the validity of our quantum-theoretic framework for the combination of two concepts. Indeed, the representation of conceptual negation naturally arises from the general assumptions of our two-sector Fock space model, and this representation faithfully agrees with the collected data. In addition, we find a new significant and a priori unexpected deviation from classicality, which can exactly be explained by assuming that human reasoning is the superposition of an “emergent reasoning” and a “logical reasoning,” and that these two processes are represented in a Fock space algebraic structure. PMID:26483715
Excluding joint probabilities from quantum theory
NASA Astrophysics Data System (ADS)
Allahverdyan, Armen E.; Danageozian, Arshag
2018-03-01
Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.
Brassey, Charlotte A.; Margetts, Lee; Kitchener, Andrew C.; Withers, Philip J.; Manning, Phillip L.; Sellers, William I.
2013-01-01
Classic beam theory is frequently used in biomechanics to model the stress behaviour of vertebrate long bones, particularly when creating intraspecific scaling models. Although methodologically straightforward, classic beam theory requires complex irregular bones to be approximated as slender beams, and the errors associated with simplifying complex organic structures to such an extent are unknown. Alternative approaches, such as finite element analysis (FEA), while much more time-consuming to perform, require no such assumptions. This study compares the results obtained using classic beam theory with those from FEA to quantify the beam theory errors and to provide recommendations about when a full FEA is essential for reasonable biomechanical predictions. High-resolution computed tomographic scans of eight vertebrate long bones were used to calculate diaphyseal stress owing to various loading regimes. Under compression, FEA values of minimum principal stress (σmin) were on average 142 per cent (±28% s.e.) larger than those predicted by beam theory, with deviation between the two models correlated to shaft curvature (two-tailed p = 0.03, r2 = 0.56). Under bending, FEA values of maximum principal stress (σmax) and beam theory values differed on average by 12 per cent (±4% s.e.), with deviation between the models significantly correlated to cross-sectional asymmetry at midshaft (two-tailed p = 0.02, r2 = 0.62). In torsion, assuming maximum stress values occurred at the location of minimum cortical thickness brought beam theory and FEA values closest in line, and in this case FEA values of τtorsion were on average 14 per cent (±5% s.e.) higher than beam theory. Therefore, FEA is the preferred modelling solution when estimates of absolute diaphyseal stress are required, although values calculated by beam theory for bending may be acceptable in some situations. PMID:23173199
Generalized mutual information and Tsirelson's bound
NASA Astrophysics Data System (ADS)
Wakakuwa, Eyuri; Murao, Mio
2014-12-01
We introduce a generalization of the quantum mutual information between a classical system and a quantum system into the mutual information between a classical system and a system described by general probabilistic theories. We apply this generalized mutual information (GMI) to a derivation of Tsirelson's bound from information causality, and prove that Tsirelson's bound can be derived from the chain rule of the GMI. By using the GMI, we formulate the "no-supersignalling condition" (NSS), that the assistance of correlations does not enhance the capability of classical communication. We prove that NSS is never violated in any no-signalling theory.
Generalized mutual information and Tsirelson's bound
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wakakuwa, Eyuri; Murao, Mio
2014-12-04
We introduce a generalization of the quantum mutual information between a classical system and a quantum system into the mutual information between a classical system and a system described by general probabilistic theories. We apply this generalized mutual information (GMI) to a derivation of Tsirelson's bound from information causality, and prove that Tsirelson's bound can be derived from the chain rule of the GMI. By using the GMI, we formulate the 'no-supersignalling condition' (NSS), that the assistance of correlations does not enhance the capability of classical communication. We prove that NSS is never violated in any no-signalling theory.
The Cognitive Substrate of Subjective Probability
ERIC Educational Resources Information Center
Nilsson, Hakan; Olsson, Henrik; Juslin, Peter
2005-01-01
The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as…
"Fathers" and "sons" of theories in cell physiology: the membrane theory.
Matveev, V V; Wheatley, D N
2005-12-16
The last 50 years in the history of life sciences are remarkable for a new important feature that looks as a great threat for their future. A profound specialization dominating in quickly developing fields of science causes a crisis of the scientific method. The essence of the method is a unity of two elements, the experimental data and the theory that explains them. To us, "fathers" of science, classically, were the creators of new ideas and theories. They were the true experts of their own theories. It is only they who have the right to say: "I am the theory". In other words, they were carriers of theories, of the theoretical knowledge. The fathers provided the necessary logical integrity to their theories, since theories in biology have still to be based on strict mathematical proofs. It is not true for sons. As a result of massive specialization, modern experts operate in very confined close spaces. They formulate particular rules far from the level of theory. The main theories of science are known to them only at the textbook level. Nowadays, nobody can say: "I am the theory". With whom, then is it possible to discuss today on a broader theoretical level? How can a classical theory--for example, the membrane one--be changed or even disproved under these conditions? How can the "sons" with their narrow education catch sight of membrane theory defects? As a result, "global" theories have few critics and control. Due to specialization, we have lost the ability to work at the experimental level of biology within the correct or appropriate theoretical context. The scientific method in its classic form is now being rapidly eroded. A good case can be made for "Membrane Theory", to which we will largely refer throughout this article.
Wang, Bo; Anthony, Stephen M; Bae, Sung Chul; Granick, Steve
2009-09-08
We describe experiments using single-particle tracking in which mean-square displacement is simply proportional to time (Fickian), yet the distribution of displacement probability is not Gaussian as should be expected of a classical random walk but, instead, is decidedly exponential for large displacements, the decay length of the exponential being proportional to the square root of time. The first example is when colloidal beads diffuse along linear phospholipid bilayer tubes whose radius is the same as that of the beads. The second is when beads diffuse through entangled F-actin networks, bead radius being less than one-fifth of the actin network mesh size. We explore the relevance to dynamic heterogeneity in trajectory space, which has been extensively discussed regarding glassy systems. Data for the second system might suggest activated diffusion between pores in the entangled F-actin networks, in the same spirit as activated diffusion and exponential tails observed in glassy systems. But the first system shows exceptionally rapid diffusion, nearly as rapid as for identical colloids in free suspension, yet still displaying an exponential probability distribution as in the second system. Thus, although the exponential tail is reminiscent of glassy systems, in fact, these dynamics are exceptionally rapid. We also compare with particle trajectories that are at first subdiffusive but Fickian at the longest measurement times, finding that displacement probability distributions fall onto the same master curve in both regimes. The need is emphasized for experiments, theory, and computer simulation to allow definitive interpretation of this simple and clean exponential probability distribution.
Berthelsen, Connie Bøttcher; Lindhardt, Tove; Frederiksen, Kirsten
2017-06-01
This paper presents a discussion of the differences in using participant observation as a data collection method by comparing the classic grounded theory methodology of Barney Glaser with the constructivist grounded theory methodology by Kathy Charmaz. Participant observations allow nursing researchers to experience activities and interactions directly in situ. However, using participant observations as a data collection method can be done in many ways, depending on the chosen grounded theory methodology, and may produce different results. This discussion shows that how the differences between using participant observations in classic and constructivist grounded theory can be considerable and that grounded theory researchers should adhere to the method descriptions of performing participant observations according to the selected grounded theory methodology to enhance the quality of research. © 2016 Nordic College of Caring Science.
Practical quantum coin flipping
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pappa, Anna; Diamanti, Eleni; Chailloux, Andre
2011-11-15
We show that in the unconditional security model, a single quantum strong coin flip with security guarantees that are strictly better than in any classical protocol is possible to implement with current technology. Our protocol takes into account all aspects of an experimental implementation, including losses, multiphoton pulses emitted by practical photon sources, channel noise, detector dark counts, and finite quantum efficiency. We calculate the abort probability when both players are honest, as well as the probability of one player forcing his desired outcome. For a channel length up to 21 km and commonly used parameter values, we can achievemore » honest abort and cheating probabilities that are better than in any classical protocol. Our protocol is, in principle, implementable using attenuated laser pulses, with no need for entangled photons or any other specific resources.« less
The Institution of Sociological Theory in Canada.
Guzman, Cinthya; Silver, Daniel
2018-02-01
Using theory syllabi and departmental data collected for three academic years, this paper investigates the institutional practice of theory in sociology departments across Canada. In particular, it examines the position of theory within the sociological curriculum, and how this varies among universities. Taken together, our analyses indicate that theory remains deeply institutionalized at the core of sociological education and Canadian sociologists' self-understanding; that theorists as a whole show some coherence in how they define themselves, but differ in various ways, especially along lines of region, intellectual background, and gender; that despite these differences, the classical versus contemporary heuristic largely cuts across these divides, as does the strongly ingrained position of a small group of European authors as classics of the discipline as a whole. Nevertheless, who is a classic remains an unsettled question, alternatives to the "classical versus contemporary" heuristic do exist, and theorists' syllabi reveal diverse "others" as potential candidates. Our findings show that the field of sociology is neither marked by universal agreement nor by absolute division when it comes to its theoretical underpinnings. To the extent that they reveal a unified field, the findings suggest that unity lies more in a distinctive form than in a distinctive content, which defines the space and structure of the field of sociology. © 2018 Canadian Sociological Association/La Société canadienne de sociologie.
On the effective field theory of intersecting D3-branes
NASA Astrophysics Data System (ADS)
Abbaspur, Reza
2018-05-01
We study the effective field theory of two intersecting D3-branes with one common dimension along the lines recently proposed in ref. [1]. We introduce a systematic way of deriving the classical effective action to arbitrary orders in perturbation theory. Using a proper renormalization prescription to handle logarithmic divergencies arising at all orders in the perturbation series, we recover the first order renormalization group equation of ref. [1] plus an infinite set of higher order equations. We show the consistency of the higher order equations with the first order one and hence interpret the first order result as an exact RG flow equation in the classical theory.
NASA Technical Reports Server (NTRS)
Zeng, X. C.; Stroud, D.
1989-01-01
The previously developed Ginzburg-Landau theory for calculating the crystal-melt interfacial tension of bcc elements to treat the classical one-component plasma (OCP), the charged fermion system, and the Bose crystal. For the OCP, a direct application of the theory of Shih et al. (1987) yields for the surface tension 0.0012(Z-squared e-squared/a-cubed), where Ze is the ionic charge and a is the radius of the ionic sphere. Bose crystal-melt interface is treated by a quantum extension of the classical density-functional theory, using the Feynman formalism to estimate the relevant correlation functions. The theory is applied to the metastable He-4 solid-superfluid interface at T = 0, with a resulting surface tension of 0.085 erg/sq cm, in reasonable agreement with the value extrapolated from the measured surface tension of the bcc solid in the range 1.46-1.76 K. These results suggest that the density-functional approach is a satisfactory mean-field theory for estimating the equilibrium properties of liquid-solid interfaces, given knowledge of the uniform phases.
NASA Astrophysics Data System (ADS)
Brynjolfsson, Ari
2002-04-01
Einstein's general theory of relativity assumes that photons don't change frequency as they move from Sun to Earth. This assumption is correct in classical physics. All experiments proving the general relativity are in the domain of classical physics. This include the tests by Pound et al. of the gravitational redshift of 14.4 keV photons; the rocket experiments by Vessot et al.; the Galileo solar redshift experiments by Krisher et al.; the gravitational deflection of light experiments by Riveros and Vucetich; and delay of echoes of radar signals passing close to Sun as observed by Shapiro et al. Bohr's correspondence principle assures that quantum mechanical theory of general relativity agrees with Einstein's classical theory when frequency and gravitational field gradient approach zero, or when photons cannot interact with the gravitational field. When we treat photons as quantum mechanical particles; we find that gravitational force on photons is reversed (antigravity). This modified theory contradicts the equivalence principle, but is consistent with all experiments. Solar lines and distant stars are redshifted in accordance with author's plasma redshift theory. These changes result in a beautiful consistent cosmology.
Applying metapopulation theory to conservation of migratory birds
Esler, Daniel N.
2000-01-01
Metapopulation theory has proven useful for understanding the population structure and dynamics of many species of conservation concern. The metapopulation concept has been applied almost exclusively to nonmigratory species, however, for which subpopulation demographic independence—a requirement for a classically defined metapopulation - is explicitly related to geographic distribution and dispersal probabilities. Defining the degree of demographic independence among subpopulations of migratory animals, and thus the applicability of metapopulation theory as a conceptual framework for understanding population dynamics, is much more difficult. Unlike nonmigratory species, subpopulations of migratory animals cannot be defined as synonymous with geographic areas. Groups of migratory birds that are geographically separate at one part of the annual cycle may occur together at others, but co-occurrence in time and space does not preclude the demographic independence of subpopulations. I suggest that metapopulation theory can be applied to migratory species but that understanding the degree of subpopulation independence may require information about both spatial distribution throughout the annual cycle and behavioral mechanisms that may lead to subpopulation demographic independence. The key for applying metapopulation theory to migratory animals lies in identifying demographically independent subpopulations, even as they move during the annual cycle and potentially co-occur with other subpopulations. Using examples of migratory bird species, I demonstrate that spatial and temporal modes of subpopulation independence can interact with behavioral mechanisms to create demographically independent subpopulations, including cases in which subpopulations are not spatially distinct in some parts of the annual cycle.
ERIC Educational Resources Information Center
Wilson, Mark; Allen, Diane D.; Li, Jun Corser
2006-01-01
This paper compares the approach and resultant outcomes of item response models (IRMs) and classical test theory (CTT). First, it reviews basic ideas of CTT, and compares them to the ideas about using IRMs introduced in an earlier paper. It then applies a comparison scheme based on the AERA/APA/NCME "Standards for Educational and…
ERIC Educational Resources Information Center
Culpepper, Steven Andrew
2013-01-01
A classic topic in the fields of psychometrics and measurement has been the impact of the number of scale categories on test score reliability. This study builds on previous research by further articulating the relationship between item response theory (IRT) and classical test theory (CTT). Equations are presented for comparing the reliability and…
ERIC Educational Resources Information Center
Mason, Brandon; Smithey, Martha
2012-01-01
This study examines Merton's Classical Strain Theory (1938) as a causative factor in intimate partner violence among college students. We theorize that college students experience general life strain and cumulative strain as they pursue the goal of a college degree. We test this strain on the likelihood of using intimate partner violence. Strain…
ERIC Educational Resources Information Center
Schlingman, Wayne M.; Prather, Edward E.; Wallace, Colin S.; Brissenden, Gina; Rudolph, Alexander L.
2012-01-01
This paper is the first in a series of investigations into the data from the recent national study using the Light and Spectroscopy Concept Inventory (LSCI). In this paper, we use classical test theory to form a framework of results that will be used to evaluate individual item difficulties, item discriminations, and the overall reliability of the…
Classical closure theory and Lam's interpretation of epsilon-RNG
NASA Technical Reports Server (NTRS)
Zhou, YE
1995-01-01
Lam's phenomenological epsilon-renormalization group (RNG) model is quite different from the other members of that group. It does not make use of the correspondence principle and the epsilon-expansion procedure. We demonstrate that Lam's epsilon-RNG model is essentially the physical space version of the classical closure theory in spectral space and consider the corresponding treatment of the eddy viscosity and energy backscatter.
New variables for classical and quantum gravity
NASA Technical Reports Server (NTRS)
Ashtekar, Abhay
1986-01-01
A Hamiltonian formulation of general relativity based on certain spinorial variables is introduced. These variables simplify the constraints of general relativity considerably and enable one to imbed the constraint surface in the phase space of Einstein's theory into that of Yang-Mills theory. The imbedding suggests new ways of attacking a number of problems in both classical and quantum gravity. Some illustrative applications are discussed.
ERIC Educational Resources Information Center
Sussman, Joshua; Beaujean, A. Alexander; Worrell, Frank C.; Watson, Stevie
2013-01-01
Item response models (IRMs) were used to analyze Cross Racial Identity Scale (CRIS) scores. Rasch analysis scores were compared with classical test theory (CTT) scores. The partial credit model demonstrated a high goodness of fit and correlations between Rasch and CTT scores ranged from 0.91 to 0.99. CRIS scores are supported by both methods.…
Conveying the Complex: Updating U.S. Joint Systems Analysis Doctrine with Complexity Theory
2013-12-10
screech during a public address, or sustain and amplify it during a guitar solo. Since the systems are nonlinear, understanding cause and effect... Classics , 2007), 12. 34 those frames.58 A technique to cope with the potentially confusing...Reynolds, Paul Davidson. A Primer in Theory Construction. Boston: Allyn and Bacon Classics , 2007. Riolo, Rick L. “The Effects and Evolution of Tag
Quantum kinetic theory of the filamentation instability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bret, A.; Haas, F.
2011-07-15
The quantum electromagnetic dielectric tensor for a multi-species plasma is re-derived from the gauge-invariant Wigner-Maxwell system and presented under a form very similar to the classical one. The resulting expression is then applied to a quantum kinetic theory of the electromagnetic filamentation instability. Comparison is made with the quantum fluid theory including a Bohm pressure term and with the cold classical plasma result. A number of analytical expressions are derived for the cutoff wave vector, the largest growth rate, and the most unstable wave vector.
A classical density-functional theory for describing water interfaces.
Hughes, Jessica; Krebs, Eric J; Roundy, David
2013-01-14
We develop a classical density functional for water which combines the White Bear fundamental-measure theory (FMT) functional for the hard sphere fluid with attractive interactions based on the statistical associating fluid theory variable range (SAFT-VR). This functional reproduces the properties of water at both long and short length scales over a wide range of temperatures and is computationally efficient, comparable to the cost of FMT itself. We demonstrate our functional by applying it to systems composed of two hard rods, four hard rods arranged in a square, and hard spheres in water.
Geometry, topology, and string theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Varadarajan, Uday
A variety of scenarios are considered which shed light upon the uses and limitations of classical geometric and topological notions in string theory. The primary focus is on situations in which D-brane or string probes of a given classical space-time see the geometry quite differently than one might naively expect. In particular, situations in which extra dimensions, non-commutative geometries as well as other non-local structures emerge are explored in detail. Further, a preliminary exploration of such issues in Lorentzian space-times with non-trivial causal structures within string theory is initiated.
Generalizing the ADM computation to quantum field theory
NASA Astrophysics Data System (ADS)
Mora, P. J.; Tsamis, N. C.; Woodard, R. P.
2012-01-01
The absence of recognizable, low energy quantum gravitational effects requires that some asymptotic series expansion be wonderfully accurate, but the correct expansion might involve logarithms or fractional powers of Newton’s constant. That would explain why conventional perturbation theory shows uncontrollable ultraviolet divergences. We explore this possibility in the context of the mass of a charged, gravitating scalar. The classical limit of this system was solved exactly in 1960 by Arnowitt, Deser and Misner, and their solution does exhibit nonanalytic dependence on Newton’s constant. We derive an exact functional integral representation for the mass of the quantum field theoretic system, and then develop an alternate expansion for it based on a correct implementation of the method of stationary phase. The new expansion entails adding an infinite class of new diagrams to each order and subtracting them from higher orders. The zeroth-order term of the new expansion has the physical interpretation of a first quantized Klein-Gordon scalar which forms a bound state in the gravitational and electromagnetic potentials sourced by its own probability current. We show that such bound states exist and we obtain numerical results for their masses.
Multiple outer-shell ionization effect in inner-shell x-ray production by light ions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lapicki, G.; Mehta, R.; Duggan, J.L.
1986-11-01
L-shell x-ray production cross sections by 0.25--2.5-MeV /sub 2//sup 4/He/sup +/ ions in /sub 28/Ni, /sub 29/Cu, /sub 32/Ge, /sub 33/As, /sub 37/Rb, /sub 38/Sr, /sub 39/Y, /sub 40/Zr, and /sub 46/Pd are reported. The data are compared to the first Born approximation and the ECPSSR theory that accounts for the projectile energy loss (E) and Coulomb deflection (C) as well as the perturbed-stationary-state (PSS) and relativistic (R) effects in the treatment of the target L-shell electron. Surprisingly, the first Born approximation appears to converge to the data while the ECPSSR predictions underestimate them in the low-velocity limit. This ismore » explained as the result of improper use of single-hole fluorescence yields. A heuristic formula is proposed to account for multiple ionizations in terms of a classical probability for these phenomena and, after it is applied, the ECPSSR theory of L-shell ionization is found to be in good agreement with the data.« less
NASA Technical Reports Server (NTRS)
Hanks, G. W.; Shomber, H. A.; Dethman, H. A.; Gratzer, L. B.; Maeshiro, A.; Gangsaas, D.; Blight, J. D.; Buchan, S. M.; Crumb, C. B.; Dorwart, R. J.
1981-01-01
An active controls technology (ACT) system architecture was selected based on current technology system elements and optimal control theory was evaluated for use in analyzing and synthesizing ACT multiple control laws. The system selected employs three redundant computers to implement all of the ACT functions, four redundant smaller computers to implement the crucial pitch-augmented stability function, and a separate maintenance and display computer. The reliability objective of probability of crucial function failure of less than 1 x 10 to the -9th power per flight of 1 hr can be met with current technology system components, if the software is assumed fault free and coverage approaching 1.0 can be provided. The optimal control theory approach to ACT control law synthesis yielded comparable control law performance much more systematically and directly than the classical s-domain approach. The ACT control law performance, although somewhat degraded by the inclusion of representative nonlinearities, remained quite effective. Certain high-frequency gust-load alleviation functions may require increased surface rate capability.
Quantum-like dynamics of decision-making
NASA Astrophysics Data System (ADS)
Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu
2012-03-01
In cognitive psychology, some experiments for games were reported, and they demonstrated that real players did not use the “rational strategy” provided by classical game theory and based on the notion of the Nasch equilibrium. This psychological phenomenon was called the disjunction effect. Recently, we proposed a model of decision making which can explain this effect (“irrationality” of players) Asano et al. (2010, 2011) [23,24]. Our model is based on the mathematical formalism of quantum mechanics, because psychological fluctuations inducing the irrationality are formally represented as quantum fluctuations Asano et al. (2011) [55]. In this paper, we reconsider the process of quantum-like decision-making more closely and redefine it as a well-defined quantum dynamics by using the concept of lifting channel, which is an important concept in quantum information theory. We also present numerical simulation for this quantum-like mental dynamics. It is non-Markovian by its nature. Stabilization to the steady state solution (determining subjective probabilities for decision making) is based on the collective effect of mental fluctuations collected in the working memory of a decision maker.
Probabilistic Sizing and Verification of Space Ceramic Structures
NASA Astrophysics Data System (ADS)
Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit
2012-07-01
Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nihill, Kevin J.; Hund, Zachary M.; Sibener, S. J., E-mail: s-sibener@uchicago.edu
2016-08-28
Fundamental details concerning the interaction between H{sub 2} and CH{sub 3}–Si(111) have been elucidated by the combination of diffractive scattering experiments and electronic structure and scattering calculations. Rotationally inelastic diffraction (RID) of H{sub 2} and D{sub 2} from this model hydrocarbon-decorated semiconductor interface has been confirmed for the first time via both time-of-flight and diffraction measurements, with modest j = 0 → 2 RID intensities for H{sub 2} compared to the strong RID features observed for D{sub 2} over a large range of kinematic scattering conditions along two high-symmetry azimuthal directions. The Debye-Waller model was applied to the thermal attenuationmore » of diffraction peaks, allowing for precise determination of the RID probabilities by accounting for incoherent motion of the CH{sub 3}–Si(111) surface atoms. The probabilities of rotationally inelastic diffraction of H{sub 2} and D{sub 2} have been quantitatively evaluated as a function of beam energy and scattering angle, and have been compared with complementary electronic structure and scattering calculations to provide insight into the interaction potential between H{sub 2} (D{sub 2}) and hence the surface charge density distribution. Specifically, a six-dimensional potential energy surface (PES), describing the electronic structure of the H{sub 2}(D{sub 2})/CH{sub 3}−Si(111) system, has been computed based on interpolation of density functional theory energies. Quantum and classical dynamics simulations have allowed for an assessment of the accuracy of the PES, and subsequently for identification of the features of the PES that serve as classical turning points. A close scrutiny of the PES reveals the highly anisotropic character of the interaction potential at these turning points. This combination of experiment and theory provides new and important details about the interaction of H{sub 2} with a hybrid organic-semiconductor interface, which can be used to further investigate energy flow in technologically relevant systems.« less
Extended Importance Sampling for Reliability Analysis under Evidence Theory
NASA Astrophysics Data System (ADS)
Yuan, X. K.; Chen, B.; Zhang, B. Q.
2018-05-01
In early engineering practice, the lack of data and information makes uncertainty difficult to deal with. However, evidence theory has been proposed to handle uncertainty with limited information as an alternative way to traditional probability theory. In this contribution, a simulation-based approach, called ‘Extended importance sampling’, is proposed based on evidence theory to handle problems with epistemic uncertainty. The proposed approach stems from the traditional importance sampling for reliability analysis under probability theory, and is developed to handle the problem with epistemic uncertainty. It first introduces a nominal instrumental probability density function (PDF) for every epistemic uncertainty variable, and thus an ‘equivalent’ reliability problem under probability theory is obtained. Then the samples of these variables are generated in a way of importance sampling. Based on these samples, the plausibility and belief (upper and lower bounds of probability) can be estimated. It is more efficient than direct Monte Carlo simulation. Numerical and engineering examples are given to illustrate the efficiency and feasible of the proposed approach.
Psychophysics of the probability weighting function
NASA Astrophysics Data System (ADS)
Takahashi, Taiki
2011-03-01
A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (0<α<1 and w(0)=1,w(
Deriving Laws from Ordering Relations
NASA Technical Reports Server (NTRS)
Knuth, Kevin H.
2004-01-01
The effect of Richard T. Cox's contribution to probability theory was to generalize Boolean implication among logical statements to degrees of implication, which are manipulated using rules derived from consistency with Boolean algebra. These rules are known as the sum rule, the product rule and Bayes Theorem, and the measure resulting from this generalization is probability. In this paper, I will describe how Cox s technique can be further generalized to include other algebras and hence other problems in science and mathematics. The result is a methodology that can be used to generalize an algebra to a calculus by relying on consistency with order theory to derive the laws of the calculus. My goals are to clear up the mysteries as to why the same basic structure found in probability theory appears in other contexts, to better understand the foundations of probability theory, and to extend these ideas to other areas by developing new mathematics and new physics. The relevance of this methodology will be demonstrated using examples from probability theory, number theory, geometry, information theory, and quantum mechanics.
Semiclassical theory of electronically nonadiabatic transitions in molecular collision processes
NASA Technical Reports Server (NTRS)
Lam, K. S.; George, T. F.
1979-01-01
An introductory account of the semiclassical theory of the S-matrix for molecular collision processes is presented, with special emphasis on electronically nonadiabatic transitions. This theory is based on the incorporation of classical mechanics with quantum superposition, and in practice makes use of the analytic continuation of classical mechanics into the complex space of time domain. The relevant concepts of molecular scattering theory and related dynamical models are described and the formalism is developed and illustrated with simple examples - collinear collision of the A+BC type. The theory is then extended to include the effects of laser-induced nonadiabatic transitions. Two bound continuum processes collisional ionization and collision-induced emission also amenable to the same general semiclassical treatment are discussed.
Classical theory of atomic collisions - The first hundred years
NASA Astrophysics Data System (ADS)
Grujić, Petar V.
2012-05-01
Classical calculations of the atomic processes started in 1911 with famous Rutherford's evaluation of the differential cross section for α particles scattered on foil atoms [1]. The success of these calculations was soon overshadowed by the rise of Quantum Mechanics in 1925 and its triumphal success in describing processes at the atomic and subatomic levels. It was generally recognized that the classical approach should be inadequate and it was neglected until 1953, when the famous paper by Gregory Wannier appeared, in which the threshold law for the single ionization cross section behaviour by electron impact was derived. All later calculations and experimental studies confirmed the law derived by purely classical theory. The next step was taken by Ian Percival and collaborators in 60s, who developed a general classical three-body computer code, which was used by many researchers in evaluating various atomic processes like ionization, excitation, detachment, dissociation, etc. Another approach was pursued by Michal Gryzinski from Warsaw, who started a far reaching programme for treating atomic particles and processes as purely classical objects [2]. Though often criticized for overestimating the domain of the classical theory, results of his group were able to match many experimental data. Belgrade group was pursuing the classical approach using both analytical and numerical calculations, studying a number of atomic collisions, in particular near-threshold processes. Riga group, lead by Modris Gailitis [3], contributed considerably to the field, as it was done by Valentin Ostrovsky and coworkers from Sanct Petersbourg, who developed powerful analytical methods within purely classical mechanics [4]. We shall make an overview of these approaches and show some of the remarkable results, which were subsequently confirmed by semiclassical and quantum mechanical calculations, as well as by the experimental evidence. Finally we discuss the theoretical and epistemological background of the classical calculations and explain why these turned out so successful, despite the essentially quantum nature of the atomic and subatomic systems.
NASA Technical Reports Server (NTRS)
Ioannou, Petros J.; Lindzen, Richard S.
1993-01-01
Classical tidal theory is applied to the atmospheres of the outer planets. The tidal geopotential due to satellites of the outer planets is discussed, and the solution of Laplace's tidal equation for Hough modes appropriate to tides on the outer planets is examined. The vertical structure of tidal modes is described, noting that only relatively high-order meridional mode numbers can propagate vertically with growing amplitude. Expected magnitudes for tides in the visible atmosphere of Jupiter are discussed. The classical theory is extended to planetary interiors taking the effects of spherically and self-gravity into account. The thermodynamic structure of Jupiter is described and the WKB theory of the vertical structure equation is presented. The regions for which inertial, gravity, and acoustic oscillations are possible are delineated. The case of a planet with a neutral interior is treated, discussing the various atmospheric boundary conditions and showing that the tidal response is small.
De Tiège, Alexis; Van de Peer, Yves; Braeckman, Johan; Tanghe, Koen B
2017-11-22
Although classical evolutionary theory, i.e., population genetics and the Modern Synthesis, was already implicitly 'gene-centred', the organism was, in practice, still generally regarded as the individual unit of which a population is composed. The gene-centred approach to evolution only reached a logical conclusion with the advent of the gene-selectionist or gene's eye view in the 1960s and 1970s. Whereas classical evolutionary theory can only work with (genotypically represented) fitness differences between individual organisms, gene-selectionism is capable of working with fitness differences among genes within the same organism and genome. Here, we explore the explanatory potential of 'intra-organismic' and 'intra-genomic' gene-selectionism, i.e., of a behavioural-ecological 'gene's eye view' on genetic, genomic and organismal evolution. First, we give a general outline of the framework and how it complements the-to some extent-still 'organism-centred' approach of classical evolutionary theory. Secondly, we give a more in-depth assessment of its explanatory potential for biological evolution, i.e., for Darwin's 'common descent with modification' or, more specifically, for 'historical continuity or homology with modular evolutionary change' as it has been studied by evolutionary developmental biology (evo-devo) during the last few decades. In contrast with classical evolutionary theory, evo-devo focuses on 'within-organism' developmental processes. Given the capacity of gene-selectionism to adopt an intra-organismal gene's eye view, we outline the relevance of the latter model for evo-devo. Overall, we aim for the conceptual integration between the gene's eye view on the one hand, and more organism-centred evolutionary models (both classical evolutionary theory and evo-devo) on the other.
Clerc, Daryl G
2016-07-21
An ab initio approach was used to study the molecular-level interactions that connect gene-mutation to changes in an organism׳s phenotype. The study provides new insights into the evolutionary process and presents a simplification whereby changes in phenotypic properties may be studied in terms of the binding affinities of the chemical interactions affected by mutation, rather than by correlation to the genes. The study also reports the role that nonlinear effects play in the progression of organs, and how those effects relate to the classical theory of evolution. Results indicate that the classical theory of evolution occurs as a special case within the ab initio model - a case having two attributes. The first attribute: proteins and promoter regions are not shared among organs. The second attribute: continuous limiting behavior exists in the physical properties of organs as well as in the binding affinity of the associated chemical interactions, with respect to displacements in the chemical properties of proteins and promoter regions induced by mutation. Outside of the special case, second-order coupling contributions are significant and nonlinear effects play an important role, a result corroborated by analyses of published activity levels in binding and transactivation assays. Further, gradations in the state of perfection of an organ may be small or large depending on the type of mutation, and not necessarily closely-separated as maintained by the classical theory. Results also indicate that organs progress with varying degrees of interdependence, the likelihood of successful mutation decreases with increasing complexity of the affected chemical system, and differences between the ab initio model and the classical theory increase with increasing complexity of the organism. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.
The Basics: What's Essential about Theory for Community Development Practice?
ERIC Educational Resources Information Center
Hustedde, Ronald J.; Ganowicz, Jacek
2002-01-01
Relates three classical theories (structural functionalism, conflict theory, symbolic interactionism) to fundamental concerns of community development (structure, power, and shared meaning). Links these theories to Giddens' structuration theory, which connects macro and micro structures and community influence on change through cultural norms.…
Lindenblatt, G.; Silny, J.
2006-01-01
Leakage currents, tiny currents flowing from an everyday-life appliance through the body to the ground, can cause a non-adequate perception (called electrocutaneous sensation, ECS) or even pain and should be avoided. Safety standards for low-frequency range are based on experimental results of current thresholds of electrocutaneous sensations, which however show a wide range between about 50 μA (rms) and 1000 μA (rms). In order to be able to explain these differences, the perception threshold was measured repeatedly in experiments with test persons under identical experimental setup, but by means of different methods (measuring strategies), namely: direct adjustment, classical threshold as amperage of 50% perception probability, and confidence rating procedure of signal detection theory. The current is injected using a 1 cm2 electrode at the highly touch sensitive part of the index fingertip. These investigations show for the first time that the threshold of electrocutaneous sensations is influenced both by adaptation to the non-adequate stimulus and individual, emotional factors. Therefore, classical methods, on which the majority of the safety investigations are based, cannot be used to determine a leakage current threshold. The confidence rating procedure of the modern signal detection theory yields a value of 179.5 μA (rms) at 50 Hz power supply net frequency as the lower end of the 95% confidence range considering the variance in the investigated group. This value is expected to be free of adaptation influences, and is distinctly lower than the European limits and supports the stricter regulations of Canada and USA. PMID:17111461
Linear flavor-wave theory for fully antisymmetric SU(N ) irreducible representations
NASA Astrophysics Data System (ADS)
Kim, Francisco H.; Penc, Karlo; Nataf, Pierre; Mila, Frédéric
2017-11-01
The extension of the linear flavor-wave theory to fully antisymmetric irreducible representations (irreps) of SU (N ) is presented in order to investigate the color order of SU (N ) antiferromagnetic Heisenberg models in several two-dimensional geometries. The square, triangular, and honeycomb lattices are considered with m fermionic particles per site. We present two different methods: the first method is the generalization of the multiboson spin-wave approach to SU (N ) which consists of associating a Schwinger boson to each state on a site. The second method adopts the Read and Sachdev bosons which are an extension of the Schwinger bosons that introduces one boson for each color and each line of the Young tableau. The two methods yield the same dispersing modes, a good indication that they properly capture the semiclassical fluctuations, but the first one leads to spurious flat modes of finite frequency not present in the second one. Both methods lead to the same physical conclusions otherwise: long-range Néel-type order is likely for the square lattice for SU(4) with two particles per site, but quantum fluctuations probably destroy order for more than two particles per site, with N =2 m . By contrast, quantum fluctuations always lead to corrections larger than the classical order parameter for the tripartite triangular lattice (with N =3 m ) or the bipartite honeycomb lattice (with N =2 m ) for more than one particle per site, m >1 , making the presence of color very unlikely except maybe for m =2 on the honeycomb lattice, for which the correction is only marginally larger than the classical order parameter.
Introduction of a Classical Level in Quantum Theory
NASA Astrophysics Data System (ADS)
Prosperi, G. M.
2016-11-01
In an old paper of our group in Milano a formalism was introduced for the continuous monitoring of a system during a certain interval of time in the framework of a somewhat generalized approach to quantum mechanics (QM). The outcome was a distribution of probability on the space of all the possible continuous histories of a set of quantities to be considered as a kind of coarse grained approximation to some ordinary quantum observables commuting or not. In fact the main aim was the introduction of a classical level in the context of QM, treating formally a set of basic quantities, to be considered as beables in the sense of Bell, as continuously taken under observation. However the effect of such assumption was a permanent modification of the Liouville-von Neumann equation for the statistical operator by the introduction of a dissipative term which is in conflict with basic conservation rules in all reasonable models we had considered. Difficulties were even encountered for a relativistic extension of the formalism. In this paper I propose a modified version of the original formalism which seems to overcome both difficulties. First I study the simple models of an harmonic oscillator and a free scalar field in which a coarse grain position and a coarse grained field respectively are treated as beables. Then I consider the more realistic case of spinor electrodynamics in which only certain coarse grained electric and magnetic fields are introduced as classical variables and no matter related quantities.
On classical and quantum dynamics of tachyon-like fields and their cosmological implications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dimitrijević, Dragoljub D., E-mail: ddrag@pmf.ni.ac.rs; Djordjević, Goran S., E-mail: ddrag@pmf.ni.ac.rs; Milošević, Milan, E-mail: ddrag@pmf.ni.ac.rs
2014-11-24
We consider a class of tachyon-like potentials, motivated by string theory, D-brane dynamics and inflation theory in the context of classical and quantum mechanics. A formalism for describing dynamics of tachyon fields in spatially homogenous and one-dimensional - classical and quantum mechanical limit is proposed. A few models with concrete potentials are considered. Additionally, possibilities for p-adic and adelic generalization of these models are discussed. Classical actions and corresponding quantum propagators, in the Feynman path integral approach, are calculated in a form invariant on a change of the background number fields, i.e. on both archimedean and nonarchimedean spaces. Looking formore » a quantum origin of inflation, relevance of p-adic and adelic generalizations are briefly discussed.« less
The Classical Theory of Light Colors: a Paradigm for Description of Particle Interactions
NASA Astrophysics Data System (ADS)
Mazilu, Nicolae; Agop, Maricel; Gatu, Irina; Iacob, Dan Dezideriu; Butuc, Irina; Ghizdovat, Vlad
2016-06-01
The color is an interaction property: of the interaction of light with matter. Classically speaking it is therefore akin to the forces. But while forces engendered the mechanical view of the world, the colors generated the optical view. One of the modern concepts of interaction between the fundamental particles of matter - the quantum chromodynamics - aims to fill the gap between mechanics and optics, in a specific description of strong interactions. We show here that this modern description of the particle interactions has ties with both the classical and quantum theories of light, regardless of the connection between forces and colors. In a word, the light is a universal model in the description of matter. The description involves classical Yang-Mills fields related to color.
NASA Astrophysics Data System (ADS)
Ivanov, Sergey V.; Buzykin, Oleg G.
2016-12-01
A classical approach is applied to calculate pressure broadening coefficients of CO2 vibration-rotational spectral lines perturbed by Ar. Three types of spectra are examined: electric dipole (infrared) absorption; isotropic and anisotropic Raman Q branches. Simple and explicit formulae of the classical impact theory are used along with exact 3D Hamilton equations for CO2-Ar molecular motion. The calculations utilize vibrationally independent most accurate ab initio potential energy surface (PES) of Hutson et al. expanded in Legendre polynomial series up to lmax = 24. New improved algorithm of classical rotational frequency selection is applied. The dependences of CO2 half-widths on rotational quantum number J up to J=100 are computed for the temperatures between 77 and 765 K and compared with available experimental data as well as with the results of fully quantum dynamical calculations performed on the same PES. To make the picture complete, the predictions of two independent variants of the semi-classical Robert-Bonamy formalism for dipole absorption lines are included. This method. however, has demonstrated poor accuracy almost for all temperatures. On the contrary, classical broadening coefficients are in excellent agreement both with measurements and with quantum results at all temperatures. The classical impact theory in its present variant is capable to produce quickly and accurately the pressure broadening coefficients of spectral lines of linear molecules for any J value (including high Js) using full-dimensional ab initio - based PES in the cases where other computational methods are either extremely time consuming (like the quantum close coupling method) or give erroneous results (like semi-classical methods).
NASA Astrophysics Data System (ADS)
Takatsuka, Kazuo; Seko, Chihiro
1996-12-01
The validity of the physical premise of the Rice-Ramsperger-Kassel-Marcus (RRKM) theory is investigated in terms of the classical dynamics of isomerization reaction in Ar7-like molecules (clusters). The passage times of classical trajectories through the potential basins of isomers in the structural transitions are examined. In the high energy region corresponding to the so-called liquidlike phase, remarkable uniformity of the average passage times has been found. That is, the average passage time is characterized only by a basin through which a trajectory is currently passing and, hence, does not depend on the next visiting basins. This behavior is out of accord with the ordinary chemical law in that the ``reaction rates'' do not seem to depend on the height of the individual potential barriers. We ascribe this seemingly strange uniformity to the strong mixing (chaos) lying behind the rate process. That is, as soon as a classical path enters a basin, it gets involved into a chaotic zone in which many paths having different channels are entangled among each other, and effectively (in the statistical sense) loses its memory about which basin it came from and where it should visit next time. This model is verified by confirming that the populations of the lifetime of transition from one basin to others are expressed in exponential functions, which should have very similar exponents to each other in each passing-through basin. The inverse of the exponent is essentially proportional to the average passage time, and consequently brings about the uniformity. These populations set a foundation for the multichannel generalization of the RRKM theory. Two cases of the non-RRKM behaviors have been studied. One is a nonstatistical behavior in the low energy region such as the so-called coexistence phase. The other is the short-time behavior. It is well established [M. Berblinger and C. Schlier, J. Chem. Phys. 101, 4750 (1994)] that in a relatively simple and small system such as H+3, the so-called direct paths, which lead to dissociation before the phase-space mixing is completed, increase the probability of short-time passage. In contrast, we have found in our Ar7-like molecules that trajectories of short passage time are fewer than expected by the statistical theory. It is conceived that somewhat a long time in the initial stage of the isomerization is spent by a trajectory to find its ways out to the next basins.
The polymer physics of single DNA confined in nanochannels.
Dai, Liang; Renner, C Benjamin; Doyle, Patrick S
2016-06-01
In recent years, applications and experimental studies of DNA in nanochannels have stimulated the investigation of the polymer physics of DNA in confinement. Recent advances in the physics of confined polymers, using DNA as a model polymer, have moved beyond the classic Odijk theory for the strong confinement, and the classic blob theory for the weak confinement. In this review, we present the current understanding of the behaviors of confined polymers while briefly reviewing classic theories. Three aspects of confined DNA are presented: static, dynamic, and topological properties. The relevant simulation methods are also summarized. In addition, comparisons of confined DNA with DNA under tension and DNA in semidilute solution are made to emphasize universal behaviors. Finally, an outlook of the possible future research for confined DNA is given. Copyright © 2015 Elsevier B.V. All rights reserved.
Classical and non-classical effective medium theories: New perspectives
NASA Astrophysics Data System (ADS)
Tsukerman, Igor
2017-05-01
Future research in electrodynamics of periodic electromagnetic composites (metamaterials) can be expected to produce sophisticated homogenization theories valid for any composition and size of the lattice cell. The paper outlines a promising path in that direction, leading to non-asymptotic and nonlocal homogenization models, and highlights aspects of homogenization that are often overlooked: the finite size of the sample and the role of interface boundaries. Classical theories (e.g. Clausius-Mossotti, Maxwell Garnett), while originally derived from a very different set of ideas, fit well into the proposed framework. Nonlocal effects can be included in the model, making an order-of-magnitude accuracy improvements possible. One future challenge is to determine what effective parameters can or cannot be obtained for a given set of constituents of a metamaterial lattice cell, thereby delineating the possible from the impossible in metamaterial design.
ERIC Educational Resources Information Center
Kim, Sooyeon; Livingston, Samuel A.
2017-01-01
The purpose of this simulation study was to assess the accuracy of a classical test theory (CTT)-based procedure for estimating the alternate-forms reliability of scores on a multistage test (MST) having 3 stages. We generated item difficulty and discrimination parameters for 10 parallel, nonoverlapping forms of the complete 3-stage test and…
Wang, Wei; Takeda, Mitsuo
2006-09-01
A new concept of vector and tensor densities is introduced into the general coherence theory of vector electromagnetic fields that is based on energy and energy-flow coherence tensors. Related coherence conservation laws are presented in the form of continuity equations that provide new insights into the propagation of second-order correlation tensors associated with stationary random classical electromagnetic fields.
Application of ply level analysis to flexural wave propagation
NASA Astrophysics Data System (ADS)
Valisetty, R. R.; Rehfield, L. W.
1988-10-01
A brief survey is presented of the shear deformation theories of laminated plates. It indicates that there are certain non-classical influences that affect bending-related behavior in the same way as do the transverse shear stresses. They include bending- and stretching-related section warping and the concomitant non-classical surface parallel stress contributions and the transverse normal stress. A bending theory gives significantly improved performance if these non-classical affects are incorporated. The heterogeneous shear deformations that are characteristic of laminates with highly dissimilar materials, however, require that attention be paid to the modeling of local rotations. In this paper, it is shown that a ply level analysis can be used to model such disparate shear deformations. Here, equilibrium of each layer is analyzed separately. Earlier applications of this analysis include free-edge laminate stresses. It is now extended to the study of flexural wave propagation in laminates. A recently developed homogeneous plate theory is used as a ply level model. Due consideration is given to the non-classical influences and no shear correction factors are introduced extraneously in this theory. The results for the lowest flexural mode of travelling planar harmonic waves indicate that this approach is competitive and yields better results for certain laminates.
Geometric Theory of Reduction of Nonlinear Control Systems
NASA Astrophysics Data System (ADS)
Elkin, V. I.
2018-02-01
The foundations of a differential geometric theory of nonlinear control systems are described on the basis of categorical concepts (isomorphism, factorization, restrictions) by analogy with classical mathematical theories (of linear spaces, groups, etc.).
Comment on Gallistel: behavior theory and information theory: some parallels.
Nevin, John A
2012-05-01
In this article, Gallistel proposes information theory as an approach to some enduring problems in the study of operant and classical conditioning. Copyright © 2012 Elsevier B.V. All rights reserved.
Representational Realism, Closed Theories and the Quantum to Classical Limit
NASA Astrophysics Data System (ADS)
de Ronde, Christian
In this chapter, we discuss the representational realist stance as a pluralistontic approach to inter-theoretic relationships. Our stance stresses the fact that physical theories require the necessary consideration of a conceptual level of discourse which determines and configures the specific field of phenomena discussed by each particular theory. We will criticize the orthodox line of research which has grounded the analysis about QM in two (Bohrian) metaphysical presuppositions - accepted in the present as dogmas that all interpretations must follow. We will also examine how the orthodox project of "bridging the gap" between the quantum and the classical domains has constrained the possibilities of research, producing only a limited set of interpretational problems which only focus in the justification of "classical reality" and exclude the possibility of analyzing the possibilities of non-classical conceptual representations of QM. The representational realist stance introduces two new problems, namely, the superposition problem and the contextuality problem, which consider explicitly the conceptual representation of orthodox QM beyond the mere reference to mathematical structures and measurement outcomes. In the final part of the chapter, we revisit, from representational realist perspective, the quantum to classical limit and the orthodox claim that this inter-theoretic relation can be explained through the principle of decoherence.
[Discussion on six errors of formulas corresponding to syndromes in using the classic formulas].
Bao, Yan-ju; Hua, Bao-jin
2012-12-01
The theory of formulas corresponding to syndromes is one of the characteristics of Treatise on Cold Damage and Miscellaneous Diseases (Shanghan Zabing Lun) and one of the main principles in applying classic prescriptions. It is important to take effect by following the principle of formulas corresponding to syndromes. However, some medical practitioners always feel that the actual clinical effect is far less than expected. Six errors in the use of classic prescriptions as well as the theory of formulas corresponding to syndromes are the most important causes to be considered, i.e. paying attention only to the local syndromes while neglecting the whole, paying attention only to formulas corresponding to syndromes while neglecting the pathogenesis, paying attention only to syndromes while neglecting the pulse diagnosis, paying attention only to unilateral prescription but neglecting the combined prescriptions, paying attention only to classic prescriptions while neglecting the modern formulas, and paying attention only to the formulas but neglecting the drug dosage. Therefore, not only the patients' clinical syndromes, but also the combination of main syndrome and pathogenesis simultaneously is necessary in the clinical applications of classic prescriptions and the theory of prescription corresponding to syndrome. In addition, comprehensive syndrome differentiation, modern formulas, current prescriptions, combined prescriptions, and drug dosage all contribute to avoid clinical errors and improve clinical effects.
Lagrange multiplier and Wess-Zumino variable as extra dimensions in the torus universe
NASA Astrophysics Data System (ADS)
Nejad, Salman Abarghouei; Dehghani, Mehdi; Monemzadeh, Majid
2018-01-01
We study the effect of the simplest geometry which is imposed via the topology of the universe by gauging non-relativistic particle model on torus and 3-torus with the help of symplectic formalism of constrained systems. Also, we obtain generators of gauge transformations for gauged models. Extracting corresponding Poisson structure of existed constraints, we show the effect of the shape of the universe on canonical structure of phase-spaces of models and suggest some phenomenology to prove the topology of the universe and probable non-commutative structure of the space. In addition, we show that the number of extra dimensions in the phase-spaces of gauged embedded models are exactly two. Moreover, in classical form, we talk over modification of Newton's second law in order to study the origin of the terms appeared in the gauged theory.
Fourier Analysis in Introductory Physics
NASA Astrophysics Data System (ADS)
Huggins, Elisha
2007-01-01
In an after-dinner talk at the fall 2005 meeting of the New England chapter of the AAPT, Professor Robert Arns drew an analogy between classical physics and Classic Coke. To generations of physics teachers and textbook writers, classical physics was the real thing. Modern physics, which in introductory textbooks "appears in one or more extra chapters at the end of the book, … is a divertimento that we might get to if time permits." Modern physics is more like vanilla or lime Coke, probably a fad, while "Classic Coke is part of your life; you do not have to think about it twice."
Extraction of decision rules via imprecise probabilities
NASA Astrophysics Data System (ADS)
Abellán, Joaquín; López, Griselda; Garach, Laura; Castellano, Javier G.
2017-05-01
Data analysis techniques can be applied to discover important relations among features. This is the main objective of the Information Root Node Variation (IRNV) technique, a new method to extract knowledge from data via decision trees. The decision trees used by the original method were built using classic split criteria. The performance of new split criteria based on imprecise probabilities and uncertainty measures, called credal split criteria, differs significantly from the performance obtained using the classic criteria. This paper extends the IRNV method using two credal split criteria: one based on a mathematical parametric model, and other one based on a non-parametric model. The performance of the method is analyzed using a case study of traffic accident data to identify patterns related to the severity of an accident. We found that a larger number of rules is generated, significantly supplementing the information obtained using the classic split criteria.
Influences on and Limitations of Classical Test Theory Reliability Estimates.
ERIC Educational Resources Information Center
Arnold, Margery E.
It is incorrect to say "the test is reliable" because reliability is a function not only of the test itself, but of many factors. The present paper explains how different factors affect classical reliability estimates such as test-retest, interrater, internal consistency, and equivalent forms coefficients. Furthermore, the limits of classical test…
A Comparison of Kinetic Energy and Momentum in Special Relativity and Classical Mechanics
ERIC Educational Resources Information Center
Riggs, Peter J.
2016-01-01
Kinetic energy and momentum are indispensable dynamical quantities in both the special theory of relativity and in classical mechanics. Although momentum and kinetic energy are central to understanding dynamics, the differences between their relativistic and classical notions have not always received adequate treatment in undergraduate teaching.…
A Comparative Analysis of Three Unique Theories of Organizational Learning
ERIC Educational Resources Information Center
Leavitt, Carol C.
2011-01-01
The purpose of this paper is to present three classical theories on organizational learning and conduct a comparative analysis that highlights their strengths, similarities, and differences. Two of the theories -- experiential learning theory and adaptive -- generative learning theory -- represent the thinking of the cognitive perspective, while…
NASA Astrophysics Data System (ADS)
Hwang, Jai-Chan; Noh, Hyerim
2005-03-01
We present cosmological perturbation theory based on generalized gravity theories including string theory correction terms and a tachyonic complication. The classical evolution as well as the quantum generation processes in these varieties of gravity theories are presented in unified forms. These apply both to the scalar- and tensor-type perturbations. Analyses are made based on the curvature variable in two different gauge conditions often used in the literature in Einstein’s gravity; these are the curvature variables in the comoving (or uniform-field) gauge and the zero-shear gauge. Applications to generalized slow-roll inflation and its consequent power spectra are derived in unified forms which include a wide range of inflationary scenarios based on Einstein’s gravity and others.
Infinite derivative gravity: non-singular cosmology & blackhole solutions
NASA Astrophysics Data System (ADS)
Mazumdar, A.
Both Einstein’s theory of General Relativity and Newton’s theory of gravity possess a short distance and small time scale catastrophe. The blackhole singularity and cosmological Big Bang singularity problems highlight that current theories of gravity are incomplete description at early times and small distances. I will discuss how one can potentially resolve these fundamental problems at a classical level and quantum level. In particular, I will discuss infinite derivative theories of gravity, where gravitational interactions become weaker in the ultraviolet, and therefore resolving some of the classical singularities, such as Big Bang and Schwarzschild singularity for compact non-singular objects with mass up to 1025 grams. In this lecture, I will discuss quantum aspects of infinite derivative gravity and discuss few aspects which can make the theory asymptotically free in the UV.
Psychodrama: group psychotherapy through role playing.
Kipper, D A
1992-10-01
The theory and the therapeutic procedure of classical psychodrama are described along with brief illustrations. Classical psychodrama and sociodrama stemmed from role theory, enactments, "tele," the reciprocity of choices, and the theory of spontaneity-robopathy and creativity. The discussion focuses on key concepts such as the therapeutic team, the structure of the session, transference and reality, countertransference, the here-and-now and the encounter, the group-as-a-whole, resistance and difficult clients, and affect and cognition. Also described are the neoclassical approaches of psychodrama, action methods, and clinical role playing, and the significance of the concept of behavioral simulation in group psychotherapy.
Automatic and strategic effects in the guidance of attention by working memory representations
Carlisle, Nancy B.; Woodman, Geoffrey F.
2010-01-01
Theories of visual attention suggest that working memory representations automatically guide attention toward memory-matching objects. Some empirical tests of this prediction have produced results consistent with working memory automatically guiding attention. However, others have shown that individuals can strategically control whether working memory representations guide visual attention. Previous studies have not independently measured automatic and strategic contributions to the interactions between working memory and attention. In this study, we used a classic manipulation of the probability of valid, neutral, and invalid cues to tease apart the nature of such interactions. This framework utilizes measures of reaction time (RT) to quantify the costs and benefits of attending to memory-matching items and infer the relative magnitudes of automatic and strategic effects. We found both costs and benefits even when the memory-matching item was no more likely to be the target than other items, indicating an automatic component of attentional guidance. However, the costs and benefits essentially doubled as the probability of a trial with a valid cue increased from 20% to 80%, demonstrating a potent strategic effect. We also show that the instructions given to participants led to a significant change in guidance distinct from the actual probability of events during the experiment. Together, these findings demonstrate that the influence of working memory representations on attention is driven by both automatic and strategic interactions. PMID:20643386
Automatic and strategic effects in the guidance of attention by working memory representations.
Carlisle, Nancy B; Woodman, Geoffrey F
2011-06-01
Theories of visual attention suggest that working memory representations automatically guide attention toward memory-matching objects. Some empirical tests of this prediction have produced results consistent with working memory automatically guiding attention. However, others have shown that individuals can strategically control whether working memory representations guide visual attention. Previous studies have not independently measured automatic and strategic contributions to the interactions between working memory and attention. In this study, we used a classic manipulation of the probability of valid, neutral, and invalid cues to tease apart the nature of such interactions. This framework utilizes measures of reaction time (RT) to quantify the costs and benefits of attending to memory-matching items and infer the relative magnitudes of automatic and strategic effects. We found both costs and benefits even when the memory-matching item was no more likely to be the target than other items, indicating an automatic component of attentional guidance. However, the costs and benefits essentially doubled as the probability of a trial with a valid cue increased from 20% to 80%, demonstrating a potent strategic effect. We also show that the instructions given to participants led to a significant change in guidance distinct from the actual probability of events during the experiment. Together, these findings demonstrate that the influence of working memory representations on attention is driven by both automatic and strategic interactions. Copyright © 2010 Elsevier B.V. All rights reserved.
Probabilistic teleportation via multi-parameter measurements and partially entangled states
NASA Astrophysics Data System (ADS)
Wei, Jiahua; Shi, Lei; Han, Chen; Xu, Zhiyan; Zhu, Yu; Wang, Gang; Wu, Hao
2018-04-01
In this paper, a novel scheme for probabilistic teleportation is presented with multi-parameter measurements via a non-maximally entangled state. This is in contrast to the fact that the measurement kinds for quantum teleportation are usually particular in most previous schemes. The detail implementation producers for our proposal are given by using of appropriate local unitary operations. Moreover, the total success probability and classical information of this proposal are calculated. It is demonstrated that the success probability and classical cost would be changed with the multi-measurement parameters and the entanglement factor of quantum channel. Our scheme could enlarge the research range of probabilistic teleportation.
Cappelleri, Joseph C.; Lundy, J. Jason; Hays, Ron D.
2014-01-01
Introduction The U.S. Food and Drug Administration’s patient-reported outcome (PRO) guidance document defines content validity as “the extent to which the instrument measures the concept of interest” (FDA, 2009, p. 12). “Construct validity is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity” (Strauss & Smith, 2009, p. 7). Hence both qualitative and quantitative information are essential in evaluating the validity of measures. Methods We review classical test theory and item response theory approaches to evaluating PRO measures including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized “difficulty” (severity) order of items is represented by observed responses. Conclusion Classical test theory and item response theory can be useful in providing a quantitative assessment of items and scales during the content validity phase of patient-reported outcome measures. Depending on the particular type of measure and the specific circumstances, either one or both approaches should be considered to help maximize the content validity of PRO measures. PMID:24811753
Spinning particles, axion radiation, and the classical double copy
NASA Astrophysics Data System (ADS)
Goldberger, Walter D.; Li, Jingping; Prabhu, Siddharth G.
2018-05-01
We extend the perturbative double copy between radiating classical sources in gauge theory and gravity to the case of spinning particles. We construct, to linear order in spins, perturbative radiating solutions to the classical Yang-Mills equations sourced by a set of interacting color charges with chromomagnetic dipole spin couplings. Using a color-to-kinematics replacement rule proposed earlier by one of the authors, these solutions map onto radiation in a theory of interacting particles coupled to massless fields that include the graviton, a scalar (dilaton) ϕ and the Kalb-Ramond axion field Bμ ν. Consistency of the double copy imposes constraints on the parameters of the theory on both the gauge and gravity sides of the correspondence. In particular, the color charges carry a chromomagnetic interaction which, in d =4 , corresponds to a gyromagnetic ratio equal to Dirac's value g =2 . The color-to-kinematics map implies that on the gravity side, the bulk theory of the fields (ϕ ,gμ ν,Bμ ν) has interactions which match those of d -dimensional "string gravity," as is the case both in the BCJ double copy of pure gauge theory scattering amplitudes and the KLT relations between the tree-level S -matrix elements of open and closed string theory.
Lamb wave extraction of dispersion curves in micro/nano-plates using couple stress theories
NASA Astrophysics Data System (ADS)
Ghodrati, Behnam; Yaghootian, Amin; Ghanbar Zadeh, Afshin; Mohammad-Sedighi, Hamid
2018-01-01
In this paper, Lamb wave propagation in a homogeneous and isotropic non-classical micro/nano-plates is investigated. To consider the effect of material microstructure on the wave propagation, three size-dependent models namely indeterminate-, modified- and consistent couple stress theories are used to extract the dispersion equations. In the mentioned theories, a parameter called 'characteristic length' is used to consider the size of material microstructure in the governing equations. To generalize the parametric studies and examine the effect of thickness, propagation wavelength, and characteristic length on the behavior of miniature plate structures, the governing equations are nondimensionalized by defining appropriate dimensionless parameters. Then the dispersion curves for phase and group velocities are plotted in terms of a wide frequency-thickness range to study the lamb waves propagation considering microstructure effects in very high frequencies. According to the illustrated results, it was observed that the couple stress theories in the Cosserat type material predict more rigidity than the classical theory; so that in a plate with constant thickness, by increasing the thickness to characteristic length ratio, the results approach to the classical theory, and by reducing this ratio, wave propagation speed in the plate is significantly increased. In addition, it is demonstrated that for high-frequency Lamb waves, it converges to dispersive Rayleigh wave velocity.
D'Ariano, Giacomo Mauro
2018-07-13
Causality has never gained the status of a 'law' or 'principle' in physics. Some recent literature has even popularized the false idea that causality is a notion that should be banned from theory. Such misconception relies on an alleged universality of the reversibility of the laws of physics, based either on the determinism of classical theory, or on the multiverse interpretation of quantum theory, in both cases motivated by mere interpretational requirements for realism of the theory. Here, I will show that a properly defined unambiguous notion of causality is a theorem of quantum theory, which is also a falsifiable proposition of the theory. Such a notion of causality appeared in the literature within the framework of operational probabilistic theories. It is a genuinely theoretical notion, corresponding to establishing a definite partial order among events, in the same way as we do by using the future causal cone on Minkowski space. The notion of causality is logically completely independent of the misidentified concept of 'determinism', and, being a consequence of quantum theory, is ubiquitous in physics. In addition, as classical theory can be regarded as a restriction of quantum theory, causality holds also in the classical case, although the determinism of the theory trivializes it. I then conclude by arguing that causality naturally establishes an arrow of time. This implies that the scenario of the 'block Universe' and the connected 'past hypothesis' are incompatible with causality, and thus with quantum theory: they are both doomed to remain mere interpretations and, as such, are not falsifiable, similar to the hypothesis of 'super-determinism'.This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).
[Fuzzy logic in urology. How to reason in inaccurate terms].
Vírseda Chamorro, Miguel; Salinas Casado, Jesus; Vázquez Alba, David
2004-05-01
The Occidental thinking is basically binary, based on opposites. The classic logic constitutes a systematization of these thinking. The methods of pure sciences such as physics are based on systematic measurement, analysis and synthesis. Nature is described by deterministic differential equations this way. Medical knowledge does not adjust well to deterministic equations of physics so that probability methods are employed. However, this method is not free of problems, both theoretical and practical, so that it is not often possible even to know with certainty the probabilities of most events. On the other hand, the application of binary logic to medicine in general, and to urology particularly, finds serious difficulties such as the imprecise character of the definition of most diseases and the uncertainty associated with most medical acts. These are responsible for the fact that many medical recommendations are made using a literary language which is inaccurate, inconsistent and incoherent. The blurred logic is a way of reasoning coherently using inaccurate concepts. This logic was proposed by Lofti Zadeh in 1965 and it is based in two principles: the theory of blurred conjuncts and the use of blurred rules. A blurred conjunct is one the elements of which have a degree of belonging between 0 and 1. Each blurred conjunct is associated with an inaccurate property or linguistic variable. Blurred rules use the principles of classic logic adapted to blurred conjuncts taking the degree of belonging of each element to the blurred conjunct of reference as the value of truth. Blurred logic allows to do coherent urologic recommendations (i.e. what patient is the performance of PSA indicated in?, what to do in the face of an elevated PSA?), or to perform diagnosis adapted to the uncertainty of diagnostic tests (e.g. data obtained from pressure flow studies in females).
Poisson-Nernst-Planck-Fermi theory for modeling biological ion channels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Jinn-Liang, E-mail: jinnliu@mail.nhcue.edu.tw; Eisenberg, Bob, E-mail: beisenbe@rush.edu
2014-12-14
A Poisson-Nernst-Planck-Fermi (PNPF) theory is developed for studying ionic transport through biological ion channels. Our goal is to deal with the finite size of particle using a Fermi like distribution without calculating the forces between the particles, because they are both expensive and tricky to compute. We include the steric effect of ions and water molecules with nonuniform sizes and interstitial voids, the correlation effect of crowded ions with different valences, and the screening effect of water molecules in an inhomogeneous aqueous electrolyte. Including the finite volume of water and the voids between particles is an important new part ofmore » the theory presented here. Fermi like distributions of all particle species are derived from the volume exclusion of classical particles. Volume exclusion and the resulting saturation phenomena are especially important to describe the binding and permeation mechanisms of ions in a narrow channel pore. The Gibbs free energy of the Fermi distribution reduces to that of a Boltzmann distribution when these effects are not considered. The classical Gibbs entropy is extended to a new entropy form — called Gibbs-Fermi entropy — that describes mixing configurations of all finite size particles and voids in a thermodynamic system where microstates do not have equal probabilities. The PNPF model describes the dynamic flow of ions, water molecules, as well as voids with electric fields and protein charges. The model also provides a quantitative mean-field description of the charge/space competition mechanism of particles within the highly charged and crowded channel pore. The PNPF results are in good accord with experimental currents recorded in a 10{sup 8}-fold range of Ca{sup 2+} concentrations. The results illustrate the anomalous mole fraction effect, a signature of L-type calcium channels. Moreover, numerical results concerning water density, dielectric permittivity, void volume, and steric energy provide useful details to study a variety of physical mechanisms ranging from binding, to permeation, blocking, flexibility, and charge/space competition of the channel.« less
Soliton Gases and Generalized Hydrodynamics
NASA Astrophysics Data System (ADS)
Doyon, Benjamin; Yoshimura, Takato; Caux, Jean-Sébastien
2018-01-01
We show that the equations of generalized hydrodynamics (GHD), a hydrodynamic theory for integrable quantum systems at the Euler scale, emerge in full generality in a family of classical gases, which generalize the gas of hard rods. In this family, the particles, upon colliding, jump forward or backward by a distance that depends on their velocities, reminiscent of classical soliton scattering. This provides a "molecular dynamics" for GHD: a numerical solver which is efficient, flexible, and which applies to the presence of external force fields. GHD also describes the hydrodynamics of classical soliton gases. We identify the GHD of any quantum model with that of the gas of its solitonlike wave packets, thus providing a remarkable quantum-classical equivalence. The theory is directly applicable, for instance, to integrable quantum chains and to the Lieb-Liniger model realized in cold-atom experiments.
A classical density functional theory of ionic liquids.
Forsman, Jan; Woodward, Clifford E; Trulsson, Martin
2011-04-28
We present a simple, classical density functional approach to the study of simple models of room temperature ionic liquids. Dispersion attractions as well as ion correlation effects and excluded volume packing are taken into account. The oligomeric structure, common to many ionic liquid molecules, is handled by a polymer density functional treatment. The theory is evaluated by comparisons with simulations, with an emphasis on the differential capacitance, an experimentally measurable quantity of significant practical interest.
Complementarity and Correlations
NASA Astrophysics Data System (ADS)
Maccone, Lorenzo; Bruß, Dagmar; Macchiavello, Chiara
2015-04-01
We provide an interpretation of entanglement based on classical correlations between measurement outcomes of complementary properties: States that have correlations beyond a certain threshold are entangled. The reverse is not true, however. We also show that, surprisingly, all separable nonclassical states exhibit smaller correlations for complementary observables than some strictly classical states. We use mutual information as a measure of classical correlations, but we conjecture that the first result holds also for other measures (e.g., the Pearson correlation coefficient or the sum of conditional probabilities).
Optical Correlation Techniques In Fluid Dynamics
NASA Astrophysics Data System (ADS)
Schatzel, K.; Schulz-DuBois, E. O.; Vehrenkamp, R.
1981-05-01
Three flow measurement techniques make use of fast digital correlators. (1) Most widely spread is photon correlation velocimetry using crossed laser beams and detecting Doppler shifted light scattered by small particles in the flow. Depending on the processing of the photon correlogram, this technique yields mean velocity, turbulence level, or even the detailed probability distribution of one velocity component. An improved data processing scheme is demonstrated on laminar vortex flow in a curved channel. (2) Rate correlation based upon threshold crossings of a high pass filtered laser Doppler signal can he used to obtain velocity correlation functions. The most powerful setup developed in our laboratory uses a phase locked loop type tracker and a multibit correlator to analyse time-dependent Taylor vortex flow. With two optical systems and trackers, crosscorrelation functions reveal phase relations between different vortices. (3) Making use of refractive index fluctuations (e. g. in two phase flows) instead of scattering particles, interferometry with bidirectional fringe counting and digital correlation and probability analysis constitute a new quantitative technique related to classical Schlieren methods. Measurements on a mixing flow of heated and cold air contribute new ideas to the theory of turbulent random phase screens.
Optical correlation techniques in fluid dynamics
NASA Astrophysics Data System (ADS)
Schätzel, K.; Schulz-Dubois, E. O.; Vehrenkamp, R.
1981-04-01
Three flow measurement techniques make use of fast digital correlators. The most widely spread is photon correlation velocimetry using crossed laser beams, and detecting Doppler shifted light scattered by small particles in the flow. Depending on the processing of the photon correlation output, this technique yields mean velocity, turbulence level, and even the detailed probability distribution of one velocity component. An improved data processing scheme is demonstrated on laminar vortex flow in a curved channel. In the second method, rate correlation based upon threshold crossings of a high pass filtered laser Doppler signal can be used to obtain velocity correlation functions. The most powerful set-up developed in our laboratory uses a phase locked loop type tracker and a multibit correlator to analyze time-dependent Taylor vortex flow. With two optical systems and trackers, cross-correlation functions reveal phase relations between different vortices. The last method makes use of refractive index fluctuations (eg in two phase flows) instead of scattering particles. Interferometry with bidirectional counting, and digital correlation and probability analysis, constitutes a new quantitative technique related to classical Schlieren methods. Measurements on a mixing flow of heated and cold air contribute new ideas to the theory of turbulent random phase screens.
A network coding based routing protocol for underwater sensor networks.
Wu, Huayang; Chen, Min; Guan, Xin
2012-01-01
Due to the particularities of the underwater environment, some negative factors will seriously interfere with data transmission rates, reliability of data communication, communication range, and network throughput and energy consumption of underwater sensor networks (UWSNs). Thus, full consideration of node energy savings, while maintaining a quick, correct and effective data transmission, extending the network life cycle are essential when routing protocols for underwater sensor networks are studied. In this paper, we have proposed a novel routing algorithm for UWSNs. To increase energy consumption efficiency and extend network lifetime, we propose a time-slot based routing algorithm (TSR).We designed a probability balanced mechanism and applied it to TSR. The theory of network coding is introduced to TSBR to meet the requirement of further reducing node energy consumption and extending network lifetime. Hence, time-slot based balanced network coding (TSBNC) comes into being. We evaluated the proposed time-slot based balancing routing algorithm and compared it with other classical underwater routing protocols. The simulation results show that the proposed protocol can reduce the probability of node conflicts, shorten the process of routing construction, balance energy consumption of each node and effectively prolong the network lifetime.
A Network Coding Based Routing Protocol for Underwater Sensor Networks
Wu, Huayang; Chen, Min; Guan, Xin
2012-01-01
Due to the particularities of the underwater environment, some negative factors will seriously interfere with data transmission rates, reliability of data communication, communication range, and network throughput and energy consumption of underwater sensor networks (UWSNs). Thus, full consideration of node energy savings, while maintaining a quick, correct and effective data transmission, extending the network life cycle are essential when routing protocols for underwater sensor networks are studied. In this paper, we have proposed a novel routing algorithm for UWSNs. To increase energy consumption efficiency and extend network lifetime, we propose a time-slot based routing algorithm (TSR).We designed a probability balanced mechanism and applied it to TSR. The theory of network coding is introduced to TSBR to meet the requirement of further reducing node energy consumption and extending network lifetime. Hence, time-slot based balanced network coding (TSBNC) comes into being. We evaluated the proposed time-slot based balancing routing algorithm and compared it with other classical underwater routing protocols. The simulation results show that the proposed protocol can reduce the probability of node conflicts, shorten the process of routing construction, balance energy consumption of each node and effectively prolong the network lifetime. PMID:22666045
NASA Astrophysics Data System (ADS)
Huyskens, P.; Kapuku, F.; Colemonts-Vandevyvere, C.
1990-09-01
In liquids the partners of H bonds constantly change. As a consequence the entities observed by IR spectroscopy are not the same as those considered for thermodynamic properties. For the latter, the H-bonds are shared by all the molecules. The thermodynamic "monomeric fraction", γ, the time fraction during which an alcohol molecule is vaporizable, is the square root of the spectroscopic monomeric fraction, and is the fraction of molecules which, during a time interval of 10 -14 s, have their hydroxylic proton and their lone pairs free. The classical thermodynamic treatments of Mecke and Prigogine consider the spectroscopic entities as real thermodynamic entities. Opposed to this, the mobile order theory considers all the formal molecules as equal but with a reduction of the entropy due to the fact that during a fraction 1-γ of the time, the OH proton follows a neighbouring oxygen atom on its journey through the liquid. Mobile order theory and classic multicomponent treatment lead, in binary mixtures of the associated substance A with the inert substance S, to expressions of the chemical potentials μ A and μ S that are fundamentally different. However, the differences become very important only when the molar volumes overlineVS and overlineVA differ by a factor larger than 2. As a consequence the equations of the classic theory can still fit the experimental vapour pressure data of mixtures of liquid alcohols and liquid alkanes. However, the solubilities of solid alkanes in water for which overlineVS > 3 overlineVA are only correctly predicted by the mobile order theory.
From Foucault to Freire through Facebook: Toward an Integrated Theory of mHealth
ERIC Educational Resources Information Center
Bull, Sheana; Ezeanochie, Nnamdi
2016-01-01
Objective: To document the integration of social science theory in literature on mHealth (mobile health) and consider opportunities for integration of classic theory, health communication theory, and social networking to generate a relevant theory for mHealth program design. Method: A secondary review of research syntheses and meta-analyses…
Definition of the Neutrosophic Probability
NASA Astrophysics Data System (ADS)
Smarandache, Florentin
2014-03-01
Neutrosophic probability (or likelihood) [1995] is a particular case of the neutrosophic measure. It is an estimation of an event (different from indeterminacy) to occur, together with an estimation that some indeterminacy may occur, and the estimation that the event does not occur. The classical probability deals with fair dice, coins, roulettes, spinners, decks of cards, random works, while neutrosophic probability deals with unfair, imperfect such objects and processes. For example, if we toss a regular die on an irregular surface which has cracks, then it is possible to get the die stuck on one of its edges or vertices in a crack (indeterminate outcome). The sample space is in this case: {1, 2, 3, 4, 5, 6, indeterminacy}. So, the probability of getting, for example 1, is less than 1/6. Since there are seven outcomes. The neutrosophic probability is a generalization of the classical probability because, when the chance of determinacy of a stochastic process is zero, these two probabilities coincide. The Neutrosophic Probability that of an event A occurs is NP (A) = (ch (A) , ch (indetA) , ch (A ̲)) = (T , I , F) , where T , I , F are subsets of [0,1], and T is the chance that A occurs, denoted ch(A); I is the indeterminate chance related to A, ch(indetermA) ; and F is the chance that A does not occur, ch (A ̲) . So, NP is a generalization of the Imprecise Probability as well. If T, I, and F are crisp numbers then: - 0 <= T + I + F <=3+ . We used the same notations (T,I,F) as in neutrosophic logic and set.
Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vourdas, A.
2014-08-15
The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H{sub 1},H{sub 2}), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H{sub 1}),P(H{sub 2}), to the subspacesmore » H{sub 1}, H{sub 2}. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities.« less
Generalizability Theory and Classical Test Theory
ERIC Educational Resources Information Center
Brennan, Robert L.
2011-01-01
Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…
The Giffen Effect: A Note on Economic Purposes.
ERIC Educational Resources Information Center
Williams, William D.
1990-01-01
Describes the Giffen effect: demand for a commodity increases as price increases. Explains how applying control theory eliminates the paradox that the Giffen effect presents to classic economics supply and demand theory. Notes the differences in how conventional demand theory and control theory treat consumer behavior. (CH)
Personality Theories for the 21st Century
ERIC Educational Resources Information Center
McCrae, Robert R.
2011-01-01
Classic personality theories, although intriguing, are outdated. The five-factor model of personality traits reinvigorated personality research, and the resulting findings spurred a new generation of personality theories. These theories assign a central place to traits and acknowledge the crucial role of evolved biology in shaping human…
Continuous Time in Consistent Histories
NASA Astrophysics Data System (ADS)
Savvidou, Konstantina
1999-12-01
We discuss the case of histories labelled by a continuous time parameter in the History Projection Operator consistent-histories quantum theory. We describe how the appropriate representation of the history algebra may be chosen by requiring the existence of projection operators that represent propositions about time averages of the energy. We define the action operator for the consistent histories formalism, as the quantum analogue of the classical action functional, for the simple harmonic oscillator case. We show that the action operator is the generator of two types of time transformations that may be related to the two laws of time-evolution of the standard quantum theory: the `state-vector reduction' and the unitary time-evolution. We construct the corresponding classical histories and demonstrate the relevance with the quantum histories; we demonstrate how the requirement of the temporal logic structure of the theory is sufficient for the definition of classical histories. Furthermore, we show the relation of the action operator to the decoherence functional which describes the dynamics of the system. Finally, the discussion is extended to give a preliminary account of quantum field theory in this approach to the consistent histories formalism.
Effects of Extrinsic Mortality on the Evolution of Aging: A Stochastic Modeling Approach
Shokhirev, Maxim Nikolaievich; Johnson, Adiv Adam
2014-01-01
The evolutionary theories of aging are useful for gaining insights into the complex mechanisms underlying senescence. Classical theories argue that high levels of extrinsic mortality should select for the evolution of shorter lifespans and earlier peak fertility. Non-classical theories, in contrast, posit that an increase in extrinsic mortality could select for the evolution of longer lifespans. Although numerous studies support the classical paradigm, recent data challenge classical predictions, finding that high extrinsic mortality can select for the evolution of longer lifespans. To further elucidate the role of extrinsic mortality in the evolution of aging, we implemented a stochastic, agent-based, computational model. We used a simulated annealing optimization approach to predict which model parameters predispose populations to evolve longer or shorter lifespans in response to increased levels of predation. We report that longer lifespans evolved in the presence of rising predation if the cost of mating is relatively high and if energy is available in excess. Conversely, we found that dramatically shorter lifespans evolved when mating costs were relatively low and food was relatively scarce. We also analyzed the effects of increased predation on various parameters related to density dependence and energy allocation. Longer and shorter lifespans were accompanied by increased and decreased investments of energy into somatic maintenance, respectively. Similarly, earlier and later maturation ages were accompanied by increased and decreased energetic investments into early fecundity, respectively. Higher predation significantly decreased the total population size, enlarged the shared resource pool, and redistributed energy reserves for mature individuals. These results both corroborate and refine classical predictions, demonstrating a population-level trade-off between longevity and fecundity and identifying conditions that produce both classical and non-classical lifespan effects. PMID:24466165
Assessing the quantum physics impacts on future x-ray free-electron lasers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmitt, Mark J.; Anisimov, Petr Mikhaylovich
A new quantum mechanical theory of x-ray free electron lasers (XFELs) has been successfully developed that has placed LANL at the forefront of the understanding of quantum effects in XFELs. Our quantum theory describes the interaction of relativistic electrons with x-ray radiation in the periodic magnetic field of an undulator using the same mathematical formalism as classical XFEL theory. This places classical and quantum treatments on the same footing and allows for a continuous transition from one regime to the other eliminating the disparate analytical approaches previously used. Moreover, Dr. Anisimov, the architect of this new theory, is now consideredmore » a resource in the international FEL community for assessing quantum effects in XFELs.« less
Seeking parsimony in hydrology and water resources technology
NASA Astrophysics Data System (ADS)
Koutsoyiannis, D.
2009-04-01
The principle of parsimony, also known as the principle of simplicity, the principle of economy and Ockham's razor, advises scientists to prefer the simplest theory among those that fit the data equally well. In this, it is an epistemic principle but reflects an ontological characterization that the universe is ultimately parsimonious. Is this principle useful and can it really be reconciled with, and implemented to, our modelling approaches of complex hydrological systems, whose elements and events are extraordinarily numerous, different and unique? The answer underlying the mainstream hydrological research of the last two decades seems to be negative. Hopes were invested to the power of computers that would enable faithful and detailed representation of the diverse system elements and the hydrological processes, based on merely "first principles" and resulting in "physically-based" models that tend to approach in complexity the real world systems. Today the account of such research endeavour seems not positive, as it did not improve model predictive capacity and processes comprehension. A return to parsimonious modelling seems to be again the promising route. The experience from recent research and from comparisons of parsimonious and complicated models indicates that the former can facilitate insight and comprehension, improve accuracy and predictive capacity, and increase efficiency. In addition - and despite aspiration that "physically based" models will have lower data requirements and, even, they ultimately become "data-free" - parsimonious models require fewer data to achieve the same accuracy with more complicated models. Naturally, the concepts that reconcile the simplicity of parsimonious models with the complexity of hydrological systems are probability theory and statistics. Probability theory provides the theoretical basis for moving from a microscopic to a macroscopic view of phenomena, by mapping sets of diverse elements and events of hydrological systems to single numbers (a probability or an expected value), and statistics provides the empirical basis of summarizing data, making inference from them, and supporting decision making in water resource management. Unfortunately, the current state of the art in probability, statistics and their union, often called stochastics, is not fully satisfactory for the needs of modelling of hydrological and water resource systems. A first problem is that stochastic modelling has traditionally relied on classical statistics, which is based on the independent "coin-tossing" prototype, rather than on the study of real-world systems whose behaviour is very different from the classical prototype. A second problem is that the stochastic models (particularly the multivariate ones) are often not parsimonious themselves. Therefore, substantial advancement of stochastics is necessary in a new paradigm of parsimonious hydrological modelling. These ideas are illustrated using several examples, namely: (a) hydrological modelling of a karst system in Bosnia and Herzegovina using three different approaches ranging from parsimonious to detailed "physically-based"; (b) parsimonious modelling of a peculiar modified catchment in Greece; (c) a stochastic approach that can replace parameter-excessive ARMA-type models with a generalized algorithm that produces any shape of autocorrelation function (consistent with the accuracy provided by the data) using a couple of parameters; (d) a multivariate stochastic approach which replaces a huge number of parameters estimated from data with coefficients estimated by the principle of maximum entropy; and (e) a parsimonious approach for decision making in multi-reservoir systems using a handful of parameters instead of thousands of decision variables.
Exact sampling hardness of Ising spin models
NASA Astrophysics Data System (ADS)
Fefferman, B.; Foss-Feig, M.; Gorshkov, A. V.
2017-09-01
We study the complexity of classically sampling from the output distribution of an Ising spin model, which can be implemented naturally in a variety of atomic, molecular, and optical systems. In particular, we construct a specific example of an Ising Hamiltonian that, after time evolution starting from a trivial initial state, produces a particular output configuration with probability very nearly proportional to the square of the permanent of a matrix with arbitrary integer entries. In a similar spirit to boson sampling, the ability to sample classically from the probability distribution induced by time evolution under this Hamiltonian would imply unlikely complexity theoretic consequences, suggesting that the dynamics of such a spin model cannot be efficiently simulated with a classical computer. Physical Ising spin systems capable of achieving problem-size instances (i.e., qubit numbers) large enough so that classical sampling of the output distribution is classically difficult in practice may be achievable in the near future. Unlike boson sampling, our current results only imply hardness of exact classical sampling, leaving open the important question of whether a much stronger approximate-sampling hardness result holds in this context. The latter is most likely necessary to enable a convincing experimental demonstration of quantum supremacy. As referenced in a recent paper [A. Bouland, L. Mancinska, and X. Zhang, in Proceedings of the 31st Conference on Computational Complexity (CCC 2016), Leibniz International Proceedings in Informatics (Schloss Dagstuhl-Leibniz-Zentrum für Informatik, Dagstuhl, 2016)], our result completes the sampling hardness classification of two-qubit commuting Hamiltonians.
NP-hardness of decoding quantum error-correction codes
NASA Astrophysics Data System (ADS)
Hsieh, Min-Hsiu; Le Gall, François
2011-05-01
Although the theory of quantum error correction is intimately related to classical coding theory and, in particular, one can construct quantum error-correction codes (QECCs) from classical codes with the dual-containing property, this does not necessarily imply that the computational complexity of decoding QECCs is the same as their classical counterparts. Instead, decoding QECCs can be very much different from decoding classical codes due to the degeneracy property. Intuitively, one expects degeneracy would simplify the decoding since two different errors might not and need not be distinguished in order to correct them. However, we show that general quantum decoding problem is NP-hard regardless of the quantum codes being degenerate or nondegenerate. This finding implies that no considerably fast decoding algorithm exists for the general quantum decoding problems and suggests the existence of a quantum cryptosystem based on the hardness of decoding QECCs.
A qualitative study of the influence of poor dental aesthetics on the lives of young adults.
Josefsson, Eva; Lindsten, Rune; Hallberg, Lillemor R-M
2010-01-01
Although many countries offer some publicly funded orthodontic treatment for children, not all conditions receive treatment and some adolescents enter adulthood with persisting poor dental aesthetics or malocclusions. The aim of this study was to generate a theory highlighting the main concerns of young adults, either native-born or of immigrant background, with poor dental aesthetics and the measures they adopt to manage their condition in everyday life. A qualitative method, classic grounded theory, was applied in order to generate a substantive theory highlighting the main concerns and managing mechanisms of 13 strategically selected 19- and 20-year-olds with poor dental aesthetics. Open interviews were conducted with each participant, the topics covering different aspects of social and dental conditions. A core category and three conceptual categories were generated. The core category was labelled "Being under the pressure of social norms" and was related to categories explaining three different ways in which these young adults handle their main concern: (1) avoiding showing their teeth; (2) minimizing the importance of appearance; and (3) seeking orthodontic treatment. The theory offers the potential for improved understanding of young adults who, despite poor dental aesthetics, are managing well with life, and also of those who have not adjusted well. In early adolescence it may be problematic to make decisions about orthodontic treatment. Undisclosed dental fear can be an important barrier. Some of the young adults in the present study would probably benefit from treatment.
A new look at the position operator in quantum theory
NASA Astrophysics Data System (ADS)
Lev, F. M.
2015-01-01
The postulate that coordinate and momentum representations are related to each other by the Fourier transform has been accepted from the beginning of quantum theory by analogy with classical electrodynamics. As a consequence, an inevitable effect in standard theory is the wave packet spreading (WPS) of the photon coordinate wave function in directions perpendicular to the photon momentum. This leads to the following paradoxes: if the major part of photons emitted by stars are in wave packet states (what is the most probable scenario) then we should see not separate stars but only an almost continuous background from all stars; no anisotropy of the CMB radiation should be observable; data on gamma-ray bursts, signals from directional radio antennas (in particular, in experiments on Shapiro delay) and signals from pulsars show no signs of WPS. In addition, a problem arises why there are no signs of WPS for protons in the LHC ring. We argue that the above postulate is based neither on strong theoretical arguments nor on experimental data and propose a new consistent definition of the position operator. Then WPS in directions perpendicular to the particle momentum is absent and the paradoxes are resolved. Different components of the new position operator do not commute with each other and, as a consequence, there is no wave function in coordinate representation. Implications of the results for entanglement, quantum locality and the problem of time in quantum theory are discussed.
On quantum effects in a theory of biological evolution.
Martin-Delgado, M A
2012-01-01
We construct a descriptive toy model that considers quantum effects on biological evolution starting from Chaitin's classical framework. There are smart evolution scenarios in which a quantum world is as favorable as classical worlds for evolution to take place. However, in more natural scenarios, the rate of evolution depends on the degree of entanglement present in quantum organisms with respect to classical organisms. If the entanglement is maximal, classical evolution turns out to be more favorable.